Fox News, one of the most relentless critics of the war on disinformation, now faces a new challenge: Its parent company is seeking to enhance its internal capabilities to combat disinformation.
Last week, Fox Corporation posted a job listing for a corporate “trust and safety behavioral analyst” tasked with identifying “misinformation/disinformation.” The role aims to implement a content moderation system across Fox’s businesses, including Fox News, to combat disinformation. The corporation will collaborate closely with undisclosed partners both within and outside the company, according to the job posting. Fox intends to utilize pattern recognition, a crucial aspect of artificial intelligence, to “identify hostile users,” as stated in the job description.
The analyst will focus on the “ongoing community health and brand safety of Fox sites and apps that directly engage with users” to “safeguard … user communities.” The job announcement highlights a background in “psychology, criminal justice, social media, gaming, news or media” as a plus.
When questioned about the job listing, Fox did not provide a comment.
The corporate interest in combating disinformation contrasts sharply with Fox News’s heavy criticism of anti-disinformation efforts that regulate social media content, which Fox News consistently equates with censorship.
Following the creation of a now-defunct Disinformation Governance Board by the Department of Homeland Security in 2022, prominent Fox News hosts denounced the move in sensational terms. Hosts like Sean Hannity and Tucker Carlson likened the board to a “Ministry of Truth” from George Orwell’s “1984,” while Brian Kilmeade echoed their sentiments. Since then, Fox News has been fixated on the disinformation battle.
In response to the revelation of the Disinformation Governance Board, 70 percent of Fox’s one-hour segments in the subsequent week referenced disinformation and the DHS official overseeing the board, according to a defamation lawsuit filed by Nina Jankowicz against Fox News. Throughout 2022, Fox News mentioned Jankowicz over 300 times, as stated in the lawsuit.
Fox’s corporate focus on disinformation differs from that of the federal government, as Fox is interested in audience “engagement,” a term emphasized multiple times in the job posting.
“Helping deliver innovative technology solutions to support user safety and increase engagement” is among the responsibilities listed in Fox’s job posting.
While the debate on content moderation often centers on freedom of speech and foreign influence campaigns, the profitability of removing content offensive to advertisers or audiences is a significant factor. Advancements in AI technology have made it increasingly feasible to conduct content moderation at scale.
In addition to machine learning, Fox’s job posting mentions terms like large language models and natural language processing, common in AI. These technologies enable the autonomous sifting through vast amounts of data, making content moderation more cost-effective than ever before.
Fox is not alone in leveraging AI advancements to combat disinformation. Mark Zuckerberg, CEO of Facebook (now Meta), highlighted the use of AI in removing hate speech and terrorist content. The federal government is also turning to AI to identify foreign influence operations.
Despite the rapid changes brought about by AI technology, these advancements have yet to fully impact the disinformation debate.
“I am pro-disinformation because one man’s disinformation is another person’s fact,” Fox News host Greg Gutfeld stated in 2022.
Gutfeld may need to discuss this viewpoint with his employer.