Advertisement
Advertisement
Advertisement
The Hollywood Reporter

How the World’s Top News Orgs Hope to Tackle Misinformation as a Historic U.S. Election Looms

Lily Ford
8 min read
Generate Key Takeaways

With a democracy-defining election on the horizon, it’s never been more vital to trust the news.

Republican presidential candidate Donald Trump went head-to-head with Democrat Vice President Kamala Harris in a feisty ABC News debate on Sept. 10 that saw the former president fact-checked in real time over false claims that Haitian immigrants in Springfield, Ohio, were eating the pets of the town’s residents.

More from The Hollywood Reporter

Advertisement
Advertisement

“Now she wants to do transgender operations on illegal aliens who are in prison,” the 78-year-old also claimed, referring to Harris saying “transgender individuals who rely on the state for care [should] receive the treatment they need,” in response to an ACLU questionnaire in 2019.

The Republican continuously reinforced his belief that he won the 2020 election against Joe Biden, despite election officials across the country confirming nothing was rigged. ABC News moderators David Muir and Linsey Davis were there to debunk Trump’s various statements across the lengthy debate, including that some states allow the “execution” of babies after they are born. Davis swiftly told Trump: “There is no state in this country where it is legal to kill a baby after it’s born.”

However, verifying claims on social media has not proved as rigorous, or indeed instantaneous. Trump, for example, took some heat in recent weeks for falsely sharing AI-generated images of Taylor Swift endorsing him for the 2024 election (the pop star later declared her intention to vote for Harris on her official Instagram account). It didn’t stop the images from swirling around on Elon Musk’s X and Mark Zuckerberg’s Facebook for weeks.

A World Economic Forum survey has named misinformation the biggest global risk over the next two years — ahead of climate change and war. This year, it’s not just Americans voting for their leader. Almost half the world is voting in elections across over 50 countries. Deepfakes, fake news and propaganda are now posing unprecedented content verification challenges for newsrooms and trusted media brands.

Advertisement
Advertisement

This puts pressure on some of the world’s biggest news organizations to get authentic and truthful journalism to the online masses. According to the Pew Research Center, just over half of U.S. adults (54 percent) get a portion of their news from social media. In a world where even a former president and presidential candidate can possibly lie repeatedly on the debate stage, amplifying the truth remains a priority.

Now a number of major global media organizations are embarking on a cross-industry initiative, led by the BBC and CBS News, called Design Your Weapons in the Fight Against Disinformation. As part of the IBC Accelerator Media Innovation program, the goal is to help organizations address the challenges of disinformation in media by teaming up, for the first time ever, to tackle the problem.

It’s not just the BBC and CBS News taking part. The Associated Press (AP), Paramount, Reuters, ITN, Globo, Al Jazeera and Channel 4 are all part of the initiative. “Look, disinformation is nothing new,” Tim Forrest, ITN’s content editor, projects, tells The Hollywood Reporter. “But what we’ve seen in the last few years has been a growth in fakery. Fundamentally, what we’re seeing is that journalists and news organizations have always asked the question: Is it true? And that continues to be the case today, but increasingly and going forward, we’re also going to have to ask: Is it real? That is a change for all of us that we need to wrap our head around.”

The AP’s social media and user-generated content editor Nadia Ahmed adds: “How can we help viewers to understand what is fact and what isn’t, and what is real and what isn’t?” Simply put, they are working to make sure the falsities are filtered out before it gets to you, their audience.

Mark Smith, head of the IBC Accelerator program and Judy Parnall of the BBC.
Mark Smith, head of the IBC Accelerator program and Judy Parnall of the BBC.

The initiative is three-fold, and starts with provenance — where a claim, image or video has originated from. “Provenance is similar to a nutrition label with food,” explains the BBC’s head of standards & industry Judy Parnall. “So we’re used to what’s in your piece of food, but actually what’s in your media? Understanding where it came from, who is standing behind it and how it was made. So you’re not just blindly receiving a video and going: ‘Gosh, that must be true, because that’s a video I’ve received.'”

Advertisement
Advertisement

Parnall says the BBC has even seen this with “shallowfakes,” which are far less edited than a deepfake, but can still change a story entirely. To address this, news orgs want to find out what tools are out there to investigate a story’s provenance. “What are the different approaches? Do you need to use blockchain? How complex is it to use?” she said.

Ahmed says detection becomes vital if a user can’t get that provenance information, and detection tools must step in to decipher what’s real or true. “It’s very, very new technology,” Ahmed says. “Because creation is very new, and the detection tools are, if we’re honest, struggling to keep up. For the most part, there’s not enough money in it. So what we’ve found through the Accelerator initiative is looking at a wide variety of tools that are available on the market.”

She continues: “What often becomes a good approach is to use a combination of them to try and get as much contextual information as possible, a manual verification process, so you’re not necessarily relying on a machine to tell you ‘this is true’ or ‘this is fake.'”

New technology can now be employed to quickly determine if an image has been altered or tampered with through the use of AI.

Advertisement
Advertisement

Says Ahmed: “You’re looking at machine results that tell you with this percentage confidence rate [how confident the machine is in its findings], we can tell that there are some issues and artifacts around the face in this photo, or the voice in this video, that seem AI-generated. And then the journalists — or hopefully in the future, just any user — is able to go in and look at that information and make an editorial or a trust decision.”

But what makes this initiative a game-changer is collaboration. The outlets are hoping to figure out a way forward together and, crucially, share their resources. “We will be in a position very soon where the human eye cannot tell the difference [between real and fake],” Forrest says. “No one organization can fix this. The scales that we’re talking about here mean that even as national broadcasters, we’re often operating against platforms that are the size of nations and have the budgets to go with it, and unless we collaborate, then we really can’t hope to get a grip on this.”

Ahmed adds that the only way to move forward is with transparency. Identifying falsities and being upfront with your readership is what matters most. She says the threat misinformation poses is palpable: “If I’m honest at this point, specific instances of misinformation are changing public opinion and and affecting elections everywhere. It’s a huge issue.”

It’s on Zuckerberg-owned Meta and Musk’s X to help, too. “One of the things I hear a lot is that the social media platforms, they’ve been absent from our group,” says Mark Smith, chair of IBC Council and head of the IBC Accelerator Programme. “Much more dialogue is required there, because they are a huge part of this.”

Advertisement
Advertisement

Only last year, the EU issued a warning to Musk to comply with new laws on fake news and Russian propaganda, after X, formerly Twitter, was found to have the highest ratio of disinformation posts of all large social media platforms.

Says Forrest: “As news providers, we have a heritage that has built trust over decades, and I think we can bring that expertise to the space. That’s what we’re we’re looking to do here.” Adds Parnall: “Everybody wins if truth wins out, and it’s not just the broadcasters and the traditional news providers, but it’s the social media platforms. Ultimately, if people see that they can’t rely on social media platforms, they’ll start to move away for them. This rising tide lifts all boats.”

A December 2020 survey reported by Redline said that 38.2 percent of U.S. news consumers reported unknowingly sharing fake news or misinformation on social media. With the rapid development of AI in even the past two to three years, the percentage is likely to be higher in 2024.

IBC2024, the annual media trade show that takes place in Amsterdam, recently announced that 45,085 visitors from 170 countries attended the conference from Sept 13-16, bringing together the global media, entertainment and technology community to showcase innovations and tackle pressing industry challenges — including misinformation.

Advertisement
Advertisement

“As media organizations, we’re not necessarily wanting to place ourselves as the arbitrators of truth,” says Ahmed. “Our job is to put out the facts as we find them, and facts do and should always have power.”

Best of The Hollywood Reporter

Sign up for THR's Newsletter. For the latest news, follow us on Facebook, Twitter, and Instagram.

Solve the daily Crossword

The Daily Crossword was played 10,288 times last week. Can you solve it faster than others?
CrosswordCrossword
Crossword
Advertisement
Advertisement
Advertisement