Author: Julia Mallon
Disinformation is a global problem. It is everywhere including the United States, where disinformation associated with the recent presidential election, pandemic and vaccines has fueled the polarization of the American electorate.
US experts agree that something has to be done, but whether or not it’s the responsibility of the government, social media companies, education, or all three is a subject for debate.
“I can’t say what kinds of regulations would be advisable or effective. I can only say the effort is all hands-on deck, requiring work from politicians, professional journalists, tech companies, and educators,” said Jonathan Anzalone, Assistant Director of the Center for News Literacy at Stonybrook University in New York.
Disinformation and its impact on democracy has been the focus of conferences, Congressional investigations, studies and various books. It has sparked grassroots movements and has been tied to a growing distrust of news outlets.
According to a 2019 study from Pew Research Center, 50% of American adults surveyed said made up news/info is a big problem. Another 68% said made-up news/info has an impact on Americans’ confidence in the government.
In April, former US President Barack Obama described disinformation as a threat to democracy in his keynote address at Stanford University.
“People are dying because of misinformation,” he said, referring to the misinformation and disinformation surrounding vaccines.
The United States Government’s Role
To what extent, if any, should the US government get involved in addressing this problem is a topic of ongoing debate and the focus of numerous groups.
Following the January 6th, 2021 attack at the US capitol, The Stanford Internet Observatory (SIO), The University of Washington’s Center for an Informed Public (CIP), Graphika, and The Atlantic Council’s Digital Forensic Research Lab (DFRLab) formed what’s called the Election Integrity Partnership (EIP).
Their goal is to, “create a model for whole-of-society collaboration and facilitate cooperation among partners dedicated to a free and fair election.”
The EIP made key recommendations for the federal government, Congress, state and local officials, and social media platforms to help achieve this while combatting disinformation.
In its final report, the EIP recommended the federal government establish a clear role in “identifying and countering” election-related disinformation and that it recognize elections as a national security priority.
One of the group’s recommendations at the state and local level is to, “establish trusted channels of communication with voters,” including a .gov website and use of both traditional and social media.” This is an election specific example of addressing disinformation.
Other experts agree the government can and should take decisive action to combat the broader problem of disinformation.
Martha Minow, Harvard Law professor and author of Saving the News: Why the Constitution Calls for Government Action to Preserve Freedom of Speech, says there are three well supported government initiatives that are needed to combat disinformation.
“The government should treat digital platform companies responsible for their product the same way other media are responsible under copyright law, defamation and fraud law, and privacy and other laws,” says Minow.
Minow believes consumer protection laws should be used to hold media companies responsible for content they run that violates well-established consumer protections standards against fraud and misrepresentation. She also believes the country needs to increase the supply of quality information through greater support for public, nonprofit, and local media as well as media education.
During his keynote address, President Obama acknowledged the way tech companies have evolved and said he would support the revamping of a section of what’s called the Communications Decency Act, which exempted what are now tech companies from being held accountable for the content produced on their platforms.
“This provision of the Communications Decency Act was intended to shield fledgling internet-based companies from legal burdens and grow the sector,” Minow says. She believes the current act grants an ‘immunity’ from liabilities, such as fraud and defamation, and treats platform companies differently from any other media.
“Internet-based companies are now among the largest and most well-capitalized companies in history and do not need what is in essence a subsidy, giving them an advantage over other media companies,” says Minow. “Revamping the act could mean taking the advantage away or conditioning it on accountability requirements, duties to disclosure moderation practices or other specified responsibilities.”
In the meantime, some states have begun to respond to the call for a greater government role in combating disinformation, specifically surrounding elections. Connecticut, along with Colorado and California, have hired what amounts to a disinformation czar in each of their states. The roles include monitoring sites and attempting to eliminate disinformation and false narratives before they go viral.
Tech Companies and Media Role
Along with advocating for possible government regulation, some experts believe the large social media platforms and the media in general need to play a more significant role in limiting disinformation.
According to the Pew Research Center, “Journalists are not blamed most for creating made-up news and information, but Americans say the news media are most responsible for fixing it.”
And a recent report from a commission at the Aspen Institute, a nonprofit organization that is, “committed to realizing a free, just, and equitable society,” agrees. The “Commission on Information Disorder,” recommended increasing transparency, specifically within social media platforms.
Two of these key recommendations include what’s called, “high reach content disclosure.” The commission believes there should be a “legal requirement for all social media platforms to regularly publish the content, source accounts, reach and impression data for posts that they organically deliver to large audiences.”
Similarly, the Aspen Institute’s commission recommends what it calls content moderation platform disclosure. This entails requiring, “social media platforms to disclose information about their content moderation policies and practices and produce a time-limited archive of moderated content in a standardized format, available to authorized researchers.” Minow also believes that social media platforms should face more regulations if lawful.
Relying only on self-regulation has failed because, “private companies are motivated to make profits and the dominant model of both social media and increasingly conventional media focuses on advertising and keeping viewers and users “engaged,” Minow said. “The companies themselves indicate that the problem is bigger than any one of them to solve.”
During his address, President Obama also said that tech companies need to be more transparent about how they operate and that regulatory structures need to be designed in collaboration with companies, experts, and communities affected.
“In a democracy, we can rightly expect companies to subject the design of their products and services to some level of scrutiny. At minimum, they should have to share that information with researchers and regulators who are charged with keeping the rest of us safe,” President Obama said.
Education System’s Role
According to a study from the Pew Research Center, only 26% of adults surveyed could correctly classify all five factual statements presented to them, only 35% of U.S. adults surveyed could correctly classify all five opinion statements presented to them, and 96% of U.S. high school students surveyed failed to challenge the credibility of an unreliable source.
To combat the average American’s information gap, one of the Aspen Institute’s key recommendations is to generate what it calls civic empowerment.
Specifically, the commission recommends the US should, “invest and innovate in online education and platform product features to increase users’ awareness of and resilience to online misinformation.”
To address this challenge, many colleges and high schools have launched what are called media literacy programs across the US and internationally. Media literacy is the ability to analyze and evaluate information from media outlets to determine what’s true and what’s false.
Erin McNeill, president and founder of nonprofit organization Media Literacy Now, has been tracking media literacy and policies related to it for ten years.
“There’s this question of do we do education or do we do regulation,” McNeill says. “If we educate all of our students as they come into the workforce, they’ll be more prepared for how we deal with these really huge challenges right now in the media system.”
The Center for News Literacy, based at the State University of New York at Stony Brook, has been the leader in media literacy education since 2007.
“Our focus is educating the public,” says Stony Brook’s Anzalone. “which we wouldn’t be doing if we didn’t think it would make an impact.” When it comes to the education versus regulation debate, Anazalone believes it’s a joint effort.
The course description of Stony Brook’s JRN 101: News Literacy reads, “Armed with critical-thinking skills, a firm grasp of relevant history and practical knowledge about the news media, News Literacy students learn how to find the reliable information they need to make decisions, take action, make judgments and responsibly share information through social media.”
The program at Stony Brook goes beyond teaching media literacy courses at the college level. The center’s staff educates high school and middle school educators on how to teach media literacy and provides an extensive collection of media literacy resources. They also have free online courses available. These courses include: “Making Sense of the News” and “News Literacy in The Age Of Coronavirus.”
One of the challenges the program at Stonybrook and other organizations have faced is that each city, town or county across the US has its own school system that decides if and how media literacy courses would be a part of their curriculum.
“Admittedly, it’s very difficult to do at scale, with the decentralized education system in the United States,” said Anzalone, but he is optimistic.
“With the challenges we’re currently facing comes the opportunity to reinvigorate our civic life and strengthen our democracy by teaching people how to find the reliable information that’s essential to their lives,” he said.