Solutions to Disinformation: Whose Responsibility Is It, Anyway?

 /  April 18, 2022, 2:52 p.m.


Disinformation Photo 4

On April 6, 2022, the first day of the “Disinformation and the Erosion of Democracy” conference, CEO of Philippine news website Rappler and Nobel Peace Prize winner Maria Ressa quoted Justice Louis D. Brandeis, who said, “If there be time to expose through discussion, the falsehoods and fallacies, to avert the evil by the processes of education, the remedy to be applied is more speech, not enforced silence.” Ressa criticized this axiom, arguing that combating online disinformation today demands more than “more speech,” and will require innovation in regulating online platforms through technological and legislative means. Ressa’s comments at the conference raise questions as to what is needed to counter disinformation, and who is responsible for it.

The Disinformation conference, created in partnership with the University of Chicago’s Institute of Politics and The Atlantic, convened speakers like former president Barack Obama, Senator Amy Klobuchar, former director of the Cybersecurity and Infrastructure Security Agency Christopher Krebs, and a variety of policy and technology experts to spark conversation and debate on approaches to disinformation. The conference tackled not only the root causes of disinformation and its spread, but also what forms effective counter-regulations and policy might take.

Krebs remarked that though some false information is designed to mislead in pursuit of specific agendas, much of it simply aims to “flood the zone. For many speakers, however, the distinction between intentional manipulation and organized chaos is secondary to the fact that both have found success creating widespread distrust. The panelists at Friday’s session discussed ways in which disinformation can be combated in all its forms, from technology company accountability to digital literacy education, and which groups must lead efforts toward creating an informed and regulated online space for the future.

Legislating a Solution

A potentially powerful legislative solution lies in holding tech companies accountable for what occurs on their platforms. Minnesota Sen. Amy Klobuchar spoke at length about her current antitrust bill, which, at a high-level, aims to prevent monopolies by promoting competition between businesses in tech. To Klobuchar, the antitrust bill is also poised to combat disinformation: given the ties between tech monopolies and data privacy issues – she mentioned Google, a “dominant gatekeeper” – she hopes that the bill will hold tech companies to higher standards of ethical consumer data use. 

Klobuchar also underscored the overlooked danger that tech monopolies pose to media transparency, proposing another piece of legislation which she believes will improve journalistic practice: the Journalism Competition and Preservation Act. This act is intended to “provide a so-called ‘safe harbor’ from antitrust laws, over a 48-month period, for news companies to collectively negotiate how their content is distributed on places like Google and Facebook.” In particular, the Act aims to support local news sources struggling to compete with major news companies by allowing the local news agencies a chance to determine revenue-sharing terms with major online platforms. Klobuchar emphasized that “people need to have a shared understanding of what is happening in their communities,” and that a bill protecting these companies will help promote accurate reporting.

For Klobuchar and co-panelist Deval Patrick, policy solutions function as a preventative “guard rail” rather than a surveillance network. However, many also made the case for extra-governmental action. Stressing public doubts about government oversight, Michigan Rep. Elissa Slotkin argued, “Outside of government is where we can do the most open thinking, and then we can bring that to the government.”

Technology Transparency

 Several panelists also discussed transparency and regulations regarding social media platforms’ policies. Historian and Atlantic writer Anne Applebaum theorized that by keeping content from being publicly posted for several hours, and by focusing on shutting down fake social media accounts, platforms could encourage greater review and reflection (in the spirit of Wikipedia). Erin Simpson, Director of Technology Policy at the Center for American Progress, said that such regulations could bring about “a marginally less-bad social media, which is worth doing.” 

Social media accountability advocate and Facebook whistleblower Frances Haugen applauded Twitter’s regulations, which include a requirement that users open links before sharing them and a company-wide commitment to providing researchers with random samples of 10% of Twitter activity through their application programming interface (API). Similar accountability measures could benefit other platforms, Haugen emphasized. She also stressed the need for linguistic equity, or the idea of equal content moderation across different languages even when the content itself differs. Approximately 35.6% of the world has access to Facebook in 111 supported languages, only 41 of which have sufficient content moderation. As a dominant online news source, Haugen said, Facebook has a responsibility to adhere to its content moderation standards regardless of geography, content, or language. 

Combating Disinformation with Education

Panelists also raised the idea that disinformation may be fought through history and digital media education. Kathleen Belew, a University of Chicago history professor, used the example of the Jan. 6 Capitol Insurrection, arguing that many involved in the attack created a particular “narrative construction” that portrayed the modern age as a threat to the white race. Belew highlighted the need for robust civics and history education to counter these narratives and better inform future generations. Both she and Slotkin also pointed out the value in digital literacy education, so as to promote better practices of interacting and viewing content online.

Belew also spoke on the role education plays in debunking conspiracy theories, which she sees as a major source of disinformation. She described conspiracy theories as the result of fear, whether of political change or apocalyptic events. Though these fears are often associated with certain political groups, Belew reminded the audience that fear of the end of times is shared by nearly everyone, and raised this common fear as a potential point of connection across different communities. 

Thinking Beyond the Internet As We Know It

Though much of the conference centered on private sector vs. governmental responsibility, several panelists offered solutions that resisted this divide. Ethan Zuckerman, a professor at the University of Massachusetts Amherst, discussed his work theorizing alternative models to profit-driven platforms. “I don’t want to just make Facebook less awful,” Zuckerman stated. “I want to make social media that is good for us as citizens and as neighbors.” In his talk, he said that content moderation should not be intertwined with profit, but instead replaced with consumer-moderated online spaces. He also argued for new platforms that supplement, rather than replace, current social media platforms, comparing his vision to media sources like PBS and NPR which were created in response to dissatisfaction with the state of television and radio in the 1960s. 

Like Zuckerman, Erin Simpson expressed that regulation and accountability alone are not enough, but that the sky's the limit with regards to imagining better social media. Drawing on her work in technology policy, she argued that “we have a lack of imagination as to what consumer tech can be” because of the monopoly major tech companies and social media platforms have. “We want pluralism in our information infrastructure,” she remarked. 

The conversations at “Disinformation and the Erosion of Democracy” hint that a problem as universal as disinformation requires all hands on deck. Maria Ressa, who has been the target of online attacks and threats as a journalist, emphasized that disinformation is a major global crisis: “Every day that there isn’t a law, that there isn’t a guard rail in place, someone dies.” Illinois State Rep. Adam Kinzinger, meanwhile, spoke hopefully about the future of democracy in spite of disinformation, saying, “Democracies aren’t defined by bad days, but how they come back from bad days.” It will remain up to all involved, from the courts to social media giants to the general public, to continue envisioning ways to transform social media platforms into spaces for truth, productive dialogue, and community in the future.

The image featured in this article was taken by the article's author while reporting the story.


Molly Morrow


Search

<script type="text/javascript" src="//downloads.mailchimp.com/js/signup-forms/popup/embed.js" data-dojo-config="usePlainJson: true, isDebug: false"></script><script type="text/javascript">require(["mojo/signup-forms/Loader"], function(L) { L.start({"baseUrl":"mc.us12.list-manage.com","uuid":"d2157b250902dd292e3543be0","lid":"aa04c73a5b"}) })</script>