41.3 F
Washington D.C.
Tuesday, April 23, 2024

Social Media Platforms Fight Back Against Manipulative Foreign Actors

  • Although Americans expect to encounter disinformation via social media platforms in the run-up to the 2020 elections, many continue to consume news online primarily via social media.
  • Foreign actors have used social media to increase discord in the United States, capitalizing on racial and economic tensions, and have even attempted to illegally influence U.S. political processes.
  • Social media companies are combating foreign actors’ efforts to influence and manipulate Americans via social media.
  • Social media companies have a responsibility to protect Americans from manipulation and disinformation by foreign actors on their platforms.
  • Social media companies’ preventative efforts to curb foreign actors are part of a larger societal effort to protect against disinformation, but more research and action are needed as disinformation efforts evolve.

Social media companies are taking meaningful steps to curb the efforts of foreign adversaries to influence the American people. Foreign actors have been manipulating our ability to think and our access to truthful information. Polling shows a large majority of Americans expect to encounter disinformation on social media in the run-up to the 2020 elections. Despite that expectation, Americans continue to use social media as a primary vehicle to consume news. U.S.-based social media companies have a responsibility to combat foreign interference taking place via their platforms, especially those with large American user bases.

The steps that some of the largest social media companies have taken show their determination to combat disinformation. In addition to the ongoing efforts that Facebook has been exerting to fight disinformation, the company recently announced it would begin blocking state-controlled media outlets from purchasing advertising in the United States this summer, and begin adding labels so users can identify posts from state-controlled outlets. Instagram, a Facebook-owned platform, has also taken steps to combat disinformation. Twitter introduced new labels and warning messages for posts that contain misleading information on COVID-19, and is considering labeling other tweets that could mislead or confuse users on other topics.

How did we get to this point?

These efforts fit into a broader context of preventing foreign adversaries from manipulating U.S. current events. For example, foreign actors have attempted to utilize disinformation to exacerbate and exploit pre-existing racial tensions domestically, to promote extreme points of view, and to illegally influence American political processes.

  • The protests against the Minneapolis Police Department following George Floyd’s death in May 2020 are a prime example. According to a review of social media activity by Politico, Chinese- and Russian-backed outlets retweeted more than 1,200 times about U.S. domestic affairs, using hashtags like #BlackLivesMatter and #Minneapolis. These digital actors portray the U.S. as a country on the brink, often highlighting the perceived hypocrisy of America’s rebukes of Moscow and Beijing’s civil rights abuses.
  • Similar disinformation efforts surrounded the killing of Michael Brown in 2014 and the emergence of the #BlackLivesMatter and #BlueLivesMatter movements. Twitter accounts associated with the Internet Research Agency (IRA, a Russian government-backed online disinformation promoter) were particularly active during that period. They mixed retweets and links to both mainstream news sources and more inflammatory conspiracy-promoting sources in an effort to normalize and promote more extreme views. The IRA’s actions manipulated discussions on the platform to increase divisions in the minds of Americans across the spectrum of opinion. Foreign efforts to further divisions even extended from social media into physical action when a Russian-run Facebook group managed to manifest its online presence into thousands of protesters in New York City.
  • In another example of disinformation, less than a month after the first reported COVID-19 death in the United States, Chinese officials attempted, without requisite evidence, to paint the U.S. Army responsible for the pandemic. This was among the latest illustrations of ongoing Chinese information warfare. Meanwhile, the Russian government spun information to shift blame to the United States and other Western countries.
  • As discussed by the U.S. Senate Select Committee on Intelligence, Russian operatives, via IRA, used social media to conduct an information warfare campaign designed to spread disinformation and societal division in the U.S. This campaign sought to polarize Americans on the basis of societal, ideological, and racial differences, provoked real world events, and was part of the Russian government’s covert support of its favored candidate in the 2016 U.S. presidential election.

What happens next?

The steps social media companies are taking to combat such efforts are hopeful signs. Nonetheless, these technological efforts are only a part of a broader societal fight against manipulation by foreign actors. Evidence suggests foreign actors will remain interested in undermining the confidence Americans have in civil society, and even each other. The tactics of disinformation that foreign actors use are known, even if they are evolving for use via the latest communications methods. Research into how to combat these efforts is widespread and ongoing.

These preventative actions against manipulation taken by social media platforms have the potential to cause foreign actors to modify their tactics and techniques. However, the effectiveness of these actions may be diminished. The potential for shifting opinions by narratives that undermine the legitimacy of any precautionary measures taken, for example, could cause backlash. These narratives could promote the view that social media companies are biased, and their measures are political in nature. The social media companies should also consider proactively addressing criticism for their efforts to protect the content on their platforms from foreign manipulation.

Not enough is known yet about Americans’ reactions to the preventative measures social media companies are taking. A majority of Americans already believe that social media companies have too much control over the content on their platforms. Considering that so many Americans continue to get news from social media platforms even while acknowledging that they do not trust that news, the situation is complicated. Research into how Americans feel about these measures is needed. Additional efforts will be required to track how foreign actors react to social media companies’ efforts to curb state-sponsored manipulation.

Technical solutions are not the only answer

As foreign actors continue to engage in efforts to gain competitive advantages over us, the U.S. must reorganize strategic thinking about public manipulation so that all social groups are aware of their vulnerability to manipulation. The actions taken in the private sector, such as the social media companies’ preventative measures, are additive and work in concert with efforts from government and academia. The Federal Bureau of Investigation (FBI), along with the Department of Homeland Security/Cybersecurity and Infrastructure Security Agency (DHS/CISA) and the Office of the Director of National Intelligence (ODNI), for example, produces a number of tools and resources that can help protect against online foreign influence operations via the joint Protected Voices initiative. Social media companies could follow the example and produce similar trusted content to help inform users about how to spot or avoid disinformation and manipulation efforts on social media platforms.

These steps come at a time of growing recognition that Americans are already part of the battleground of logic in which our foreign adversaries are fighting to influence us and manipulate our thinking. It is unlikely that truly effective solutions to defend against disinformation and manipulation will be easy to find or implement, and especially unlikely before the U.S. elections in November. Nonetheless, more research is needed into the long-term effects of disinformation, which are not yet fully known. There is still time for all of these efforts to be brought together into a cohesive whole that can help build a more resilient U.S. population, one which has effective defenses against manipulation.

author avatar
American University School of International Service Disinformation Research Team
The authors - Matthew Castle, Shaunil Chokshi, Daniel Jativa, Fahad Mirza, Michael Pattullo, Stephen Rudd, and Geoffrey Wright - are a part of The School of International Service Disinformation Research Team, a team of graduate students from American University’s School of International Service who are working on themes related to disinformation and the 2020 elections process. Through its collaborative efforts, with the assistance of Professor Jorhena Thomas, the team works with government and private sector partners to identify, expose, and counter disinformation targeted to the United States populace. Team members have a range of backgrounds and expertise (intelligence, military, Capitol Hill, etc.) that shapes their approach to the topic. The work herein reflects the contributors' research, analysis, and viewpoints and is not reflective of any professional or other affiliations of the contributors.
American University School of International Service Disinformation Research Team
American University School of International Service Disinformation Research Team
The authors - Matthew Castle, Shaunil Chokshi, Daniel Jativa, Fahad Mirza, Michael Pattullo, Stephen Rudd, and Geoffrey Wright - are a part of The School of International Service Disinformation Research Team, a team of graduate students from American University’s School of International Service who are working on themes related to disinformation and the 2020 elections process. Through its collaborative efforts, with the assistance of Professor Jorhena Thomas, the team works with government and private sector partners to identify, expose, and counter disinformation targeted to the United States populace. Team members have a range of backgrounds and expertise (intelligence, military, Capitol Hill, etc.) that shapes their approach to the topic. The work herein reflects the contributors' research, analysis, and viewpoints and is not reflective of any professional or other affiliations of the contributors.

Related Articles

Latest Articles