Don’t miss this piece by Renee DiResta (Director of research at New Knowledge, and a Mozilla fellow on media, misinformation and trust) on Wired about the extent of the misinformation problem on the internet today. Some highlights:

 

But, ultimately, what the government—and the general public—is realizing is that while disinformation, misinformation, and social media hoaxes have evolved from a nuisance into high-stakes information war, our frameworks for dealing with them have remained the same. We discuss counter-messaging, treating this as a problem of false stories rather than as an attack on our information ecosystem. We find ourselves in the midst of an arms race, in which responsibility for the integrity of public discourse is largely in the hands of private social platforms, and determined adversaries continually find new ways to manipulate features and circumvent security measures. Addressing computational propaganda and disinformation is not about arbitrating truth. It’s about responding to information warfare—a cybersecurity issue—and it must be addressed through collaboration between governments responsible for the safety of their citizens and private industry responsible for the integrity of their platforms.

Social platforms have begun to take steps to reduce the spread of disinformation. These steps, several of which were inspired by prior tech hearings, are a good start. But as platform features and protections change, determined adversaries will develop new tactics. We should anticipate an increase in the misuse of less resourced social platforms, and an increase in the use of peer-to-peer encrypted messaging services. Future campaigns will be compounded by the use of witting or unwitting people through whom state actors will filter their propaganda. And we should anticipate the incorporation of new technologies, such as videos (“deepfakes”) and audio produced by AI, to supplement these operations, making it increasingly difficult for people to trust what they see.

In the short term, our government, civil society, political organizations, and social platforms must prioritize immediate action to identify and eliminate influence campaigns, and to educate the public ahead of the 2018 elections. In the longer term, it’s time for an updated global Information Operations doctrine, including a clear delegation of responsibility within the U.S. government. We should pursue the regulatory and oversight frameworks necessary to ensure that private tech platforms are held accountable, and that they continue to do their utmost to mitigate the problem in our privately-owned public squares. And we need structures for cooperation between the public and private sectors; formal partnerships between security companies, researchers, and government will be essential to identifying influence operations and malign narratives before they achieve widespread reach.

 

Advertisements

Posted by Carlos Alvarenga

Carlos A. Alvarenga is the Executive Director of World 50 Labs and Adjunct Professor in the Logistics, Business and Public Policy Department at the University of Maryland’s Robert E. Smith School of Business.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s