Articles

Higher Education

ASU’s Weaponized Narrative Initiative Teams Up with Truepic to Fight Fake News, Verify Images

By Henry Kronk
October 06, 2018
Voiced by Amazon Polly

During the past two presidential administrations, internet trolls have manipulated images of George Bush and Barack Obama talking on the phone to make it appear the device was upside-down. These would be some of the lighter instances of what a group at Arizona State University (ASU) calls weaponized narratives, or fake stories created to undermine an opponent. The ASU Weaponized Narrative Initiative focuses on these malicious initiatives of fake news, and on Tuesday, they announced a new partnership with Truepic, an image verifying  app and platform, in the run up to the 2018 mid terms.

In the weeks and days before an election, many images of dubious origin circulate in the hopes they’ll create at least a temporary impression to sway voters one way or another. With the new partnership, ASU’s Weaponized Narrative Initiative is hoping to recruit the higher education community to participate with journalists and others in spreading awareness and verifying images.

“It is incredible to watch the development and deployment of technologies that weaponize information streams to the extent that citizens can no longer tell what is real and what has been manufactured to mislead them,”  said Brad Allenby, Co-Director of the Weaponized Narrative Initiative, in a statement. “In an effort to raise awareness and stem this trend, ASU is partnering with Truepic to try to counter these attacks with authenticated images that defend rationality and transparency in yet another critical way.”

ASU’s Weaponized Narratives Initiative Is Ramping Up for November

The ASU initiative defines weaponized narrative as “an attack that seeks to undermine an opponent’s civilization, identity, and will. By generating confusion, complexity, and political and social schisms, it confounds response on the part of the defender,” according to their site.

“A fast-moving information deluge is the ideal battleground for this kind of warfare – for guerrillas and terrorists as well as adversary states. A firehose of narrative attacks gives the targeted populace little time to process and evaluate. It is cognitively disorienting and confusing – especially if the opponents barely realize what’s hitting them. Opportunities abound for emotional manipulation undermining the opponent’s will to resist,” they write.

Russian interference in the 2016 election, including false and misrepresented news stories disseminated through Facebook and other social media platforms, would be a good example of weaponized narratives.

There have already been so many reported and identified Russian weaponized narratives in the run up to the 2018 midterm elections that there’s a Wikipedia page dedicated to it.

Enter Truepic

The free Truepic mobile app acts as a digital notary. It first checks images for signs of editing or alteration. It then gives the image a watermark and logs it on to Bitcoin’s blockchain ledger along with numerous metadata points, such as a geocode and time stamp.

“We believe that the ultimate solution to the crisis of disinformation will not come from one sector, but, instead, will be a whole-of-society approach, including collaboration between technologists and academia. We look forward to our partnership with Arizona State University which will no doubt help us continue innovating and developing technology to return trust in images around the world,” said Jeff McGregor, CEO, Truepic, in a statement.

Both Truepic and the Weaponized Narrative Initiative are part of the U.S. Department of State’s Global Engagement Center Tech Demo series, an effort launched to harness tech to fight propaganda from foreign countries and terrorist organizations.

In June, Truepic secured a funding round of $8 million in June to develop their technology. The company has made diverse partners around the world, such as the subreddit AMA (Ask Me Anything) and the Syrian American Medical Society.

Featured Image: Saffu, Unsplash.