The search for a knowledge specialist for misinformation

To review this article, select My Profile and then View Recorded Stories.

To review this article, select My Profile and then View Recorded Stories.

Kehrt sound

To review this article, select My Profile and then View Recorded Stories.

To review this article, select My Profile and then View Recorded Stories.

One day in early June 2018, Sara-Jayne Terp, a British knowledge scientist, flew from her home in Oregon to Tampa, Florida, to participate in organized training through the US military. But it’s not the first time On the anniversary of D-Day, the US and its allies in the Middle East have been able to do so. But it’s not the first time Special Operations Command combined an organization of experts and infantrymen for an experiment that invites reflection: if the invasion of Normandy took a position today, what would it look like?The 1944 operation was a success in the giant component because the Allies had spent only about a year planting false information, convincing the Germans that they were troops where they were not, broadcasting fictional radio transmissions and even organizing fictitious tanks in key locations. Now, with today’s tools, how do you deceive the enemy?

Terp has spent the day in Florida thinking about how to fool a fashionable enemy, though he has never noticed the results. “I think they immediately filed the report,” he said. But she ended up having dinner with Pablo Breuer, the naval commander who invited her, and Marc Rogers, a cybersecurity expert. They began to communicate about fashion deception and, in particular, a new danger: campaigns that use other people to spread false data through social media. The 2016 elections showed that foreign countries had manuals for such operations. But in America, there hasn’t been much reaction, no defense.

October 2020. Subscribe to WIRED.

“We’ve had enough to admire the problem,” Breuer says. “Everyone’s after him. Nobody does anything. “

They discussed the creation of their own consultant to track and prevent misinformation; if someone started a campaign, they wanted to know how it worked; if other people around the world were beginning to recite the same strange theory, they were looking to know who it was. Like hackers, they used to dismantle things to see how they worked: using artifacts hidden in code to insinuate malware to a Russian crime syndicate, for example, or by retroengineering a denial-of-service attack to find a form of protection. They realized that misinformation can be treated in the same way: as a cybersecurity problem.

The trio left Tampa convinced that there had to be a way to analyze disinformation campaigns so researchers could simply perceive how they painted and counteracted them. Soon after, Terp helped bring together a foreign organization of security experts, academics, news experts, and government researchers. to paintings about what she called “desinfosec. “

Terp knew, of course, that there was a key difference between malware and influence campaigns. A virus spreads across vulnerable endpoints and nodes in a PC network. But with the information, those nodes aren’t machines, they’re human. “Beliefs can be hacked,” says Terp. Si must be protected from an attack, he thought, the weaknesses of the network must be identified. In this case, this network was the other people of the United States.

So when Breuer invited Terp back to Tampa to talk about her concept six months later, she made the decision not to fly. On the last day of 2018, he packed up his red Hyundai for a few weeks on the road. New year’s party in Portland to say goodbye to friends. A typhoon was approaching, so he left long before midnight to cross the mountains east of the city, skidding through the pass when highway personnel closed off the roads for him.

Thus began an odyssey that began with a 3,000-mile adventure to Tampa, but did not stop there. Terp spent nearly nine months on the road, from Indianapolis to San Francisco, atlanta to Seattle, crafting a manual to combat incorrect information and advertising it to his colleagues in 47 states. Along the way, he was also aware of vulnerabilities in the American human network.

Terp is a shy but warm middle-aged woman, with hair she likes to replace, now gray and short, now a blonde lock, now a reddish lavender hue. She once gave a presentation titled “An Introverted Consultant for Presentations” at a hacker convention, where she brings a teddy bear. He likes to end the problems crossed by half-finished that he buys in second-hand stores. She is also an expert in making the visual invisible and detecting submerged threats.

Terp began his career in defense studies for the British government. His first work was to expand algorithms capable of combining sonar readings with oceanographic knowledge and human intelligence to locate submarines. “It was great knowledge before great knowledge was great,” he says. She has been temporarily interested in how knowledge shapes ideals and how they can be used to manipulate them. It was the Cold War, and keeping the upper hand meant knowing how the enemy would figure out if you were lying.

After the end of the Cold War, Terp focused on crisis response; it has become a crisis mapping that collects and synthesizes knowledge of the box’s resources to create a coherent picture of what was happening.

These were mistakes such as the earthquake in Haiti and the BP oil spill in 2010, when Terp’s task was to gather real-time knowledge of social media, which began to realize what the impression of being deliberately false data designed to confuse a person. already chaotic situation . . . An article, which brings up Russian scientists, claimed that the BP spill would cause the ocean floor to collapse and cause a tsunami. Initially, Terp saw them as remote incidents, rubbish blocking their knowledge flows. -until the 2016 elections, it has become transparent to her, and to many others, that disdata campaigns were fought and coordinated through complicated opponents.

When Terp crossed the country in 2019, it was a bit like drawing a map of the crisis in the United States. I’d avoid looking at other people in the cafes. He started chatting at breakfast at Super 8. He tried to get an idea of the situation. communities to which other people belonged, what did they look like to each other, what were they thinking?How did they communicate with each other? He slowly amassed his impressions.

In Tampa, Terp and Breuer temporarily began to enlist their defense against misinformation, assuming that small clues – such as express sources or spelling errors in viral publications, or the most powerful twitter profile style – can reveal the origin, scope, and these “artifacts,” as Terp calls them, are breadcruceptions left as a result of an attack. The ultimate effective approach, they thought, would be to organize a way for global security to hint at those Ariane lines. Thread.

Whitney phillips

Gideon Lewis-Kraus

Samuel Woolley

Because cybercriminals tend to collect their vulnerabilities from a stock of non-unusual techniques, many cybersecurity researchers use an online database called ATT

Terp remained in Tampa for a week before returning to the road, but continued to paint while traveling. To start their database, the misinfosec team analyzed past crusades, from army military education painting Jade Helm 15 of 2015, which on social media became an attempt to impose martial law in Texas, to Russia-related blacktivist narratives that fueled the racial department before the 2016 election , sought to analyze how each crusade paints, catalogs artifacts, and identifies methods that are presented over and over again. Does a retweet of an influencer give legitimacy and reach to a message?Has a hashtag borrowed from any other crusade in the hope of hunting followers?

Once they recognize the patterns, they thought, they would also see choke points. In cyber warfare, there is a concept called the chain of death, adapted from the military. Map the stages of an attack, says Breuer, and you can anticipate what they are going to do: “If I can break that chain, if I can break a link somewhere, the attack fails. “

The misinfosec organization, however, has developed a cataloging design for disinformation techniques, based on the ATT framework.

Last October, the team incorporated AMITT into a foreign open source risk exchange platform, which meant that anyone, anywhere, can simply load a misinformation crusade and, with just a few clicks, specify what tactics, techniques and procedures were at stake. Breuer has followed the term “cognitive security” to describe paintings to save criminals from hacking people’s ideals, a task they expect global cybersecurity groups and risk researchers to perform. manage a brand reputation, protect against market manipulation, or protect a platform from legal risks.

While Terp was driving, he listened to a lot of radio. He told a long history of a country in crisis: a liberal plot to ruin America and extraterrestrial beings decided to destroy a way of life. Online, other people on the left also constantly agitated through existential threats.

This kind of concern and division, Terp thought, made other people the best targets for misinformation. The irony is that other people who hack into these concerns and ideals are regularly hostile extraterrestrial beings. Information providers have a purpose, whether it’s destabilizing a political formula or just making money, but recipients sometimes don’t see the total situation, they only see a friend’s 5G trends or Pizzagate messages, or, as 2020 has begun, links to sensational videos about a new virus coming from China.

In February, Terp was attending a hacker conference in Washington when he began to feel unwell. He returned limping to an apartment he had rented in Bellingham, north of Seattle. A doctor told him he had pneumonia moving around the area. A few weeks later, Seattle has become the first coronavirus hot spot in the United States, and soon Covid’s pandemic began running in parallel with what others described as an “infodemic,” a tide of false data spreading with the disease.

Around the same time Terp became ill, Breuer’s parents sent him a wonderful video on Facebook claiming that the new virus was an American-made biological weapon. His parents are from Argentina and had won the video of friends worried at home. The video presented the possibility of testing AMITT, so Breuer began cataloguing the artifacts. The narration in Spanish. At one point, the camera reviews some patent numbers that, according to the narrator, are viral mutations. Breuer sought patents; they didn’t exist. When he tracked the video, he discovered that it had been shared through sock puppet accounts on Facebook. He called friends from South America and Latin America to ask them if they had noticed the video and found out he was passing through Mexico and Guatemala two weeks before appearing in Argentina. “It’s a bit like following a virus,” Breuer says.

While Breuer was watching the video, he identified several disinformation techniques from the AMITT database. “Create fake profiles on social networks” is Technique 7. The video used fake experts to look more valid (Technique 9). He thinks it might be about sowing narratives for other disinformation campaigns (technique 44: seedling distortion).

As with malware, tracking incorrect information to its source is not an accurate science. The Spanish seemed designed to give the video an air of authority in Latin America. Its maximum production price indicated significant monetary support. impression in Mexico and Guatemala, and the time of its launch – February, just before migrant personnel departed for spring planting in the United States – warned that its purpose could be simply to undermine America’s food security. “He’s someone who really understood the geopolitical consequences,” Breuer says. All this led him to be a professional job, probably Russian.

Of course, I’d probably be wrong. But by analyzing a video like this and recording it in the database, Breuer hopes that the next time a subtle video in Spanish crosses South America and is based on sock puppets, law enforcement and researchers will be there to see how far away it is. Last time, recognize the style and vaccinate earlier as opposed to it.

About a month after his recovery, Terp won a message from Marc Rogers, with whom he had dined after the D-Day event. Rogers had helped organize a foreign organization of volunteer investigators who ran to hospitals for cyberattacks and virus scams. flooding of incorrect information like the video analyzed through Breuer, and Rogers sought to know if Terp would lead a team that would stick to Covid’s exploit campaigns.

Terp is cautiously positive about the strength of the attacked human being.

One Tuesday morning in August, Terp was looking for a house to dissect the wrong lacheck data. A video released the previous day stated that Covid-19, a hoax perpetrated through the World Health Organization, had already amassed some 150,000 visits. He also heard about a couple of Swiss websites claiming that Anthony Fauci doubted that a vaccine opposed to the virus was successful and that doctors thought the mask was useless. Your computer was looking for other URLs similar to the same host domain, identifying the advertisement. Site tags used on sites to hint at investments and catalog express words and stories, such as the one that the German government claimed that the German government sought to have Covid-infected youth transferred to internment camps, to identify where else they appeared. entered the database, adding to the data arsenal to combat the wrong data. She is positive about the dynamics of the project: the more it is used, the more effective AMITT will be, Terp says, adding that her organization is working with NATO, the EU and the Department of Homeland Security to verify the system.

It is also cautiously positive about the strength of the attacked network. On his road trip, Terp said, the more he drove, the more positive he became. People were proud of their cities, enjoyed their communities. She saw that when other people have done it, anything concrete to fight, they’re less likely to be placed in ghost battles in front of illusory enemies. “You have to involve other people in their own solution,” he says. By creating a world where incorrect information makes more sense, Terp expects more others can reject it.

During George Floyd’s protests, Terp’s team traced some other rumor: a meme resurfaced, in ways, in “buses loaded with an anti-vehicle” sparked demonstrations in small towns. One of the things you’ve noticed is that other people from small conservative communities are demystifying this idea. “Someone said, “Wait, that doesn’t seem right, ” he said. These other people understood, on some level, that their communities were being hacked and that they had to be defended.

SONNER KEHRT (@etskehrt) is a freelancer in California. This is his first short story for WIRED.

This article appears in the October edition. Subscribe now.

Tell us what you think of this article. Send a letter to the editor at mail@wired. com.

Anti-fraud. Undisputed Across the country, other people are striving to reboot america’s electoral system. But it’s not the first time

WIRED is where it is done. It is the essential source of data and concepts that give meaning to a constant global transformation Wired verbal exchange illustrates how generation is turning each and every facet of our lives: from culture to business, from science to design. The advances and inventions we notice lead to new thinking tactics, new connections and new industries.

More from WIRED

Contact

© 2020 Condé Nast. All rights reserved. Use of this site is an acceptance of our user agreement (updated 1/1/20) and privacy and cookie policy (updated 1/1/20) and your privacy rights in California. Wired may earn a portion of the product sales purchased on our site as a component of our partner component associations with retailers. The content of this site may not be reproduced, distributed, transmitted, cached or otherwise used, unless you have the prior written permission of Condé Nast.

Leave a Comment

Your email address will not be published. Required fields are marked *