n n n ‘. concat(e. i18n. t(“search. voice. recognition_retry”),’n
The government has reportedly said that adversaries of the UK (including Russia, China and Iran) would possibly intentionally spread misinformation and conspiracy theories about the Princess of Wales.
Many rumors, gossip, and unfounded hypotheses have circulated on platforms such as Facebook, X, and TikTok since Kate underwent abdominal surgery in January and had to take time off from her royal engagements while recovering.
Citing an anonymous Whitehall source, The Telegraph said fears were developing that China, Russia and Iran were concerned about spreading conspiracy theories around the princess, with the aim of destabilising Britain.
The source said: “Part of the modus operandi of hostile states is to destabilize things: it undermines the legitimacy of our elections or other institutions. “
Rumours surrounding Kate have intensified after her husband, William, withdrew from attending a memorial service at Windsor Castle for his godfather, the late King Constantine of Greece, over an unspecified “personal matter”. Kensington Palace on Saturday proved William’s absence from the room due to Kate’s diagnosis.
When Kate apologized earlier this month after posting a photo of herself with her three children, which she had edited, to mark Mother’s Day on March 10, the rumor came to a head.
The use of conspiracy theories and other misinformation is a familiar tactic used through Russian “troll farms” to lie about issues like the war in Ukraine and COVID-19 vaccines.
A government report on a “troll farm” discovered in 2022 said Russian agents were operating on social media platforms, adding Telegram, Twitter, Facebook, and TikTok.
Agents use VPNs to make it look like they’re in other countries and comment on stories instead of posting them to increase reach, and they also magnify “organic” content, such as news articles or YouTube videos from other people who agree with the troll’s goals.
These strategies allow trolls to be successfully detected, the UK government said.
James McQuiggan, a security specialist at KnowBe4, told Yahoo News that the use of social media by cybercriminals, state attackers and conspiracy theorists is a persistent problem, especially over the past decade.
He said: “With Brexit and the 2016 US election, social media was used to spread the message with inaccurate and unverified data that opposing camps of dissidents spent more time refuting their claims than providing their own.
“Historically, state actors have used conspiracy theories to divert attention from domestic issues, discredit the opposition, or manipulate public opinion to their advantage. They can fabricate or magnify narratives that align with their strategic interests, exploiting existing social tensions or crises to promote those theories more credible to the public.
McQuiggan said state actors routinely use social media to widely spread conspiracy theories, “employing bots, trolls, and fake accounts to create the illusion of popularity for those narratives. “
Prince William is known to fiercely protect his family’s privacy. She was infuriated, for example, when a French magazine published topless photographs of Kate, taken from afar, in 2012.
Royal commenters said the way Kate circulated the video message was either a reflection of a recent hypothesis or an attempt to silence her. But conspiracy theories continued in the wake of his announcement, with some commentators falsely claiming that the video was generated through synthetic intelligence (AI), or questioning the accuracy of what he said.
Imran Ahmed, head of the U. S. Center for Countering Digital Hate, told the BBC that conspiracy theories show the “inhumanity” of social media.
He said: “I think it’s the inhumanity of the way social media has led us to behave, forcing other people to communicate about things that can be very deeply personal.
“And also, of course, to see the effect this has on our society, how temporarily it has been passed on to millions of other people and how much damage it causes to the royal family itself. “
In September of last year, the Washington Post reported that after Twitter laid off thousands of workers and repealed regulations on misinformation since Elon Musk’s inauguration in April 2022, social networks such as YouTube and Facebook also stopped labeling posts similar to the unfounded rumor that the 2020 election was “stolen,” It’s a sign that corporations are abandoning their plans to crack down on their top-down competitive strategies for countering misinformation.
McQuiggan said: “Social media corporations have come under scrutiny for their role in spreading conspiracy theories, which has led to increased pressure to address the issue. For example, platforms such as Facebook, Twitter, and YouTube have taken steps to remove or limit content. similar to an express conspiracy. Theories, such as QAnon, and banned accounts that systematically publicize misinformation.
“Some technological efforts include implementing algorithms designed for the visibility of this content and introducing fact-checking features and partnering with third-party fact-checkers to determine data and label or remove false content. “
But in a year when dozens of countries (including the U. K. and the U. S. ) are trying to find a way toU. S. Department of T As general elections are held, considerations will no doubt grow as to whether social media corporations are doing enough.
Kate Middleton’s plots persist after announcement (Yahoo News UK)
AI researcher calls for ‘immediate global ban’ (Business Insider)
One AI researcher estimates that there is a 10% to 20% chance that AI will take over (Fortune)