YouTube will not allow content that “realistically simulates” deceased youth or victims of violent crimes.
The video platform has updated its harassment and cyberbullying policy to prohibit content that “realistically simulates deceased minors or victims of first fatal violent events or that is well documented describing their death or the violence they suffered. “
The updated policy will go into effect on Jan. 16, said Google, which owns YouTube.
This is part of its broader “accountability efforts,” which in the past have included banning content about Covid-19 vaccines that contradicted the government’s health consensus and a set of rules to apply age restrictions to corresponding videos.
The policy change comes after the proliferation of true crime videos on platforms like TikTok and YouTube recreating the likeness of dead or missing children narrating the stories of what happened to them.
Some videos used AI-generated depictions of James Bulger, a two-year-old British boy he kidnapped and murdered in 1993, and Madeleine McCann, a three-year-old British girl who disappeared in Portugal in 2007.
Bulger’s mother described her son’s AI clips as “disgusting” and “beyond sickening” to The Mirror.
TikTok started those videos afterward, stating that “our network rules make it transparent that we don’t allow artificial media that contains the image of a young person,” but many remained on YouTube.
Last year, YouTube began requiring creators to label content as “synthetic” and warned that failure to do so could result in suspension or penalties. He also said that creators and artists will be required to remove content that simulates their symbol without consent.
Brazil’s entire population may be exposed to a major knowledge drain
Critical Xwiki vulnerability when attacked via RCE
U. S. consumers will settle for streaming classified ads if it saves them money
The attack on defense contractor Ultra I
China Claims to Have Decrypted AirDrop Encryption
Subscribe to our newsletter