Guy claimed they only began to scrub its platform of videos depicting Alison's last moments after they filed the FTC complaint. Although conspiracy theories about Alison's death and the Sandy Hook shooting have proliferated on the site for years, Google only announced last June that it would ban Sandy Hook conspiracy videos, white supremacists, and Nazis from YouTube, the Verge reported. The worldwide video platform has come under fire for how it moderates content on its site. "After hundreds of reports, it takes, days weeks, or months before any certain action is taken, if at all," Guy told Insider. The complaint claimed that YouTube fails to convey that the burden is on consumers to report graphic content, a process that they claim can re-traumatize victims who are forced to watch and describe these scenes repeatedly. YouTube relies on users to flag content and to describe the violence within offensive videos to moderate the platform. We will continue to stay vigilant and improve our policy enforcement." "We rigorously enforce these policies using a combination of machine learning technology and human review and over the last few years, we've removed thousands of copies of this video for violating our policies. "Our Community Guidelines are designed to protect the YouTube community, including those affected by tragedies," a spokesperson told Insider. Silhouettes of mobile users are seen next to a screen projection of Youtube logo in this picture illustrationĪ spokesperson for YouTube told Insider that YouTube uses a "combination of technology and people to enforce these guidelines," and that they specifically prohibit "videos that aim to shock with violence, or accuse victims of public violent events of being part of a hoax." Guy told Insider that "YouTube isn't giving parents the truth about what can be exposed to." The complaint also argued that Alison's video could be traumatizing to children who may accidentally stumble upon it on the platform. What they're doing with the content they're offering up, they're doing evil everyday." "The founders started out saying they don't do evil. "The dark web is here - and it's called Google," Parker told Insider. The complaint, seen by Insider, claimed that YouTube's community guidelines proclaim that violent and graphic videos are not allowed, "leading users to reasonably believe that they will not encounter it." However, it alleges that these videos are "commonplace" and have remained on its site for several years. Rachel Guy and Spencer Myers, law students with the Georgetown University Law Center who helped draft the complaint, called on the FTC to "end the company's blatant, unrepentant consumer deception." The FTC complaint alleges that YouTube violates its own terms of serviceīecause there are no laws making it illegal to host disturbing videos, filing an FTC complaint is the Parker family's only real legal recourse. Parker and his family have had only one tool available to defend themselves from such traumatic vitriol and the nightmare of seeing their daughter's death: watch these videos one-by-one in order to report them," reads the complaint. Pozner and hundreds of volunteers with the HONR network have flagged thousands of videos of Alison's death on YouTube as violent or graphic content, according to the complaint.ĭespite multiple attempts at flagging the videos, which YouTube's terms of service prohibit, the complaint alleges that the online video platform often takes little to no action to remove them from its site. Pozner started the HONR Network to assist victims of highly-publicized, mass casualty events who have been revictimized by online conspiracy theorists. Lenny Pozner, whose son Noah was killed in the Sandy Hook Elementary shooting, was harassed by conspiracy theorists who claimed the mass casualty at the Connecticut elementary school was a hoax and that he was a crisis actor.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |