Advertisement

Hundreds Of People Share Stories About Falling Down YouTube’s Recommendation Rabbit Hole - Breaking

Hundreds Of People Share Stories About Falling Down YouTube’s Recommendation Rabbit Hole  - Breaking Thanks for watching my video.
If you like my videos, please subscribe to the channel to receive the latest videos
Videos can use content-based copyright law contains reasonable use Fair Use (
For any copyright, please send me a message.  It all started with a simple keyword search for “transgender” on YouTube. Alex had just come out as trans, and was looking for videos from other queer people who had been through a similar experience. YouTube was helpful at first, but soon enough, it served up a series of videos that portrayed being transgender as a mental illness and something to be ashamed of.  “YouTube reminded me why I hid in the closet for so many years,” Alex said, adding that the platform “will always be a place that reminds LGBT individuals that they are hated and provides the means for bigots to make a living spouting hate speech.”  YouTube’s recommendation algorithm — which drives 70% of the site’s traffic — is designed to maximize ad revenue by keeping viewers watching for as long as possible, even if that means pushing out increasingly extreme content for them to binge. Alex, who is identified here by a pseudonym, is one of hundreds of people who were pulled down dark rabbit holes by the algorithm, and shared their stories with the Mozilla Foundation, a San Francisco-based nonprofit that’s urging YouTube to prioritize user safety over profits. (It is also the sole shareholder of the Mozilla Corp., which makes the Firefox web browser.)  One YouTube user who was curious about scientific discovery videos was sucked into a web of conspiracy theories and fringe content. Another who searched for humorous “fail videos” was later fed dash cam footage of horrific, fatal accidents. A third who watched confidence-building videos from a drag queen ended up in an echo chamber of homophobic rants.  Through the crowdsourced compilation of such anecdotes, which reflect the findings of investigative reporting from news outlets including The New York Times, The Washington Post and HuffPost, the Mozilla Foundation aims to show just how powerful the recommendation algorithm can be.  Earlier this year, YouTube announced it would tweak its algorithm to reduce the spread of harmful misinformation and “borderline content,” and to feature authoritative sources more prominently in search results. The Google-owned company pushed back against Mozilla’s research, and told HuffPost that such changes have already yielded progress.  “While we welcome more research on this front, we have not seen the videos, screenshots or data in question and can’t properly review Mozilla’s claims,” said YouTube spokesperson Farshad Shadloo. “Generally, we’ve designed our systems to help ensure that content from more authoritative sources is surfaced prominently in recommendations,” he added, noting that the number of views of borderline content has been cut in half since YouTube adjusted its algorithm.  But independent researchers hav

google,YouTube,extremism,Conspiracy Theories,Mozilla Foundation,

Post a Comment

0 Comments