In August, YouTube CEO Susan Wojcicki penned a Wall Street Journal op-ed defending her company’s policies and signaling YouTube would welcome new regulations. More recently, YouTube has imposed some degree of greater self-control over itself, cracking down on things like right-wing hate speech and vaccine misinformation. It is partly but not entirely built around a recommendation algorithm, a feature that can lead users to increasingly radicalized content and has drawn the attention of lawmakers looking to latch onto possible areas for reform. Several years ago, for instance, it was allowing some Neo-Nazis on its site but later banned one group, Atomwaffen, amid pressure from the Anti-Defamation League. In some moments, YouTube has seemed to act inconsistently in taking down content, sometimes flip-flopping on decisions. One famous debacle came in 2018, when the influencer Logan Paul posted a video that went viral featuring a dead body in a Japanese forest tragically popular as a suicide site. The site has a checkered past with content moderation. teens watch YouTube videos, according to data from Statista, a statistics research firm, and close to 70% of all Americans spend time there. In the trio, YouTube is the biggest, oldest and most well known, having grown to over 2 billion users worldwide over 16 years. The similarities mainly end there, though, and their exact fates in Congress on Tuesday and beyond will differ based on their varying stances on content moderation and corporate histories, as well as the nuances around how each app works. And none of them will probably face the frosty hostility that Facebook received. On Tuesday, each will need to reckon with the same broad topics: content moderation and protections for young users. They’re widely used by pre-teens and teenagers and count their users by the hundreds of millions. The three companies share some basic DNA.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |