A new study by software nonprofit Mozilla Foundation found that 71 percent of videos study participants deemed objectionable were suggested to them by YouTube’s own recommendation algorithm.
“Research volunteers encountered a range of regrettable videos, reporting everything from COVID fear-mongering to political misinformation to wildly inappropriate ‘children’s’ cartoons,” Mozilla Foundation wrote in a statement.