‘Dislike’ on YouTube barely works, according to new research on recommendations

If you’ve ever found it difficult to “train” YouTube’s algorithm to recommend a certain type of video after it’s included in your recommendations, you’re not alone. In fact, asking YouTube to understand exactly what your interests are can be even harder than you think. A major problem, according to made by Mozilla, YouTube’s in-app controls, such as the “dislike” button, are largely ineffective as a tool for controlling recommended content. According to the report, these buttons “prevent less than half of unwanted algorithm suggestions.”
Researchers at Mozilla used data collected from RegretsReporter, a browser extension that allows people to their recommended data for use in studies like these. In short, the report is based on millions of recommended videos, as well as anecdotal reports from thousands of people.
Mozilla tested the effectiveness of four different controls: disliking the “dislike”, “disinterested”, “do not recommend channels”, and “remove from watch history” buttons. The researchers found that these had varying degrees of effectiveness, but the overall impact was “small and incomplete”.
Of the four controls, the most effective was “no recommendations from the funnel,” blocking 43% of unwanted recommendations, while “don’t care” was the least effective and only prevented about 11 % unwanted suggestions. The “dislike” button was almost equal at 12%, and the “remove from watch history” button removed about 29%.
In their report, the Mozilla researchers noted long periods of time that study participants said they would sometimes use to block unwanted suggestions, such as watching videos. when logged out or while connected to the VPN. The study highlights the need for YouTube to better explain its controls to users and give people more proactive ways to determine what they want to watch, the researchers said.
Becca Ricks, a senior researcher at Mozilla who co-authored the report, said: “The way YouTube and many platforms work is that they rely a lot on passive data collection to infer ownership. your like. “But it’s a bit of a patriarchal way of operating where you’re making choices on behalf of people. You can ask people what they want to do on the platform instead of just seeing what they are doing.”
Mozilla’s research comes amid growing calls for major platforms to make their algorithms more transparent. In the United States, legislators have proposed bills to “Unclear” recommendation algorithms and for holding companies for algorithmic bias. The European Union goes even further. The recently passed Digital Services Act will require platforms how recommender algorithms work and open them up to outside researchers.
All products recommended by Engadget are handpicked by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission. All prices are correct at time of publication.