Remember Mozilla? Some of you will have taken offense to that question, but we suspect that most of you won’t have done. Mozilla is the company best known for creating the Firefox browser, and for the past few years, Firefox has been losing its share of the market. Improvements in Microsoft’s default browser, coupled with the ever-expanding domination of the market by Google Chrome, has pushed Firefox to a very distant third place in the overall web-traffic stakes, and made Mozilla an afterthought in the minds of most people. Now, the company is suddenly back in the news, and it’s because they’ve come to take shots at YouTube.
If you’ve ever had the experience of loading YouTube either in your browser or as an app and been confronted by ‘recommended videos’ that you’d never dream of watching in a million years, you’re not alone. It’s a known problem, and one that YouTube attempted to perform a coding fix on in January 2019. Odd, inappropriate videos still turn up, though, and some people believe that they’re more than an annoying quirk of using YouTube. In fact, some people believe that they might be harmful. Mozilla is among that number, and now they want the help of internet users all over the world to address the issue, crack YouTube’s algorithm, and try to create something better.
It might be hard to imagine a way in which recommended YouTube videos could be a problem, so allow us to provide you with an example. Imagine you were addicted to gambling. We have nothing against gambling or online slots websites, but we’re going to use this as an example. One of the ways you might have tried to break the habit is to go to YouTube and search for videos of people giving advice on how to stop gambling and battle against your addictive mindset. You might search for terms like “quit online slots websites” or “stop gambling.” You’ll probably find some helpful videos by doing so. The next time you log in to YouTube, you could easily be confronted by videos that demonstrate that week’s best new Jumpman casino. Worse still, you might see a video offering you advice on how to get the best incentives and offers from online slots websites. That’s because YouTube’s search algorithms remember that you’ve been looking at gambling videos, but don’t appear to have the nuance to understand that you were looking to get away from online slots websites, not sign up for more. To an addict, that would be extremely damaging.
This is where Mozilla comes in, and by extension, this is also where you could come in if you want to help. The company has introduced a new browser extension to Firefox that allows you to ‘donate’ (their term, not ours) your YouTube recommendations too Mozilla, thus feeding them the data they need in order to try to work out what YouTube’s algorithm does, and the patterns that it follows. The extension, which has been given the somewhat strange name of ‘RegretsReporter,’ has also been made available for Chrome. When activated, the extension will ‘monitor’ the user’s use of YouTube, tracking how much time they spend on the app or website without actively recording what each user is watching. Every time someone sees a recommendation that they don’t think should be there, they can report it anonymously through the extension. Mozilla hopes to build up a pattern of recommendations in relation to users’ viewing habits, and from there, develop an understanding of what ‘triggers’ YouTube to make a specific recommendation.
The example of gambling that we used earlier on would be problematic, but it would represent one of the less troubling perceived trends that people have reported seeing while using YouTube. The platform stands accused of pushing conspiracy theorist videos and blatant misinformation at people browsing for entirely unrelated topics, and not doing enough to ensure that the content of said conspiracy videos is suitable for the potentially young viewer who ends up watching them. The more conspiracy theory videos they watch, the more they’re likely to be recommended. Unlike Facebook and Twitter, who have both started tagging misinformation in videos and offering readers alternative news sources for balance, there’s no such ‘flagging’ system in operation on YouTube. To put that another way, someone who’s watching a video full of misinformation has no way of knowing that what they’re seeing and hearing isn’t correct.
YouTube is aware of Mozilla’s plan and has already spoken out against it. Farshad Shadloo, a spokesperson for the company, has complained that the voluntary reporting method will result in only ‘anecdotal’ evidence being submitted to Mozilla, from which it will be ‘impossible’ to draw conclusions. He added that YouTube has made thirty ‘policy adjustments’ to its recommendation system during the past twelve months alone, and continues to review and adjust the system for better results on a day to day basis. As a final flourish, he insisted that the company has recently ‘cracked down’ on conspiracy theory content, and also videos that have been deemed to contain medical misinformation. In other words, they believe their house is already in order, and they don’t want the help of a third party to ensure that’s the case.
YouTube’s rejection of Mozilla’s project is not surprising. This is not the first time Mozilla has incurred YouTube’s displeasure. Last year, Mozilla produced a ‘recommendation bubbles’ project to show the difference in YouTube recommendations between people on different sides of the political spectrum, which YouTube didn’t appreciate. Representatives from Mozilla have also met with Google in the past to make recommendations about how the company’s coding could be improved, which apparently didn’t sit well with YouTube either. It’s not known how long Mozilla will be collecting data for in relation to this latest project, but they’ve already promised that once they feel they have enough data to draw conclusions, they’re going to make the information public – along with all their findings. If we had to put money on it, we’d bet that YouTube isn’t in love with that idea either. In the grand scheme of things, though, YouTube isn’t obliged to act on anything Mozilla does or doesn’t find, and given the seemingly-lax reporting system, it might be that they don’t find anything worth reporting anyway. We’ll find out soon enough.