Hacker News new | past | comments | ask | show | jobs | submit login
Mozilla Is Crowdsourcing Research into YouTube Recommendations (foundation.mozilla.org)
201 points by fraqed on Sept 17, 2020 | hide | past | favorite | 173 comments



What's impressive to me is that YouTube Recommendation are so ... stale. All it recommends to me is content from the same 3 categories, and the same videos over and over and over (you would think if I didn't click on the suggestion in the past 6 months it would adjust).

Now the topics it recommends to me are non-political, but I can very well see that if you fall down one of more extremist rabbit holes thats all you will be fed on your recommendation page for months.


Youtube's recommendation algorithm is just dumb, in my opinion.

At least half of the time on Youtube, I watch classical music, and you would think Youtube would recommend me some classical performances that I haven't yet seen.

Nope. It keeps feeding me exactly the same videos I've already watched, despite being basically the largest catalogue of classical music videos.

What's the point of a recommendation system if it doesn't even recommend new video of the same category I like? Might as well just shuffle my watching history.


Every Google product I've used has had hilariously awful recommendations.

For a company that has so much of my data and indexes the entire internet, it sure is surprising how bad they are at recommendations.


> What's the point of a recommendation system if it doesn't even recommend new video of the same category I like?

Because YouTube doesn't care what you want--YouTube cares what it can monetize.

I suspect that YouTube gets very little money from showing you classical performances but gets lots from showing you the latest "Funny Cat Video".

You are the product, not the customer.

Once you internalize that, stuff that Google does makes sense.


> Because YouTube doesn't care what you want--YouTube cares what it can monetize.

> You are the product, not the customer.

> Once you internalize that, stuff that Google does makes sense.

They published their algorithm. They rank items according to predicted watch-time. The other poster is probably correct, in that many people use these long classical music videos as background music over and over (i.e. high watch time) and so they're highly ranked.


Interesting. When and where did "they publish[] their algorithm?"


RecSys 2016

Covington, Paul, Jay Adams, and Emre Sargin. "Deep neural networks for youtube recommendations." Proceedings of the 10th ACM conference on recommender systems. 2016.

Nvidia furthermore implemented/released it for TensorFlow, and for their new Recsys engine:

https://ngc.nvidia.com/catalog/resources/nvidia:wideanddeep_...


The thing is, it doesn’t even show me funny cat video anymore. It’s the same classical music video recommended over and over again.

To be honest, the more I think about it, the more I don’t know what money is YouTube trying to earn.

It’s probably just a dumb algorithm.


I think it has a grasp of revisits, I forget the exact terminology but if you rewatch stuff I tries to predict will you like to watch it again. Some stuff is not worthing multiple times though, once is enough. But music videos can indeed be watched multiple times.

I think I heard somewhere that a lot of kid videos rank in the top stratosphere of view counts on YouTube because kids just play stuff on repeat. There's probably a lot of repeat plays going on across YouTube.


It's because a large segment of classical music watchers are using it as a background music playlist, and listen to the same recordings repeatedly.


They should be able to distinguish between users who always play the same video in a loop vs. users who never watch anything twice (or anything intermediate between the two extremes) and adjust the recommendations based on that.

But maybe their recommendation engine is hyper-focused on the matrix-completion framework and doesn't use any information other than which users watched which videos.


The recommendation system is not "dumb" in the opinion of its designers. Perhaps it is performing exactly as intended.

In the same way that online advertising needs to display millions of ads to reach only hundreds of people, the recommendation system may not be intended to work at 100% efficiency.

YouTube "recommendations" work as far as I can tell. If you take two people searching for some subject area on YouTube, and only one of them knows how to search past the initial "popular" recommendations, each person will produce very different results.

If neither person is a skilled searcher (most people are not), they both end up with the same results. Those videos receive thus more and more views. Videos with millions of views help Google sell online ad services. If views were distributed more evenly across all videos, would that make advertisers more interested or less interested.

What's the point of a recommendation system that does not recommend a new video in the same category. To promote certain videos to attract large audiences and thus draw more interest from potential advertisers. More advertisers bidding on fewer popular videos would seem to favour higher prices paid to Google.

Some of the best videos I have seen on YouTube have less than 500 views, some even in the single digits. I found them through automated searches, not "recommendations".

Google makes information accessible with the belief that relevance is directly proportional to popularity. The company was founded on this idea.


This isn’t how digital ad inventory works. Advertisers don’t bid to win a slot that all viewers of a particular video will see. They bid to win a view. A million views on one video, or the same million impressions split across a thousand videos doesn’t change the supply/demand equation like you suggest. Buying an ad on the most popular video isn’t like buying an ad on the Super Bowl, because not all viewers are seeing the same ads.


A video with only a few hundred views is not going to produce much YouTube ad revenue. In order to sell the idea of making money from YouTube ads, one needs to convince ad buyers and video uploaders that videos can receive millions of views.


Same problem with music, I thought I somehow broke their engine because it keeps feeding me some random hit songs I liked 5 years ago and does it for the past 5 years. You would think they want to show me something new for their engagement metrics but I guess not.


Contrary data point: I watch a lot of classical music on YouTube and it gives me consistently excellent recommendations.

(Although I only take recommendations from the front page, and ignore the playlists entirely.)


For me it has been the opposite as in it has so little hysteresis. Just one day I watch something I do not usually watch, say cat videos, and for weeks I will get only cat videos as recommendations. I think their algorithm needs to be cognizant of and adapt to temporal shifts in a viewers interest. Some users may like change in focus at slightest of deviations, others may prefer more hysteresis.


It's both for me. It locks on to a few categories or channels and occasionally, almost randomly decides to start showing some other category or channel based entirely upon a single view.

This is not the same algorithm that they've always used. It began changing somewhere around 3 or 4 years ago and it's completely different now. Honestly, I think they've intentionally broken it. I used to derive a lot of enjoyment from going on an algorithm enhanced "wikiwalk" of YouTube content, discovering many interesting and bizarre videos. It's impossible to do that now, you just wind up stuck on a channel or going around in circles.

To put it plainly, I think YouTube has decided to bury all of the fringe and weird one off content in favour of mainstream channels because they can't monetize the former. Good job Google, you killed another golden goose.


Because of this, when I'm sent video links to watch I will open them up in a Private Window / Incognito so I do not pollute my account.


I never use non-private windows, so I use Invidious or mpv for this. I put

  control + alt + m
    notify-send "mpv loading $(xclip -o)"; mpv --player-operation-mode=pseudo-gui "$(xclip -o)" || notify-send 'mpv error'
in my ~/.config/sxhd/sxhkdrc to conveniently play any youtube-dl supported url I’ve copied or highlighted.


Thank you for this tip, just added it to my sxhkdrc.


The main Invidious site just shut down, although various alternative Invidious sites still exist.


I set my browser to block all cookies. Has the same effect as private window mode. That way I can still browse youtube without having to open a new window. Yea, it sucks that I can no longer sign in, but at least the recommendations are better.


This criticism shouldn't apply anymore. They predict categories you like and you can select a category. The 'All' page should be a mix of all categories, not just cat videos. Amazon on the other hand...


> you can select a category

Could you tell me how ? Would love to have that feature. They seem to have turned this off.


I'm sorry I don't know how to turn it on. When I go to youtube.com, at the top it has about 15 categories that it has gleaned from my use (e.g. Guitar, Flight Simulators), some of which I'm surprised it learned, as well as All, Mix, Trending. I use Edge and I'm logged in to YouTube.


I know what you are talking about. They had rolled it out in my region and I loved it. Now they seem to have rolled it back


I use YouTube infrequently. Usually for binging random content like comedy sketches or speedrunning. No matter what I'm on, there's always a few videos in my recommendations that make no sense and look like trash. Not generically, but specific videos that ALWAYS appear. One of them is "I PAID FIVE BASSISTS ON FIVR TO PLAY AN IMPOSSIBLE BASSLINE"

...

Why is this so bad? What are your hundreds of data scientists doing?


I think these are exploitation/exploration trade-offs.

YouTube only knows your interests by the videos you watch. It knows you like speedruns, so it will show you speedruns, easy. But you certainly have other unrelated interests: cooking, woodworking, old cars, whatever. So, from time to time, Google takes a chance and shows you a random topic just to gauge your interest.

It works on a global scale too. They want some variety, they don't want to depend only on a few topics and previously successful YouTubers. So sometimes, they may pick a random video and so it to everyone, and if it succeeds, they have something new.

It is actually rather clear when you look at your recommendations to determine which are which. Typically you are going to find a mix of videos related to the video you just watched, videos relative to your global interests, trending videos, and random stuff.

If some generally effective machine learning algorithm acts really stupid from time to time, chances are that it is exploring, that is: it is asking you a question instead of giving you an answer.

Edit: someone mentionned the multi-armed bandit, yes, this is such an algorithm


I don't think it's bad actually. I mean the recommendations could be far better, but this specific case isn't bad.

Say you go on reddit/r/iloveredcars - all you will see upvoted (ie ... RECOMMENDED by users) will be red cars. You'll never see blue cars. You'll never know they even exist. All you've ever seen are red cars. Not only this but you'll notice that over time it's always the same red car pictures being reposted over and over again.

By inserting "wrong" results a small percentage of the time this ensures that users will be exposed a tiny bit to different content, that may be they end up liking, opening an entiere new tree of recommendation opportunities.

Now I'm sure some people don't want this and they do want to see the same red car pictures over and over, but personally, I appreciate that they do this.


aka Multi-Armed Bandit


The function of an adtech recommendation system is not to find content that you want to see but rather to show you content that's the most profitable for the adtech firm that's serving it.

The videos that YT recommends over and over again are probably videos that have a high ratio of advertising revenue (meaning good demographics) to playback cost (meaning low overall byte size) and a high likelihood of going viral by being shared off-platform (meaning broad appeal, non-controversial, short, safe for religious conservatives, safe for work, and easy to summarize within 280 characters).

The videos you want to watch don't meet these criteria, so Youtube won't find them for you.


Recommending you a video you don't want to watch is a wasted opportunity. If you don't want to watch it, you won't, so YouTube's earnings are zero. If something you do want to watch is recommended instead, you will, and YouTube's earnings are nonzero.

I'm right now on a browser without my account and on which I don't usually watch YT, basically only when someone pastes a link while gaming. About half of the front page are things I might have clicked on when bored. A third are long. Two are definitely controversial. I don't buy your hypothesis.

Disclosure: I work in Google, but my closest contact with YouTube was drinking whisky with a YouTube SRE some four years ago.


I'd say there's definitely a lot of "LOOK AT MY YOUTUBE SEO TITLE" but in actuality that bassist guy is really good and generally plays some great riffs.


It's strange.

It's like a wild species with vibrant, sometimes garrish colors saying 'Look at me!', where to discriminant viewers it's typically poisonous.

I'm pretty cautious.

But for a large number of my favorite content creators it's a bait and switch. Bait your clicks and actually provide really meaty content.

That aside, my personal favorite trend in the digital media space are the 1-4 minute tutorials for people who already know the software.

These people typically make money outside of Youtube and don't rely on snaring views.


> "use YouTube infrequently"

> "binging random content"

gets recommended random popular content

> "why are recommendations so bad"

They can't read your mind, you know


They could probably infer that if I'm watching comedy,and the last 5 videos were comedy by the same artist, I don't want to watch some guy's channel about bassists that they've tried to get me to watch 100 times prior.


If they were only recommending comedy from the same artist you may then have the same issue as another comment, which is that recommendations aren't novel enough.

I'd take your point if they only recommended bass videos, but presumably it's just one recommendation in a list of many, with the point being to explore your preferences while also satisfying your demands


The problem everyone is complaining about is that they used to recommend a good mix of content that was actually relevant. Like if you were watching a bunch of comedy videos by one comic it would recommend more of them but also another comic that you would probably enjoy. I don’t know how they did it, maybe it was just based on what other people with similar demographics and interests watched. If it was then they probably broke it when they decided to pop the filter bubbles. Turns out some of those bubbles were useful, they just decided to throw the baby out with the bath water.


If they used to recommend a good mix of content, what do you feel is the issue with recommendations now? Not enough variety? Not enough relevance?


> They could probably infer that if I'm watching comedy ... I don't want to watch some guy's channel about bassists that they've tried to get me to watch 100 times prior.

davie504 is basically a musically-themed comedy act, so this doesn't seem that unreasonable.


Urgh him. I watch a lot of music videos (as in videos about making music + music theory). I can not get this guy off my recommends. He has the personality of a sofa cushion. I cannot understand why he's popular.

Other than that, I'm pretty happy with my recommends. I think the Google spooks have built a good file on me.


I use this firefox addin, it put a cross next to user comments and videos and lets you hide videos and comments from those users/channels in the recommended videos, searches etc;

https://addons.mozilla.org/en-US/firefox/addon/youtube-clean...


They stop appearing if you specifically mark them as not interested


I do this, and it does work, but it doesn't seem to learn that I don't want to see generic youtube garbage.

I'd love for it to learn that anything with a shocked face thumbnail and CrAzY question for video name is not desired.


Shocked face or a big red arrow pointing to something in the thumbnail make the video an instant pass for me. It would be amazing if YouTube's algorithm could recognize that.


Every so often I do that. I spend five minutes selecting channels I'm not interested in, because I watched a gamedev video once and now I'm being suggested hundreds of gamedev videos, or whatever. Pretty soon, it's like the algorithm just... gives up and starts suggesting me generic videos about celebrities and other vapid things that I'd never click on in a million years.


I often AM interested in a topic, I just don't want every little piece of content drivel shoved down my throat. I still want the cream of the crop to occasionally pop up in my feed.

That's the problem with marking things as not-interesting. The algorithm will wildly overreact in the opposite direction.


I've marked dozens of videos on a single topic as "not interested", only for them to be replaced by different videos on the same topic soon after. Marked all those as "not interested" again, and the next day there were more.

The only thing that works for me is finding the video in my watch history that's causing the recommendations and removing it there. Often it's even something I didn't watch all the way through or even gave a thumbs-down; apparently thumbs-down also means I secretly want more of this content?

YouTube's recommendation algorithm is somehow worse than useless.


The fact that feature exists shows that even YouTube know their algorithm needs help.


Everyone lauds TikTok’s algorithm for some reason, but they have the same thing and it seems just as worthless there.


Tiktok by design has more fresh content and by being newer is not gamed as much as Youtube's algorithm.

With enough time Tiktok will look as bad. I.e. I don't think it's algorithm related "for the most part"


Hahah! I get this one too.


> One of them is "I PAID FIVE BASSISTS ON FIVR TO PLAY AN IMPOSSIBLE BASSLINE"

Slap like right now


I've had the same experience. I use YT to listen to music pretty regularly and if I don't go and actively select a different song it will play the same 5-10 over and over. Even when I do select a song it will make its way back to those same 5-10 songs. There's a lot to discover on there so I still use it, but it's also pushing me back to my other services or libraries.


Not just that, but I get super annoyed that I exclusively listen to YouTube music on my desktop, and exclusively watch shows etc on my Apple TV. And for some reason YouTube cannot figure out that no, I don’t want to watch these music videos on my Apple TV, and no I don’t want to watch this documentary on my desktop.

And yeah even within music videos, the recommendations seem really basic. Compared to Spotify, Google is looking like real amateurs when it comes to recommendations.


> Compared to Spotify, Google is looking like real amateurs when it comes to recommendations.

I agree about Google bad recommendations, but for me Spotify it at that same bad place too. It used to be way better, but now 80% of my daily recommended playlists are stuff that I have in my own music library, just shuffle played. It's absolutely terrible.


It works well enough for me in some categories. Like if I'm listening to some guitar demo or pedal maker demo or shootout, I'll get similar videos coming through and I've stumbled onto some great ones that way (JHS pedals has a great channel).

But yeah, the music cycle is funny. And for some reason, no matter how many times I say "please don't show me this video", YouTube really wants me to listen to Mandolin Orange. I can't escape that recommendation even though I think I listened to them once and decided it wasn't to my taste and have always removed them from the queue. (Sorry, I'm sure they're nice but it's just not for me)


It actually works well for me since I mostly want to listen to similar mixes while working so my work Youtube account is pretty spot on.


What freaks me out is how invasive YouTube/Google is. I have browser ad blockers, script blockers, privacy addons. Sure I am logged in though which pretty much defeats the purpose of all those.

My often used example is the time I went car shopping with my sister. She was buying a new car I went with her (age old women vs salesman bias). The dealership was over an hour from my town, my sister lives in a different town. I didn't do anything other than stand there, no info of mine given, no forms filled out.

Back home two hours later YouTube has showing on recommendations for me videos for new car shopping.


I'd guess location tracking (if you have an Android phone) with you stopped at a particular place for long enough.


Yeah, it especially annoys me that it recommends videos I've already watched. Even if I say "don't recommend this" and give "I've already watched the video" as the reason, YT will still keep recommending videos I've watched.


It's ironic to see a complaint that youtube recommendations are stale, in the comments of a complaint that they lead people to discover offensive things. Can't have it both ways. Either you let people discover new things (some of which might be offensive) or not.


Just because every now and then it shows something interesting or different doesn't mean that they aren't stale for the most part.


This misses my point entirely. I'm not commenting on whether youtube recommendations are stale or not. I'm pointing out that it's contradictory to complain both that the recommendations are too stale and that people discover offensive things.


My point is that it's not. Mostly stale doesn't mean there is never something outside the norm. As such, I do not believe I am missing your point, but that you are willfully and incorrectly assuming that everything is always an absolute.


Some of this blandness may be by design, at least when it comes to political content. YouTube caught a lot of flak in 2018 for allegedly being a "radicalization pipeline", but research[0] shows that their "next video" recommendations consistently direct users away from fringe content and towards mainstream media outlets. One of the authors of that paper made a website to visualize their results[1], which is pretty neat.

The research in question was limited by not using logged-in accounts with a significant view history, but it's still the best attempt to quantify the YouTube algorithm's influence that I've seen so far. Mozilla's initiative sounds interesting, but the fact that they're asking users to only report negative interactions rather than getting a representative snapshot of all YouTube use makes me suspect that Mozilla is approaching this from the perspective of advocacy, not scientific inquiry.

--- [0] https://arxiv.org/pdf/1912.11211.pdf [1] https://recfluence.net/


Yup. I'm somehow bored by the greatest collection of human knowledge ever captures on video because it can't seem to find anything new for me to watch, even with hundreds and thousands of hours of content created every hour.


What's impressive to me is that YouTube Recommendation are so ... stale.

I think part of it is them removing certain content from the recommendations. I would be happy with a chronological order of all videos from all my subscriptions. Perhaps with groups of subscriptions, so I can make a group of comedians, a group of nerds, a group of girls dancing, etc.

The real problem is that youtube is still so much better than everyone else at just playing videos. How has it been 10 years and there is still no real competition in the realm of usability. Even most paid services are not as good. I think a bunch of the streaming services should pool their resources and come up with a good open source player with libraries for all the major frontends and backends. Instead of all of them poorly reinventing the wheel over and over.


The fact Google makes the browser and the OS for most users certainly puts them at an advantage.

Video is much much more complex than a .mp4 file on a server - the effort required to get the first frame from the server to the users screen as quickly as possible, and then not drop any frames after that, is pretty high to begin with, but it's even higher if you don't have control of the exact way the browser allocates CPU/GPU time for example...


> I would be happy with a chronological order of all videos from all my subscriptions.

Is this not the subscription view? I've never noticed a video from any channel I subscribe to that didn't show up there.


That's my experience as well. It will only recommend videos that I assume are popular globally, and old videos from channel I've watched in the past (including videos I've already watched), until I watch enough of something new and it suddenly switches to only recommending that. There's gotta be some middle ground...

I don't understand how the recommendation algorithm seems so brittle, given that advertising is their core business, and that I am manually doing so much of the work by subscribing to channels I enjoy. A simple feed of recent updates from channels I subscribe to would be an improvement; in fact I think I mostly find videos to watch from Twitter.


The YT algo used to be crazy and recommend insane stuff and you could learn from it and do a deep dive, a lot like Wikipedia. Not anymore.


> What's impressive to me is that YouTube Recommendation are so ... stale.

There isn't any more "depth" to recommendations anymore. Right now it's the same recommendations over and over again. Lots of celebrity, corporate media, etc.

When youtube wanted to grow, their recommendation was so good. You could go down that rabbit hole forever really and actually discover content/creators. Now that they have a monopoly position, it's about limiting options and guiding you to established content.


Not just the same categories, but videos I've seen before and videos it's been "recommending" for months that I don't watch.


[semi-speculation presented as if I actually know]

Youtube recommendations are best effort. That effort has a computational budget and a time limit. The fastest and least computationally intensive recommendation is to pull the last set of recommendations out of the on-device local cache.

The next fastest thing is to recommend whatever is cached near the edge of your internet connection whether it fits your previous viewing or not. It generally does not do this. Instead it recommends edge-cached material with cosign similarity to your preferences. However, there might not be much of that if your preference is for outlying content.

The slowest and most computationally intensive recommendation would be to compute a new tensor of your preferences and run cosign similarity across the entire vector space of Youtube. It doesn't do this for obvious reasons even though that's what we all want.

Of course, this ignores the issue of Google revenue. Advertisers targeted you, target you based on the videos you've watched. So there's that too. Google recommends videos for which it has relevant advertisers.


With most sufficiently large companies it all boils down to making money (which usually is why they have been able to grow so large to begin with). It is a very fine line between making as much profit as possible from users and not exploiting them so much that they leave.

Google does not have that much interest in serving new, interesting videos over what earns the most clicks and therefore advertising. A lot of users might also be pretty fine with what is already served. Although I have seen more and more alternatives popping up, several with a monthly payment plan instead. While free content is nice, a paid plan incites different business practices than one where revenue is closely coupled with the number of clicks.


>Of course, this ignores the issue of Google revenue. Advertisers targeted you, target you based on the videos you've watched. So there's that too. Google recommends videos for which it has relevant advertisers.

I think this is exactly the problem.

Contrast this with the recommendations I get from my paid spotify account where I don't get ads: It's absolutely Excellent. Seriously, whoever worked on Spotify's recommender system algo deserves a huge pat on the back. They have the best recommender I've encountered anywhere on the net, and I pay a lot of attention to recommenders. It's just light years ahead of everyone else.


> (you would think if I didn't click on the suggestion in the past 6 months it would adjust).

This applies to YouTube music as well. I find this incredibly annoying.

Know what's even worse? Searching for something, then liking that song. On my end, that generates an incessant stream of titles from the same artist that no amount of downvotes can stop.

YM sees the pattern +--------------------- and yet continues to predict +.


Accidentally clicking on an ad from the President's re-election campaign made YouTube think that I'd be interested in watching videos of different people getting destroyed by facts and logic, or watching sweaty people get really angry at pop culture for some reason.


Systemic fixes are needed. In the meantime you can improve your own experience (and hopefully others a bit too). When you hover over a recommendation, there's a menu icon to the right side (three vertical dots, gray). Clicking it opens menu with option to indicate "Not interested" and "Report". The former is supposed to train your own instance of recommendation algo, the later does the usual reporting.

According to the current scuttlebutt lore, Youtube makes a concerted effort to steer users towards authoritative sources (typically the big media brands), especially on spicier subjects. This is in response to the recent slew of articles accusing Youtube of user radicalization.


Oh I've been doing that... but it's a lot of tedious manual work (5!!! clicks per recommendation)


Not interested doesn't work. One or two weeks later it 'forgets' your preferences and someone else uploads the same content as a clip so it gets suggested too.

Googling around shows there used to be a 'Don't recommend this channel' button which would be awesome to have back.

I find it very annoying listening to a two hour debate, understanding difficult topics and then disagreeing with the speaker and being unable to block content from said speaker.

There's a bunch of one-topic people pushing their opinions out to the youtube user base and it would be great not to have to see their face in 30 different suggestions after you find yourself deeply disagreeing with their opinions or you're just done with their conversations after hearing all of their views.


> Googling around shows there used to be a 'Don't recommend this channel' button which would be awesome to have back.

It's still there on my Chrome desktop or Youtube app on android.


Those don't do shit. I tried getting it to stop recommending me fucking Jordan Peterson and Ben Shapiro videos for years, but I guess I just can't escape the angry 30-something category.

I think Youtube's recommendation algorithm is an abomination that has caused real harm to society.


Same here! God forbid I watch a video by someone who's ever been in front of a camera with those two guys, I'm immediately pushed into a rain of suggestions.

Try watching a comedian that's slightly edgy like bill Burr, you get immediately showered in videos of people "showing feminists what's what" or "shutting up liberals". I'm not even American, not right wing, but YouTube seems to try its hardest anyway.


The rage is strong in those.


I'm afraid this, too, is because it works. I don't have a specific example at hand but multiple times I've seen people say something like "youtube kept recommending me this so I finally watched it".


I honestly don't understand this either...

Youtube knows exactly which videos I've watched and which I haven't.

Why does it think it's a good idea to ever recommend videos that I've already watched?


I had the same question!

On mobile YT suggested me to watch an old video from a channel I'm subscribed to. It's a good video but since I have watched it already (with my account logged it) and remember it who cares about it. This one time the interface asked me if it's a good suggestion! So I clicked the bad suggestion link, and then the app asked me to categorise why is it a bad suggestion. All the options were about the video itself. The categories were "overly sexualised", "gory", "boring" etc. There was no option to say "I have seen it already you silly". Way to set up a feedback system to get no useful information out of it.


I think there's a category or two where it makes sense:

- fitness videos

- music

- how-to & instructional videos


Same thing for my 'google news' app on my phone. If I ever (god forbid) google a persons name, I get stories about that person for months. I also get the same results (mixed into new results) for days and sometimes a week at a time. Are they really that bad at recommendation that they can't find another option that ranks higher in my "interest" score?


The recommendations are designed to be videos you will most likely click on...not necessarily match your current interests.


Comparing my TikTok "For You" page on an account that doesn't follow any other accounts with my YouTube Home on an account with tons of subscribed channels that are relevant to my interests I'm astonished at how fucking terrible the YouTube recommendations are. Absolute dogshit. Awful.


I found Youtube recommendations are much better than recommendations at other places. Eg. Netflix. When I say recommendation, I mean - "Helping me with discovery of new stuff". They also have a problem - they keep showing what I already have seen - I don't know - whether this is by design or a bug.


Try clicking novel suggestions from the home page. It will start altering your profile in a few days. Also don't do this from a browser session tied to a Google account. There will still be a profile attached to the cookie but it isn't permanently linked to you.


Couldn't care less about this. Just put out a paid version of Firefox that dedicates this revenue stream exclusively to continued Firefox development and I will happily pay fist fulls of money for it. Seriously, Firefox adds so much value to my life and I'd happily pay for it.



I think the "exclusively for Firefox development" is pretty important. Mozilla is doing a lot of random things these days


>Mozilla is doing a lot of random things these days

Yeah, like paying three million to its CEO annually. That's not going into Firefox development alright.


The one who no longer works there? And how do you think they would build and manage a development team without executives?


>The one who no longer works there?

As far as I know, Mitchell Baker is still Mozilla's CEO. And even if she resigned, maintaining her salary would have been at odds with keeping many employees from being fired.

>>I criticize unreasonably high salaries on a nonprofit organization that just laid off almost three hundred people

>"you are saying that they can build and manage a development team without executives"

If you want to engage in a discussion or a debate, at least be honest about it.


Mitchell Baker began as CEO in December, after the resignation of Chris Beard. Was permanently appointed CEO in April. Her salary for the position has not been released

Mozilla is a nonprofit, not a charity. Their executives already receive less than market rate for similar companies, by some accounts far less. They can't offer nothing.


I specifically want to fund Firefox, not this kinds of research. I will also pay something for continued Rust development.


Same. Firefox is a great browser.


Except for the memory leaks.


Chrome eats a lot more memory for me than Firefox.


The difference is, chrome allocates what it needs upfront, and that's all it uses. Firefox keeps increasing its usage.

That means that when I want to leave a website open on my server for a month, I use chrome because I know exactly how much memory it's going to use. If it was firefox, at the end of the month even the swap would be filled.


If you try to open something else after that month, does your computer crash or does Firefox happily give up the memory?


The only way to free up the memory is to close firefox. Closing tabs doesn't do anything. I can 'killall firefox' and then reopen it to the same 50 tabs, and the memory usage after they're loaded again will be several GB lower.

I don't know if you count memory exhaustion as a 'crash' per se.


Does opening another program cause system instability? I don't know how you are monitoring it, but it sounds like Firefox is making use of the available ram for a cache. There's no real difference between cache and free memory, so using more is only a good thing.


Yes, it does. Monitoring is easy with htop - just look at the ram and swap usage. Leave FF open too long with too many tabs, and even the swap starts to fill up and the whole computer gets laggy.

This is a well-known problem. What's hard to believe?


Semi related: there's a browser plugin[1] available to "de-mainstream" Youtube. It removes from recommendations the well known channels of the big name media brands[2]. You can add more to the list.

--

[1] https://demainstream.com/

[2] https://github.com/miscavage/De-Mainstream-YouTube-Extension...


This works pretty well (I use it) because it removes recommendations and search results (which are really recommendations) that are artificially inserted/ranked at the top every single time.

That said I don't think it's what Mozilla and many commenters here are looking for, they're looking for banning content they disagree with (i.e. censorship).


> they're looking for banning content they disagree with (i.e. censorship).

That's an extremely unfair read of the situation. The extension doesn't even ban anything. It is a research project attempting to understand why the YouTube algorithm recommends the videos it does. Whether or not you think Plandemic (a nonsense conspiracy theory video) should be banned or not you can still see research into the YouTube algorithm as a worthwhile endeavour.


Literally the first sentence of the article is about dangerous recommendations. And the second is about harmful content.

This is not just an open-ended inquiry of what youtube is recommending. It's about finding the bad and getting rid of it.

EDIT: I also want to remind people about how Mozilla used its push notifications to call for a Facebook boycott over objectionable content.


> YouTube recommendations can be delightful, but they can also be dangerous. The platform has a history of recommending harmful content — from pandemic conspiracies to political disinformation — to its users, even if they’ve previously viewed harmless content.

What part of this is untrue?

And again, you're saying "getting rid of it" without evidence. It's a study. It does not mention banning anything, anywhere. If the argument is that they should not study this because it might lead to bans in the future then you're being pro-censorship in order to promote anti-censorship which doesn't make a whole lot of sense.


Having tried to get rid of things in the past is evidence. The very loaded choice of words they use is another.

I'm not saying they should not study this thing. I am just discussing their aims, because I think it's interesting, and it may inform others' choice of whether to participate or not.


> What part of this is untrue?

The characterization of pandemic conspiracies and political disinformation as harmful.

It's like kids and allergies. You never hear about kids who grew up on farms being allergic to animals - it's always those whose parents didn't have any when they were growing up.

If people aren't subject to misinformation, they'll never develop the sense of who's lying and who isn't.

It used to be that we gave common-sense advice - "don't believe everything you read on the Internet". Now, it's the other way around - "we must cleanse the Internet of harmful content".

Being exposed to misinformation is good for you, and it's good for democracy.


"I never heard of it happening, so it is universally true that it never happens."

Meanwhile, the actual state of affairs is[0] that there's one allergic farmer child per three allergic non-farmer children. Don't believe everything you read online.

[0]: https://pubmed.ncbi.nlm.nih.gov/11048766/


A few months ago YouTube was flighting a feature I liked a lot. The homepage would group my interest into named categories "Cute dog videos", "Strangeloop", "Debussy", "PyData", "Algebraic Topology" and so on. I really liked that because I could choose which deep end to sink into. Not just that it would also serve as a reminder on what I had been interested in, but have become too distracted to remember.

They seem to have stopped this and now the recommendations featured on my homepage are all over the place.


Same. It was refreshing to see options for computer science, electrical engineering, etc... instead of the usual stuff. Found a few new channels the day or so that feature was up.


I still see these category groupings on the YouTube Apple TV app today. I agree, they’re a really good way to browse.


I'm pretty sure they're still testing that feature. I see it show up occasionally.

Interestingly, I think there are a couple of variants for the number of videos to display on the home page, too.


How can I get myself on the better side of their A/B test ?


Unfortunately, I don't think there's any (public?) way to force it. I've seen the feature disappear or reappear on a reload -- it's not consistent across a session, let alone on a user-by-user basis.

Try refreshing the home page until it shows up, maybe?


At youtube.com/new, they say you can opt-in to experiments if you have premium, but I don't have premium and am not sure if that is one of the options.


I'm always mind-blown reading comment threads like this regarding recommendation, with people annoyed that explicit feedback helps recommenders, or exasperated that they're recommended one item that they haven't approved while also wanting novel recommendations.

1000s of hours of videos are uploaded every minute. 5+ billion videos are watched daily. Are you not rather quite pleased that YouTube can deliver highly relevant top-20 recommendations within milliseconds from such a large and changing corpus, while keeping track of your past preferences, your interactions, your social circle's preferences, popularity and trends, content features, etc?

It's pretty amazing to me. I think people expect recommender systems to read minds.


Consider that a lot of the complains are that YouTube constantly recommends already watched videos, instead of new and relevant ones. Then, if a user watches a single video on a topic outside of their usual interest scope, YouTube replaces any semi-relevant recommendations with with that new category, regardless of if the user is actually interested in the topic.


> Then, if a user watches a single video on a topic outside of their usual interest scope, YouTube replaces any semi-relevant recommendations with with that new category, regardless of if the user is actually interested in the topic.

That's the way it used to work, now it adds that new category to the list of categories it thinks you like (top of the home page), and you can filter or include that category.


The complaints are due to a significant decline in the quality of recommendations. It’s like that scene from 2001 where HAL is slowly lobotomized by Dave.


Since when, do you think? I personally think that they're better than ever. On the other hand, I think they moved to more deep learning approaches over the last 3-5 years (based on their research output at least) so perhaps recommendations are perceivably different since then


I’d say more within the last 3. Definitely in the last 2. It’s much worse for me, just repeats the same channels and categories.


YouTube recommendations, for me, are the greatest music discovery resource. I've found more new artists through YouTube's algorithm than Spotify, Pandora or Amazon Prime Music.


YouTube recommendations suck so hard.

I'm subscribed to all this stuff but it never seems to give me similar content, it recommends all the videos I've already watched.

If I happen to watch something new, like a People's Court segment, then all I see is People's Court stuff.

There never seems to be balance.

Then there's the problem where a new video (like Gourmet Makes from BA) gets posted on a channel I like. I'll watch it with my partner on his machine, but then YouTube just chokes up, wondering why I never click on it, and then shoves it in my face ad nauseum.

Like, their search engine doesn't even seem to scratch the surface of content they're holding. It seems to be really stupid machine learning or AI.

Why can't it show me balanced recommendations from everything I watch or subscribe to, toss in some new things which are similar/popular, and maybe differentiate between content likely to be consumed multiple times (music videos) vs stuff people don't often watch a rerun of?

They seem to have so much data they could work with. They have amazing engineers and Google's expertise in algorithms.

Facebook even had problems with news feed being overrun with low quality content early on but seems to have figured it our fairly well- at least in my experience on the site (I know there are big echo chamber problems over there as well, don't get me wrong).

Even a lot of my non-techie friends seem to complain often about how terrible YouTube's recommendations are.


I feels like YouTube's recommendations were better around 2010 to 2015. I think back then they recommended videos based on the tags a video had, seemed to work much better


I'm assuming there were less people adding fake tags to earn more views then as well.


> recommendations can be delightful

They are really not. I struggle pages after pages to find something interesting. And then days later i stumble on a link that has interesting stuff randomly. Even search is bad. And there's always the knowledge that google is censoring stuff and manipulating people.

There's definitely space for better youtube/video recommendation site.


This is what you're gonna do with the money saved from the firings?


No. This is the Mozilla Foundation not Mozilla Corp.


I don't know why people pretend like the distinction matters.


In terms of where money goes, it does. Mozilla Corp both makes and spends vast sums of money (browsers are fantastically expensive).

Mozilla Foundation, in contrast, makes and spends relatively little compared to that. Seriously how much do you think this crowd-sourced project is costing the Foundation?


Because they want to pretend that Mozilla aren’t throwing money down the drain.


This claims that YouTube recommendations are opaque, but often YouTube cites that viewers of a specific video like another. YouTube also tags content which might be extended to note related or contrasting content or to map how videos that share a tag relate.


I previously used my partner's Google account to watch on YouTube and it had built up recommendations based on what I previously watched since I didn't use the subscriptions, but one day it all just disappeared and only showed recommendations from my partner's subscriptions. I'd prefer if they implemented a user driven exploration system rather than attempting to guess what I want but it must be useful to be able to change how it works without modifying the user interface. I eventually gave up and used my own account with now 32 subscriptions and it's better than it was although it shows a lot of videos I've already watched. It throws in a few videos from other channels but it's mostly from the subscriptions.


facebook got muy upset when propublica used a similar approach (browser plugin) to surveil their ads ecosystem

https://www.propublica.org/article/facebook-blocks-ad-transp...


Am I incorrect in assuming that Mozilla’s goal was at one time to expand access of information to the world?

I don’t see any other likely outcome of this “research” other than a fresh new media hue and cry detailing chains of “wrong-think” that will lead to activists who masquerade as journalists calling for more content they deem unfit for the plebs consumption.


I guess I just don't understand why Mozilla would spend money on this? Google is under no obligation to care about their findings. Why not spend money in areas that will actually benefit people? What do they hope to gain by this?


It could build the case internally for them to build something like peertube or support it.


Fundamental flaw with this approach. Since I don't click on or watch conspiracy theories, none of my YouTube recommendations are those. But I have seen people watch incendiary and conspiracy theory videos exclusively on autoplay.


Was trying to find the source for the extension, looks like the link was in the privacy notice. Github link here for those who are also looking for it: https://github.com/mozilla-extensions/regrets-reporter


Seems silly Why not just build a better (user-configurable) recommendations engine?


Over a database of videos they don't own? That changes rapidly in ways they can't control?


I don't think it's a bad suggestion, though their plan is most likely to influence people to complain to Google.

Same as politics, you want a maximum of people to be convinced by your idea/opinion/etc to make it happen faster. If you build your idea it might be great but if no one is convinced of it it won't work.


Mainly, because most people won't be exposed to it. They will be exposed to the Youtube's one and the effects over the majority are those that matter for the goals of the study.



This must be part of Mozilla's new focus on profit.



I use a ublock filter to block YouTube recommendations. I think they're unhealthy.


There are a few good "un-frak your youtube" extensions as well. I've got mine set to disable autoplay, trending, chat, "related" at the end of video playback, and recommendations. Helps quite a bit.


Seconded. Here are the filters I use for blocking recommendations on YT, in case anyone else finds them useful:

    youtube.com##.ytp-suggestions
    youtube.com##.ytp-pause-overlay
    youtube.com##.videowall-endscreen
    youtube.com###related
    youtube.com##ytd-browse[page-subtype="home"]


[flagged]


Oh no it will survive. Yahoo-ificating itself in the process.


[flagged]


Mozilla is quite clear about that. They consider Firefox as a political (yeah privacy is a political stance) product like another. A costly one.


There is politics as in FSF / GNU and there is politics as in Mozilla. Lot of mozilla activism is not relevant to privacy / open web. Not to mention it also limits many people from donating to them.


Why does privacy have anything to do with conspiracy theories and “disinformation”. Privacy should in fact be about respecting the individual and choices, and believing in people being intelligent enough (as groups) to make good choices.

Basically the opposite of this pro censorship Mozilla stance.


Maybe because rationality could be exercised only if a person is given diverse high-quality information. Being tracked means being figured out, put in a Turing box, and utilized in accordance with someone else's agenda. This research seems to attempt observing how digital bubbles form and how they could be circumvented.


Recommendation bubbles are partly privacy matter -- without the ability to track users, there would be no recommendation bubbles. It may sound like a far-fetched possibility -- how to maintain privacy of logged in users? But I figure it is doable from technical standpoint; the key question would be how to transform the financial part of the operation to keep Youtube afloat.

At this point, even the awareness of recommendation bubbles and their shape & size can help quite a bit. It's a valid research.

It grates me that Mozilla seems to take very one-sided view on the politics of it, but hey, it's a start. Hopefully others will follow to counter-balance the Mozilla's slant.


You can recommend just based on content. I hardly ever open the youtube homepage. And often open links in incognito mode. Still get to see all these recommendations next to a video.

I don't see how you could not have a "recommendation bubble" as long as you have recommendations. Even if they were to be 100% manual it's still a bubble of some kind.

Not only that, you effectively have these bubbles outside the internet too. A book will recommend further reading, friends will recommend certain things and so forth. There's no escaping it.

When I go to the YouTube homepage these recommendations are absolute junk. Links friends send me are much more relevant and influential.

IMO the most charitable reading of this initiative is that they believe the algorithm is somehow biased towards exposing and warming people to harmful content in an unobvious way.


Another name for recommendation bubble is sub-culture. Google has decided that there should be only one and the results are just as inane as a high school clique taking over the school newspaper.


Censorship? It seems to me that they want to use the data for research.


Yep: this project is funded by donations, and donations tend to favor the politics of the donors.


Crowdsourcing censorship!


I'm sorry, but the crowd has run out of money. Try the people that print and take it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: