It's crazy to think that clicking on the "not interested in videos from this channel" button previously did not remove videos from that channel in recommendations. I used to press it up to 5 times on the same video and it still did nothing.
I click on "not interested" when it shows videos i've watched the day before, but then assuming i don't want to see videos of a similar type would be a mistake, yet that's how broken the system is. Any attempt to "train" the algo is wasted effort based on past experience. You could say we users have been "trained" by the algo not to even bother anymore.
My approach is to vote on videos I've watched, and just skip it if I've already seen it. It works for me. I'm concerned that I'd see less relevant content if I start saying I'm not interested in previously watching interesting videos.
> I'm concerned that I'd see less relevant content
I used to have that concern as well, but then I realised I didn't actually get that much value from yt recommendations. The stuff I want to see comes from channels I'm subscribed to and only very rarely have I discovered something good from recommendations - most of the discovery comes from the right-hand side video column.
> I didn't actually get that much value from yt recommendations.
The last rec engine I got much value out of was early Netflix, around the first public competition to improve their engine.
My nightmare is that corporations suddenly realized that recommendation engines don't actually help them sell content. Maybe it's easier to persuade tons of people to watch something they'll all think is "good enough" rather than to identify or develop content some tiny subgroup of people will think is sublime.
I think this arises because we can't really model intentions as they involve irrational, subjective, qualitative preferences. .. That's to say, I completely agree with you.
To further the point, the algos are training YT admins based on suspect user metrics.
Did anyone ask for; "What did you think of this video?" 5-star feature, or "Recent Posts" ie. "Hi guys, i know i haven't done a video recently blah blah".
It appears they are repeating the same mistakes made on G+.
The android app "Google Opinions Rewards" occasionally asks me to rate a video i've recently watched, then follows it with a binary choice of recommendation, which I subsequently have to rate the quality of the recommendation.
I'd assume that data is being used for training somewhere.
Also helps to clear your viewing history (Youtube's "History" sidebar tab) every once in a while when you're tired of seeing the same recommendations.
For example, I kept seeing Starcraft 2 replays on the homepage after watching the AI vs human tournament which kept helping me procrastinate. I don't even play the game nor subscribe to a single SC2 channel.
Cleared history, started off strong watching some of my language-learning channels, and now my Youtube homepage is pretty productive.
I've turned off my view history, but my recommendations are bloody awful. YouTube knows what channels I subscribe to, it knows what I click like on, it knows what video I'm currently watching, but it still serves me recommendations that are apparently totally unrelated to any of that.
> it actually does this on purpose, if you've watched it once there is a good chance you will watch it again.
Bullshit.
If I start a random video and let it automatically progress to the next one, I will have to watch it (or parts of it) to tell if I’m interested or not.
This is particularly annoying for Youtube music where starting a radio from something you like now is guaranteed to add new artists to “your artists” because you listened to them once, only to click “next” or “not interested”.
Young kids love the ability to predict what will happen.
An example of this is Blue's Clues, initially producers/scheduling were afraid that by having reruns in the same timeslot would doom the show. Instead kids being able to watch the shows again was a great boon to their target audience's enjoyment.
I’m confused. What was this feature for, then? Just to let off steam when they suggest terrible videos?
Another infuriating thing about that button is that it visually overlaps the video itself (at least in the iPhone app). I don’t know how many times I’ve fatfingered it and started playing an upsetting video rather than report it.
I thought for a moment they were going to turn off recommendations. For users not signed in, they are on by default.
Instead this is like asking users to click thumbs up or down in response to an ad, "Don't show me this ad again" or fill out a survey about the ads.
The issue is not that the user is annoyed by a specific toxic video. That is only the effect, not the cause. The issue is that Google's recommendation system is, as they say, "broken". Either fix it to stop promoting "popular" garbage (which is "popular" because it is auto-recommended and auto-played), or turn if off by default.
This is another example of an "opt-out" that most users cannot be bothered with. Instead of sane defaults: recommendations off, auto-play off.
If Google must force users to interact with a particular video and click something so they can collect another data point, then a more sensible approach would be "opt-in". Let users decide whether they want some recommendations based on a particular video they manually select. "Show me recommendation based on this video"
Just more dark patterns. If a choice was one users wanted (recommendations on, auto-play on), then the way to test that is to make them choose it. Instead, Google makes the default choices that benefit Google and requires users to change settings. Often forcing them to sign-in and be tracked in order to change a default setting.
> Either fix it to stop promoting "popular" garbage (which is "popular" because it is auto-recommended and auto-played), or turn if off by default.
Or maybe people like content that you don't? People aren't drones who watch something because it was shovelled into their watch next. Chances are it was decent content. For them.
If you consider this a dark pattern then you probably need to get a better perspective. There are far worse actors online.
> People aren't drones who watch something because it was shovelled into their watch next.
Actually, in this context, they are. If YouTube were content with people using recommendations as you describe, they could simply keep the sidebar list where you could click on one if you're interested. (or the recommendations that pop up after a video is played, or the home screen or the seemingly half dozen other places they show recommendations)
Instead, they implemented Autoplay mode, which basically works exactly like the force-fed drone model you described.
Easy: survivor bias. A lot of people are using youtube, and people who get good recommendations don't have any fuel to do memes. So, even if the amount of people who get bad recommendations is minimal, the meme would still exist.
This doesn’t really solve some of the more concerning things about recommendations, which is the gradual ease into more extreme content. i.e., “super Mario speedrun” to “Top 10 speedrunning fails” to “Super Mario speedrunner gets owned” to “Why women shouldn’t be allowed to play video games” or other such hogwash.
Was just talking to a family member recently about this. Her daughter is into horses and so is watching a lot of showjumping videos. Apparently the algorithm keeps trying to lead her off towards increasingly bad showjumping accident videos and from there basically into /r/watchpeopleandhorsesdie
My mom likes watching Animal documentaries. The recommendations have completely gone out of hand. Showing things like very explicit videos of animals having sex, wild animals eating people, and other terrible things...
Also with videos about cute / funny babys. It slippery sloped into explicit birth giving videos and other almost pedophilic things.
I had to manually delete every video in her history to fix that.
Daily reminder that the internet doesn't absolve you of actually parenting your kids, and keeping and eye on what they browse on the internet. This isn't unique to Youtube, there are all sorts of bad places kids can end up if left alone on the web.
The point of emphasis in this particular situation, though, is that the recommendations are actively encouraging bad content that the kid wasn't originally looking for.
It can also actively encourage good content that someone wasn't originally looking for. This is one of the things I most love about the internet. XKCD did a good job highlighting this in "The Problem with Wikipedia"
The problem we have is determining what is good content and what is bad content and who is the arbiter of what is good or bad. Something you think is bad, might be something someone else thinks is good. These determinations are highly subjective.
Is it bad content? That's subjective. They are suggesting popular content with similiar keywords and using the data that it is popular mixed with no flags to deem it safe to recommend.
I don't think she's in much danger at this stage. It's no different from a kid wanting to watch videos of racing cars - they don't need to see the possible gristly consequences unless they're thinking of actually taking up street racing or something.
While I can understanding wanting to not watch such videos because how the events are covered in such videos. Mainly to gawk at grevious injuries/incidents.
I do not like the idea filtering reality. A video could instead be if you do stupid thing X this happens. Then again it also depends on the age and maturity of the kid/teen. You dont want to terrify them or scare them wanting to try or learn new things. However learning from others mistakes is better than finding out first hand yourself. Like I said most videos on youtube are not an analysis of mistakes that lead up to an event.
Then you just have unfortunate accidents/events which don’t really tell you more than sometimes life can be brutal.
Although, I think youtube filters violent stuff unless your account says your over 18, and removes really disturbing stuff outright.
But it is filtered. Just the other way around. The most horrible content gets morbid curiosity clicks and the normal stuff is under-represented, filtered out.
And sure, you as an adult should be able to watch whatever you want. But we're talking about little kids here, It's OK to shield them from the more horrible aspects of life. Let them be kids for a little while.
At this point I'm pretty sure third parties understand the algorithm a lot better than anyone in YouTube; this would match my experience in MMOs where groups like GoonFleet would reverse engineer hell out of the games to win.
To be honest, this looks like a CYA feature to get out, not like something that is actually intended to be used.
They took pains to give the "follow the recommendation" action the most frictionless path possible in the UI - just wait and do nothing. In contrast, this new feature to dial back recommendations is a buried in an obscure menu only reachable by clicking a button labeled "..." in a counterintuitive location. I can think of few ways how you could increase the friction any more.
Are you proposing that they don't autoplay anything that you haven't explicitly opted in to? Discovering new creators is pretty important to both Youtube and its users. Or do you want it to autoplay something that's not based on any personalized recommendation? That would reduce the value of Youtube over TV in that it caters to niche interests.
Actually yes, I would propose that you opt-in to Autoplay as a whole. Make it a separate function that people can activate if they want to binge watch or whatever, but keep it off by default.
I have seen a good number of people who struggled with the Autoplay mode in presentations, either could not stop it in time or forgot about it and then had their presentation derailed by the next video. I've seen exactly zero people who actually wanted to use it.
Of course this is wishful thinking when the CEO's performance metric is to keep their users glued to the screen as long as possible.
> The move comes after Susan Wojcicki and other YouTube executives were criticized for being either unable or unwilling to act on internal warnings about extreme and misleading videos because they were too focused on increasing viewing time and other measures of engagement.
It actually does. The desktop website has a permanent setting. Mobile website has the setting, too, but occasionally resets it to Twitter’s “Top tweets” sorting.
I'd also appreciate if it went back to recommending videos similar to the current video I was on instead of having the same recommendations on every video now.
No kidding. You can hit refresh as much as you want and it keeps showing the exact same recommendations. Very annoying, but not nearly as bad as all of the censorship going on, imho.
This change drives me nuts! Earlier, listening to a song brought up similar songs. But now the recommendation list is filled up with old videos that I've already watched and have no relation to the song being played. Why would I want to watch a Paw Patrol cartoon after listening to a sad Bollywood song?
That's the worse - share a computer (or some google account) with someone? Well you get to see all the stuff YT wants them to see also.
YT is useful, but a trainwreck I simply don't allow my family to watch. It even got my Dad (who's aged/infirm) to become paranoid due to the push-to-extreme-content problem. Best way to solve? Remove YT from his TV.
Or at least keep it in the "watch again" section. I appreciate it from time-to-time for certain music videos and think "sure, I'll listen to that again".
This is a flaw of the fundamental nature of the service. If it were a clientside application, keeping track of your entire viewing history forever would be trivial. But because it's on a remote server, keeping everyones history forever would be very difficult. I would guess they're probably using a bloom filter and just not dedicating enough resources to it per-user. Or, maybe even more likely, they ran 1 experiment 1 time on a few users and discovered that re-watching previously watched content was popular and extrapolated from that - ignoring that their sample size included a great many children (since they outnumber adults) resulting in them just expecting everyone to act like children.
Is there a way to hide particular channels from search results? There are a handful of channels that post videos I have absolutely no interest in, but they do really well in YouTube's search algorithm.
This is what I really want ... the ability to blanket block certain channels. I have zero interest in Joe f’ing Rogan... but apparently everyone else in my marketing profile does nothing but watch his videos. No matter what I look for, or what I’m watching, there’s some goddamn Joe Rogan video in my search results or recs.
Nope but they give you an option to block people from commenting on your non-existent channel which is the most useless feature for 99.9% of the users. I just don’t understand why Youtube refuses to add this very simple feature.
The main thing I want them to do is not to keep repeatedly showing me the same videos I've already watched. They have little bars under them that show I've watched them all the way through, it shouldn't be that hard.
I'm really confused as to why they do this: there must be hundreds of people and hundreds of millions of $s working on the YouTube algo. I'm seemingly in the minority that thinks it's largely brilliant, it regularly uncovers things that are related to stuff that I'm interested in but new takes on them, and sometimes throws up some absolute genius recommendation from nowhere. However, like you, my "Up Next" selection is plagued by videos I've previously watched! It's horrible UX.
I have two behaviours; TV-program-style videos I'll watch once, and music-style videos I'll watch repeatedly.
My theory is they have difficulty telling the difference, so they think the fact I've watched "80s-90s Hip Hop & R&B Playlist" repeatedly indicates I'd also like to watch "Plumbing bottle air vent teardown" repeatedly.
Agreed, I click through them once in a while when I'm bored and it consists mostly of stuff I've already watched mixed with stuff I have chosen not to watch from channels I follow combined with random crap. The most annoying thing is that it just fills your feed with stuff similar to whatever you recently did. This is made worse by the fact that they only show a handful of recommendations. The only way to get new recommendations is to either wait hours/days for the stupid stuff to go away or to engage with their fiddly "this recommendation was BS" system for getting rid of recommendations. Google news has the same "more of the same shit" masquerading as machine learning style algorithms.
On a positive note, I like what e.g. Spotify is doing in some places and I think Youtube can learn from that. They provide some great content discovery mechanisms. Curated playlists, users that listen to artist X also listen to artist Y, these are the top tracks for this artist (based on play statistics), etc. Very useful. Very low tech. Very easy to implement.
If only they'd stop confusing geo location with my interests. E.g. their release radar is full of German crap because I happen to live in Germany (I'm actually Dutch). Youtube does the same "you are in Germany! Here's some random German crap!" style promoted content on their front page. Most of it is very obviously not even remotely close to anything I watch. Location seems to drown out recommendation signals and all forms of common sense. It is used to push locally promoted content regardless of the not so subtle hints that I don't ever engage with that (my German sucks; why would I?).
There are hundreds of millions of migrants world wide, many of them have spending power and they are ignored and under served by most big tech companies (Amazon, Netflix, Spotify, Apple, Google, etc.). This is surprising because they actually employ a lot of these migrants: they should know better. Getting content in the right language vs. the local language, localizing promoted content to the local language instead of what the user actually speaks and consumes on their service, etc. It's not hard and it's not because the content isn't there but because the algorithms are optimizing for the local lowest common denominator and most of these companies shoot.
I kind of disagree on what you try to accomplish in your second half. If you want to live here, you should be exposed to local culture and language, and have a somewhat hard time retreating into your own, separate filter bubble. Sure, Youtube could find more effective ways to achieve that, maybe by more gradually introducing you to it (especially language). But with their current capabilities, I'd vote for them pushing location based crap over not doing anything.
ps: So angenehm wieder nen Tag mit halbwegs überlebbarem Wetter zu haben.
Why should I be denied access to content I might be reasonably assumed to enjoy? What kind of weird argument is that? From my point of view it's just a good example of broken assumptions in what is obviously a not very sophisticated way to promote and recommend content.
I am not sure where you've seen anyone advocate you to "denied access to content". We are talking about a recommendation system. Those are pushing content, and as such include believes, values, etc. This makes companies have a responsibility on what they advocate for and what effect that will have on our societies. If you're talking about opportunity cost: Well, an imaginary flat earth'er or science denier would say the same about getting recommended stuff beyond his bubble.
Maybe I notice it because I use it more now, but it's shocking how bad it's become. I often get a heavy stream of suggestions for: ancient videos, and videos I've already watched. Throw in some clickbait and outrage focused content.
Then it seems to try to make me more extreme in some way (flat earther, hardcore gamer, etc) based on perhaps one partial watch of a semi-related video.
Surely there's some folks at Google who can see the raging dumpster fire this has become? I realize it's hard given all the competing interests (esp. advertisers), but this is horrendous.
As bad as it is, Netflix's suggestions are almost worse. I think these companies are optimizing on metrics that (ostensibly) show an improved experience while in reality they're destroying almost all user enjoyment.
Is it really a good thing that I "engage" longer with Netflix/YouTube trying to find something to watch with their terrible suggestions? (vs quickly navigating to a better video?)
>I think these companies are optimizing on metrics that (ostensibly) show an improved experience
They're optimizing for highest profits. For Netflix, that means showing you videos with the lowest licensing fees. For Youtube, that means showing videos where the uploader has paid to boost it.
I think the uploader doesn't even have to be the one paying now. If I remember right, Coke can pay for someone's positive Coke related video to be shown more and it bypasses the sponsorship disclosure requirements.
> Surely there's some folks at Google who can see the raging dumpster fire this has become? I realize it's hard given all the competing interests (esp. advertisers), but this is horrendous.
I'm sure bad recommendations are an issue for some people, but for me they aren't that bad. I've found some pretty awesome stuff I probably wouldn't have found otherwise through it, and I'm pretty sure even my worst recommendations are better than what I see opening youtube.com in an incognito window.
We still get recommended extremist videos on YouTube with no way to opt out, unless we create a Google account. Casual visitors will continue to be exposed to that type of content.
Everything is extremism to somebody. Just because you class what you saw as extremism doesn't mean it's wrong for casual visitors to see it. YouTube already has its own (however capricious) standard for what to hide from recommendations or remove completely.
There is no way that "3 hours of planes taking off" or "3 hours of rain" are getting 10's of millions of views without getting recommended widely, including myself. 18 hours of "ambient sounds" has 130 million views so that's some heavy promotion, not just some occasional insomniac using search.
Which is kinda my point. The amount of HD traffic and wasted bandwidth is just insane. At what point would YT just acknowledge the same viewer is watching (or listening) the same video over and over and over again before offering a download button? (sure tech ppl can use 3rd-party tool, but all are blocked on Google Play, and most others limited to short vids only)
Anything that wastes bandwidth, I see as a positive thing because it's pressure on networks to increase bandwidth. Once it's in place, using it is harmless since it would otherwise have been wasted serving nobody. Netflix forced ISPs in my country to pull finger when their customers would have regular evening internet slowdowns. I'm looking forward to more improvements as they cater to cloud gaming.
Now that's a video idea. I currently live right under the flight path for planes flying out of SFO; I should set up a camera and just record the sky, since they fly pretty close overhead.
It really depends. If its a silent video, yea I can see why its bogus. But, if its background noise, then yes its useful.
Sometimes when I have trouble sleeping, I search "white noise" and find videos like this. Usually I click on something like a 'fan blowing' video or a 'thunder storm' video. The noise/pattern combination somehow helps me sleep. It also helps me relax and focus while working.
I'd always assumed you could block specific ads or advertisers since there's a button on each ad with a menu for that (on the circled i button). But when I just tried it, it refused because it said personalized ads were turned off, even though there were turned on. So I guess that's just another weirdly broken pretend-feature.
Before I got YouTube premium I'd try that button on some ads that were annoying and repetitive (Grammarly) but it made no difference and the same ads kept showing.
This reminds me the days of free hosting w/ banner ads, then you'd get a whole month of surgical banners. I'm sure YT has good metrics on what level of discomfort will prompt certain folk to go paid.
For me it's not about pressure, I run an ad-blocker anyways. Youtube is one of the services I use the most in my every day life, and considering the amount of value I get out of it, giving 10$ back is the least I can do in my mind. I find that many people take online services for granted these days, and don't take a step back to look at how much value these services bring them. I get 50+ hours of entertainment per month out of Youtube, if that's not worth 10$, then I don't know what is.
Besides the protection racket vibes mentioned you are paying Google first and foremost not the actual content creators. I would argue paying for the hosting actually creates the wrong incentives and empowers the platform over the people whose work you enjoy.
According to Youtube [0], a part of the Premium cost goes directly to creators based on how much time you spend watching their channel. Also, you're not only paying for pure hosting, but for the millions of engineering hour that went into creating the platform. I understand average users not realizing this, but HN should know more than most how much work goes into implementing these services.
And going back to the ad discussion, if you compare it to traditional media, you get to pay more than 10$ and still have to sit through 2-3m of advertisement, compared to the 5-10s long ads on Youtube.
>Users will now be able to tell YouTube to stop suggesting videos from a particular channel by tapping the three-dot menu next to a video on the homepage or Up Next, then choosing “Don’t recommend channel.”
By "now" do they mean `eventually`, because I just tried looking for said option with no success.
As someone with a phobia for snakes I don't appreciate seeing such recommendations in my feed. I have clicked on not interested so many times but another snake video inevitably finds its way into my feed.
> While YouTube is introducing the feature now, this kind of tool is pretty common place on other digital services. Spotify Technology SA has a version for artists people don’t want to hear from.
Where!? I've been missing this Spotify feature for ages. I do not like Mastodon. I do not want Spotify to ever automatically put on Mastodon. It keeps putting on Mastodon.
I have a hard time understanding how youtube can screw this up so badly.
Spotify does it nicely. They clearly recognized the few different tastes I have. And then offer separate discovery-queues for each of them and let me decide depending on what mood/situation i'm in.
But that might be the main difference... they signal back their knowledge to the user. Steam kind does the same with "Is this game relevant to you?" info. More clearly telling me what youtube thinks about me would be the first step towards correcting it...
One of the things I hate about the current algo is that I get suggestions from freebooters who upload pirated content. Say for example, John Oliver airs at 9pm, by 9:32 his opening monologue is already pirated and at the top of my feed.
No matter how much I dislike, report, or ignore these videos, they always pop to the top. I don't view them, and never give them clicks, but because I watch every official John Oliver video, they get recommend to me.
Edit: I'm dismayed to see that you've posted many unsubstantive comments recently. Could you please not do that? As I said, the hope is for this site to be a bit better than internet default. We can't do that if established users don't take care of it.
I can understand wanting to hide politics. But JP is such a misrepresented character. He makes motivational content helping guys put their lives back together, nothing inherently bad or political.
Agreed. Each to their own, but I am tired of seeing JP recommended because I watched one Joe Rohan video once. I click not interested on each JP video, but still get recommendations. It is quite frustrating.
Hell, I'd even enjoy a filter that analyzes the audio and filters out anything that's below a given (low) bar of sensible discourse. I wonder how hard it would be to categorize the really awful rhetoric? It's very formulaic.
Maybe give me a slider like the old SlashDot days so I could fine tune it depending on my mood :)
At least for me, consuming any content related to my hobbies results in political content being recommended to me, even despite my avoidance of political videos in their entirety throughout the lifetime of my YouTube account.