> I want my search results to prefer [.....] other pages marked high for quality by people who [some algorithm] to this page.
Facebook/YouTube already do this and it is a disaster, which maybe even has indirectly contributed to rise of antivaxx, Q, Trump election, Capitol attack etc.
Before personalized recommendation systems, when an antivaxxer made a video, hardly anyone saw it, when they liked it, the like resulted in nothing.
When their likes started affecting their personal search results, liking one antivaxx video made search produce hundreds more of them. It finds so much of the same stuff they no longer have free time to check an opposing opinion.
You're operating under an assumption that your judgement is more correct or better for you, and the entire problem is about finding gems in an ocean of trash. The reality is more complicated, for you antivaxx content is trash, for them it is your content that is trash.
> my judgement is more correct for me - because it is
That is plain arrogant. Not only you refuse to remember that you may be wrong, you also want search engines to protect you from ever discovering that.
What if your neighbor likes something that will eventually hurt you. For example, burning the nearby 5G tower that serves your phone, because they like the theory that will improve everyone's health around. Is their judgement correct for them? Do you want them to get more of stuff they like? Should their community of single-minded people be assisted by search engines in avoiding (what they think is) spam?
I understand everybody wants less SEO spam, I simply point that the solution you're thinking about has already been tried and found to have consequences no one expected. "Give me more stuff that I like" is the old problem. The new problem is called "How to find more stuff that I like without creating an echo chamber".
You seem to be all about straw man situations affecting other people - I'm (perhaps selfishly) only interested in the quality of my own search results, and how much better they could be if I could give feedback to the search engine.
If my neighbour burns down the cell tower, if another neighbour has trapped themselves in an antivax chamber on Facebook - well, those are some hypotheticals that have literally nothing with my point, which is simply that I want better search results, driven my my own selections.
Have you tried making your own google search? You can pick what sites it will search. I'm not sure if you can say "Prioritize these sites, but still give me results from other sites too".
I do wish there was something next to each search result though that was like "I like this site, include it more often." or "This site is garbage, ignore it forever"
> I want my search results to prefer [.....] other pages marked high for quality by people who [some algorithm] to this page.
Facebook/YouTube already do this and it is a disaster, which maybe even has indirectly contributed to rise of antivaxx, Q, Trump election, Capitol attack etc.
Before personalized recommendation systems, when an antivaxxer made a video, hardly anyone saw it, when they liked it, the like resulted in nothing.
When their likes started affecting their personal search results, liking one antivaxx video made search produce hundreds more of them. It finds so much of the same stuff they no longer have free time to check an opposing opinion.
You're operating under an assumption that your judgement is more correct or better for you, and the entire problem is about finding gems in an ocean of trash. The reality is more complicated, for you antivaxx content is trash, for them it is your content that is trash.