Maybe an unpopular opinion but it feels like Google is near useless anymore. Between results containing outdated or broken links to empty discussions, and ads being their main priority as well as "fuzzy search results" where you can search for one thing and get something completely unrelated because Google decided you also meant to search for something else that is possibly contextually adjacent, I can't really get good results from it anymore.
I mean, I can still get answers for simple questions but when it comes to anything unique or complex I usually just get frustrated and go to duckduckgo or something else. ChatGPT now adays mostly.
Sure ChatGPT can hallucinate but Google's results rely on some random person somewhere to have properly answered something. The reality of that is so many of the "answers" I find are discussions on forums between a bunch of random people who have no real credentials or factual answers but instead just opinions based on something else they read on Google. People google something, read the google blurb about it at the top of the results, then go answer other people's questions.
I honestly think Google is losing favor at this point. I've even been considering moving away from Android because the OS just feels like the Walmart iOS now adays. It features the same problems but in a way that nothing is polished versus iOS.
Google needs to stop just following everyone else. Everything now adays feels like the ol Google+ move. "Ah, successful product someone else made, let's remake it and name it google something!"
> Sure ChatGPT can hallucinate but Google's results rely on some random person somewhere to have properly answered something.
i don't know about your experience with chatgpt, but to me it's been untrustworthy. so much so that every answer i get needs to be double checked against google and that "random person".
I think Reddit has a golden opportunity here, considering how often I prefix searches with "site:reddit.com" to filter out SEO bullshit and usually get an answer from a subject-matter expert in the top ten results. They could even tie an LLM into it and regurgitate the highest-voted comment, and it would likely be more effective than Google or GPT.
I've been saying it for a while, you could basically replace entirely google with a 10k websites index and get above their quality of results 90% of the time.
That's fair I can definitely agree that but my point was more that neither is a very trustworthy source anyway. It's the same way I'd double check one source versus another. It's just easier to start with GPT at this point because at least I know it'll just answer based on what it has been trained with versus special interests. One example I've used is the question "Is Fix A Flat ok to use in tires?" the results are from the company that makes fix a flat saying basically "Yes obviously" while you then see tire companies and repair shops giving you mixed responses. Then you have the third group being just people who have heard from one of the first 2 groups. People are worried about AI degrading the quality of information over time, learning from itself. We're basically there thanks to people caring more about being right than being correct and never doing their own leg work.
I find that forums are often some of the most useful results. The posts are typically written by genuine people, many of whom are passionate about the subject matter.
Reliability can be an issue, but the medium at least provides a number of context clues to help. Seeing how individuals write and interact is helpful when judging how much weight to give to their words.
Converting those forum posts into a generic and overly confident interface strips away useful information.
Yes, but maybe 5 times out of 10 the forum has your question, but not your answer. Add to this that guy on the Apple support forums, who has destroyed millions of lives by repeating the same answer to every thread opened for the last 15 years.
People as a whole say a lot of things, correct and incorrect. But ChatGPT is a single thing that has a fairly impressive rate of reliability on information, but if you get into certain levels of details on certain topics, it'll just spit out false information that's indistinguishable from the correct stuff. I wouldn't expect a human to do that: trick me into thinking they're an expert with an encyclopedic, verifiably correct knowledge of a topic, but then confidently start lying about that same topic in that same conversation. It's much harder to vet, or know when you need to vet.
I mean, I can still get answers for simple questions but when it comes to anything unique or complex I usually just get frustrated and go to duckduckgo or something else. ChatGPT now adays mostly.
Sure ChatGPT can hallucinate but Google's results rely on some random person somewhere to have properly answered something. The reality of that is so many of the "answers" I find are discussions on forums between a bunch of random people who have no real credentials or factual answers but instead just opinions based on something else they read on Google. People google something, read the google blurb about it at the top of the results, then go answer other people's questions.
I honestly think Google is losing favor at this point. I've even been considering moving away from Android because the OS just feels like the Walmart iOS now adays. It features the same problems but in a way that nothing is polished versus iOS.
Google needs to stop just following everyone else. Everything now adays feels like the ol Google+ move. "Ah, successful product someone else made, let's remake it and name it google something!"