Hacker News new | past | comments | ask | show | jobs | submit login

The problem is it's still a computer. And that's okay.

I can ask the computer "hey I know this thing exists in your training data, tell me what it is and cite your sources." This is awesome. Seriously.

But what that means is you can ask it for sample code, or to answer a legal question, but fundamentally you're getting a search engine reading something back to you. It is not a programmer and it is not a lawyer.

The hype train really wants to exaggerate this to "we're going to steal all the jobs" because that makes the stock price go up.

They would be far less excited about that if they read a little history.




> "we're going to steal all the jobs"

It won't steal them all, but it will have a major impact by stealing the lower level jobs which are more routine in nature -- but the problem is that those lower level jobs are necessary to gain the experience needed to get to the higher level jobs.

It also won't eliminate jobs completely, but it will greatly reduce the number of people needed for a particular job. So the impact that it will have on certain trades -- translators, paralegals, journalists, etc. -- is significant.


I find it fascinating that I can achieve about 85-90% of what I need for simple coding projects in my homelab using AI. These projects often involve tasks like scraping data from the web and automating form submissions.

My workflow typically starts with asking ChatGPT to analyze a webpage where I need to authenticate. I guide it to identify the username and password fields, and it accurately detects the credential inputs. I then inform it about the presence of a session cookie that maintains login persistence. Next, I show it an example page with links—often paginated with numbered navigation at the bottom—and ask it to recognize the pattern for traversing pages. It does so effectively.

I further highlight the layout pattern of the content, such as magnet links or other relevant data presented by the CMS. From there, I instruct it to generate a Python script that spiders through each page sequentially, navigates to every item on those pages, and pushes magnet links directly into Transmission. I can also specify filters, such as only targeting items with specific media content, by providing a sample page for the AI to analyze before generating the script.

This process demonstrates how effortlessly AI enables coding without requiring prior knowledge of libraries like beautifulsoup4 or transmission_rpc. It not only builds the algorithm but also allows for rapid iteration. Through this exercise, I assume the role of a manager, focusing solely on explaining my requirements to the AI and conducting a code review.


The thing that makes the smarter search use case interesting is how LLMs are doing their search result calculations: dynamically and at metadata scales previously impossible.

LLM-as-search is essentially the hand-tuned expert systems AI vs deep learning AI battle all over again.

Between natural language understanding and multiple correlations, it's going to scale a lot further than previous search approaches.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: