Hacker News new | past | comments | ask | show | jobs | submit login

If labeling claims as "tired" makes it false, not a single fact in the world can be considered as backed by evidence. I'm not flipping anything around either, because again, it's squarely on you to provide proof for your claims and not those who question it. You're essentially making the claim that transformers can reverse a non-reversible function. That's like saying you can reverse a hash although multiple inputs can result in the same hash. That's not even "unbacked claims" territory, it defies logic.

I'm still not convinced LLMs are mere abstractions in the same way programming language implementations are. Even though programmers might give up some control of the implementation details when writing code, language implementors still decides all those details. With LLMs, no one does. That's not an abstraction, that's chaos.






I have been careful to use language like "theoretically" throughout my posts, and to focus on leaving doors open until we know for sure they are closed. You are claiming they're already closed, without evidence. This is a big difference in how we are engaging with this subject. I'm sure we would find we agree on a number of things but I don't think we're going to move the needle on this discussion much more. I'm fine with just amicably ending it here if you'd like.



Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: