Hacker Newsnew | past | comments | ask | show | jobs | submit | braooo's commentslogin

Yeh. We're still barely beyond the first few pixels that make up the bottom tail of the S-curve for autonomous type AI everyone imagines

Energy models and other substrates are going to be key, and it has nothing to do with text at all as human intelligence existed before language. It's Newspeak to run a chat bot on what is obviously a computer and call it an intelligence like a human. 1984 like dystopia crap.


Could you please stop creating accounts for every few comments you post? We ban accounts that do that. This is in the site guidelines: https://news.ycombinator.com/newsguidelines.html.

You needn't use your real name, of course, but for HN to be a community, users need some identity for other users to relate to. Otherwise we may as well have no usernames and no community, and that would be a different kind of forum. https://hn.algolia.com/?sort=byDate&dateRange=all&type=comme...


Running LLMs at home is a repeat of the mess we make with "run a K8s cluster at home" thinking

You're not OpenAI or Google. Just use pytorch, opencv, etc to build the small models you need.

You don't need Docker even! You can share over a simple code based HTTP router app and pre-shared certs with friends.

You're recreating the patterns required to manage a massive data center in 2-3 computers in your closet. That's insane.


For me, this is essential. On priciple, I won't pay money to be a software engineer.

I never paid for cloud infrastructure out of pocket, but still became the go-to person and achieved lead architecture roles for cloud systems, because learning the FOSS/local tooling "the hard way" put me in a better position to understand what exactly my corporate employers can leverage with the big cash they pay the CSPs.

The same is shaping up in this space. Learning the nuts and bolts of wiring systems together locally with whatever Gen AI workloads it can support, and tinkering with parts of the process, is the only thing that can actually keep me interested and able to excel on this front relative to my peers who just fork out their own money to the fat cats that own billions worth of compute.

I'll continue to support efforts to keep us on the track of engineers still understanding and able to 'own' their technology from the ground up, if only at local tinkering scale


Self hosting my own LLM setup in the homelab was what really helped me learn the fundamentals of K8s. If nothing else I'm grateful for that!


So I love linux and would wish to learn devops one day in its entirety to be an expert to actually comment on the whole post but

I feel like they actually used docker for just the isolation part or as a sandbox (technically they didn't use docker but something similar to it for mac (apple containers) ) I don't think that it has anything to do with k8s or scalability or pre shared cert or http router :/


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: