Hacker News new | past | comments | ask | show | jobs | submit login

Lol who has a $5k computer lying around like that?



Well, I work on this stuff, but I’m mostly sharing this to spread awareness that you don’t need a multimillion dollar rack of nvidia gpu machines to do inference with surprisingly powerful models these days. Not that long ago, you’d need a much more expensive multi-kilowatt workstation to run this sort of thing at a useful speed.


this is hacker news. Its coin toss for every person reading this that they have a $5K computer lying around




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: