Hacker News new | past | comments | ask | show | jobs | submit login

Isn't it possible with more efficiency, we still want them for advanced AI capabilities we could unlock in the future?



Operating costs are usually a pretty significant factor in total costs for a data center. Unless power efficiency stops improving much and/or demand so far outstrips supply that they can't be replaced, a bunch of 10 year old GPUs probably aren't going to be worth running regardless.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: