Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sure, but if you have models like DeepSeek - 400GB - that won't fit on a consumer card.


True. But an AI shop doesn't care about that. They get more performance for the money by going for multiple Nvidia GPUs. I have 512 GB ram on my PC too with 8 memory channels, but it's not like it's usable for AI workloads. It's nice to have large amounts of RAM, but increasing the batch size during training isn't going to help when compute is the bottleneck.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: