Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I agree with you but right now RTX 4090 cards are pushing $2000, which doesn't leave much budget left. I'd suggest picking up a used 3090 card from eBay, which are currently around $800. This will still give 24gb of VRAM like the 4090.


i've seen some blog posts saying if you buy a used 3090 that has been used for bitcoin mining then there is a risk of thermal throttling because the thermal paste on the vram is not great and worse if it was run hot for a long time.

any recommendations on how to buy one? e.g. 24GB model, any particular model to run LLMs? what is the biggest baddest LLM you can run on a single card?

have been thinking about it but was sticking with cloud/colab for experiments so far.


The good deals are gonna be on local ads. Facebook Marketplace in most of the US.


Craigslist and eBay have some great deals.


I remember videos (on youtube likely) of thermal paste replacement, that was upgrade to stock card. So, average person should be able to do it. It'll cost a few $$ for the paste. I would go with local workstation, then don't have to think much about while running stable diffusion. Plus, if it's used from ebay, prices cannot go much lower, you'll get something back at the end. Also, for image things training dataset can be quite big for network transfers.


Strong endorse here. I pick up used RTX 3090s from Facebook Marketplace and eBay at $800 maximum. Can usually find them locally for $700-750, and typically can test them too, which is fine (though I've had no issues yet).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: