Hacker News new | past | comments | ask | show | jobs | submit login

Is a card like this useful to run Stable Diffusion, etc?



From what I gather you would want the M40 if anything as it is a single GPU with access to all of the 24GB for SD inference. But if you are going to do smaller you might as well get a 3060 12GB as it will be about 3x faster.


12GB is on the lower end of what you want for SD though. Granted, optimizations are still happening, but if you wanna do something like 1024x1024 or higher, you should at least get 24GB.


1024x1024 doesn't work very well in SD since it only iterates over 512x512 windows. For higher resolution images the best method is to use a different AI model for upscaling a 512x512 SD image.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: