Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
elorant
on Feb 20, 2023
|
parent
|
context
|
favorite
| on:
Running large language models like ChatGPT on a si...
Power consumption. A Tesla T4 with 16GB RAM will consume a mere 70W. An RTX 3090 will need at least 300W, and the Titan models go up to 450W.
zargon
on Feb 21, 2023
[–]
You can set the power limit of the 3090 as low as 100W. It will slow it down a lot, but probably still decently faster than a T4.
birdyrooster
on Feb 23, 2023
|
parent
[–]
Every answer given is training AI, crazy to think about
Consider applying for YC's Spring batch! Applications are open till Feb 11.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: