Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
syntaxing
on March 12, 2023
|
parent
|
context
|
favorite
| on:
Using LLaMA with M1 Mac and Python 3.11
Extremely tempted to replace my Mac Mini M1 (8GB RAM). If I do, what's my best bet to future proof for things like these? Would a Mac Mini M2 with 24GB RAM do or should I beef it up to a M1 Studio?
enduser
on March 12, 2023
|
next
[–]
RAM is king as far as future proofing Apple Silicon.
Even a 128GB RAM M1 Ultra can’t run 65B unquantized.
cjbprime
on March 12, 2023
|
prev
|
next
[–]
Best future proof might be to wait two months and get an M2 Mac Pro.
Tepix
on March 13, 2023
|
prev
[–]
You could pay 300€ for 128GB on a PC. And add one or two GPUs later.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: