Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
smoldesu
on April 10, 2023
|
parent
|
context
|
favorite
| on:
The LLama Effect: Leak Sparked a Series of Open So...
> Llammas are showing the world that it doesn't take monopoly-level hardware to run those things.
LLaMA was not necessarily the model that did that. A fairer attribution might be BERT or GPT-Neo.
seydor
on April 10, 2023
[–]
it was difficult to run all those models. now gamers follow youtube tutorials
smoldesu
on April 10, 2023
|
parent
[–]
Name 1 way GPT-Neo was harder to run than LLaMA.
chpatrick
on April 11, 2023
|
root
|
parent
[–]
In my experience GPT-Neo never produced any useful output.
Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
LLaMA was not necessarily the model that did that. A fairer attribution might be BERT or GPT-Neo.