Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
lumost
on May 3, 2022
|
parent
|
context
|
favorite
| on:
OPT: Open Pre-trained Transformer Language Models
A 175 billion parameter model might be a couple hundred gigs on disk. The file is probably just too big for GitHub/other standard FB services.
cma
on May 3, 2022
[–]
They could just torrent.
lumost
on May 4, 2022
|
parent
[–]
They could, and their might be a torrent in the future - but torrents lose tracking info. I'm sure the researchers want to know who is downloading their models even if they don't care who it is.
Consider applying for YC's Summer 2026 batch! Applications are open till May 4
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: