Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
mycall
on Jan 23, 2020
|
parent
|
context
|
favorite
| on:
Talking to myself: how I trained GPT2-1.5b for rub...
> predict the next word in 40GB of Internet text
This could do wonders for lip reading correction.
menmob
on Jan 23, 2020
[–]
OpenAI trained the initial 1.5B model on ~160G of text.. so I’m sure it’s already going to give amazing results.
Consider applying for YC's Spring batch! Applications are open till Feb 11.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
This could do wonders for lip reading correction.