Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is this all speculation? Got a source for some of these big claims? Admittedly, it's a fun read.


These are all facts coming from publicly available information, such as the Llama 3 paper and recent interviews. I'd dig up the specific timestamps, but I'm working at the moment and can't spend the next hour scrolling through long interviews.

The interviews are:

"Mark Zuckerberg - Llama 3, $10B Models, Caesar Augustus, & 1 GW Datacenters": https://www.youtube.com/watch?v=bc6uFV9CJGg

"Yann Lecun: Meta AI, Open Source, Limits of LLMs, AGI & the Future of AI | Lex Fridman Podcast #416 ": https://www.youtube.com/watch?v=5t1vTLU7s40

The total FLOPS used in training is in bunch of places, I can't find a good link at the moment.


Zuckerberg said the reason he bought so many GPUs was to build Reels to compete with TikTok. Is that the big claim you're asking about?

https://www.dwarkeshpatel.com/p/mark-zuckerberg




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: