Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Merging Keras into TF and trying to copy PyTorch by adopting their model was the downfall of TF. The initial TF releases were great. It was simple, easy to reason about, and solved a clear problem. But then Google wanted TF to appeal to everyone and solve all problems for industry and research and beginners and experts at the same time because people need to get promoted internally. And that never goes well. Nobody I know wants to use TF these days (but some are forced to).

Let's hope JAX won't suffer the same fate.



> The initial TF releases were great.

I object to this statement. The earlier releases of TF (ca. 2013) were impossible to debug and the documentation was always broken - if you tried to follow the Seq2Seq tutorial you know what I'm talking about. I'd argue that these releases were great for the 50 people who already knew how to use it, but they were aggressively unhelpful for beginners.

PyTorch won my lab (and certainly others) because you could add prints to check your dimensions while TF forced you to build a correct computation graph all at once. Performance? Sure, TF is probably faster. But I'd argue that TF's big mistake was not taking their new users into consideration.


I agree. I truly miss the graph-mode API, especially coming from Theano before TF, but it wasn’t as beginner friendly and Google wanted to capture market share for their cloud.

At least with jax the core library isn’t adopting any of the framework level stuff so those can evolve independently.


Yup. For a lot of models I preferred the graph mode. It was explicit with no magic. I think they should've just stuck with that, even if it meant not everyone and their mom can use it.

Agree on the framework stuff. Please just be a library, not a collection of opinionated frameworks where I need to read the source code anyway to understand what it actually does. After something not working and debugging for hours I remember looking at the number of weights in the model and thinking, wait, something can't be right here. Then I dig into the framework layers and figured out it added slightly different things than I thought it would. Would've been much faster to just write the graph myself.


the original tf was truly horrendous :) it was extremely unintuitive and slow to program. that's why so many people switched to pytorch. the merge was a good idea, it just came too late. tf is as popular as it is just because it was the first to market, & because of Google's megaphone, even inside Google researchers don't want to use it, and a large fraction have switched to Jax


The downfall was fragmentation. They merged Keras after TF started having multiple competing application libraries, each managed by different people (and each fighting for promotion, as you say). Keras just happened to be the most popular. Even after Keras, various teams (e.g., deepmind) have decided to make their own libraries.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: