CASP (Critical Assessment of protein Structure Prediction) is calling it a solution. To quote from the article:
"We have been stuck on this one problem – how do proteins fold up – for nearly 50 years. To see DeepMind produce a solution for this, having worked personally on this problem for so long and after so many stops and starts, wondering if we’d ever get there, is a very special moment."
--Professor John Moult
Co-founder and chair of CASP
It's an improvement- and a big one- but not a solution to the problem. It mainly shows just how stuck the community had gotten with their techniques and how recently improvements in DNNs and information theory methods can be exploited if you have lots of TPU time.
Well, it's not. Nature does not have a committee sorry. Proteins are delicate "machines" where even a a small change in the sequence (and thus the 3D structure) as small as a few amino-acids would change effectively the structure and the function of it. On top of that, proteins are dynamic beasts. In any case, it's a great advance, but DM, as many companies likes a little bit too much to tout its own horn.
I think that missed the mark, regardless of the rest of the discussion. It's like saying that the winner of the DARPA Grand Challenge for self-driving cars "solved" autonomous driving back in 2010.
This benchmark maybe solved, but simultaneously, there remain other open problems relating to protein folding which are unsolved and which may not even have benchmarks yet :)
Said differently, there's vast space between having a great result on a specific benchmark (this) and solving all interesting problems in a scientific field.
This is an issue of the more subtle aspects of English.
"To see DeepMind produce a solution for this" does not imply something is solved. I can produce a bad solution. I can produce a really good solution. All without solving a problem.
This is a really good solution. Of course, there's still room for more research and better methods in the future, but now computational protein structure prediction can compete with experiments actually measuring the structure.
Laypersons often use the word "solution" in situations where an academic would say "method" or "approach": we did something useful, but it may not be the best possible way.
In pure math, "solution" means determining whether a logical statement is true or false. For example, in (asymptotic, worst-case) analysis of algorithms, the logical statements take the form "there exists an algorithm to compute X with asymptotic complexity O(f(n)), and no algorithm with lower complexity exists." These are crisp notions with no room for debate.
In this competition, they defined "solved" as achieving 90% accuracy. This is somewhere in between the two definitions. It's technically a valid problem statement, but it can become obsolete in a weird way. If someone else solves the problem of achieving 95% accuracy, then suddenly the 90% solution doesn't look so good. Compare to e.g. sorting. If we add the requirement of a stable sort, it becomes a new problem. Stable sorting algorithms are not automatically "better" than unstable ones.
"We have been stuck on this one problem – how do proteins fold up – for nearly 50 years. To see DeepMind produce a solution for this, having worked personally on this problem for so long and after so many stops and starts, wondering if we’d ever get there, is a very special moment."
--Professor John Moult Co-founder and chair of CASP