It's really not so extraordinary, exponential reduction in logical errors when the physical error rate is below a threshold (for certain types of error correcting codes_ is well accepted an both theoretical and computational grounds.
For a review paper on surface codes, see
A. G. Fowler, M. Mariantoni, J. M. Martinis, and A. N. Cleland, “Surface codes: Towards practical large-scale quantum computation,” Phys. Rev. A, vol. 86, no. 3, p. 032324, Sep. 2012, doi: 10.1103/PhysRevA.86.032324.
It does. It's up to engineering to make errors uncorrelated. The google paper being referenced actually makes an "error budget" to see what the main sources of errors are, and also run tests to find sources of correlated errors.
The claim about this is that correlated errors will lead to an "error floor", a certain size of error correction past which exponential reduction in errors no longer applies, due to a certain frequency of correlated errors. See figure 3a of the arxiv version of the paper: https://arxiv.org/abs/2408.13687
For a rough but well-sourced overview, see Wikipedia: https://en.wikipedia.org/wiki/Threshold_theorem
For a review paper on surface codes, see A. G. Fowler, M. Mariantoni, J. M. Martinis, and A. N. Cleland, “Surface codes: Towards practical large-scale quantum computation,” Phys. Rev. A, vol. 86, no. 3, p. 032324, Sep. 2012, doi: 10.1103/PhysRevA.86.032324.