To me, 2027 looks like a case of writing the conclusion first and then trying to explain backwards how it happens.
If everything goes "perfectly", then the logic works (to an extent, but the increasing rate of returns is a suspicious assumption baked into it).
But everything must go perfectly to do that, including all the productivity multipliers being independent and the USA deciding to take this genuinely seriously (not fake seriously in the form of politicians saying "we're taking this seriously" and not doing much), and therefore no-expenses-spared rush the target like it's actually an existential threat. I see no way this would be a baseline scenario.
Sure, but that's kinda what I'm saying they're doing wrong.
One of the core claims 2027 is making is, to paraphrase, we get AI to help researchers do the research. If we just presume that this happens (which I'm saying is a mistake), then the AI helps researchers research how to make AI self-improve. But there's not any obvious reason for me to expect that.
I mean, even aside from the narrow issue that the METR report earlier this year is showing that AI could (at the time) only do with 80% success tasks that would take a domain expert 15 minutes, and that this time horizon doubles every 7 months which would take them to being useful helpers for half-to-two-day tasks over 2027 which is still much less than needed for this kind of thing, there's still a lot of unknowns about where we are in what might be a sigmoid for unrealised efficiency gains in such code.
If everything goes "perfectly", then the logic works (to an extent, but the increasing rate of returns is a suspicious assumption baked into it).
But everything must go perfectly to do that, including all the productivity multipliers being independent and the USA deciding to take this genuinely seriously (not fake seriously in the form of politicians saying "we're taking this seriously" and not doing much), and therefore no-expenses-spared rush the target like it's actually an existential threat. I see no way this would be a baseline scenario.