Shocking! It's not like there weren't 4070 Ti Super cards that had 16GB GDDR6x at 21Gbps with 8448 cuda cores. 28Gbps with 8960 cuda cores?! Unbelievable! Sounds... fine. Uh, let's see what the price is.
It's good news I think - more cards have to set 16gb as their baseline to future-proof them for the long term. Skipping a 12gb generation is better for everyone, and I frankly don't even think 8gb cards are that bad if you aren't playing 4k games.
More CUDA cores tends to mean more SMs, which means better performance overall. We'll have to wait until the benchmarks roll out to say anything for sure, but it looks like a good upgrade (and lord only knows Nvidia will price it accordingly).
Yep, there are a number of different 12gb Nvidia cards; 3060ti and 2060 also had 12gb options.
But I think those are gap filler cards, at least in Nvidia's lineup. They're intended to fill the "budget 4k" market that needs enough memory to run high-res displays (or LLMs/crypto miners) but not the compute to fully saturate it. A sort of consolation prize for people that can't afford the 2080-tier cards but want a similarly large memory pool.
A lot of outlets see those cards and end up dunking on the XX70 models just because "memory lower" and I think they miss the point. There are a lot of users, particularly 1080p and 1440p users, that do want less memory that can run faster. As time goes on, I expect Nvidia to phase out 4gb cards for an 8gb minimum, with 16gb midrange (XX60ti, XX70) cards and probably 32gb flagship (XX80/XX90/Titan RTX) styled cards. Again though, it's anyone's guess.
Uh, even the 3080ti had 12GB RAM; in that generation only the 3090 had more (24GB). The 2080 had 8GB.
Just a nitpick; the guess is sound (though I would expect 4GB to stay for the office PC market), and obviously only time will tell what they'll really be doing.
OTOH don't underestimate the power of artificial market segmentation - "oh, you want 16GB VRAM? Too bad, get a xx60 instead of a xx50. After all it's just 25% more expensive!".