The point was different - it is way too expensive for regular gamers, 2080Ti likely won't be able to do 4k@60Hz like 1080Ti couldn't either and 10 games featuring RTX in the nearly future is not a sufficient draw, especially when some of the effects now look subjectively worse with RTX ON (see those shadows from multiple lights above dancing people from Jenson's demo). So the question remains - who is the real target audience that will actually buy those cards at these prices? Did NVidia turn into Apple and made RTX its own iPhone X?
Waiting this generation out until RTX is more wide-spread/tested and going for the next 7nm generation with hopefully AMD having a proper GPU to compete seems like a better strategy for most gamers out there.
The 80 series has always been a low-volume, high-margin halo product within Nvidia's gaming range. It's dirt cheap and high-volume compared to Quadro, but top-of-the-range for gaming. Cryptomania has revealed to Nvidia that they probably priced the 1080 too low at launch - many gamers were in fact willing to pay substantially inflated prices for The Best GPU.
If the mass market decides to wait for the RTX 1060 or 1050, that's fine with Nvidia, as they face no real competition from AMD at the moment. It's very much in Nvidia's interests to make the most of their market dominance.
The 70 series is traditionally pretty popular for gamers though. At $600, though, I mean... that's at the point where just the graphics card is more than one of the 4K consoles Microsoft or Sony is putting out. Obviously a PC game is going to look nicer, but someone has to start thinking about that comparison.
"someone has to START thinking about that comparison."
Sorry, but this comment is really kind of hilarious.
The "PC vs. Console" debate is something that almost predates the Internet and it has generated countless forum wars...
A high range PC has always been the more powerful and expensive gaming machine, since basically the first 3DFX cards in the late 90s, some people are OK with that, other prefer consoles as a perfectly acceptable alternative.
That's not really what I meant. Obviously some people have drawn their lines in the sand and will never consider switching. I don't think that's everyone. I play games on both console and PC, as I imagine do many others. If the price is too unreasonable, or the PC version doesn't work right without a bunch of configuration, or whatever, I can't be bothered with it and will just go to the console version of something.
> I mean... that's at the point where just the graphics card is more than one of the 4K consoles Microsoft or Sony is putting out.
The console might output 4K but that doesn't mean the GPU inside can handle higher settings than a 1060. The $600 GPU is irrelevant to that comparison.
> The console might output 4K but that doesn't mean the GPU inside can handle higher settings than a 1060
Which is kinda irrelevant too, since console games are highly optimized for exactly 1 graphic card & rest of the setup. You get all the details hardware is capable of smoothly, nothing more or less. No fiddling with tons of settings that most have no clue about.
I don't get this often used argument - my games look better on PC than on consoles. Yes some UI gimmicks look better, textures have higher res, but after 30 minutes of playing it really doesn't matter at all, quality of gameplay and smoothness of experience is the king. Of course if one pours 5x the amount of money into PC that would be spent on console, there needs to be some inner justification. But it still doesn't make sense to me.
This is a view of person who plays only on PC, never had any console.
If you set it to medium you should get a smooth experience. You don't have to do anything I'd call "fiddling", and being optimized for a specific GPU is overrated (and not even true with the current consoles). Especially when you have a much better CPU.
> Of course if one pours 5x the amount of money into PC that would be spent on console, there needs to be some inner justification.
You can get a prettier and smoother experience if you do that and don't put the settings on super-ultra.
But also, if you're already going to have a computer, popping in a better-than-console GPU is cheaper than a console.
To the extent that's true it kind of works against your argument, doesn't it? I doubt that PC sales look better in India because Indians all have top-of-the-line Alienware rigs.
The top of the line rigs are what you need to play new titles on Ultra settings, sold at $50-60+
With hardware comparable in pricing to what you'd find in a console (or using something that doesn't make much sense to mine with, like a GTX 780Ti) you can easily play a 3-5 year-old game at 1/2 to 1/4 of it's original price, which might even be reduced further by -50% to -90% during a Steam sale.
But it does open up the system's library of exclusive titles, which makes it seem compelling to someone considering a video card purchase who already has an older one that does OK with games.
I think the cryptomania revealed that people (young people, gamers, who got into cryptocurrencies) could earn a few bucks or so back with their investment. Some of whom used mom 'n pop's electricity grid for that purpose. If they had to pay that back, it was likely 0% interest.
I mean you're not countering his point in any way. He didn't say nobody would buy it, but it's a simple fact that most people can't afford or justify a thousand dollar GPU.
The vast majority of people buying Nvidia GPUs in the 10xx generation were going for 1060s and 1070s.
Yet, going by a scan of people on the train last time I caught it, heaps of people seem to find money for iPhone Xs at almost the same price point.
If you're sufficiently dedicated, even with limited funds going for every second or third iteration of halo products can be a great strategy. That way when you get it you'll have the absolutely best there is and a couple of years down the track it'll still be great but maybe it won't quite do Ultra anymore on the latest titles.
The 1080TI transformed my experience of my X34 Predator (it even paid for itself through timely sale of mining profits) which does 3440x1440 @95hz. I certainly wouldn't mind the new card but I'll wait for the next one after that minimum.
Don't most people get expensive phones because of subsidies from carriers? Or at least, they pay monthly installments for these devices, through data plans (basically).
Do people really take out loans to get super expensive video cards?
People seem to be missing something in this particular point of the conversation. It's not a function of absolute price. It's a function of price-per-performance. Sure, a lot of the crowd here can afford the 1080 or the Titan, but the bang-for-buck favors the lower-end cards.
'most people can't afford or justify'. Come one. People buy cars and other stuff. Someone working fulltime and buying a 1k cheaper car can already afford and justify a 1k graphics card.
We are already talking about a small percentage of people who wanna buy a graphics card.
From those people, it is easily justifyable to spend less on a car, a holiday or rent and instead having a nicer gaming rig. If you spend a lot of time playing games why not?
Modern technology is way cheaper than the previous/old status symbols.
I'm thinking about buying a car and one simple calculation is still what else i can do with that money.
And yes in munich, where i live right now, there are enough people with a car who could use public transport and don't us it.
The target group of a 1k graphics card is not someone who can barely afford the car he/she needs every day and would not be able to earn anything if the car breaks down...
If you played seriously then you knew enough to turn the settings lower, not higher.
The last thing you want is having your field of view obscured by colorful explosions, bloom, debris, etc. when your opponent has a crystal clear vision on you.
But that is a completely orthogonal point to the question as to whether doom runs well at 4k with all features on. Because I am asking that question does not mean I would play death match that way. But I might indeed go through the single player campaign that way.
Indeed I didn't play Doom at 4k at all, because as I said, it felt like garbage at 4k, no matter what settings, on 1070
Doesn't matter if they aren't "kids" anymore or not, there's a reason AMD focused hard on the $200-300 range for graphics cards - because that's where most buyers budgets are. There are people who spend more, but even many enthusiasts are shopping in the $400-500 segment for a card to support high-refresh-rate gaming or higher resolutions like 1440P, the number of people who blow $800+ on a GPU like a 1080 Ti are few and far between in comparison.
Certainly not, but the infuriating part is historically the performance of those cards trickles down to the lower tiers at more reasonable prices as the generations go by. The GTX 1070 beat the GTX 980 Ti at an MSRP $200 less just one generation later, meanwhile at least from pure TFLOPS numbers the RTX 2080 is less powerful than the GTX 1080 Ti while costing around the same.
One would be forgiven for expecting roughly GTX 1080 Ti performance in the RTX 2070 at around $449-499 USD.
This all sounds like normal r&d and market forces. New stuff is low volume and premium prices. Once it becomes more common and more production lines are switched, the prices fall and the features get included into other models. This applies to virtually every product.
Or did you want to highlight something else I missed?
The new products are launching at the same price point similarly specced parts from the previous generation have been selling at. "Low volume" doesn't really play when you're talking silicon manufacturing, when you spend millions of dollars to make a mask you want to get ROI on it quickly - the GTX 1070 sold at over $200 less than the GTX 980 Ti, for example, at launch.
When you sell a product that actually has less compute performance (the RTX 2080) at the same price point as the higher-end part from the last generation (GTX 1080 Ti) something has gone horribly wrong. A lot of this is likely due to the Tensor/RTX units on the new GPU's taking up die space and there hasn't been an appropriate process shrink to make up for it, but it's all the more reason these are REALLY unappealing options for anyone outside the top-end enthusiast segment (the GTX 1070 is the most popular enthusiast card this generation, because even enthusiasts have budgets - which is usually the $400-500 range for GPUs).
tl;dr; Prices here make no sense, cards with similar or less performance selling at the same price point you could get from the previous gen - just with raytracing support added on top (so it won't net you WORSE performance with this new functionality utilized). I don't know who Nvidia thinks is going to buy these from a gaming perspective.
The hype around "ti" is unreal. Now they're changing it to "RTX" and "ti". /shock /awe /s
To me, it's pretty clear NVIDIA is cashing out.
Have you not noticed the market slaps the word gaming on commodity hardware along with a bunch of christmas lights and people happily pay a premium for it.
Gamers aren't the brightest bunch and a $1000 is the right price point when people are gladly dropping that on a mobile phone now.
Sure compared to a few years ago I'd agree with you, this market? this hype? No.
Nvidia has a history of just sitting out their performance lead, see Geforce 8800 vs Geforce 9800.
Even in the initially released marketing material, by Nvidia, the 8800 GTX had the obviously way better raw specs than the 9800 GTX. Took them a couple of days until they changed the material to compare on performance % in different games.
But the 9800 GTX was actually a slower card than the one year older 8800 GTX due to lower memory bandwidth and capacity. As such it was competing against one generation older mid-range cards like the 8800 GTS 512.
NVDA also has a significantly higher market share than AMD does right now, that doesn't change that $200-300 is still the most common price point for consumer GPU purchases.
Current Steam user survey results (now that the over-counting issue has been fixed) shows the GTX 1060 as the single most popular GPU installed by active Steam users with a 12.5% market share, the GTX 1050 Ti and 1050 take second and third place with ~9.5% and ~6% respectively that means about 30% of Steam users have a current-gen GPU in the $200-300 price range.
So yes, volume != profit, but the consumer obviously trends towards more reasonably priced cards. Cards in the Titan price-point that NVidia is trying to sell the RTX 2080 Ti at are so uncommon that they get lumped into the 'other' category of the Steam survey - and since I highly doubt magic like doing integer operations in tandem with FP32 operations is going to bring that much of a performance improvement to the majority of gaming workloads in tandem with the really weak raw numbers of the just-announced cards (fewer TFLOPS on the 2080 than the 1080 Ti selling in the same price bracket) it's obvious Nvidia is really taking the piss with their pricing. You're paying more for raytracing, that's it - and while it's certainly a cool feature I don't really see gamers caring that much until it becomes usable at that $200-300 price point.
Thanks. Not sure where they got it from. The Anandtech article that is linked to there does contain some TFLOPS numbers but I think they came up with those numbers somehow based on the CUDA core count so could well not be accurate.
I guess it's just simply twice the number of Cuda cores times the operating frequency and so it's accurate as such but lots more goes into gaming performance of a GPU.
They're really not. Compared to many other popular hobbies gaming's yearly cost is really low. Things like audio, photography, cars, warhammer, travelling, winter sports each will have yearly costs that make gaming seem cheap as hell.
Football (soccer) is cheap, so is basketball, traveling can be done on a budget (backpacking, hitchhiking). Board games or card games (not the collectible or trade-able cash-grabbing variety) are also cheap.
There's many expensive hobbies, but also a ton of cheap ones.
I agree with you, aliasing is obviously less noticeable at higher resolutions, simply because the pixels are smaller (or the pixel density is higher, whichever way you want to see it).
Which ones though? Tbh I do have a 4k gsync so that does help when fps goes below 60. I find anything from 50-60fps to be smooth and below 50 it starts to get too choppy. It also helps with the input lag to run gsync @ 58fps. The most current game I run ultra is far cry 5, it's a pleasure @ 4k.
Ultra is ridiculous and unnecessary. I play 4K60 on an RX 480 with "smart" settings — max textures/meshes, min shader effects, no ambient occlusion, no dynamic reflections, etc.
The only point i'd like to make is that the only reason that you "can't do" 4k@60 is because devs decided to tune their effects scaling in such a way that this is the case.
This doesn't affect the argument that you're making. I just think it's actually incredibly absurd to complain as though it's the hardware's fault for not being able to keep up with 4k@60, when it's the devs who you should be looking at when you're disappointed with the performance on a given piece of hardware.
Oh yes it’s the developer’s fault for not “tuning” something the right way. Sure.
You can “tune” something all you want, you’re always going to have a lower quality representation if you want better performance. The hardware should give the developers the possibility of getting better quality graphics at higher resolutions. We can play the original doom at 4 and probably even 8K without much problems. But that doesn’t mean it’s because they “tuned” it better, it’s because hardware has gotten better and hardware will always be the limiting factor for games.
I think the point is that with PCs they make less effort to eek performance out of hardware. When you've got a console, you know exactly what you'll be optimising for and work really hard to get the most out of it. With a PC release I think Devs tend to make far less effort and simply up the requirements
They are likely incentivized to jack up the price on personal purchases so the big manufacturers can have more overhead integrating it with their consoles or pre-built gaming PCs.
They demonstrated real time super resolution on top of hybrid rendering. The meaning of 4k@60Hz has changed. They can render that just fine -- it's just a question of how many of the pixels are imagined by the model and how good the model is.
With my 1080 (not 1080Ti) I play most of the games (just couple of exceptions, really) with 60 FPS on 4k monitor. And look at 2080 with the same price tag as 1080 - gamers will wipe it out from stock in seconds.
Talking about two different things here... You mentioned that card is capable of outputting 8k HDR @ 60Hz, i.e. your Windows desktop can happily have 7680x4320@60. I mentioned that running games at 4k@60Hz or 3k@144Hz smoothly might not be possible for some demanding games and that many gamers expected that from the new generation.
You can slow any hardware with sufficiently inefficient program (or sufficiently detailed scene, if we're talking about GPUs). You can easily make any videocard work at 0.001 FPS if your scene is heavy enough. So it only depends on game developers, it's unfair to blame Nvidia for that. GPU progress is astonishing, at least when compared with CPU progress.
Ehm, if your server runs a reasonable modern x264, you should get significantly lower bandwidth at "transparent" quality compared to what the GPU's hardware encoders is even capable of reaching. The reason there being that the hardware encoder can't use some features x264 implements sufficiently fast to make them worthwhile to use at that sort of time investment.
Please don't try to measure lossy-anything-calculation by speed alone, always make sure that the required quality can even be reached, and even if, that it still exhibits the performance benefits, even after you tune these features up far enough.
It's not about blame, it's being realistic that 4K means 4x the pixels and 8K means 16x the pixels, and while these cards represent a lot of progress they're nowhere near that level.
Waiting this generation out until RTX is more wide-spread/tested and going for the next 7nm generation with hopefully AMD having a proper GPU to compete seems like a better strategy for most gamers out there.