kevinc313
turboD
I am surprised not a single person pointed out that those high numbers of the 3090 that were shown on some of the first page links is with a hard modded shunt mod for the 3090 and max overclock.
The 3090 is about ~10% faster than the 3080 without any hard mods. There is no generational leap here. Wait until you see the official reviews.
Only about +30% better than the 2080 Ti or Titan RTX, adjusted for TDP. What the heck? Ampere is a flop, aside from the firesale MSRP of the 3080.
+1
And EVGA is price-gouging with the 3090 FTW3 simply because they anticipate demand for it.
For example, 3080 FTW3 = $789. = +$89
3090 FTW3, same PCB, same features, same iCX3 sensors, dual bios fuse and cooler = $1750 and $1800 = +$250 and +$300
Then, when you acknowledge the fact that the FE cooler on just the 3080, to say nothing of the bigger 3090 cooler, was valued at $155 by Igor's Lab. I highly doubt the FTW3 cooler costs $155 to make, it probably costs between $50 and 75 to make.
Let's just say $55 for the sake of the argument.
Ok add that $100 to the difference making the 3090 FTW3 $350 and $400 more expensive than FE.
Is this a binned chip, a Kingpin card?
No.
It's FTW3.
They have lost their minds, both NV and EVGA.
This is the very definition of hubris.
They also not only are doing nothing about the Bounce Alert bots but they actually encouraged it by not allowing us to pre-order or queue up for a purchase (this isn't rocket science, it CAN be done).
1.Buying a 3080 on ebay absolves both NV and AIB partners from warranty responsibility.
2.When a bot buys 40 3080 XC3's in a row from EVGA they aren't using any 5% associate code.
3.Product scarcity drives up hysteria and demand, generates press noise, which further drives up hysteria and demand "I HAVE TO HAVE IT, I WILL PAY ANYTHING".
In reality the 3080 is ~10% faster than 2080 Ti at 1440p at the same power draw!
Here's my Timespy GPU of 16,750. Compare that to 17,500, it's not even 10%! Same power draw! Same thing in games!
NV either handsomely rewards or has acquired ownership of Digital Foundry who pumps out these carefully curated "benchmarks" showing the 3080 some 90% faster than the 2080 and 65% than 2080 Ti when in actuality it's only 20% faster at 1440p and 30% faster at 4K at a higher power draw! What a joke!
They think we don't know that AMD is releasing a large die Navi GPU in November that will sit in between the 3080 and 3090 @ less than 300w on TSMC's 7nm node, which is head and shoulders above this 8nm EUV mediocrity, and it will have either 16 or 24GB of video memory for $1000. We also need to remember that both next-gen consoles are the same architecture, RDNA 2, and they both will have path tracing, AI super-sampling and their equivalent of RTX IO (AMD invented this tech first actually, both next gen consoles were showing it off and then boom, it's added to Ampere by Nvidia). Take an console port running on Unreal engine optimized for RDNA 2, the engine will accomplish RDNA 2 based path tracing and AI super sampling NATIVELY and therefore more efficiently (6900xt may be slower than 3090 on paper but may run console ports way faster). Nvidia can either try to emulate AMD's path tracing and AI super-sampling in pursuit of efficiency or they can try to run it on top of the engine, either way, it's axiomatic that if a game developer develops a game for either one of the consoles it will be highly optimized to make full use of the underlying architecture, RDNA 2.
NGreedia is actually in deep kaka. Don't let the fancy cooler and market hysteria fool you.
Think about it.
There will be a $1000 card that will be faster than the 3080 at less than 300w with 16 or 24GB of video memory that will run console ports better. This may seem trivial. 90% of the games on PC are console ports or simultaneous release. Developers make more money selling games on consoles and therefore they are developed on console first and then they develop them for PC even for simultaneous release.
This is why NGreedia acquired ARM. They need to diversify their manufacturing capability because their gaming division may be in trouble.
I'm upset because, coming from 1080 Ti, the $1200 2080 Ti was the only upgrade path, a path I somewhat foolishly took about 6 months ago with a gently used XC2 for $900 local sale (no tax, no shipping). I mean I love this card and it's easily 50% faster in Timespy GPU and nearly all of my demanding titles (10,600 vs 16,700 Timespy GPU, both cards under water block and overclocked to 2000 and 2100 MHz respectively). This was around $1050 with the water block for a solid 50% gain, and I can use DLSS and RT.
Now the proposition is $1800 before taxes, another $230 for a decent water block and back-plate, or around $2300 altogether including shipping, for a 25% bump at 1440p (45% if it's a DLSS title), 35% bump at 4K (55% if it's a DLSS title)?
16,750 vs 20,500 Timespy GPU, a 23% gain?
I've already done the analysis comparing vmanuelgm's shunted 3090 @ 550w over in the 3090 Owners Club thread in overclock.net.
At 3840x2160, in the Metro Exodus bench, the shunted 3090 is 45% faster without DLSS on both cards and 67% faster with DLSS on both cards.
But we need to normalize for the same TDP, that additional 160w is only good for a 10% overclock with the 3090 because it's already at the edge of it's power-efficiency threshold! At 390w it's 10% slower, so we deduct 10% from the 45% at 4K with no DLSS and another from the with DLSS.
So the 3090 is 35% faster than 2080 Ti at the same power draw (2080 Ti @ 373w, 3090 @ 390w) and 55% faster with DLSS enabled on both cards.
But I'm not at 4K! And the 3080 (320w) is 20% faster at 1440p against 2080 Ti (260w) and 30% faster at 4K (only 10% and 20% faster when you run 2080 Ti at the same power draw).
So we deduct another 10% because the 3090 is 10% faster at 4K than at 1440p vs 2080 Ti at same power draw.
So a whopping 25% faster at 1440p without DLSS and 45% faster with DLSS for $2300!
What kind of value proposition is this?
This is a Ray Tracing title with DLSS in comparison!
90% of the games in my library don't have either! Any future game with DLSS, sure, it will be a 45% gain at 1440p. Worse, all of the VR titles in my library don't have DLSS so the gain there is the non DLSS gain of 45% (HP Reverb G2, 2160x2160x2)!
The question is, how many new titles will have DLSS support?
Either way 45% isn't enough for $2300!
For $2300 I'm going to need a card that is as fast as 2080 Ti SLI without any scaling loss or 200% faster!
Ampere is a indeed a joke.
The big joke is how fast TSMC 7nm is.
The APU in the XBOX Series X is as fast as an AMD 3700x and RTX 2080 combined at only 170w.
Let than sink in for a moment. 10.5 TFLOPS.
AMD will have a 300w card dropping on November.
And guess what? EVGA doesn't make AMD cards.
So their price gouging strategy of marking up the FTW3 $400 over the FE?
Not a winning strategy with at least this informed consumer.
Not only am I NOT buying this overpriced mediocrity but I am also encouraging others considering it to wait for November.
All Empires fall. This is what a monopoly looks like, it started with the $1200 80 Ti card of last generation, now the rebadged 80 Ti is $1500-1800 an offers half the performance gain (25% vs 50% at 1440p without DLSS)
Don't support this mediocrity, just let it die.
post edited by vulcan1978 - 2020/09/22 09:24:28