EVGA

LockedDear EVGA, Your 3080 Ti XC3 Cards Suck.

Page: < 12345.. > >> Showing page 4 of 11
Author
talon951
FTW Member
  • Total Posts : 1026
  • Reward points : 0
  • Joined: 2020/10/06 02:41:19
  • Status: offline
  • Ribbons : 3
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/25 19:39:19 (permalink)
I assume you flashed the 385w Zotac bios? That's higher than what EVGA intended for the XC3. So it would clock higher. Assuming that's what you flashed, you can't expect the evga bios to match that if they intended a lower power limit.

The real issue is that $1200 or more for a video card is insane in the first place. Have you considered this entire issue is actually driven ultimately by Covid? I know that sounds dumb on the surface but that's how we got to this point. The high demand and shortage of gpus is at least in part due to the pandemic.

In a way I agree with you in that paying $1500 (HC version) is INSANE for that card. Could it be that you are upset because it cost sooo much money and is at best average? I do understand that. But is it really EVGA's fault or are they just operating in this crazy environment we're in right now?

I've bought $3000 in video cards this generation. I would have laughed my ass off a year ago at anyone that suggested I would do that but here I am. A middling Zotac 3090 and a 3080Ti FTW3. The Zotac was VERY much like the 3080ti XC3 in that it underperformed relative to other 3090s. But that's all I could get last year. I made the most of it with a custom loop and KP XOC bios. I bought the 3080ti as a spare card in part to insure I'd have a 30 series card in case my 3090 died.

Anyway some perspective that will probably be lost but that's how I see this. Difficult times that we have to make the best of.
#91
jboud47
Superclocked Member
  • Total Posts : 118
  • Reward points : 0
  • Joined: 2021/05/10 14:49:25
  • Location: Silver Spire
  • Status: offline
  • Ribbons : 0
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/25 20:10:44 (permalink)
talon951
But is it really EVGA's fault



Yes. 100%. Lesser products are not having the same power issues. A card that costs 2/3rds of this one is outperforming it.

None of the rest of what you said is justification to sell a product that doesn't operate as advertised. Especially one that's priced at an inflated MSRP. And then on top of that to not address the issue and provide inadequate customer support.

"It was I who helped the Prime Evils mastermind their own exile to your world. The plan we set in motion so long ago cannot be stopped by any mortal agency. Hell, itself, is poised to spill forth into your world like a tidal wave of blood and nightmares. You and all your kind... are doomed."
#92
kongfra
Superclocked Member
  • Total Posts : 135
  • Reward points : 0
  • Joined: 2015/06/01 06:30:48
  • Status: offline
  • Ribbons : 0
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/25 20:11:11 (permalink)
gsrcrxsi
kongfra
ty_ger07
kongfra
Have you actually tried to RMA and see what EVGA says?  Maybe you just have bad card?

There seems to be many bad cards. Should everyone RMA their card at their own expense and hope the replacement performs as expected, or should EVGA be made aware of the issue and provide some other fix?
mine hits 348W in most demanding games like Red Dead Redemption 2 and Battlefield 5 without any overclocking or messing with any settings except the fan curve.  It also stays very cool 69-72 under full load and this is the XC3 base model

Good for you. You got a good card. You are one of a few who has reported such.

It's hit and miss and comes down to component manufacturing tolerances.



 
What I am saying is if the card is not within spec has anyone tried to RMA and see what EVGA says?

I RMA’d my first 3080 Ti XC3 HC. But not for the low performance issue. The official RMA reason was the RGB stopped working completely. So it was a bit of an open and shut case. I paid (2x MSRP) collateral for a cross-ship RMA, and got a brand new replacement.

My first card DID have the low power and low clock issues. And it was quite severe. FurMark would only pull 270W and clocks under 1000MHz. Most other tests had low power draw and low clocks, very often less than the 1725MHz advertised clocks. Before the RGB failed, I did attempt to troubleshoot the issue with EVGA, but suffice to say, EVGA’s tech support team seems like a bunch of regulars just reading a script. They know as much about their own product as any random person on Reddit. They didn’t have access to their own BIOS files to send me to try to reflash it, and they couldn’t offer any explanations about why it was acting the way it is. Just the boiler plate stuff, what is the system specs, what are the voltages, etc. they had no interest in escalating it to actual technicians.

Being able to have a valid reason for RMA was a bit of a godsend. The replacement card acts much better (at least pulls 300W in FurMark, and higher power draw and clocks across the board) but it’s still not acting the way it should, and exhibits strange behavior that honestly it shouldn’t be, lower enforced power limit with increased mem controller load.

But I’m not going to RMA it a second time just for this reason. I can only hope they admit the issue and try to fix it via BIOS.



stupid question because I don't pay attention to these technical enough, what is the GPU Clock and Memory Clock suppose to be for a regular TI XC3? If I understand GPU should be 1725 and memory 1975?
 
Here is a snapshot of HWINFO after about playing 45 of Red Dead Redemption 2 with all settings maxed , but this screenshot was taken 10-15 min after exiting game so average might be skewed but you can see I what I hit in the max, I really don't know if this is good or bad or what my clocks should be, I am not overclocking nor touched anything in precision other then ramping up the fan curve
 
Is hwinfo accurate?  
 
 
post edited by kongfra - 2021/07/25 20:21:42

3080 TI XC3, i9-10850K, Noctua NH-D15S, Gigabyte Z590 Aorus Elite, Crucial Ballistix 32 GB Ram DDR4-3200 CL16 , EVGA Supernova G6 1000W 80+ Gold, Windows 10 Pro,  LG 27GP83B-B with Dual Dell S2716DG Monitor, 2 TB Crucial MX500 SSD, Phanteks Enthoo Pro Case
#93
kram36
The Destroyer
  • Total Posts : 21477
  • Reward points : 0
  • Joined: 2009/10/27 19:00:58
  • Location: United States
  • Status: offline
  • Ribbons : 72
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/25 20:17:18 (permalink)
talon951
I assume you flashed the 385w Zotac bios? That's higher than what EVGA intended for the XC3. So it would clock higher. Assuming that's what you flashed, you can't expect the evga bios to match that if they intended a lower power limit.

The real issue is that $1200 or more for a video card is insane in the first place. Have you considered this entire issue is actually driven ultimately by Covid? I know that sounds dumb on the surface but that's how we got to this point. The high demand and shortage of gpus is at least in part due to the pandemic.

In a way I agree with you in that paying $1500 (HC version) is INSANE for that card. Could it be that you are upset because it cost sooo much money and is at best average? I do understand that. But is it really EVGA's fault or are they just operating in this crazy environment we're in right now?

I've bought $3000 in video cards this generation. I would have laughed my ass off a year ago at anyone that suggested I would do that but here I am. A middling Zotac 3090 and a 3080Ti FTW3. The Zotac was VERY much like the 3080ti XC3 in that it underperformed relative to other 3090s. But that's all I could get last year. I made the most of it with a custom loop and KP XOC bios. I bought the 3080ti as a spare card in part to insure I'd have a 30 series card in case my 3090 died.

Anyway some perspective that will probably be lost but that's how I see this. Difficult times that we have to make the best of.

My card cost over $1,500 not $1,200.
 
jboud47
talon951
But is it really EVGA's fault



Yes. 100%. Lesser products are not having the same power issues. A card that costs 2/3rds of this one is outperforming it.

None of the rest of what you said is justification to sell a product that doesn't operate as advertised. Especially one that's priced at an inflated MSRP. And then on top of that to not address the issue and provide inadequate customer support.

Bingo!
#94
Dante_Tel
New Member
  • Total Posts : 39
  • Reward points : 0
  • Joined: 2021/06/03 19:24:41
  • Status: offline
  • Ribbons : 0
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/25 20:29:33 (permalink)
ty_ger07
Dante_Tel
ty_ger07
Dante_Tel
Aruzedragon
Dante_Tel
So Kram, I took a look at your Motherboard as mentioned in your specs from these sites. Are you aware that your mobo only goes up to PCIe 3.0? For max speed and performance these cards require no lower than the current PCIe 4.0 to run without hindrance. I mean, you can run a 4.0 card in a 3.0 slot but you're going to see varied levels of performance drop due to your slot not being up to spec. That's honestly my best guess after running all the other numbers and specs myself, the problem isn't the cards, it's your Motherboard not being up to spec.


I am having the same issue, and I have pcie 4.0 also. Not to mention, pcie 3.0 has more than enough bandwidth for gaming application. This isn't about pcie 3 vs 4 at all.



I mean I've checked his numbers and from what I've seen in the past (i.e. trying to socket a GPU in a system that isn't up to spec) it definitely matches with what I've seen before. This problem can be caused by any number of mismatches in the system though, not just the pcie slot itself.

I disagree.  In this instance, the rest of the system is inconsequential.  There is no configuration mismatch which is going to cause the card to power limit at 290 watts instead of 350 watts.  When the card says that it is power limited at a power consumption way lower than expected, that is the problem we are focusing on which isn't related to the rest of the system.


... (wattage and overall performance was down) ...

I hope that you can see and understand the difference.
 
Consider the card's wattage being down, performance being down, and the perfcap reason is "util".  Then, consider the card's wattage being down, performance being down, and the perfcap reason is "pwr".  Can you see and understand the difference?  Do you understand the implications?
 
It's very easy to see the difference between a card which is being bottlenecked versus a card which is simply not performing as it should.  Kram's cards are not performing as they should.  They are not bottlenecked.  It is not a configuration problem in his case.


If you say so.

No longer watching queues as of: 09/24/2021
Best of luck to everyone!
 
TL;DR: Gone Fishin'
#95
gsrcrxsi
SSC Member
  • Total Posts : 985
  • Reward points : 0
  • Joined: 2010/01/24 19:20:59
  • Status: online
  • Ribbons : 5
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/25 20:45:12 (permalink)
kongfra
stupid question because I don't pay attention to these technical enough, what is the GPU Clock and Memory Clock suppose to be for a regular TI XC3? If I understand GPU should be 1725 and memory 1975?
 
Here is a snapshot of HWINFO after about playing 45 of Red Dead Redemption 2 with all settings maxed , but this screenshot was taken 10-15 min after exiting game so average might be skewed but you can see I what I hit in the max, I really don't know if this is good or bad or what my clocks should be, I am not overclocking nor touched anything in precision other then ramping up the fan curve
 
Is hwinfo accurate?  



Clock speeds will ultimately depend on what power limit is enforced for any given situation.

I could play a light game or run a light load and the GPU hits 2000+MHz. But I could also hit a very heavy load and the power limit drops to 285W and it’ll only manage 1680MHz (or in the case of FurMark, something like 1050MHz at 300W.

So the answer is, it depends.

But I don’t think the normal behavior should be different power limits for different levels of memory controller load, which is what I’ve observed so far with my card, and not something seen on the FTW3 card.

Before anyone starts slinging nonsense. I do NOT expect the XC3 card to perform the same as a FTW3, or be able to pull the same 400+ watts that that model does. What I DO expect, is to be able to actually use all of the power (350-366W) and not have the clocks being throttled below the boost clocks advertised while it’s pulling less than 90% TDP while telling me it’s power limited. If the card is pulling 300W and running 1700MHz, it should allow me to add some more clocks and use the TDP overhead it still has. But right now, it holds you back except under certain situations that don’t need high levels of memory controller utilization

Rig1: EPYC 7V12 | [4] RTX A4000
Rig2: EPYC 7B12 | [5] 3080Ti + [2] 2080Ti
Rig3: EPYC 7B12 | [6] 3070Ti + [2] 3060
Rig4: [2] EPYC 7742 | RTX A2000
Rig5: [2] EPYC 7642
Rig6: EPYC 7551 | [4] Titan V

#96
KingEngineRevUp
FTW Member
  • Total Posts : 1030
  • Reward points : 0
  • Joined: 2019/03/28 16:38:54
  • Status: offline
  • Ribbons : 9
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/25 20:45:56 (permalink)
Why are there people here trying to say this is okay? EVGA ADVERTISED this card has a 350W TDP. It is not okay if people's cards are averaging 280-300W. That's not even within 5-10% of 350W.
#97
tyranus7
Superclocked Member
  • Total Posts : 198
  • Reward points : 0
  • Joined: 2018/07/14 20:54:24
  • Location: Old Silent Hill
  • Status: offline
  • Ribbons : 0
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/25 21:45:34 (permalink)
Dante_Tel
So Kram, I took a look at your Motherboard as mentioned in your specs from these sites. Are you aware that your mobo only goes up to PCIe 3.0? For max speed and performance these cards require no lower than the current PCIe 4.0 to run without hindrance. I mean, you can run a 4.0 card in a 3.0 slot but you're going to see varied levels of performance drop due to your slot not being up to spec. That's honestly my best guess after running all the other numbers and specs myself, the problem isn't the cards, it's your Motherboard not being up to spec.




You're just plain wrong. PCIe 3.0 and 4.0 makes no difference at all for 30-series.

Desktop: | Gigabyte Z490 Vision-G | Intel Core i9 10850K | GPU: COLORFUL iGame GeForce RTX 3080 Ti Advanced OC-V | 16 GB RAM DDR4@4400 MHz CL18 | 1 TB NVMe + 10 TB HDD | Intel 2.5 Gbps Ethernet | 1080p @144Hz | EVGA 750W Supernova P2 |
 
12G-P5-3967-KR     6/3/2021 7:48:19 AM PT    Yes (ordered/cancelled by EVGA)

12G-P5-3968-KR     6/3/2021 7:51:30 AM PT    No
12G-P5-3953-KR     6/30/2021 6:30:34 AM PT   Yes (skipped)

 
 
#98
-Tax-
Superclocked Member
  • Total Posts : 110
  • Reward points : 0
  • Joined: 2011/04/08 14:12:32
  • Status: offline
  • Ribbons : 0
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 07:53:24 (permalink)
Same issue with my 12G-P5-3953-KR, no where near the 350 watts stated, usually under 300. I have put the card in the Step Up Queue, I don't feel sending it in for a replacement will fix the issue. With so many of these cards not performing as expected.
 
LG27GN950 4k 144hz monitor
MSI Z390 Gaming Pro Carbon AC
I9-9900k 
16 GB 3200 RAM
EVGA RTX 3080 Ti XC3 Gaming
EVGA SuperNOVA 850 G2, 80+ GOLD 850W 
1TB M.2 Windows 10 Home 64 bit
500GB SSD
Lian Li  pc-011-dynamic case
 
Windows up to date
Drivers up to date
Bios up to date
 
#99
kram36
The Destroyer
  • Total Posts : 21477
  • Reward points : 0
  • Joined: 2009/10/27 19:00:58
  • Location: United States
  • Status: offline
  • Ribbons : 72
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 07:56:52 (permalink)
-Tax-
Same issue with my 12G-P5-3953-KR, no where near the 350 watts stated, usually under 300. I have put the card in the Step Up Queue, I don't feel sending it in for a replacement will fix the issue. With so many of these cards not performing as expected.
 
LG27GN950 4k 144hz monitor
MSI Z390 Gaming Pro Carbon AC
I9-9900k 
16 GB 3200 RAM
EVGA RTX 3080 Ti XC3 Gaming
EVGA SuperNOVA 850 G2, 80+ GOLD 850W 
1TB M.2 Windows 10 Home 64 bit
500GB SSD
Lian Li  pc-011-dynamic case
 
Windows up to date
Drivers up to date
Bios up to date
 


Hoping by the time your Step-Up hits, EVGA will address this issue and fix it.
-Tax-
Superclocked Member
  • Total Posts : 110
  • Reward points : 0
  • Joined: 2011/04/08 14:12:32
  • Status: offline
  • Ribbons : 0
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 08:02:21 (permalink)
kram36
 
Hoping by the time your Step-Up hits, EVGA will address this issue and fix it.




Totally agree EVGA should figure out the issue and address it. Step-Up shouldn't be the "fix" but it sure is the best route I can see currently.
-Tax-
Superclocked Member
  • Total Posts : 110
  • Reward points : 0
  • Joined: 2011/04/08 14:12:32
  • Status: offline
  • Ribbons : 0
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 08:02:21 (permalink)
Double posted for some reason lol
post edited by -Tax- - 2021/07/26 08:03:59
Jstandaert
Superclocked Member
  • Total Posts : 243
  • Reward points : 0
  • Joined: 2021/04/10 16:36:16
  • Status: offline
  • Ribbons : 2
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 08:04:07 (permalink)
kram36
-Tax-
Same issue with my 12G-P5-3953-KR, no where near the 350 watts stated, usually under 300. I have put the card in the Step Up Queue, I don't feel sending it in for a replacement will fix the issue. With so many of these cards not performing as expected.
 
LG27GN950 4k 144hz monitor
MSI Z390 Gaming Pro Carbon AC
I9-9900k 
16 GB 3200 RAM
EVGA RTX 3080 Ti XC3 Gaming
EVGA SuperNOVA 850 G2, 80+ GOLD 850W 
1TB M.2 Windows 10 Home 64 bit
500GB SSD
Lian Li  pc-011-dynamic case
 
Windows up to date
Drivers up to date
Bios up to date
 


Hoping by the time your Step-Up hits, EVGA will address this issue and fix it.


My daily attempt to sound smart-
 
Doesn't the FTW series have the opposite problem? J2C was doing a video on new world and set his power cap to 100% but the card was showing like 120%. could this be EVGA being overly cautious so the card isn't a time bomb like the 3090's?

Save some Dough-Use my Code
 
 
-Tax-
Superclocked Member
  • Total Posts : 110
  • Reward points : 0
  • Joined: 2011/04/08 14:12:32
  • Status: offline
  • Ribbons : 0
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 08:07:49 (permalink)
Jstandaert
My daily attempt to sound smart-
 
Doesn't the FTW series have the opposite problem? J2C was doing a video on new world and set his power cap to 100% but the card was showing like 120%. could this be EVGA being overly cautious so the card isn't a time bomb like the 3090's?



I saw that video as well, anything is feasible until EVGA responds to the situation.
kram36
The Destroyer
  • Total Posts : 21477
  • Reward points : 0
  • Joined: 2009/10/27 19:00:58
  • Location: United States
  • Status: offline
  • Ribbons : 72
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 08:15:02 (permalink)
Jstandaert
kram36
-Tax-
Same issue with my 12G-P5-3953-KR, no where near the 350 watts stated, usually under 300. I have put the card in the Step Up Queue, I don't feel sending it in for a replacement will fix the issue. With so many of these cards not performing as expected.
 
LG27GN950 4k 144hz monitor
MSI Z390 Gaming Pro Carbon AC
I9-9900k 
16 GB 3200 RAM
EVGA RTX 3080 Ti XC3 Gaming
EVGA SuperNOVA 850 G2, 80+ GOLD 850W 
1TB M.2 Windows 10 Home 64 bit
500GB SSD
Lian Li  pc-011-dynamic case
 
Windows up to date
Drivers up to date
Bios up to date
 


Hoping by the time your Step-Up hits, EVGA will address this issue and fix it.


My daily attempt to sound smart-
 
Doesn't the FTW series have the opposite problem? J2C was doing a video on new world and set his power cap to 100% but the card was showing like 120%. could this be EVGA being overly cautious so the card isn't a time bomb like the 3090's?


EVGA released these cards before that game started destroying 3090 cards. There is no correlation of the issues.
kongfra
Superclocked Member
  • Total Posts : 135
  • Reward points : 0
  • Joined: 2015/06/01 06:30:48
  • Status: offline
  • Ribbons : 0
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 08:18:02 (permalink)
kram36
kongfra
Have you actually tried to RMA and see what EVGA says?  Maybe you just have bad card?  mine hits 348W in most demanding games like Red Dead Redemption 2 and Battlefield 5 without any overclocking or messing with any settings except the fan curve.  It also stays very cool 69-72 under full load and this is the XC3 base model


Post links to Time Spy, Port Royal and Fire Strike Extreme.




 
Since i only have the 3dmark demo, i only have Timespy
 
Here are my results - 19,003
 
this on air cooled TI XC3 as well, all stock
 
https://www.3dmark.com/3dm/64211712?

3080 TI XC3, i9-10850K, Noctua NH-D15S, Gigabyte Z590 Aorus Elite, Crucial Ballistix 32 GB Ram DDR4-3200 CL16 , EVGA Supernova G6 1000W 80+ Gold, Windows 10 Pro,  LG 27GP83B-B with Dual Dell S2716DG Monitor, 2 TB Crucial MX500 SSD, Phanteks Enthoo Pro Case
speedysloth
Superclocked Member
  • Total Posts : 132
  • Reward points : 0
  • Joined: 2021/06/09 18:23:46
  • Status: offline
  • Ribbons : 1
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 08:24:38 (permalink)
-Tax-
Same issue with my 12G-P5-3953-KR, no where near the 350 watts stated, usually under 300. I have put the card in the Step Up Queue, I don't feel sending it in for a replacement will fix the issue. With so many of these cards not performing as expected.
 
LG27GN950 4k 144hz monitor
MSI Z390 Gaming Pro Carbon AC
I9-9900k 
16 GB 3200 RAM
EVGA RTX 3080 Ti XC3 Gaming
EVGA SuperNOVA 850 G2, 80+ GOLD 850W 
1TB M.2 Windows 10 Home 64 bit
500GB SSD
Lian Li  pc-011-dynamic case
 
Windows up to date
Drivers up to date
Bios up to date
 


Is it possible for you to post a timespy run results and maybe a picture of the entire hwinfo64 gpu section after the timespy run is finished showing temperatures and power draw? Thanks.
KingEngineRevUp
FTW Member
  • Total Posts : 1030
  • Reward points : 0
  • Joined: 2019/03/28 16:38:54
  • Status: offline
  • Ribbons : 9
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 09:50:26 (permalink)
Just a thought, we need to find someone with a MSI 3080 Ti Ventus or Gigabyte Eagle
 
I've noticed that the Ventus and Eagle models always have the same TDPs as XC3s. 
 
https://www.techpowerup.com/gpu-specs/msi-rtx-3080-ti-ventus-3x-oc.b8767
https://www.techpowerup.com/gpu-specs/gigabyte-rtx-3080-ti-eagle.b8842
 
If the Ventus and Eagle have the same problem, then it's highly likely it might be a hardware level issue for cards designed with 2X 8-Pin with a 350W TDP in mind. 
 
This would be similar to 3080 Tis with TDPs around 450W that hover around just 400W only. 
post edited by KingEngineRevUp - 2021/07/26 09:52:38
kram36
The Destroyer
  • Total Posts : 21477
  • Reward points : 0
  • Joined: 2009/10/27 19:00:58
  • Location: United States
  • Status: offline
  • Ribbons : 72
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 10:41:44 (permalink)
KingEngineRevUp
Just a thought, we need to find someone with a MSI 3080 Ti Ventus or Gigabyte Eagle
 
I've noticed that the Ventus and Eagle models always have the same TDPs as XC3s. 
 
https://www.techpowerup.com/gpu-specs/msi-rtx-3080-ti-ventus-3x-oc.b8767
https://www.techpowerup.com/gpu-specs/gigabyte-rtx-3080-ti-eagle.b8842
 
If the Ventus and Eagle have the same problem, then it's highly likely it might be a hardware level issue for cards designed with 2X 8-Pin with a 350W TDP in mind. 
 
This would be similar to 3080 Tis with TDPs around 450W that hover around just 400W only. 


The 3080 Ti FE card is a 2X 8-Pin card and has no issue pulling 350w same with the Nvidia 3090 FE card, it's a 2X 8-Pin card.
 
With that game that started to kill 3090 cards. EVGA is really going to ignore this issue even more. What a bad timing.
post edited by kram36 - 2021/07/26 10:49:01
gsrcrxsi
SSC Member
  • Total Posts : 985
  • Reward points : 0
  • Joined: 2010/01/24 19:20:59
  • Status: online
  • Ribbons : 5
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 10:55:32 (permalink)
kram36
 
The 3080 Ti FE card is a 2X 8-Pin card and has no issue pulling 350w same with the Nvidia 3090 FE card, it's a 2X 8-Pin card.
 
With that game that started to kill 3090 cards. EVGA is really going to ignore this issue even more. What a bad timing.




technically, it's a 1x12-pin card, that ships with a 2x8-pin adapter that you may or may not need (depending if you get the custom cable provided by your PSU manufacturer or not)

Rig1: EPYC 7V12 | [4] RTX A4000
Rig2: EPYC 7B12 | [5] 3080Ti + [2] 2080Ti
Rig3: EPYC 7B12 | [6] 3070Ti + [2] 3060
Rig4: [2] EPYC 7742 | RTX A2000
Rig5: [2] EPYC 7642
Rig6: EPYC 7551 | [4] Titan V

KingEngineRevUp
FTW Member
  • Total Posts : 1030
  • Reward points : 0
  • Joined: 2019/03/28 16:38:54
  • Status: offline
  • Ribbons : 9
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 10:58:30 (permalink)
kram36
KingEngineRevUp
Just a thought, we need to find someone with a MSI 3080 Ti Ventus or Gigabyte Eagle
 
I've noticed that the Ventus and Eagle models always have the same TDPs as XC3s. 
 
https://www.techpowerup.com/gpu-specs/msi-rtx-3080-ti-ventus-3x-oc.b8767
https://www.techpowerup.com/gpu-specs/gigabyte-rtx-3080-ti-eagle.b8842
 
If the Ventus and Eagle have the same problem, then it's highly likely it might be a hardware level issue for cards designed with 2X 8-Pin with a 350W TDP in mind. 
 
This would be similar to 3080 Tis with TDPs around 450W that hover around just 400W only. 


The 3080 Ti FE card is a 2X 8-Pin card and has no issue pulling 350w same with the Nvidia 3090 FE card, it's a 2X 8-Pin card.
 
With that game that started to kill 3090 cards. EVGA is really going to ignore this issue even more. What a bad timing.




Yes, but remember those cards do not follow the same Reference design, they are more customized than most if not all other GPUs. 
 
I have a suspicion that these other 350W cards have the same behavior as your XC3, but we won't know until we find these owners. 
 
If they have the same issue, it sounds like a problem that needs to be taken up with nivida, and if they also have an issue, then more groups can join together to try and see if there is a software level fix for this. 
 
I made a post on nvidia sub-reddit, waiting for it to get approved.
 
But I have observed a very common pattern. XC3, Ventus, Eagle and almost all of Zotac's cards all share the same TDPs
 
The Gigabyte Gaming, Vision, TUF and several other models usually follow a similar TDP
 
The Suprim X, FTW3, Strix, Master and Elite cards usually follow around the same TDPs
 
I've seen posters complain about similar TDP behaviors for the 3080s and 3090s between these cards. It can be a similar case for the 3080 Ti. This is my hypothesis.
 
1. There is a low TDP reference design
2. There is a middle TDP reference design that always pushes TDP around the FE
3. There is a high TDP reference design, usually the 3-Pin cards
 
Nvidia approves these reference designs and they probably made and sent the reference designs out for 1 and 2 and other partners just copied and slightly modified the parts on the bill of materials. 
 
It's just a hunch for now.
 
Edit: The ideal case, the other 350W TDP cards can consistently go above 300W, so why can't the XC3? Then more pressure can be put on EVGA to match their competitors. 
post edited by KingEngineRevUp - 2021/07/26 11:09:42
atfrico
Omnipotent Enthusiast
  • Total Posts : 12753
  • Reward points : 0
  • Joined: 2008/05/20 16:16:06
  • Location: <--Dip, Dip, Potato Chip!-->
  • Status: offline
  • Ribbons : 25
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:03:59 (permalink)
kram36
KingEngineRevUp
Just a thought, we need to find someone with a MSI 3080 Ti Ventus or Gigabyte Eagle
 
I've noticed that the Ventus and Eagle models always have the same TDPs as XC3s. 
 
https://www.techpowerup.com/gpu-specs/msi-rtx-3080-ti-ventus-3x-oc.b8767
https://www.techpowerup.com/gpu-specs/gigabyte-rtx-3080-ti-eagle.b8842
 
If the Ventus and Eagle have the same problem, then it's highly likely it might be a hardware level issue for cards designed with 2X 8-Pin with a 350W TDP in mind. 
 
This would be similar to 3080 Tis with TDPs around 450W that hover around just 400W only. 


The 3080 Ti FE card is a 2X 8-Pin card and has no issue pulling 350w same with the Nvidia 3090 FE card, it's a 2X 8-Pin card.
 
With that game that started to kill 3090 cards. EVGA is really going to ignore this issue even more. What a bad timing.

Give them time kram. EVGA will address it. 😼

Those who abuse power, are nothing but scumbags! The challenge of power is how to use it and not abuse it. The abuse of power that seems to create the most unhappiness is when a person uses personal power to get ahead without regards to the welfare of others, people are obsessed with it. You can take a nice person and turn them into a slob, into an insane being, craving power, destroying anything that stands in their way.
 
 
Affiliate Code: 3T15O1S07G
KingEngineRevUp
FTW Member
  • Total Posts : 1030
  • Reward points : 0
  • Joined: 2019/03/28 16:38:54
  • Status: offline
  • Ribbons : 9
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:07:03 (permalink)
atfrico
kram36
KingEngineRevUp
Just a thought, we need to find someone with a MSI 3080 Ti Ventus or Gigabyte Eagle
 
I've noticed that the Ventus and Eagle models always have the same TDPs as XC3s. 
 
https://www.techpowerup.com/gpu-specs/msi-rtx-3080-ti-ventus-3x-oc.b8767
https://www.techpowerup.com/gpu-specs/gigabyte-rtx-3080-ti-eagle.b8842
 
If the Ventus and Eagle have the same problem, then it's highly likely it might be a hardware level issue for cards designed with 2X 8-Pin with a 350W TDP in mind. 
 
This would be similar to 3080 Tis with TDPs around 450W that hover around just 400W only. 


The 3080 Ti FE card is a 2X 8-Pin card and has no issue pulling 350w same with the Nvidia 3090 FE card, it's a 2X 8-Pin card.
 
With that game that started to kill 3090 cards. EVGA is really going to ignore this issue even more. What a bad timing.

Give them time kram. EVGA will address it. 😼



Or they won't. Several of us 3080 Ti FTW3 owners have made a stink about the FTW3 not being able to push and sustain TDP of 450W, we all average 400W and just touch 450W for like a few milliseconds at a time.
 
We just got e-mails back saying "it works at intended." Never got a response why it works the way the claim. 
B0baganoosh
CLASSIFIED Member
  • Total Posts : 2365
  • Reward points : 0
  • Joined: 2009/08/04 04:27:18
  • Status: offline
  • Ribbons : 39
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:11:07 (permalink)
So thinking about who has "good" card and "bad" cards here, has anybody put together a table of motherboards, CPUs (w/speed), memory (w/speed+timings), PCI-e config (gen? lanes?) that they're running with their card?
 
I doubt it is strictly related, specifically because you're all seeing the Pwr limit flag, but it might help put to bed some of the other theories if you can show no correlation between things like "slow" memory or PCI-e gen 3 (or 8x lanes only) usage, etc. 

6Q6CPFHPBPCU691 is a discount code anyone can use.
 
i9 13900k - EVGA Z690 Classy - Nvidia RTX 4090 FE - G.Skill 32GB DDR5-6000  - WD SN850 2TB NVMe Gen4 - Be Quiet! Straight Power 12 1200W - Be Quiet! Dark Base 900 Pro. MO-RA3 420 Pro. Dark Palimpsest MODS RIGS post for build notes.
KingEngineRevUp
FTW Member
  • Total Posts : 1030
  • Reward points : 0
  • Joined: 2019/03/28 16:38:54
  • Status: offline
  • Ribbons : 9
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:19:46 (permalink)
Nike_7688
So thinking about who has "good" card and "bad" cards here, has anybody put together a table of motherboards, CPUs (w/speed), memory (w/speed+timings), PCI-e config (gen? lanes?) that they're running with their card?
 
I doubt it is strictly related, specifically because you're all seeing the Pwr limit flag, but it might help put to bed some of the other theories if you can show no correlation between things like "slow" memory or PCI-e gen 3 (or 8x lanes only) usage, etc. 


There is not "good card." They're all not able to use the extra 50W to sustain higher clock speeds. 
 
OC silicon potential is not the same as being gimped and held back. If there's a good silicon card, it to would benefit from being able to access the 50W (63W total with max PL) more. 
gsrcrxsi
SSC Member
  • Total Posts : 985
  • Reward points : 0
  • Joined: 2010/01/24 19:20:59
  • Status: online
  • Ribbons : 5
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:24:04 (permalink)
KingEngineRevUp
 
 
1. There is a low TDP reference design
2. There is a middle TDP reference design that always pushes TDP around the FE
3. There is a high TDP reference design, usually the 3-Pin cards
 
Nvidia approves these reference designs and they probably made and sent the reference designs out for 1 and 2 and other partners just copied and slightly modified the parts on the bill of materials. 
 
It's just a hunch for now.
 
Edit: The ideal case, the other 350W TDP cards can consistently go above 300W, so why can't the XC3? Then more pressure can be put on EVGA to match their competitors. 




I've said it over and over.
 
this is not a case where the GPU will never reach a certain power draw. it's entirely use-case dependent.
 
I can make my card hit 350W... in the DX12 Raytracing benchmark. and probably other loads that see very low memory controller utilization. the problem is that too many people are not standardizing their test plans, and you have one person comparing their performance and power draw under X conditions, while another person compares under Y conditions, and yet another compares under Z conditions.
 
so does that mean that our cards are "fine" if we can trick the card into running full power draw under one niche benchmark test? I don't think so. In my opinion the power limit should be enforced to the same value, regardless of use-case. but that's not what's happening. and clearly some cards are performing better than others. I showed this in my pre and post RMA testing, eliminating the excuse of "oh it's just system to system differences and CPU/ram differences". If that was the case, I wouldnt see different behavior between my original card and my RMA, yet I did. and I also eliminated the software based excuses of "OS or driver" problems by demonstrating this same behavior on both Linux and Windows.
 
the best correlation I've shown with my own card is that memory controller load and observed effective power limit exhibit an inverse relationship. high mem loads = low power limit, and low mem load = higher power limit.

Rig1: EPYC 7V12 | [4] RTX A4000
Rig2: EPYC 7B12 | [5] 3080Ti + [2] 2080Ti
Rig3: EPYC 7B12 | [6] 3070Ti + [2] 3060
Rig4: [2] EPYC 7742 | RTX A2000
Rig5: [2] EPYC 7642
Rig6: EPYC 7551 | [4] Titan V

UnknownOCPlayer
New Member
  • Total Posts : 3
  • Reward points : 0
  • Joined: 2021/04/08 14:29:07
  • Status: offline
  • Ribbons : 0
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:26:22 (permalink)
Has anyone asked in regards of positioning of the cards and if the MSI afterburner is also being used, etc.?
 
KingEngineRevUp
FTW Member
  • Total Posts : 1030
  • Reward points : 0
  • Joined: 2019/03/28 16:38:54
  • Status: offline
  • Ribbons : 9
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:31:24 (permalink)
gsrcrxsi
KingEngineRevUp
 
 
1. There is a low TDP reference design
2. There is a middle TDP reference design that always pushes TDP around the FE
3. There is a high TDP reference design, usually the 3-Pin cards
 
Nvidia approves these reference designs and they probably made and sent the reference designs out for 1 and 2 and other partners just copied and slightly modified the parts on the bill of materials. 
 
It's just a hunch for now.
 
Edit: The ideal case, the other 350W TDP cards can consistently go above 300W, so why can't the XC3? Then more pressure can be put on EVGA to match their competitors. 




I've said it over and over.
 
this is not a case where the GPU will never reach a certain power draw. it's entirely use-case dependent.
 
I can make my card hit 350W... in the DX12 Raytracing benchmark. and probably other loads that see very low memory controller utilization. the problem is that too many people are not standardizing their test plans, and you have one person comparing their performance and power draw under X conditions, while another person compares under Y conditions, and yet another compares under Z conditions.
 
so does that mean that our cards are "fine" if we can trick the card into running full power draw under one niche benchmark test? I don't think so. In my opinion the power limit should be enforced to the same value, regardless of use-case. but that's not what's happening. and clearly some cards are performing better than others. I showed this in my pre and post RMA testing, eliminating the excuse of "oh it's just system to system differences and CPU/ram differences". If that was the case, I wouldnt see different behavior between my original card and my RMA, yet I did. and I also eliminated the software based excuses of "OS or driver" problems by demonstrating this same behavior on both Linux and Windows.
 
the best correlation I've shown with my own card is that memory controller load and observed effective power limit exhibit an inverse relationship. high mem loads = low power limit, and low mem load = higher power limit.


There is a difference between "hitting" and there is a difference between "sustaining."

I want EVGA to explain the scenarios for both. Not you or I. It's time for them to be transparent with us.
post edited by KingEngineRevUp - 2021/07/26 11:32:31
gsrcrxsi
SSC Member
  • Total Posts : 985
  • Reward points : 0
  • Joined: 2010/01/24 19:20:59
  • Status: online
  • Ribbons : 5
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:31:57 (permalink)
Nike_7688
So thinking about who has "good" card and "bad" cards here, has anybody put together a table of motherboards, CPUs (w/speed), memory (w/speed+timings), PCI-e config (gen? lanes?) that they're running with their card?
 
I doubt it is strictly related, specifically because you're all seeing the Pwr limit flag, but it might help put to bed some of the other theories if you can show no correlation between things like "slow" memory or PCI-e gen 3 (or 8x lanes only) usage, etc. 




these things don't matter to this issue. people have taken the same card and put it in different PCs and seen the same issue.
 
and conversely, I don't know about anyone else, but at least I have. tried different examples of the same exact card (same sku, different serial number), in the same exact system, and observed different behavior with some cards pulling more power and clocking higher.
 
cards having different performance in the same sku is normal and expected. some cards naturally clock higher. different cards of the same sku seeing different effective power limits is not normal though.

Rig1: EPYC 7V12 | [4] RTX A4000
Rig2: EPYC 7B12 | [5] 3080Ti + [2] 2080Ti
Rig3: EPYC 7B12 | [6] 3070Ti + [2] 3060
Rig4: [2] EPYC 7742 | RTX A2000
Rig5: [2] EPYC 7642
Rig6: EPYC 7551 | [4] Titan V

gsrcrxsi
SSC Member
  • Total Posts : 985
  • Reward points : 0
  • Joined: 2010/01/24 19:20:59
  • Status: online
  • Ribbons : 5
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:32:26 (permalink)
KingEngineRevUp
gsrcrxsi
KingEngineRevUp
 
 
1. There is a low TDP reference design
2. There is a middle TDP reference design that always pushes TDP around the FE
3. There is a high TDP reference design, usually the 3-Pin cards
 
Nvidia approves these reference designs and they probably made and sent the reference designs out for 1 and 2 and other partners just copied and slightly modified the parts on the bill of materials. 
 
It's just a hunch for now.
 
Edit: The ideal case, the other 350W TDP cards can consistently go above 300W, so why can't the XC3? Then more pressure can be put on EVGA to match their competitors. 




I've said it over and over.
 
this is not a case where the GPU will never reach a certain power draw. it's entirely use-case dependent.
 
I can make my card hit 350W... in the DX12 Raytracing benchmark. and probably other loads that see very low memory controller utilization. the problem is that too many people are not standardizing their test plans, and you have one person comparing their performance and power draw under X conditions, while another person compares under Y conditions, and yet another compares under Z conditions.
 
so does that mean that our cards are "fine" if we can trick the card into running full power draw under one niche benchmark test? I don't think so. In my opinion the power limit should be enforced to the same value, regardless of use-case. but that's not what's happening. and clearly some cards are performing better than others. I showed this in my pre and post RMA testing, eliminating the excuse of "oh it's just system to system differences and CPU/ram differences". If that was the case, I wouldnt see different behavior between my original card and my RMA, yet I did. and I also eliminated the software based excuses of "OS or driver" problems by demonstrating this same behavior on both Linux and Windows.
 
the best correlation I've shown with my own card is that memory controller load and observed effective power limit exhibit an inverse relationship. high mem loads = low power limit, and low mem load = higher power limit.


There is a difference between "hitting" and there is a difference between "sustaining."

I want EVGA to explain the scenarios for both. Not you.



I can sustain 345-350W for the duration of the DX12 test.
 
I only reference sustained numbers as being noteworthy values. I don't give much attention to spikes.
 
einstein = 285W sustained
furmark = 304W sustained
timespy = 320-330W sustained
DX12 RT = 345-350W sustained
post edited by gsrcrxsi - 2021/07/26 11:34:42

Rig1: EPYC 7V12 | [4] RTX A4000
Rig2: EPYC 7B12 | [5] 3080Ti + [2] 2080Ti
Rig3: EPYC 7B12 | [6] 3070Ti + [2] 3060
Rig4: [2] EPYC 7742 | RTX A2000
Rig5: [2] EPYC 7642
Rig6: EPYC 7551 | [4] Titan V

Page: < 12345.. > >> Showing page 4 of 11
Jump to:
  • Back to Mobile