EVGA

LockedDear EVGA, Your 3080 Ti XC3 Cards Suck.

Page: < 12345.. > >> Showing page 5 of 11
Author
KingEngineRevUp
FTW Member
  • Total Posts : 1030
  • Reward points : 0
  • Joined: 2019/03/28 16:38:54
  • Status: offline
  • Ribbons : 9
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:35:06 (permalink)
gsrcrxsi
KingEngineRevUp
gsrcrxsi
KingEngineRevUp


1. There is a low TDP reference design
2. There is a middle TDP reference design that always pushes TDP around the FE
3. There is a high TDP reference design, usually the 3-Pin cards

Nvidia approves these reference designs and they probably made and sent the reference designs out for 1 and 2 and other partners just copied and slightly modified the parts on the bill of materials. 

It's just a hunch for now.

Edit: The ideal case, the other 350W TDP cards can consistently go above 300W, so why can't the XC3? Then more pressure can be put on EVGA to match their competitors. 




I've said it over and over.

this is not a case where the GPU will never reach a certain power draw. it's entirely use-case dependent.

I can make my card hit 350W... in the DX12 Raytracing benchmark. and probably other loads that see very low memory controller utilization. the problem is that too many people are not standardizing their test plans, and you have one person comparing their performance and power draw under X conditions, while another person compares under Y conditions, and yet another compares under Z conditions.

so does that mean that our cards are "fine" if we can trick the card into running full power draw under one niche benchmark test? I don't think so. In my opinion the power limit should be enforced to the same value, regardless of use-case. but that's not what's happening. and clearly some cards are performing better than others. I showed this in my pre and post RMA testing, eliminating the excuse of "oh it's just system to system differences and CPU/ram differences". If that was the case, I wouldnt see different behavior between my original card and my RMA, yet I did. and I also eliminated the software based excuses of "OS or driver" problems by demonstrating this same behavior on both Linux and Windows.

the best correlation I've shown with my own card is that memory controller load and observed effective power limit exhibit an inverse relationship. high mem loads = low power limit, and low mem load = higher power limit.


There is a difference between "hitting" and there is a difference between "sustaining."

I want EVGA to explain the scenarios for both. Not you.



I can sustain 345-350W for the duration of the DX12 test.


But like you said, you have to "trick" the card into doing it right?

Edit: I saw you edited your post.

Do you know where your first and second card were made?

https://forums.evga.com/D...l-number-m2981084.aspx

China or Taiwan?
post edited by KingEngineRevUp - 2021/07/26 11:40:54
gsrcrxsi
SSC Member
  • Total Posts : 985
  • Reward points : 0
  • Joined: 2010/01/24 19:20:59
  • Status: offline
  • Ribbons : 5
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:40:19 (permalink)
KingEngineRevUp
gsrcrxsi
KingEngineRevUp
gsrcrxsi
KingEngineRevUp


1. There is a low TDP reference design
2. There is a middle TDP reference design that always pushes TDP around the FE
3. There is a high TDP reference design, usually the 3-Pin cards

Nvidia approves these reference designs and they probably made and sent the reference designs out for 1 and 2 and other partners just copied and slightly modified the parts on the bill of materials. 

It's just a hunch for now.

Edit: The ideal case, the other 350W TDP cards can consistently go above 300W, so why can't the XC3? Then more pressure can be put on EVGA to match their competitors. 




I've said it over and over.

this is not a case where the GPU will never reach a certain power draw. it's entirely use-case dependent.

I can make my card hit 350W... in the DX12 Raytracing benchmark. and probably other loads that see very low memory controller utilization. the problem is that too many people are not standardizing their test plans, and you have one person comparing their performance and power draw under X conditions, while another person compares under Y conditions, and yet another compares under Z conditions.

so does that mean that our cards are "fine" if we can trick the card into running full power draw under one niche benchmark test? I don't think so. In my opinion the power limit should be enforced to the same value, regardless of use-case. but that's not what's happening. and clearly some cards are performing better than others. I showed this in my pre and post RMA testing, eliminating the excuse of "oh it's just system to system differences and CPU/ram differences". If that was the case, I wouldnt see different behavior between my original card and my RMA, yet I did. and I also eliminated the software based excuses of "OS or driver" problems by demonstrating this same behavior on both Linux and Windows.

the best correlation I've shown with my own card is that memory controller load and observed effective power limit exhibit an inverse relationship. high mem loads = low power limit, and low mem load = higher power limit.


There is a difference between "hitting" and there is a difference between "sustaining."

I want EVGA to explain the scenarios for both. Not you.



I can sustain 345-350W for the duration of the DX12 test.


But like you said, you have to "trick" the card into doing it right?



"trick" in the sense of running the right workload to put the card under the exact right conditions to actually give the advertised power limit
 
power draw can absolutely vary based on workload, however power LIMIT should not vary like that.

Rig1: EPYC 7V12 | [4] RTX A4000
Rig2: EPYC 7B12 | [5] 3080Ti + [2] 2080Ti
Rig3: EPYC 7B12 | [6] 3070Ti + [2] 3060
Rig4: [2] EPYC 7742 | RTX A2000
Rig5: [2] EPYC 7642
Rig6: EPYC 7551 | [4] Titan V

kram36
The Destroyer
  • Total Posts : 21477
  • Reward points : 0
  • Joined: 2009/10/27 19:00:58
  • Location: United States
  • Status: offline
  • Ribbons : 72
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:42:34 (permalink)
gsrcrxsi
kram36
 
The 3080 Ti FE card is a 2X 8-Pin card and has no issue pulling 350w same with the Nvidia 3090 FE card, it's a 2X 8-Pin card.
 
With that game that started to kill 3090 cards. EVGA is really going to ignore this issue even more. What a bad timing.




technically, it's a 1x12-pin card, that ships with a 2x8-pin adapter that you may or may not need (depending if you get the custom cable provided by your PSU manufacturer or not)


So they are essentially 2x 6-pin cards?
KingEngineRevUp
FTW Member
  • Total Posts : 1030
  • Reward points : 0
  • Joined: 2019/03/28 16:38:54
  • Status: offline
  • Ribbons : 9
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:42:43 (permalink)
Do you know where your first and second card were made?

https://forums.evga.com/D...l-number-m2981084.aspx

China vs. Taiwan?
KingEngineRevUp
FTW Member
  • Total Posts : 1030
  • Reward points : 0
  • Joined: 2019/03/28 16:38:54
  • Status: offline
  • Ribbons : 9
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:45:06 (permalink)
kram36
gsrcrxsi
kram36
 
The 3080 Ti FE card is a 2X 8-Pin card and has no issue pulling 350w same with the Nvidia 3090 FE card, it's a 2X 8-Pin card.
 
With that game that started to kill 3090 cards. EVGA is really going to ignore this issue even more. What a bad timing.




technically, it's a 1x12-pin card, that ships with a 2x8-pin adapter that you may or may not need (depending if you get the custom cable provided by your PSU manufacturer or not)


So they are essentially 2x 6-pin cards?


They're a new and their own plug, think of them as that, not 2x 6-pin.

https://www.techpowerup.c...ith-nvidia-ampere-gpus

They have to follow a higher standard like better pins and the minimum gauge wire diameter is higher.
post edited by KingEngineRevUp - 2021/07/26 11:46:45
kram36
The Destroyer
  • Total Posts : 21477
  • Reward points : 0
  • Joined: 2009/10/27 19:00:58
  • Location: United States
  • Status: offline
  • Ribbons : 72
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:46:14 (permalink)
KingEngineRevUp
Do you know where your first and second card were made?

https://forums.evga.com/D...l-number-m2981084.aspx

China vs. Taiwan?

Both my cards came from Taiwan.
kram36
The Destroyer
  • Total Posts : 21477
  • Reward points : 0
  • Joined: 2009/10/27 19:00:58
  • Location: United States
  • Status: offline
  • Ribbons : 72
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:48:19 (permalink)
KingEngineRevUp
kram36
gsrcrxsi
kram36
 
The 3080 Ti FE card is a 2X 8-Pin card and has no issue pulling 350w same with the Nvidia 3090 FE card, it's a 2X 8-Pin card.
 
With that game that started to kill 3090 cards. EVGA is really going to ignore this issue even more. What a bad timing.




technically, it's a 1x12-pin card, that ships with a 2x8-pin adapter that you may or may not need (depending if you get the custom cable provided by your PSU manufacturer or not)


So they are essentially 2x 6-pin cards?


They're a new and their own plug, think of them as that, not 2x 6-pin.

https://www.techpowerup.c...ith-nvidia-ampere-gpus

Doesn't matter, they use 2x 8-pin cables. If Nvidia can support a 3090 from that, then EVGA should be able to support EVGA's advertised 350w of the 3080 Ti XC3 card.
KingEngineRevUp
FTW Member
  • Total Posts : 1030
  • Reward points : 0
  • Joined: 2019/03/28 16:38:54
  • Status: offline
  • Ribbons : 9
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:48:24 (permalink)
kram36
KingEngineRevUp
Do you know where your first and second card were made?

https://forums.evga.com/D...l-number-m2981084.aspx

China vs. Taiwan?

Both my cards came from Taiwan.


I'm wondering where both of gsrcrxsi were made of he has one that is noticably better.

Can possibly be the factories are using different controllers and or other parts.

kram36
KingEngineRevUp
kram36
gsrcrxsi
kram36

The 3080 Ti FE card is a 2X 8-Pin card and has no issue pulling 350w same with the Nvidia 3090 FE card, it's a 2X 8-Pin card.

With that game that started to kill 3090 cards. EVGA is really going to ignore this issue even more. What a bad timing.




technically, it's a 1x12-pin card, that ships with a 2x8-pin adapter that you may or may not need (depending if you get the custom cable provided by your PSU manufacturer or not)


So they are essentially 2x 6-pin cards?


They're a new and their own plug, think of them as that, not 2x 6-pin.

https://www.techpowerup.c...ith-nvidia-ampere-gpus

Doesn't matter, they use 2x 8-pin cables. If Nvidia can support a 3090 from that, then EVGA should be able to support EVGA's advertised 350w of the 3080 Ti XC3 card.


Read my edit. I'm just trying to explain to you the difference between the plugs. The standard for the Nvidia one is higher (pin quality and minimum wire gauge diameter, etc.)

Your last point is true and I'm not disagreeing with it.

But why the XC3 is different from the FE, it doesn't use a reference design, different components, board layout, wire routing, etc. PNY designed them a different board then Nvidia themselves designed.
post edited by KingEngineRevUp - 2021/07/26 11:50:45
B0baganoosh
CLASSIFIED Member
  • Total Posts : 2366
  • Reward points : 0
  • Joined: 2009/08/04 04:27:18
  • Status: offline
  • Ribbons : 39
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:49:14 (permalink)
gsrcrxsi
Nike_7688
So thinking about who has "good" card and "bad" cards here, has anybody put together a table of motherboards, CPUs (w/speed), memory (w/speed+timings), PCI-e config (gen? lanes?) that they're running with their card?
 
I doubt it is strictly related, specifically because you're all seeing the Pwr limit flag, but it might help put to bed some of the other theories if you can show no correlation between things like "slow" memory or PCI-e gen 3 (or 8x lanes only) usage, etc. 




these things don't matter to this issue. people have taken the same card and put it in different PCs and seen the same issue.
 
and conversely, I don't know about anyone else, but at least I have. tried different examples of the same exact card (same sku, different serial number), in the same exact system, and observed different behavior with some cards pulling more power and clocking higher.
 
cards having different performance in the same sku is normal and expected. some cards naturally clock higher. different cards of the same sku seeing different effective power limits is not normal though.




I agree. I wasn't suggesting you'll find an "oh, it's not the card" answer. I just thought it might help put it to bed for good lol. Your original test vs. RMA was the most helpful to me as it was a very specific apples to apples test in the same system.

6Q6CPFHPBPCU691 is a discount code anyone can use.
 
i9 13900k - EVGA Z690 Classy - Nvidia RTX 4090 FE - G.Skill 32GB DDR5-6000  - WD SN850 2TB NVMe Gen4 - Be Quiet! Straight Power 12 1200W - Be Quiet! Dark Base 900 Pro. MO-RA3 420 Pro. Dark Palimpsest MODS RIGS post for build notes.
talon951
FTW Member
  • Total Posts : 1026
  • Reward points : 0
  • Joined: 2020/10/06 02:41:19
  • Status: offline
  • Ribbons : 3
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:52:42 (permalink)
KingEngineRevUp
kram36
KingEngineRevUp
Just a thought, we need to find someone with a MSI 3080 Ti Ventus or Gigabyte Eagle
 
I've noticed that the Ventus and Eagle models always have the same TDPs as XC3s. 
 
https://www.techpowerup.com/gpu-specs/msi-rtx-3080-ti-ventus-3x-oc.b8767
https://www.techpowerup.com/gpu-specs/gigabyte-rtx-3080-ti-eagle.b8842
 
If the Ventus and Eagle have the same problem, then it's highly likely it might be a hardware level issue for cards designed with 2X 8-Pin with a 350W TDP in mind. 
 
This would be similar to 3080 Tis with TDPs around 450W that hover around just 400W only. 


The 3080 Ti FE card is a 2X 8-Pin card and has no issue pulling 350w same with the Nvidia 3090 FE card, it's a 2X 8-Pin card.
 
With that game that started to kill 3090 cards. EVGA is really going to ignore this issue even more. What a bad timing.




Yes, but remember those cards do not follow the same Reference design, they are more customized than most if not all other GPUs. 
 
I have a suspicion that these other 350W cards have the same behavior as your XC3, but we won't know until we find these owners. 
 
If they have the same issue, it sounds like a problem that needs to be taken up with nivida, and if they also have an issue, then more groups can join together to try and see if there is a software level fix for this. 
 
I made a post on nvidia sub-reddit, waiting for it to get approved.
 
But I have observed a very common pattern. XC3, Ventus, Eagle and almost all of Zotac's cards all share the same TDPs
 
The Gigabyte Gaming, Vision, TUF and several other models usually follow a similar TDP
 
The Suprim X, FTW3, Strix, Master and Elite cards usually follow around the same TDPs
 
I've seen posters complain about similar TDP behaviors for the 3080s and 3090s between these cards. It can be a similar case for the 3080 Ti. This is my hypothesis.
 
1. There is a low TDP reference design
2. There is a middle TDP reference design that always pushes TDP around the FE
3. There is a high TDP reference design, usually the 3-Pin cards
 
Nvidia approves these reference designs and they probably made and sent the reference designs out for 1 and 2 and other partners just copied and slightly modified the parts on the bill of materials. 
 
It's just a hunch for now.
 
Edit: The ideal case, the other 350W TDP cards can consistently go above 300W, so why can't the XC3? Then more pressure can be put on EVGA to match their competitors. 


There have been people on OC'ers forum with other brands and this same problem. Might be some that don't but I suspect plenty that do.

My guess is there is a limit specified or at least recommended by Nvidia that is being hit so you see it on multiple AIB's.
gsrcrxsi
SSC Member
  • Total Posts : 985
  • Reward points : 0
  • Joined: 2010/01/24 19:20:59
  • Status: offline
  • Ribbons : 5
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:53:52 (permalink)
KingEngineRevUp
Do you know where your first and second card were made?

https://forums.evga.com/D...l-number-m2981084.aspx

China vs. Taiwan?

my two cards' serial numbers differed only by the last two digits, the later being only 13 places higher. otherwise identical and hence produced in the same factory.
 
so that's a non-starter.

Rig1: EPYC 7V12 | [4] RTX A4000
Rig2: EPYC 7B12 | [5] 3080Ti + [2] 2080Ti
Rig3: EPYC 7B12 | [6] 3070Ti + [2] 3060
Rig4: [2] EPYC 7742 | RTX A2000
Rig5: [2] EPYC 7642
Rig6: EPYC 7551 | [4] Titan V

kram36
The Destroyer
  • Total Posts : 21477
  • Reward points : 0
  • Joined: 2009/10/27 19:00:58
  • Location: United States
  • Status: offline
  • Ribbons : 72
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:55:49 (permalink)
KingEngineRevUp
kram36
KingEngineRevUp
Do you know where your first and second card were made?

https://forums.evga.com/D...l-number-m2981084.aspx

China vs. Taiwan?

Both my cards came from Taiwan.


I'm wondering where both of gsrcrxsi were made of he has one that is noticably better.

Can possibly be the factories are using different controllers and or other parts.

kram36
KingEngineRevUp
kram36
gsrcrxsi
kram36

The 3080 Ti FE card is a 2X 8-Pin card and has no issue pulling 350w same with the Nvidia 3090 FE card, it's a 2X 8-Pin card.

With that game that started to kill 3090 cards. EVGA is really going to ignore this issue even more. What a bad timing.




technically, it's a 1x12-pin card, that ships with a 2x8-pin adapter that you may or may not need (depending if you get the custom cable provided by your PSU manufacturer or not)


So they are essentially 2x 6-pin cards?


They're a new and their own plug, think of them as that, not 2x 6-pin.

https://www.techpowerup.c...ith-nvidia-ampere-gpus

Doesn't matter, they use 2x 8-pin cables. If Nvidia can support a 3090 from that, then EVGA should be able to support EVGA's advertised 350w of the 3080 Ti XC3 card.


Read my edit. I'm just trying to explain to you the difference between the plugs. The standard for the Nvidia one is higher (pin quality and minimum wire gauge diameter, etc.)

Your last point is true and I'm not disagreeing with it.

But why the XC3 is different from the FE, it doesn't use a reference design, different components, board layout, wire routing, etc. PNY designed them a different board then Nvidia themselves designed.

EVGA advertises the card as a 350w card, I'm lucky to average 300w on both my cards. Nvidia advertises the FE as a 350w card and it gets 350w, same as the 3090 from a 2x 8-pin power source. I don't think Nvidia came up with a miracle 12-pin power plug that makes more power than is feed into it.
post edited by kram36 - 2021/07/26 11:58:47
KingEngineRevUp
FTW Member
  • Total Posts : 1030
  • Reward points : 0
  • Joined: 2019/03/28 16:38:54
  • Status: offline
  • Ribbons : 9
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 11:58:30 (permalink)
talon951

There have been people on OC'ers forum with other brands and this same problem. Might be some that don't but I suspect plenty that do.

My guess is there is a limit specified or at least recommended by Nvidia that is being hit so you see it on multiple AIB's.



Yep, we're both on that forum together and have noticed similar behaviors. Like the 375-380 TDP people only being able to hit 350W no matter what they do right? Are you talking about those guys? 
post edited by KingEngineRevUp - 2021/07/26 12:05:50
kram36
The Destroyer
  • Total Posts : 21477
  • Reward points : 0
  • Joined: 2009/10/27 19:00:58
  • Location: United States
  • Status: offline
  • Ribbons : 72
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 12:01:07 (permalink)
KingEngineRevUp
talon951



There have been people on OC'ers forum with other brands and this same problem. Might be some that don't but I suspect plenty that do.

My guess is there is a limit specified or at least recommended by Nvidia that is being hit so you see it on multiple AIB's.


Yep, we're both on that forum together and have noticed similar behaviors. Like the 375-380 TDP people only being able to hit 350W no matter what they do right? Are you talking about those guys? 


If I could hit 350w with my XC3 cards, I would be tickled pink. Then it would be down to the silicone lottery, not under advertised power.
gsrcrxsi
SSC Member
  • Total Posts : 985
  • Reward points : 0
  • Joined: 2010/01/24 19:20:59
  • Status: offline
  • Ribbons : 5
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 12:01:09 (permalink)
I would like other 3080ti XC3 owners to run the DX12 RT benchmark and log their power draw. see if they see at least close to full power limit draw there.

Rig1: EPYC 7V12 | [4] RTX A4000
Rig2: EPYC 7B12 | [5] 3080Ti + [2] 2080Ti
Rig3: EPYC 7B12 | [6] 3070Ti + [2] 3060
Rig4: [2] EPYC 7742 | RTX A2000
Rig5: [2] EPYC 7642
Rig6: EPYC 7551 | [4] Titan V

KingEngineRevUp
FTW Member
  • Total Posts : 1030
  • Reward points : 0
  • Joined: 2019/03/28 16:38:54
  • Status: offline
  • Ribbons : 9
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 12:04:38 (permalink)
kram36


EVGA advertises the card as a 350w card, I'm lucky to average 300w on both my cards. Nvidia advertises the FE as a 350w card and it gets 350w, same as the 3090 from a 2x 8-pin power source. I don't think Nvidia came up with a miracle 12-pin power plug that makes more power than is feed into it.




Kram, lets start over. This has nothing to do with the pins but the way the board is possible designed itself (the controllers, etc). 
 
NVIDIA releases a reference board design, if people fabricated the exact reference board that NVIDIA sent them they would have a NVIDIA 3080 Ti reference. Vendors would take these reference designs and then customize them slightly (these cards could have their bios flashed to one another even with reference bios) or fully customize them to be very different (these cards couldn't flash any bios from the reference and the reference couldn't flash their bios, risk of bricking occured). 
 
In the past, Nvidia would do this themselves and name them Reference or FE. They would also have other vendors produce these "reference" designs and sell them as a reference board. But that's not the same this generation. 
 
The Nvidia 30 series FE were fully customized to be completely different, like when a partner takes the reference design as a guideline and fully customizes their card to be quite unique. 
 
So that's the difference between your XC3 and the FE. 
kongfra
Superclocked Member
  • Total Posts : 135
  • Reward points : 0
  • Joined: 2015/06/01 06:30:48
  • Status: offline
  • Ribbons : 0
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 12:06:00 (permalink)
gsrcrxsi
I would like other 3080ti XC3 owners to run the DX12 RT benchmark and log their power draw. see if they see at least close to full power limit draw there.


What is the  DX12 RT benchmark? I will run it

3080 TI XC3, i9-10850K, Noctua NH-D15S, Gigabyte Z590 Aorus Elite, Crucial Ballistix 32 GB Ram DDR4-3200 CL16 , EVGA Supernova G6 1000W 80+ Gold, Windows 10 Pro,  LG 27GP83B-B with Dual Dell S2716DG Monitor, 2 TB Crucial MX500 SSD, Phanteks Enthoo Pro Case
kram36
The Destroyer
  • Total Posts : 21477
  • Reward points : 0
  • Joined: 2009/10/27 19:00:58
  • Location: United States
  • Status: offline
  • Ribbons : 72
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 12:11:17 (permalink)
KingEngineRevUp
kram36


EVGA advertises the card as a 350w card, I'm lucky to average 300w on both my cards. Nvidia advertises the FE as a 350w card and it gets 350w, same as the 3090 from a 2x 8-pin power source. I don't think Nvidia came up with a miracle 12-pin power plug that makes more power than is feed into it.




Kram, lets start over. This has nothing to do with the pins but the way the board is possible designed itself (the controllers, etc). 
 
NVIDIA releases a reference board design, if people fabricated the exact reference board that NVIDIA sent them they would have a NVIDIA 3080 Ti reference. Vendors would take these reference designs and then customize them slightly (these cards could have their bios flashed to one another even with reference bios) or fully customize them to be very different (these cards couldn't flash any bios from the reference and the reference couldn't flash their bios, risk of bricking occured). 
 
In the past, Nvidia would do this themselves and name them Reference or FE. They would also have other vendors produce these "reference" designs and sell them as a reference board. But that's not the same this generation. 
 
The Nvidia 30 series FE were fully customized to be completely different, like when a partner takes the reference design as a guideline and fully customizes their card to be quite unique. 
 
So that's the difference between your XC3 and the FE. 


I understand that, you're missing my point. I can not get either of my two cards to hit EVGA's advertised 350w, lucky to average 300w. If Nvidia can do it, then so can EVGA. I want what I paid for and I'm not getting it. My cards are crap cards.
atfrico
Omnipotent Enthusiast
  • Total Posts : 12753
  • Reward points : 0
  • Joined: 2008/05/20 16:16:06
  • Location: <--Dip, Dip, Potato Chip!-->
  • Status: offline
  • Ribbons : 25
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 12:11:36 (permalink)
Have you tried folding with the GPU Kram to see if you get the same results? I am a bit curious about the wattage🤔

Those who abuse power, are nothing but scumbags! The challenge of power is how to use it and not abuse it. The abuse of power that seems to create the most unhappiness is when a person uses personal power to get ahead without regards to the welfare of others, people are obsessed with it. You can take a nice person and turn them into a slob, into an insane being, craving power, destroying anything that stands in their way.
 
 
Affiliate Code: 3T15O1S07G
KingEngineRevUp
FTW Member
  • Total Posts : 1030
  • Reward points : 0
  • Joined: 2019/03/28 16:38:54
  • Status: offline
  • Ribbons : 9
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 12:23:44 (permalink)
kram36
KingEngineRevUp
kram36


EVGA advertises the card as a 350w card, I'm lucky to average 300w on both my cards. Nvidia advertises the FE as a 350w card and it gets 350w, same as the 3090 from a 2x 8-pin power source. I don't think Nvidia came up with a miracle 12-pin power plug that makes more power than is feed into it.




Kram, lets start over. This has nothing to do with the pins but the way the board is possible designed itself (the controllers, etc). 
 
NVIDIA releases a reference board design, if people fabricated the exact reference board that NVIDIA sent them they would have a NVIDIA 3080 Ti reference. Vendors would take these reference designs and then customize them slightly (these cards could have their bios flashed to one another even with reference bios) or fully customize them to be very different (these cards couldn't flash any bios from the reference and the reference couldn't flash their bios, risk of bricking occured). 
 
In the past, Nvidia would do this themselves and name them Reference or FE. They would also have other vendors produce these "reference" designs and sell them as a reference board. But that's not the same this generation. 
 
The Nvidia 30 series FE were fully customized to be completely different, like when a partner takes the reference design as a guideline and fully customizes their card to be quite unique. 
 
So that's the difference between your XC3 and the FE. 


I understand that, you're missing my point. I can not get either of my two cards to hit EVGA's advertised 350w, lucky to average 300w. If Nvidia can do it, then so can EVGA. I want what I paid for and I'm not getting it. My cards are crap cards.




Okay, you understand that part. Now let me revisit my other post where I speculate the following. Nvidia might have 3 reference designs. 
 
1. 350 TDP
2. 373-380 TDP
3. 400W+ TDP
 
They might have forced the vendors to follow a certain design where hardware regulates things and is the root cause of why you can't sustain above 350W when you want whenever you want. 
 
If this is true, and the Ventus, Eagle and Zotac cards (which seem to share a similar board design to the XC3) suffer from your issue, then the issue might have to be taken up with NVIDIA. Your cause might be bigger than you know. 
gsrcrxsi
SSC Member
  • Total Posts : 985
  • Reward points : 0
  • Joined: 2010/01/24 19:20:59
  • Status: offline
  • Ribbons : 5
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 12:28:57 (permalink)
kongfra
gsrcrxsi
I would like other 3080ti XC3 owners to run the DX12 RT benchmark and log their power draw. see if they see at least close to full power limit draw there.


What is the  DX12 RT benchmark? I will run it


the 3DMark DX12 Raytracing benchmark. it's included with 3DMark

Rig1: EPYC 7V12 | [4] RTX A4000
Rig2: EPYC 7B12 | [5] 3080Ti + [2] 2080Ti
Rig3: EPYC 7B12 | [6] 3070Ti + [2] 3060
Rig4: [2] EPYC 7742 | RTX A2000
Rig5: [2] EPYC 7642
Rig6: EPYC 7551 | [4] Titan V

gsrcrxsi
SSC Member
  • Total Posts : 985
  • Reward points : 0
  • Joined: 2010/01/24 19:20:59
  • Status: offline
  • Ribbons : 5
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 12:32:20 (permalink)
kram36
I understand that, you're missing my point. I can not get either of my two cards to hit EVGA's advertised 350w, lucky to average 300w. If Nvidia can do it, then so can EVGA. I want what I paid for and I'm not getting it. My cards are crap cards.


what power draw is observed while running the DX12 RT benchmark?

Rig1: EPYC 7V12 | [4] RTX A4000
Rig2: EPYC 7B12 | [5] 3080Ti + [2] 2080Ti
Rig3: EPYC 7B12 | [6] 3070Ti + [2] 3060
Rig4: [2] EPYC 7742 | RTX A2000
Rig5: [2] EPYC 7642
Rig6: EPYC 7551 | [4] Titan V

kongfra
Superclocked Member
  • Total Posts : 135
  • Reward points : 0
  • Joined: 2015/06/01 06:30:48
  • Status: offline
  • Ribbons : 0
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 12:35:33 (permalink)
gsrcrxsi
kongfra
gsrcrxsi
I would like other 3080ti XC3 owners to run the DX12 RT benchmark and log their power draw. see if they see at least close to full power limit draw there.


What is the  DX12 RT benchmark? I will run it


the 3DMark DX12 Raytracing benchmark. it's included with 3DMark




 
oh i only have 3d mark demo, I would have to purchase the full suite, I know it only $30 but not worth it for me, unless someone knows where to get it cheap

3080 TI XC3, i9-10850K, Noctua NH-D15S, Gigabyte Z590 Aorus Elite, Crucial Ballistix 32 GB Ram DDR4-3200 CL16 , EVGA Supernova G6 1000W 80+ Gold, Windows 10 Pro,  LG 27GP83B-B with Dual Dell S2716DG Monitor, 2 TB Crucial MX500 SSD, Phanteks Enthoo Pro Case
gsrcrxsi
SSC Member
  • Total Posts : 985
  • Reward points : 0
  • Joined: 2010/01/24 19:20:59
  • Status: offline
  • Ribbons : 5
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 12:38:31 (permalink)
it was on sale not too long ago and I bought the whole thing for like $4 lol.

Rig1: EPYC 7V12 | [4] RTX A4000
Rig2: EPYC 7B12 | [5] 3080Ti + [2] 2080Ti
Rig3: EPYC 7B12 | [6] 3070Ti + [2] 3060
Rig4: [2] EPYC 7742 | RTX A2000
Rig5: [2] EPYC 7642
Rig6: EPYC 7551 | [4] Titan V

kongfra
Superclocked Member
  • Total Posts : 135
  • Reward points : 0
  • Joined: 2015/06/01 06:30:48
  • Status: offline
  • Ribbons : 0
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 12:41:01 (permalink)
gsrcrxsi
it was on sale not too long ago and I bought the whole thing for like $4 lol.




 
I missed it, probably before I got my new build up and running with my 3080 TI, I haven't bought anything new in a while, since my older build was barely able to play anything, I even avoided the entire steam summer sale since I still have huge back log LOL

3080 TI XC3, i9-10850K, Noctua NH-D15S, Gigabyte Z590 Aorus Elite, Crucial Ballistix 32 GB Ram DDR4-3200 CL16 , EVGA Supernova G6 1000W 80+ Gold, Windows 10 Pro,  LG 27GP83B-B with Dual Dell S2716DG Monitor, 2 TB Crucial MX500 SSD, Phanteks Enthoo Pro Case
kram36
The Destroyer
  • Total Posts : 21477
  • Reward points : 0
  • Joined: 2009/10/27 19:00:58
  • Location: United States
  • Status: offline
  • Ribbons : 72
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 12:41:35 (permalink)
atfrico
Have you tried folding with the GPU Kram to see if you get the same results? I am a bit curious about the wattage🤔

Well, that's interesting, however folding is nothing like gaming. It's only using a 88% GPU load and 4% Memory load, which doesn't required over 264w.
 

 
phroze
SSC Member
  • Total Posts : 799
  • Reward points : 0
  • Joined: 2018/09/17 20:09:17
  • Location: WA State
  • Status: offline
  • Ribbons : 0
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 12:47:47 (permalink)
kongfra
gsrcrxsi
kongfra
gsrcrxsi
I would like other 3080ti XC3 owners to run the DX12 RT benchmark and log their power draw. see if they see at least close to full power limit draw there.


What is the  DX12 RT benchmark? I will run it


the 3DMark DX12 Raytracing benchmark. it's included with 3DMark




 
oh i only have 3d mark demo, I would have to purchase the full suite, I know it only $30 but not worth it for me, unless someone knows where to get it cheap


here you go
https://www.g2a.com/3dmar...global-i10000044767004

Case: Lian Li O11 Dynamic XL
Mobo: Asrock X570 Taichi
CPU: Ryzen 5900x
GPU: EVGA RTX 3090 FTW3 Ultra
RAM: Crucial Ballistix OC to 3800 16 18 18 1:1
PSU: EVGA SuperNova G2 1600w
Cooling: Custom hardline loop: optimus blocks, primochill stuff, lian li stuff, HW Labs 60mm radiators, custom stuff
kram36
The Destroyer
  • Total Posts : 21477
  • Reward points : 0
  • Joined: 2009/10/27 19:00:58
  • Location: United States
  • Status: offline
  • Ribbons : 72
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 12:50:35 (permalink)
KingEngineRevUp
kram36
KingEngineRevUp
kram36


EVGA advertises the card as a 350w card, I'm lucky to average 300w on both my cards. Nvidia advertises the FE as a 350w card and it gets 350w, same as the 3090 from a 2x 8-pin power source. I don't think Nvidia came up with a miracle 12-pin power plug that makes more power than is feed into it.




Kram, lets start over. This has nothing to do with the pins but the way the board is possible designed itself (the controllers, etc). 
 
NVIDIA releases a reference board design, if people fabricated the exact reference board that NVIDIA sent them they would have a NVIDIA 3080 Ti reference. Vendors would take these reference designs and then customize them slightly (these cards could have their bios flashed to one another even with reference bios) or fully customize them to be very different (these cards couldn't flash any bios from the reference and the reference couldn't flash their bios, risk of bricking occured). 
 
In the past, Nvidia would do this themselves and name them Reference or FE. They would also have other vendors produce these "reference" designs and sell them as a reference board. But that's not the same this generation. 
 
The Nvidia 30 series FE were fully customized to be completely different, like when a partner takes the reference design as a guideline and fully customizes their card to be quite unique. 
 
So that's the difference between your XC3 and the FE. 


I understand that, you're missing my point. I can not get either of my two cards to hit EVGA's advertised 350w, lucky to average 300w. If Nvidia can do it, then so can EVGA. I want what I paid for and I'm not getting it. My cards are crap cards.




Okay, you understand that part. Now let me revisit my other post where I speculate the following. Nvidia might have 3 reference designs. 
 
1. 350 TDP
2. 373-380 TDP
3. 400W+ TDP
 
They might have forced the vendors to follow a certain design where hardware regulates things and is the root cause of why you can't sustain above 350W when you want whenever you want. 
 
If this is true, and the Ventus, Eagle and Zotac cards (which seem to share a similar board design to the XC3) suffer from your issue, then the issue might have to be taken up with NVIDIA. Your cause might be bigger than you know. 


Again, I can't touch 350w with ether of my cards, ever, lucky to average 300w. If EVGA says 350w I want my 350w. I didn't pay over $1,500 for a 3080 Ti XC3 Hydro Copper to have it performing more like a 3080 Super card. EVGA sold me the cards saying they are 350w cards, my issues is with EVGA, not Nvidia.
post edited by kram36 - 2021/07/26 12:52:03
atfrico
Omnipotent Enthusiast
  • Total Posts : 12753
  • Reward points : 0
  • Joined: 2008/05/20 16:16:06
  • Location: <--Dip, Dip, Potato Chip!-->
  • Status: offline
  • Ribbons : 25
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 12:52:25 (permalink)
kram36
atfrico
Have you tried folding with the GPU Kram to see if you get the same results? I am a bit curious about the wattage🤔

Well, that's interesting, however folding is nothing like gaming. It's only using a 88% GPU load and 4% Memory load, which doesn't required over 264w.
 

 

Oh i know, i just want to see how it behaves with certain apps and i am starting to see more clearly

Those who abuse power, are nothing but scumbags! The challenge of power is how to use it and not abuse it. The abuse of power that seems to create the most unhappiness is when a person uses personal power to get ahead without regards to the welfare of others, people are obsessed with it. You can take a nice person and turn them into a slob, into an insane being, craving power, destroying anything that stands in their way.
 
 
Affiliate Code: 3T15O1S07G
ObscureEmpyre
SSC Member
  • Total Posts : 972
  • Reward points : 0
  • Joined: 2012/01/15 14:40:05
  • Status: offline
  • Ribbons : 7
Re: Dear EVGA, Your 3080 Ti XC3 Cards Suck. 2021/07/26 12:52:53 (permalink)
kram36
atfrico
Have you tried folding with the GPU Kram to see if you get the same results? I am a bit curious about the wattage🤔

Well, that's interesting, however folding is nothing like gaming. It's only using a 88% GPU load and 4% Memory load, which doesn't required over 264w.
 

 

Do you have HWiNFO installed? If so, what’s the power reading from that? I’ve heard GPU-Z can be inaccurate. Also, have you run MSI Kombustor? That maxes out my card’s power, so it’d be sure to max out yours.


Page: < 12345.. > >> Showing page 5 of 11
Jump to:
  • Back to Mobile