Iamrogue
Superclocked Member
- Total Posts : 235
- Reward points : 0
- Joined: 2010/01/24 01:58:41
- Location: P(r)oland
- Status: offline
- Ribbons : 4
as in the topic - RTX 30 - is it pcie gen3 or gen4?
do i need to switch to amd to get the full performance?
|
Sajin
EVGA Forum Moderator
- Total Posts : 49168
- Reward points : 0
- Joined: 2010/06/07 21:11:51
- Location: Texas, USA.
- Status: offline
- Ribbons : 199
Re: RTX 30 - is it pcie gen3 or gen4?
2020/09/02 00:09:53
(permalink)
|
kram36
The Destroyer
- Total Posts : 21477
- Reward points : 0
- Joined: 2009/10/27 19:00:58
- Location: United States
- Status: offline
- Ribbons : 72
Re: RTX 30 - is it pcie gen3 or gen4?
2020/09/02 02:26:40
(permalink)
It's Gen 4. We'll have to see benchmarks to know if any performance is being left on the table running the cards in Gen 3 mode, but Digital Foundry has benchmarks of the RTX 3080 on Gen 3 and the card does very very very very well.
|
ryu4000
SSC Member
- Total Posts : 649
- Reward points : 0
- Joined: 2009/01/25 13:36:22
- Location: Picayune MS
- Status: offline
- Ribbons : 0
Re: RTX 30 - is it pcie gen3 or gen4?
2020/09/02 03:12:38
(permalink)
I don't think they be that much different between gen 3 and 4 there already a gen 4 GPU out on the AMD side and it's not really much gain so i wouldn't run out and grab a gen 4 mobo unless you were already upgrading.
Case phanteks p500a Mobo Msi b550 tomhawk Cpu ryzen 3700x SDD mx-300 525gb intel 512gb wd black nvme 1tb GPU gigabyte vision 3080 PSU EVGA SuperNOVA 1300 G2 Power Supply Monitor LG 34GK950F-B 34 Ram Corsair vengeance rgb pro 32gb HDTV LG CX oled 65
|
MatthewAMEL
Superclocked Member
- Total Posts : 164
- Reward points : 0
- Joined: 2016/07/13 23:15:40
- Status: offline
- Ribbons : 0
Re: RTX 30 - is it pcie gen3 or gen4?
2020/09/02 08:02:32
(permalink)
Sajin Gen4. Nope.
Gen4 is right. And you absolutely will be constraining the card if it's on Intel. Games like Horizon Zero Dawn already show a 10% difference in frame rate using a 5700XT on Gen3 vs Gen4. Go check out Hardware Unboxed comparison last week between 5700XT and 2080Ti using Gen3 and Gen4. It's significant. I'd post the video here, but I'm not allowed.
post edited by MatthewAMEL - 2020/09/02 08:21:51
|
VVhiplash
iCX Member
- Total Posts : 392
- Reward points : 0
- Joined: 2011/06/15 11:42:22
- Location: U.S.A
- Status: offline
- Ribbons : 1
Re: RTX 30 - is it pcie gen3 or gen4?
2020/09/03 10:12:20
(permalink)
kram36 It's Gen 4. We'll have to see benchmarks to know if any performance is being left on the table running the cards in Gen 3 mode, but Digital Foundry has benchmarks of the RTX 3080 on Gen 3 and the card does very very very very well.
Heck freakin yes! I know corporations always advertise or present their absolute "best" and cherry picked numbers, but I'm so glad to see that these numbers don't seem like they're cherry picked. I'm even MORE excited for these graphics cards than I was before!
ll Steiger-Dynamics Maven ll Intel i7-5930k 4.2Ghz ll ASRock X99 Extreme6/AC ll Liqmax II 240 ll GeiL Super-Luce 32gb 2666mhz ll (x2)EVGA GTX 980 SC ll Samsung XP941 500gb and 850 Pro 1TB ll EVGA PS 1000W Platinum ll
|
HawkOculus
iCX Member
- Total Posts : 456
- Reward points : 0
- Joined: 2019/04/10 10:50:51
- Status: offline
- Ribbons : 1
Re: RTX 30 - is it pcie gen3 or gen4?
2020/09/03 10:25:39
(permalink)
MatthewAMEL
Sajin Gen4. Nope.
Gen4 is right. And you absolutely will be constraining the card if it's on Intel. Games like Horizon Zero Dawn already show a 10% difference in frame rate using a 5700XT on Gen3 vs Gen4. Go check out Hardware Unboxed comparison last week between 5700XT and 2080Ti using Gen3 and Gen4. It's significant. I'd post the video here, but I'm not allowed.
Please stop trolling. The video literally supports NONE of what you are claiming in terms of constraints on Intel. Digital Foundry already did benchmarks on a test system with a 9900k. You are an AMD shill, nothing more.
|
EVGA_JacobF
EVGA Alumni
- Total Posts : 16946
- Reward points : 0
- Joined: 2006/01/17 12:10:20
- Location: Brea, CA
- Status: offline
- Ribbons : 26
Re: RTX 30 - is it pcie gen3 or gen4?
2020/09/03 10:30:14
(permalink)
Gen3 should be more than enough PCIE bandwidth for most scenarios.
|
d.burnette
CLASSIFIED ULTRA Member
- Total Posts : 5496
- Reward points : 0
- Joined: 2007/03/08 13:19:32
- Status: offline
- Ribbons : 17
Re: RTX 30 - is it pcie gen3 or gen4?
2020/09/03 10:33:50
(permalink)
EVGA_JacobF Gen3 should be more than enough PCIE bandwidth for most scenarios.
Good to hear thanks for the info! I was not fancying a new build anytime soon.
Don EVGA Z390 Dark MB | i9 9900k CPU @ 5.2 GHz all cores | EVGA RTX 3090 FTW3 Ultra | 32 GB G Skill Trident Z 3200 MHz CL14 DDR4 Ram | Corsair H150i Pro Cooler | EVGA T2 Titanium 1000w Power Supply | Samsung 970 Pro 1TB m.2 Nvme | Samsung 970 Evo 1TB m.2 Nvme | Samsung 860 Evo 1TB SATA SSD | EVGA DG 87 Case |
|
Airikay
Superclocked Member
- Total Posts : 235
- Reward points : 0
- Joined: 2020/08/21 11:08:25
- Status: offline
- Ribbons : 1
Re: RTX 30 - is it pcie gen3 or gen4?
2020/09/03 10:49:17
(permalink)
MatthewAMEL
Sajin Gen4. Nope.
Gen4 is right. And you absolutely will be constraining the card if it's on Intel. Games like Horizon Zero Dawn already show a 10% difference in frame rate using a 5700XT on Gen3 vs Gen4. Go check out Hardware Unboxed comparison last week between 5700XT and 2080Ti using Gen3 and Gen4. It's significant. I'd post the video here, but I'm not allowed.
There are a ton of other reasons why AMD is outperforming Intel on Decima engine games and it has nothing to do with PCIe 3 vs PCIe 4. The Decima engine scales with threads. You can also see it in Death Stranding benchmarks too as they increase the threads up to about 24. Just watched the hardware unboxing and it showed no difference/1 frame rate on all but the 2 Decima Engine games. DS had a 4% difference and HZD was 2% (not 10%)
post edited by Airikay - 2020/09/03 12:20:36
|
MatthewAMEL
Superclocked Member
- Total Posts : 164
- Reward points : 0
- Joined: 2016/07/13 23:15:40
- Status: offline
- Ribbons : 0
Re: RTX 30 - is it pcie gen3 or gen4?
2020/09/03 13:08:30
(permalink)
Airikay
MatthewAMEL
Sajin Gen4. Nope.
Gen4 is right. And you absolutely will be constraining the card if it's on Intel. Games like Horizon Zero Dawn already show a 10% difference in frame rate using a 5700XT on Gen3 vs Gen4. Go check out Hardware Unboxed comparison last week between 5700XT and 2080Ti using Gen3 and Gen4. It's significant. I'd post the video here, but I'm not allowed.
There are a ton of other reasons why AMD is outperforming Intel on Decima engine games and it has nothing to do with PCIe 3 vs PCIe 4. The Decima engine scales with threads. You can also see it in Death Stranding benchmarks too as they increase the threads up to about 24. Just watched the hardware unboxing and it showed no difference/1 frame rate on all but the 2 Decima Engine games. DS had a 4% difference and HZD was 2% (not 10%)
HZD was 5% on the 5700XT between Gen3 and Gen4. 5% on F1, 13% in Wolfenstein:Youngblood, 5% in Death Stranding. They simply went into the BIOS and switched Gen4 off. The Decima benchmarks were already running on AMD. So more threads was already factored into the numbers. If it's that big a delta on a card that is half as fast as a 3080...imagine what that will look like a year from now when more and more games are using the Gen4 bandwidth. When developers really start to leverage RTX IO (NVCache) you will easily need the Gen4 pipe. The best way to answer the question, IMHO, is 'you are losing ~5% of a 3080's performance on current gen games. Potentially much more with next gen games'.
|
Airikay
Superclocked Member
- Total Posts : 235
- Reward points : 0
- Joined: 2020/08/21 11:08:25
- Status: offline
- Ribbons : 1
Re: RTX 30 - is it pcie gen3 or gen4?
2020/09/03 16:07:53
(permalink)
MatthewAMEL HZD was 5% on the 5700XT between Gen3 and Gen4. 5% on F1, 13% in Wolfenstein:Youngblood, 5% in Death Stranding. They simply went into the BIOS and switched Gen4 off. The Decima benchmarks were already running on AMD. So more threads was already factored into the numbers. If it's that big a delta on a card that is half as fast as a 3080...imagine what that will look like a year from now when more and more games are using the Gen4 bandwidth. When developers really start to leverage RTX IO (NVCache) you will easily need the Gen4 pipe. The best way to answer the question, IMHO, is 'you are losing ~5% of a 3080's performance on current gen games. Potentially much more with next gen games'.
What are you talking about? Are you watching the same video? You told me hardware unboxed https://youtu.be/PAwIh1nSOQ8 HZD largest difference was on 1080P. It went from 133 to 137. That is 3%. And it goes to literally the same at 4k. F1 has a 3 frame rate difference at 1080P and 1440. The larger % would be at 1440P since a lower frame rate. 124 to 127. That is a 2.5% difference. Wolfenstein is literally 1 frame, which is under 1%. Death Stranding goes from 140 to 145 for 3.5% at 1080P. I have no idea where you are getting these 10% from HZD, like you originally stated, or the 13% from Wolfenstein. All of those are within margin of errors too, meaning you could do the test multiple times and it might show less than 1% at times too.
|