EVGA

Low GPU utilization RTX 3090

Author
kRoOkEd OnE
New Member
  • Total Posts : 11
  • Reward points : 0
  • Joined: 2013/11/29 09:55:36
  • Status: offline
  • Ribbons : 0
2021/11/13 18:20:55 (permalink)
Hi, I just bought the EVGA XC3 ULTRA 3090 and I figured I'd get more frames in games but really it's doing the same as my 2080ti.
Battlefield 2042, far cry 6, cyberpunk... It uses 70 or 80 percent of my GPU and when I lower settings it uses less lol. I updated the firmware via precision. I enabled resizable bar, and used DDU in safe mode and other things. I feel like I just wasted my money. Has ANYONE figured this out yet?

AMD Ryzen 5950x 16c/32t
32 gb ddr4 3600mhz ram (4 sticks)
ASRock taichi x570
1000w revolt psu
EVGA 3090 XC3 ULTRA
Windows 11
Latest Nvidia drivers and windows updates
1080p 240hz monitor
#1

22 Replies Related Threads

    ty_ger07
    Insert Custom Title Here
    • Total Posts : 16598
    • Reward points : 0
    • Joined: 2008/04/10 23:48:15
    • Location: traveler
    • Status: offline
    • Ribbons : 271
    Re: Low GPU utilization RTX 3090 2021/11/13 18:26:30 (permalink)
    You are going to be CPU limited with such a high-end video card with such a low resolution monitor.
    Get a higher resolution monitor to match your high-end card.

    ASRock Z77 • Intel Core i7 3770K • EVGA GTX 1080 • Samsung 850 Pro • Seasonic PRIME 600W Titanium

    #2
    kRoOkEd OnE
    New Member
    • Total Posts : 11
    • Reward points : 0
    • Joined: 2013/11/29 09:55:36
    • Status: offline
    • Ribbons : 0
    Re: Low GPU utilization RTX 3090 2021/11/13 18:57:18 (permalink)
    ty_ger07
    You are going to be CPU limited with such a high-end video card with such a low resolution monitor.
    Get a higher resolution monitor to match your high-end card.


    That doesn't make sense. So you're telling me that these cards are limited in pushing out frames per second? What you're saying is that instead of giving you more frames per second graphics cards all of a sudden scale down now?

    Since when is a powerful CPU and a powerful graphics card paired together to get mediocre results?
    #3
    kevinc313
    CLASSIFIED ULTRA Member
    • Total Posts : 5004
    • Reward points : 0
    • Joined: 2019/02/28 09:27:55
    • Status: offline
    • Ribbons : 22
    Re: Low GPU utilization RTX 3090 2021/11/13 18:57:33 (permalink)
    W11, 5950X and 240hz probably isn't a great combo.
    #4
    kRoOkEd OnE
    New Member
    • Total Posts : 11
    • Reward points : 0
    • Joined: 2013/11/29 09:55:36
    • Status: offline
    • Ribbons : 0
    Re: Low GPU utilization RTX 3090 2021/11/13 19:05:49 (permalink)
    kevinc313
    W11, 5950X and 240hz probably isn't a great combo.


    Aside fron the ryzen bug they patched windows 11 has been fine. U less it's the actual issue and I just don't know it yet
    #5
    kevinc313
    CLASSIFIED ULTRA Member
    • Total Posts : 5004
    • Reward points : 0
    • Joined: 2019/02/28 09:27:55
    • Status: offline
    • Ribbons : 22
    Re: Low GPU utilization RTX 3090 2021/11/13 19:16:48 (permalink)
    kRoOkEd OnE
    kevinc313
    W11, 5950X and 240hz probably isn't a great combo.


    Aside fron the ryzen bug they patched windows 11 has been fine. U less it's the actual issue and I just don't know it yet



    https://www.techpowerup.c...-ryzen-9-5950x/15.html
    #6
    kRoOkEd OnE
    New Member
    • Total Posts : 11
    • Reward points : 0
    • Joined: 2013/11/29 09:55:36
    • Status: offline
    • Ribbons : 0
    Re: Low GPU utilization RTX 3090 2021/11/13 23:28:54 (permalink)
    kevinc313
    kRoOkEd OnE
    kevinc313
    W11, 5950X and 240hz probably isn't a great combo.


    Aside fron the ryzen bug they patched windows 11 has been fine. U less it's the actual issue and I just don't know it yet






    I'm trying to understand what I'm looking at. I see benchmarks.
    #7
    Brendruis
    New Member
    • Total Posts : 12
    • Reward points : 0
    • Joined: 2008/07/19 12:32:39
    • Status: offline
    • Ribbons : 0
    Re: Low GPU utilization RTX 3090 2021/11/14 02:55:59 (permalink)
    You’re just CPU limited at those resolutions. Pick up a 1440p or 4k monitor.. they are pretty affordable now. and you can game at high refresh too… they have high refresh 4k monitors I believe, I know my TV can receive 4k HDR @120hz.. which is fast enough for me. I know there are 144Hz 1440p monitors… not sure if there are 4k monitors faster than 120 yet
    post edited by Brendruis - 2021/11/14 02:57:42
    #8
    ty_ger07
    Insert Custom Title Here
    • Total Posts : 16598
    • Reward points : 0
    • Joined: 2008/04/10 23:48:15
    • Location: traveler
    • Status: offline
    • Ribbons : 271
    Re: Low GPU utilization RTX 3090 2021/11/14 06:24:51 (permalink)
    kRoOkEd OnE
    ty_ger07
    You are going to be CPU limited with such a high-end video card with such a low resolution monitor.
    Get a higher resolution monitor to match your high-end card.


    That doesn't make sense. So you're telling me that these cards are limited in pushing out frames per second? What you're saying is that instead of giving you more frames per second graphics cards all of a sudden scale down now?

    Since when is a powerful CPU and a powerful graphics card paired together to get mediocre results?

    You are CPU bottlenecked. This isn't a new thing that all of the sudden started existing. You can read about CPU bottlenecking for decades. Why would you buy a 3090 for 1080p?

    The video card "scales down" waiting for the CPU. The video card isn't its own fully autonomous thing. It is told what to draw by the CPU. While the CPU is busy calculating things, the GPU waits doing nothing.

    Your computer setup is unbalanced for gaming. Not just the monitor, but also the CPU. Your CPU has a lot of threads, but many games won't be able to utilize many threads effectively.

    I suggest that you research how to optimize your 5950x for gaming, and consider getting a monitor that more appropriately matches your 3090.
    post edited by ty_ger07 - 2021/11/14 07:13:02

    ASRock Z77 • Intel Core i7 3770K • EVGA GTX 1080 • Samsung 850 Pro • Seasonic PRIME 600W Titanium

    #9
    kevinc313
    CLASSIFIED ULTRA Member
    • Total Posts : 5004
    • Reward points : 0
    • Joined: 2019/02/28 09:27:55
    • Status: offline
    • Ribbons : 22
    Re: Low GPU utilization RTX 3090 2021/11/14 08:08:11 (permalink)
    kRoOkEd OnE
    kevinc313
    kRoOkEd OnE
    kevinc313
    W11, 5950X and 240hz probably isn't a great combo.


    Aside fron the ryzen bug they patched windows 11 has been fine. U less it's the actual issue and I just don't know it yet






    I'm trying to understand what I'm looking at. I see benchmarks.



    Yep.  Those are techpowerup 720P resolution benches, when means they are about the clearest indication of CPU performance on games that you can get.  The 5950X is comparable to a 10700K or overclocked 9900K, however, on some multi-core optimized games, you may get better performance, or on some low core games worse performance.  To get better gaming FPS on a 5950x, people would typically do things like a fat OC, fast tight ram, disabling cores and disabling hyperthreading, then testing at 720P or 1080P low.   I'd try either half the cores disabled or hyperthreading disabled, synced fclk and memory speed, a fixed speed or PBO OC and some 3600CL14 ram.  Or just sell it off and get a 11900K or 12900K with a good MB, an OC, and fast ram.  I'd run some benches at 720P compared to 1080P to see if you are in fact CPU limited, which is pretty quick to do, before changing anything.
     
    Here's the tests at 1080P, looks like they tested with a 2080 Ti and 3090, and got the same results:
     
    https://www.techpowerup.com/review/amd-ryzen-9-5950x/16.html
     
    However, their latest test suite of games has the 5950X doing better against the 10700K and newer:
     
    https://www.techpowerup.c...-lake-12th-gen/15.html
    post edited by kevinc313 - 2021/11/14 08:20:05
    #10
    kRoOkEd OnE
    New Member
    • Total Posts : 11
    • Reward points : 0
    • Joined: 2013/11/29 09:55:36
    • Status: offline
    • Ribbons : 0
    Re: Low GPU utilization RTX 3090 2021/11/14 10:40:39 (permalink)
    kevinc313
    kRoOkEd OnE
    kevinc313
    kRoOkEd OnE
    kevinc313
    W11, 5950X and 240hz probably isn't a great combo.


    Aside fron the ryzen bug they patched windows 11 has been fine. U less it's the actual issue and I just don't know it yet






    I'm trying to understand what I'm looking at. I see benchmarks.



    Yep.  Those are techpowerup 720P resolution benches, when means they are about the clearest indication of CPU performance on games that you can get.  The 5950X is comparable to a 10700K or overclocked 9900K, however, on some multi-core optimized games, you may get better performance, or on some low core games worse performance.  To get better gaming FPS on a 5950x, people would typically do things like a fat OC, fast tight ram, disabling cores and disabling hyperthreading, then testing at 720P or 1080P low.   I'd try either half the cores disabled or hyperthreading disabled, synced fclk and memory speed, a fixed speed or PBO OC and some 3600CL14 ram.  Or just sell it off and get a 11900K or 12900K with a good MB, an OC, and fast ram.  I'd run some benches at 720P compared to 1080P to see if you are in fact CPU limited, which is pretty quick to do, before changing anything.
     
    Here's the tests at 1080P, looks like they tested with a 2080 Ti and 3090, and got the same results:
     

     
    However, their latest test suite of games has the 5950X doing better against the 10700K and newer:
     



    Thank you for helping especially with results and a solution!
    #11
    kRoOkEd OnE
    New Member
    • Total Posts : 11
    • Reward points : 0
    • Joined: 2013/11/29 09:55:36
    • Status: offline
    • Ribbons : 0
    Re: Low GPU utilization RTX 3090 2021/11/14 10:47:06 (permalink)
    ty_ger07
    kRoOkEd OnE
    ty_ger07
    You are going to be CPU limited with such a high-end video card with such a low resolution monitor.
    Get a higher resolution monitor to match your high-end card.


    That doesn't make sense. So you're telling me that these cards are limited in pushing out frames per second? What you're saying is that instead of giving you more frames per second graphics cards all of a sudden scale down now?

    Since when is a powerful CPU and a powerful graphics card paired together to get mediocre results?

    You are CPU bottlenecked. This isn't a new thing that all of the sudden started existing. You can read about CPU bottlenecking for decades. Why would you buy a 3090 for 1080p?

    The video card "scales down" waiting for the CPU. The video card isn't its own fully autonomous thing. It is told what to draw by the CPU. While the CPU is busy calculating things, the GPU waits doing nothing.

    Your computer setup is unbalanced for gaming. Not just the monitor, but also the CPU. Your CPU has a lot of threads, but many games won't be able to utilize many threads effectively.

    I suggest that you research how to optimize your 5950x for gaming, and consider getting a monitor that more appropriately matches your 3090.



    I've done research as a gamer. I've looked at all benchmarks and they all put the 5950x on top at all resolutions if not close to the top. That's why I went with it. I have no desire for 4K gaming.
    It's just a bit disappointing how these game engines cause bottlenecks by not utilizing CPU. Battlefield 5 I gained over 80fps because it's so beautifully optimized for CPU.

    I've been building computers for 20 years but this problem seems a bit reverse compared to what it used to be. There was never too much CPU to limit results now it's the other way around. Sucks. I really hope they start developing for better CPU usage as a standard real soon.
    #12
    kevinc313
    CLASSIFIED ULTRA Member
    • Total Posts : 5004
    • Reward points : 0
    • Joined: 2019/02/28 09:27:55
    • Status: offline
    • Ribbons : 22
    Re: Low GPU utilization RTX 3090 2021/11/14 11:01:32 (permalink)
    kRoOkEd OnE I really hope they start developing for better CPU usage as a standard real soon.



    I'd say they've been developing for multicore for at least 5 years, but it's hit or miss.  Some newer games still only put full load on a few cores.  Then there's the issue of optimization across all cores.
    #13
    sethleigh
    SSC Member
    • Total Posts : 796
    • Reward points : 0
    • Joined: 2015/08/12 11:27:56
    • Status: offline
    • Ribbons : 4
    Re: Low GPU utilization RTX 3090 2021/11/14 11:23:41 (permalink)
    kRoOkEd OnE
    ty_ger07
    You are going to be CPU limited with such a high-end video card with such a low resolution monitor.
    Get a higher resolution monitor to match your high-end card.

    That doesn't make sense. So you're telling me that these cards are limited in pushing out frames per second? What you're saying is that instead of giving you more frames per second graphics cards all of a sudden scale down now?

    It's not scaling down. The guy you quoted is correct. Each frame that your card puts out has to be prepared by your CPU, which is also thinking about all sorts of other things that it has to track in game, like all the other players around you, what they're doing, etc. All of this info is being communicated to your game by the game server, so there is lots of network traffic with lots of updates per second showing you all the changes that are happening around you that your client can't predict, because they're dictated by the actions of others. The sheer, brute power of the GPU is only one factor.
     
    At such a low resolution, your CPU and GPU are already showing you as much as they can within the limitations of network updates from the server and other such things. The way you're going to see your GPU power unleashed and see it showing you more is to upgrade your monitor to a higher resolution.
     
    Dude, you already put a 5950x and 3090 in your machine. You clearly have money to burn and appreciate having a fast, beastly machine. Why are you still playing 1080p? A 1440p monitor even at the same refresh rate as your 1080p will not cost that much, at least not in the context of what you've already demonstrated you can afford and are willing to spend. You've already got so much CPU and GPU power at your fingertips that at this point you really are limited in how that power can be expressed by playing at such a low resolution. A 1080ti already had mastered 1080p resolution four years ago.
     
    For that matter, don't stop at standard 1440p, which is 2560x1440. If you really want to treat yourself, get one of the excellent 3440x1440 ultrawides, or even one of the new 5120x1440 super ultrawides. The Samung G9 Neo something or other can do 5120x1440 at up to 175hz, and if any CPU/GPU combination can get the most out of that that's currently possible, your 5950/3090 combo will do it. I upgraded to a 3440x1440/120hz a couple of years ago and it's really awesome, and the available refresh rates of this size of monitor have increased from there. I've been playing BF 2042 for a couple days (since around 15 seconds after it went live Friday morning) and with the ray traced ambient occlusion turned off but pretty much everything else on Ultra my framerates are generally at or exceeding my monitor's refresh rate. I'll see up to 50% CPU utilization on my 5900x during that time, with my GPU (3080ti very lightly OCed) pegged.

    Since when is a powerful CPU and a powerful graphics card paired together to get mediocre results?

    Since CPU/GPU pairs already came out that could maximize 1080p and then you upgraded to a CPU and GPU pair at least three generations later. Your monitor really is the weak link in your rig now. Don't let this advice go in one eye and out the other. You may not think you're interested in anything higher than 1080p right now, but that's just because you haven't experienced the awesomeness that is 1440p-based ultrawides. I went from 1080p to 4K back in 2015, and my 1080ti could barely keep FPS up to my monitor's 60hz refresh, and I really wanted higher refresh. I side-graded to this 3440x1440/120hz and while I lost some resolution compared to 4K, the ultrawideness of it was fantastic, the resolution increase over 1080p was extremely noticeable and greatly enjoyed, and the lower pixels/frame of 3440x1440 compared to 3840x2160 really allowed my framerates to rise to the occasion and actually utilized my monitor's 120hz refresh. My 5900x and 3080ti combo can maximize this monitor. Your 5950/3090 combo would do a little better, and you'd probably be able to max out a 3440x1440/175hz monitor with judicious settings choices. That would be a beastly setup indeed. You'd love it.
    post edited by sethleigh - 2021/11/14 11:34:01

    Happy EVGA customer.  Affiliate Code: 0Y7-1VU-ATW2
     
    GigaByte X570 Aorus Master, AMD Ryzen 5900x under Optimus Foundation block, 32gb G.Skill DDR4 @ 3800 MHz 14-14-14-28, EVGA 3080ti FTW3 Ultra under Optimus block, 2TB 980 Pro SSD, EVGA Supernova G6 850W PS, ASUS 34" 3440x1440p 120Hz ultrawide, Lenovo 24" 1080p secondary monitor, Win 10

    #14
    kRoOkEd OnE
    New Member
    • Total Posts : 11
    • Reward points : 0
    • Joined: 2013/11/29 09:55:36
    • Status: offline
    • Ribbons : 0
    Re: Low GPU utilization RTX 3090 2021/11/14 11:48:07 (permalink)
    sethleigh
    kRoOkEd OnE
    ty_ger07
    You are going to be CPU limited with such a high-end video card with such a low resolution monitor.
    Get a higher resolution monitor to match your high-end card.

    That doesn't make sense. So you're telling me that these cards are limited in pushing out frames per second? What you're saying is that instead of giving you more frames per second graphics cards all of a sudden scale down now?

    It's not scaling down. The guy you quoted is correct. Each frame that your card puts out has to be prepared by your CPU, which is also thinking about all sorts of other things that it has to track in game, like all the other players around you, what they're doing, etc. All of this info is being communicated to your game by the game server, so there is lots of network traffic with lots of updates per second showing you all the changes that are happening around you that your client can't predict, because they're dictated by the actions of others. The sheer, brute power of the GPU is only one factor.
     
    At such a low resolution, your CPU and GPU are already showing you as much as they can within the limitations of network updates from the server and other such things. The way you're going to see your GPU power unleashed and see it showing you more is to upgrade your monitor to a higher resolution.
     
    Dude, you already put a 5950x and 3090 in your machine. You clearly have money to burn and appreciate having a fast, beastly machine. Why are you still playing 1080p? A 1440p monitor even at the same refresh rate as your 1080p will not cost that much, at least not in the context of what you've already demonstrated you can afford and are willing to spend. You've already got so much CPU and GPU power at your fingertips that at this point you really are limited in how that power can be expressed by playing at such a low resolution. A 1080ti already had mastered 1080p resolution four years ago.
     
    For that matter, don't stop at standard 1440p, which is 2560x1440. If you really want to treat yourself, get one of the excellent 3440x1440 ultrawides, or even one of the new 5120x1440 super ultrawides. The Samung G9 Neo something or other can do 5120x1440 at up to 175hz, and if any CPU/GPU combination can get the most out of that that's currently possible, your 5950/3090 combo will do it. I upgraded to a 3440x1440/120hz a couple of years ago and it's really awesome, and the available refresh rates of this size of monitor have increased from there. I've been playing BF 2042 for a couple days (since around 15 seconds after it went live Friday morning) and with the ray traced ambient occlusion turned off but pretty much everything else on Ultra my framerates are generally at or exceeding my monitor's refresh rate. I'll see up to 50% CPU utilization on my 5900x during that time, with my GPU (3080ti very lightly OCed) pegged.

    Since when is a powerful CPU and a powerful graphics card paired together to get mediocre results?

    Since CPU/GPU pairs already came out that could maximize 1080p and then you upgraded to a CPU and GPU pair at least three generations later. Your monitor really is the weak link in your rig now. Don't let this advice go in one eye and out the other. You may not think you're interested in anything higher than 1080p right now, but that's just because you haven't experienced the awesomeness that is 1440p-based ultrawides. I went from 1080p to 4K back in 2015, and my 1080ti could barely keep FPS up to my monitor's 60hz refresh, and I really wanted higher refresh. I side-graded to this 3440x1440/120hz and while I lost some resolution compared to 4K, the ultrawideness of it was fantastic, the resolution increase over 1080p was extremely noticeable and greatly enjoyed, and the lower pixels/frame of 3440x1440 compared to 3840x2160 really allowed my framerates to rise to the occasion and actually utilized my monitor's 120hz refresh. My 5900x and 3080ti combo can maximize this monitor. Your 5950/3090 combo would do a little better, and you'd probably be able to max out a 3440x1440/175hz monitor with judicious settings choices. That would be a beastly setup indeed. You'd love it.


    Honestly I do love ultra-wide my first ultra wide was the Acer predator x34 I think it was 2016 if I'm not mistaken mistaken. I loved the wider field of view more than anything but went back to 1080p because of content creation. I stream on twitch. Most of the time I'm playing PVP shooters so 4K was always out of the question because I favored the higher frame rate.

    It really sucks that twitch doesn't support UW in their video player or it would be an easy choice for me.
    post edited by kRoOkEd OnE - 2021/11/14 13:11:38
    #15
    sethleigh
    SSC Member
    • Total Posts : 796
    • Reward points : 0
    • Joined: 2015/08/12 11:27:56
    • Status: offline
    • Ribbons : 4
    Re: Low GPU utilization RTX 3090 2021/11/14 12:08:59 (permalink)
    kRoOkEd OnE
    Honestly I do love ultra-wide my first ultra wide was the Acer predator x34 I think it was 2016 if I'm not mistaken mistaken. I loved the wider field of view more than anything but went back to 1080p because of content creation. I stream on twitch. Most of the time I'm playing PVP shooters so 4K was always out of the question because I favored the higher frame rate.

    I haven't streamed so I don't know what's possible in terms of whether you stream at the same resolution you're playing at or what have you. I do think the resolution improvement from 1080p to 1440p is extremely noticeable, and the ultrawide aspect ratio is hugely nice for all sorts of things, gaming included. I'm really appreciating the step up in refresh rate from the 60hz I got in ye olden days to the 120hz my current monitor does. If I had to guess the gains get more and more marginal from there. Given that 3440x1440p monitors exist that can do up to 175hz, I'd speculate that the difference between the 240hz you can do now and 175hz wouldn't be all that noticeable, but the increase in resolution and width will be. I think it would be a worthy upgrade for your current hardware. How that might affect streaming I don't know.
     
    Given my 5900x (fairly aggressive OCed) and RAM (3800/CL14, also fairly aggressively pushed) and 3080ti (only very lightly OC for now until my water block comes in) can exceed my monitor's 120hz refresh rate except when ray tracing is turned on, while nearly everything else is still on Ultra, I have no doubt that this setup could push a 175hz monitor without too much compromise in the settings, and your 3090 slightly more so. 
    post edited by sethleigh - 2021/11/14 12:10:28

    Happy EVGA customer.  Affiliate Code: 0Y7-1VU-ATW2
     
    GigaByte X570 Aorus Master, AMD Ryzen 5900x under Optimus Foundation block, 32gb G.Skill DDR4 @ 3800 MHz 14-14-14-28, EVGA 3080ti FTW3 Ultra under Optimus block, 2TB 980 Pro SSD, EVGA Supernova G6 850W PS, ASUS 34" 3440x1440p 120Hz ultrawide, Lenovo 24" 1080p secondary monitor, Win 10

    #16
    kRoOkEd OnE
    New Member
    • Total Posts : 11
    • Reward points : 0
    • Joined: 2013/11/29 09:55:36
    • Status: offline
    • Ribbons : 0
    Re: Low GPU utilization RTX 3090 2021/11/14 13:25:14 (permalink)
    With streaming it shrinks the picture and adds black thick bars on the top and bottom to be able to squeeze the UW aspect ratio in without any distortion. Honestly I may just say f it and do it anyways lol uw is just way too good
    #17
    sethleigh
    SSC Member
    • Total Posts : 796
    • Reward points : 0
    • Joined: 2015/08/12 11:27:56
    • Status: offline
    • Ribbons : 4
    Re: Low GPU utilization RTX 3090 2021/11/14 13:49:05 (permalink)
    kRoOkEd OnE
    With streaming it shrinks the picture and adds black thick bars on the top and bottom to be able to squeeze the UW aspect ratio in without any distortion. Honestly I may just say f it and do it anyways lol uw is just way too good.

    Good plan.  Btw, I don't know if you've been tracking the various resolutions out there, but I did see the other day when I was looking around that a new "intermediate" (compared to 4K anyhow) resolution out there that I can see being a sweet spot with high end hardware may end up being 3840x1600. It's still lower res (hence better framerate) than 4k, but offers the same PPI as the 3440x1440 34" ultrawides but in a 38" diagonal size. This one can do up to 160hz refresh. I'd be truly curious how my 5900x/3080ti or your 5950x/3090 would perform on a beast like this. A resolution like that would truly separate the wheat of your 3090 from the chaff of any lower GPU.

    Happy EVGA customer.  Affiliate Code: 0Y7-1VU-ATW2
     
    GigaByte X570 Aorus Master, AMD Ryzen 5900x under Optimus Foundation block, 32gb G.Skill DDR4 @ 3800 MHz 14-14-14-28, EVGA 3080ti FTW3 Ultra under Optimus block, 2TB 980 Pro SSD, EVGA Supernova G6 850W PS, ASUS 34" 3440x1440p 120Hz ultrawide, Lenovo 24" 1080p secondary monitor, Win 10

    #18
    kevinc313
    CLASSIFIED ULTRA Member
    • Total Posts : 5004
    • Reward points : 0
    • Joined: 2019/02/28 09:27:55
    • Status: offline
    • Ribbons : 22
    Re: Low GPU utilization RTX 3090 2021/11/14 14:13:52 (permalink)
    kRoOkEd OnE
    With streaming it shrinks the picture and adds black thick bars on the top and bottom to be able to squeeze the UW aspect ratio in without any distortion. Honestly I may just say f it and do it anyways lol uw is just way too good



    AFAIK some streamers/youtubers who use ultrawides play in a window at a normal aspect ratio.  Or you could just get a conventional 1440P or 4K screen.  Or a 48" OLED and play in a 1440P window when streaming.
    #19
    F1Aussie
    New Member
    • Total Posts : 30
    • Reward points : 0
    • Joined: 2020/11/26 05:06:50
    • Status: offline
    • Ribbons : 0
    Re: Low GPU utilization RTX 3090 2021/11/16 01:11:02 (permalink)
    I have a 5950x, 3090 and 38" 3840x 1600. It kind of sits around 3k resolution, if there was such a thing.
    I mainly do sim racing and that can still tank the 3090 fps in some situations but in any first person shooter this combo destroys frame rates at the highest settings, Ray tracing on and dlss on ultra quality, well in what I play it on it does anyway.
    #20
    sethleigh
    SSC Member
    • Total Posts : 796
    • Reward points : 0
    • Joined: 2015/08/12 11:27:56
    • Status: offline
    • Ribbons : 4
    Re: Low GPU utilization RTX 3090 2021/11/16 07:47:01 (permalink)
    F1Aussie
    I have a 5950x, 3090 and 38" 3840x 1600. It kind of sits around 3k resolution, if there was such a thing.
    I mainly do sim racing and that can still tank the 3090 fps in some situations but in any first person shooter this combo destroys frame rates at the highest settings, Ray tracing on and dlss on ultra quality, well in what I play it on it does anyway.

    Might I ask what specific 3840x1600 monitor you have, and its max refresh rate? Until recently I'd have assumed a 3080ti should be able to crush that resolution, but I've been playing Battlefield 2042 since last Thursday night (I guess it was really Friday morning, but I hadn't gone to sleep yet, so still "thursday night" to me....), and that game can absorb massive CPU and GPU power like no game I've seen. I'll be at ~50% utilization on my 5900x, so 6 full cores' worth, while framerates are typically at or above my monitor's 120hz refresh rate with most settings on ultra but with ray tracing off, but not massively over 120fps.
     
    I'd assume that in most games my 5900x/3080ti FTW3 could crush 3840x1600 at higher than 120hz framerates, but in Battlefield 2042 I don't know, and it would probably require a lot more compromise in the settings, or some benefits from DLSS.

    Happy EVGA customer.  Affiliate Code: 0Y7-1VU-ATW2
     
    GigaByte X570 Aorus Master, AMD Ryzen 5900x under Optimus Foundation block, 32gb G.Skill DDR4 @ 3800 MHz 14-14-14-28, EVGA 3080ti FTW3 Ultra under Optimus block, 2TB 980 Pro SSD, EVGA Supernova G6 850W PS, ASUS 34" 3440x1440p 120Hz ultrawide, Lenovo 24" 1080p secondary monitor, Win 10

    #21
    F1Aussie
    New Member
    • Total Posts : 30
    • Reward points : 0
    • Joined: 2020/11/26 05:06:50
    • Status: offline
    • Ribbons : 0
    Re: Low GPU utilization RTX 3090 2021/11/17 05:12:39 (permalink)
    sethleigh
    F1Aussie
    I have a 5950x, 3090 and 38" 3840x 1600. It kind of sits around 3k resolution, if there was such a thing.
    I mainly do sim racing and that can still tank the 3090 fps in some situations but in any first person shooter this combo destroys frame rates at the highest settings, Ray tracing on and dlss on ultra quality, well in what I play it on it does anyway.

    Might I ask what specific 3840x1600 monitor you have, and its max refresh rate? Until recently I'd have assumed a 3080ti should be able to crush that resolution, but I've been playing Battlefield 2042 since last Thursday night (I guess it was really Friday morning, but I hadn't gone to sleep yet, so still "thursday night" to me....), and that game can absorb massive CPU and GPU power like no game I've seen. I'll be at ~50% utilization on my 5900x, so 6 full cores' worth, while framerates are typically at or above my monitor's 120hz refresh rate with most settings on ultra but with ray tracing off, but not massively over 120fps.
     
    I'd assume that in most games my 5900x/3080ti FTW3 could crush 3840x1600 at higher than 120hz framerates, but in Battlefield 2042 I don't know, and it would probably require a lot more compromise in the settings, or some benefits from DLSS.

    I have the predator x38, very expensive but nice monitor the default is 144hz but it can be oc'd to 175hz
    #22
    rckrz6
    New Member
    • Total Posts : 82
    • Reward points : 0
    • Joined: 2018/12/16 09:37:02
    • Status: offline
    • Ribbons : 0
    Re: Low GPU utilization RTX 3090 2021/11/18 11:00:21 (permalink)
    as already stated you can buy the best graphics card in the world but if the game runs into CPU limitations thats all your gonna get outta it
    #23
    Jump to:
  • Back to Mobile