EVGA

Success at getting 3090 in SLI on DX11

Page: 12 > Showing page 1 of 2
Author
xanthus1
New Member
  • Total Posts : 97
  • Reward points : 0
  • Joined: 2009/03/09 11:52:50
  • Status: offline
  • Ribbons : 2
2021/04/14 11:58:06 (permalink)
I got two 3090’s in SLI on older titles and it is working.  I have tried this with a few other titles although I only really play ESO now days but this works on everything I have tested.  Let me start off by saying I was very upset with Nvidia decision to basically abandon and kill SLI. 
 
My experience with SLI goes back 20+ years with these following configs:
 
Scan-Line Interleave (3dfx SLI)
******************************
1999 2x Voodoo2’s
2000 Quantum 3d 3dfx SLI single slot card
 
Scalable Link Interface (Nvidia SLI)
******************************
2008 3x MSI Ultra 8800 (3-way SLI)
2009 2x EVGA GTX 295s (Quad SLI)
2010 4x EVGA GTX 480s (Quad SLI) (water cooled)
2011 4x EVGA GTX 580s (Quad SLI) (water cooled)
2013 4x EVGA first gen Titans (Quad SLI) (water cooled)
2017 2x Nvidia Titan Xp (SLI) (water cooled)
 
NVlink (Nvidia SLI)
******************************
2021 2x ASUS 3090 (SLI) (water cooled)
 
In that time, I have upgraded Nvidia hardware and I have always found myself amazed how much faster the next video card SLI group has been.  This last upgrade however was to say a little lack luster going from dual Titan Xp’s to essentially a single 3090 in many cases was at best only 5 to 15% faster in most of my titles on a 5k monitor.  Considering how much I spent on these 3090’s in todays market this was nuts.  I read everything I could find most everyone talked about how crappy SLI was to begin with and I thought to myself I was upset when SLI was downgrading for games from 3-way and Quad to only a pair.  Also, I have not experienced these issues that people talk about in regards to SLI.  The biggest issue I was faced with was power and cooling.
 
When I played my main game ESO I noticed that GPU2 which is the first GPU of my NVlink would throttle up and back down but GPU4 which is the second GPU of my 3090 NVlink pair would also do that but much less.  This reminded me of an issue I had back when I was using quad SLI 580s with some of the first drivers that were released.  Back then I was running 3x 120hz acer monitors in an SLI surround configuration and if I had the monitors plugged into GPUs 1 2 and 3 the 4th GPU would almost sit idle but if I plugged the monitors into GPUs 1 2 then 4 but left GPU 3 alone, they would all balance load equally including GPU 3.  The net result would be about a 17% improvement is my game’s FPS at the time.
 
This got me thinking my current main monitor is a Samsung C49RG90 which is a single 5120x1440 ultra-wide screen 5k monitor.  It has a productivity setting called picture-by-picture but this really is a monitor bifurcation setting allowing you to connect two PCs to a single monitor at 2560x1440 for each connection.  I set the monitor up for picture-by-picture then connected to one 3090 and then to the other 3090.  Then went into the Nvidia Control Panel and set it to “Span displays with Surround” contrary to some popular misconception this does allow SLI as long as the monitors are in a single link group.
 
I loaded it up ESO and from the pictures below you can see my one 5k monitor with every option cranked to the max my FPS went from 113fps with a single 3090 to 158fps with 3090s in SLI on DX11.  This frame rates of course changes plus or minus 5% all the time.
 

 

 

 


 
I hope this helps others with 3090's wanting to run them in SLI on older titles.  Also before anybody states I am using too much power I have a 15.04KW solar system on my roof and I get a check every month from the power company. 
 
  
#1

46 Replies Related Threads

    rjbarker
    CLASSIFIED Member
    • Total Posts : 3214
    • Reward points : 0
    • Joined: 2008/03/20 10:07:05
    • Location: Vancouver Isle - Westcoast Canada
    • Status: offline
    • Ribbons : 21
    Re: Success at getting 3090 in SLI on DX11 2021/04/14 18:16:59 (permalink)
    My experience with SLi goes back to 8800 GTX.(around 2006)..then GTX 280 and every series after that. Even played around with Tri SLi 580's n 680's....truth is that new Cards are now so powerful there simply is no requirement for multi Card systems anymore....unless you're running 3 x 4K monitors perhaps.or 3 x Ultra wide 3440*1440p monitors....otherwise its not justified anymore and no way nv will continue to support "old tech"....1080Ti SLi is the last SLi system for me...........and will be the last ;)
    Single 3080 Card on a single 3440*1440p monitor is just right !

    I9 12900K EK Velocity2 / ROG Z690 Apex/ 32G Dominator DDR5 6000/ Evga RTX 3080Ti FTW3  EK Vector / 980 Pro 512G / 980 Pro 1TB/ Samsung 860 Pro 500G/ WD 4TB Red / AX 1600i /  Corsair 900D & XSPC 480 * 360 * 240 Rads   XSPC Photon 170 Rez-Vario Pump Combo - Alienware 3440*1440p 120Hz/ W11
     
    #2
    Sajin
    EVGA Forum Moderator
    • Total Posts : 49164
    • Reward points : 0
    • Joined: 2010/06/07 21:11:51
    • Location: Texas, USA.
    • Status: offline
    • Ribbons : 199
    Re: Success at getting 3090 in SLI on DX11 2021/04/14 21:19:14 (permalink)
    Nice, but what do users do if they don't own that monitor?
    #3
    xanthus1
    New Member
    • Total Posts : 97
    • Reward points : 0
    • Joined: 2009/03/09 11:52:50
    • Status: offline
    • Ribbons : 2
    Re: Success at getting 3090 in SLI on DX11 2021/04/14 21:31:52 (permalink)
    Sajin
    Nice, but what do users do if they don't own that monitor?


    Get Nvidia to do the right thing an unlock it for everyone.
    #4
    xanthus1
    New Member
    • Total Posts : 97
    • Reward points : 0
    • Joined: 2009/03/09 11:52:50
    • Status: offline
    • Ribbons : 2
    Re: Success at getting 3090 in SLI on DX11 2021/04/14 21:40:47 (permalink)
    rjbarker
    My experience with SLi goes back to 8800 GTX.(around 2006)..then GTX 280 and every series after that. Even played around with Tri SLi 580's n 680's....truth is that new Cards are now so powerful there simply is no requirement for multi Card systems anymore....unless you're running 3 x 4K monitors perhaps.or 3 x Ultra wide 3440*1440p monitors....otherwise its not justified anymore and no way nv will continue to support "old tech"....1080Ti SLi is the last SLi system for me...........and will be the last ;)
    Single 3080 Card on a single 3440*1440p monitor is just right !




    It is not “old tech” using multiple processing units to work on a single task just makes logical sense.  But from a business point of view does not make sense for them.  Frist, they need to spend resources to make SLI work, and by doing so power users that buy SLI systems may skip a generation or two for their next upgrade.  This means they don’t have a consistent revenue stream with new cards for that segment of the user base.  It is a no-win situation for Nvidia.  
     
    #5
    Sajin
    EVGA Forum Moderator
    • Total Posts : 49164
    • Reward points : 0
    • Joined: 2010/06/07 21:11:51
    • Location: Texas, USA.
    • Status: offline
    • Ribbons : 199
    Re: Success at getting 3090 in SLI on DX11 2021/04/14 22:31:28 (permalink)
    xanthus1
    Sajin
    Nice, but what do users do if they don't own that monitor?


    Get Nvidia to do the right thing an unlock it for everyone.


    I wish.
    #6
    Hoggle
    EVGA Forum Moderator
    • Total Posts : 10101
    • Reward points : 0
    • Joined: 2003/10/13 22:10:45
    • Location: Eugene, OR
    • Status: offline
    • Ribbons : 4
    Re: Success at getting 3090 in SLI on DX11 2021/04/14 22:36:56 (permalink)
    xanthus1
    Sajin
    Nice, but what do users do if they don't own that monitor?


    Get Nvidia to do the right thing an unlock it for everyone.


    Or it could be more support in DirectX for multiple GPU configurations. DirectX at this point would be the better ground level support for SLI then NVIDIA having to make a profile.

    Use an Associates Code & SAVE 5% - 10% on your purchase. Just click on the associates banner to save, or enter the associates code at checkout on your next purchase. If you choose to use my code I want to personally say "Thank You" for using it. 
     
     
    #7
    rjbarker
    CLASSIFIED Member
    • Total Posts : 3214
    • Reward points : 0
    • Joined: 2008/03/20 10:07:05
    • Location: Vancouver Isle - Westcoast Canada
    • Status: offline
    • Ribbons : 21
    Re: Success at getting 3090 in SLI on DX11 2021/04/15 08:31:18 (permalink)
    xanthus1
    rjbarker
    My experience with SLi goes back to 8800 GTX.(around 2006)..then GTX 280 and every series after that. Even played around with Tri SLi 580's n 680's....truth is that new Cards are now so powerful there simply is no requirement for multi Card systems anymore....unless you're running 3 x 4K monitors perhaps.or 3 x Ultra wide 3440*1440p monitors....otherwise its not justified anymore and no way nv will continue to support "old tech"....1080Ti SLi is the last SLi system for me...........and will be the last ;)
    Single 3080 Card on a single 3440*1440p monitor is just right !




    It is not “old tech” using multiple processing units to work on a single task just makes logical sense.  But from a business point of view does not make sense for them.  Frist, they need to spend resources to make SLI work, and by doing so power users that buy SLI systems may skip a generation or two for their next upgrade.  This means they don’t have a consistent revenue stream with new cards for that segment of the user base.  It is a no-win situation for Nvidia.  
     




    Sorry but SLi is "old tech" (going back at least 15 yrs now), that was when a single Card was out of the question to push 1080p or 1200p gaming to max frames w ultra settings.......and very few "power users" still use SLi.(compared to even previous gen)..especially now that it is limited to 3090 only.
    Whats you're definition of a power user, as even a single GPU set-up with all the greatest HW is way beyond "mainstream".
     
    Agreed though would be nice if it was an option.....I would have considered 3080 SLi, but in all honesty I was finding that more n more games did not support SLi, so found myself playing games w only a single 1080Ti being utilized....if no support for SLi from software / game devs... its dead in my books!
    ....with that being said I will hang onto my 3 slot EK Terminal Block for awhile ;)
     

    I9 12900K EK Velocity2 / ROG Z690 Apex/ 32G Dominator DDR5 6000/ Evga RTX 3080Ti FTW3  EK Vector / 980 Pro 512G / 980 Pro 1TB/ Samsung 860 Pro 500G/ WD 4TB Red / AX 1600i /  Corsair 900D & XSPC 480 * 360 * 240 Rads   XSPC Photon 170 Rez-Vario Pump Combo - Alienware 3440*1440p 120Hz/ W11
     
    #8
    CraptacularOne
    Omnipotent Enthusiast
    • Total Posts : 14533
    • Reward points : 0
    • Joined: 2006/06/12 17:20:44
    • Location: Florida
    • Status: offline
    • Ribbons : 222
    Re: Success at getting 3090 in SLI on DX11 2021/04/15 08:43:33 (permalink)
    Interesting, I have a ultrawide monitor that also allows for picture by picture. I'll have to grab my other RTX 3090 out of my render station and try and see if I can verify this work around. Have you tested other games and measured a appreciable difference with 1 card vs 2? 

    Intel i9 14900K ...............................Ryzen 9 7950X3D
    MSI RTX 4090 Gaming Trio................ASRock Phantom RX 7900 XTX
    Samsung Odyssey G9.......................PiMax 5K Super/Meta Quest 3
    ASUS ROG Strix Z690-F Gaming........ASUS TUF Gaming X670E Plus WiFi
    64GB G.Skill Trident Z5 6800Mhz.......64GB Kingston Fury RGB 6000Mhz
    MSI MPG A1000G 1000w..................EVGA G3 SuperNova 1000w
    #9
    xanthus1
    New Member
    • Total Posts : 97
    • Reward points : 0
    • Joined: 2009/03/09 11:52:50
    • Status: offline
    • Ribbons : 2
    Re: Success at getting 3090 in SLI on DX11 2021/04/15 17:36:14 (permalink)
    CraptacularOne
    Interesting, I have a ultrawide monitor that also allows for picture by picture. I'll have to grab my other RTX 3090 out of my render station and try and see if I can verify this work around. Have you tested other games and measured a appreciable difference with 1 card vs 2? 


     
    I have only tried it on Fallout 4, Skyrim, and ESO.  It works on all of them.  The improvement is about 25%.



    post edited by xanthus1 - 2021/04/15 17:39:26
    #10
    Lumo841
    New Member
    • Total Posts : 20
    • Reward points : 0
    • Joined: 2011/10/01 05:59:12
    • Status: offline
    • Ribbons : 0
    Flagged as Spam (1)
    Re: Success at getting 3090 in SLI on DX11 2021/04/15 19:48:11 (permalink)
    SLI is old tech from 3dfx which Nvidia acquired when they went bankrupt. 
    #11
    rjbarker
    CLASSIFIED Member
    • Total Posts : 3214
    • Reward points : 0
    • Joined: 2008/03/20 10:07:05
    • Location: Vancouver Isle - Westcoast Canada
    • Status: offline
    • Ribbons : 21
    Re: Success at getting 3090 in SLI on DX11 2021/04/16 07:53:19 (permalink)
    ^^^^ Old or not it worked great running SLi optimized games!!! In some cases scaling was up to 100% improvement.
    Had SLi been available for 3080 I would have been working on getting a second Card......

    I9 12900K EK Velocity2 / ROG Z690 Apex/ 32G Dominator DDR5 6000/ Evga RTX 3080Ti FTW3  EK Vector / 980 Pro 512G / 980 Pro 1TB/ Samsung 860 Pro 500G/ WD 4TB Red / AX 1600i /  Corsair 900D & XSPC 480 * 360 * 240 Rads   XSPC Photon 170 Rez-Vario Pump Combo - Alienware 3440*1440p 120Hz/ W11
     
    #12
    sKutDeath
    New Member
    • Total Posts : 7
    • Reward points : 0
    • Joined: 2021/04/02 03:10:55
    • Status: offline
    • Ribbons : 1
    Re: Success at getting 3090 in SLI on DX11 2021/04/18 12:35:08 (permalink)
    Yes, I have been doing SLI since the 6800 GT, not even the Ultra. It really went wild with the 8800GTX's. But yes, I sit here in the same boat with 2 x 3090's... I am finding more possibilities though, and since RE-SIZE BAR, well that just opened up another level... I keep hoping they, (NVIDIA ****s), put the connections for NVLink for a reason...

    CPU: AMD Ryzen Threadripper 3970X @ 4,1 GHz
    Board: Gigabyte TRX40 AORUS XTREME
    RAM: 65536 MB Type Quad Channel (256 bit) DDR4-SDRAM
    Frequency 1899.6 MHz (DDR4-3800)
    GPU: 2 x EVGA NVIDIA GeForce RTX 3090 FTW3 Ultra SLI / NVLink
    SSD 1&2: 2 x GIGABYTE GP-AG42TB (Bus: NVMe)
    SSD 3: Samsung SSD 970 EVO Plus 1TB (Bus: NVMe)
    Display: LG Electronics ( 38GL950G (GSM7735)
    38.3 inches (97.3 cm) / 3840 x 1600 pixels @ 175 Hz



     
    #13
    CraptacularOne
    Omnipotent Enthusiast
    • Total Posts : 14533
    • Reward points : 0
    • Joined: 2006/06/12 17:20:44
    • Location: Florida
    • Status: offline
    • Ribbons : 222
    Re: Success at getting 3090 in SLI on DX11 2021/04/19 12:26:02 (permalink)
    xanthus1
    CraptacularOne
    Interesting, I have a ultrawide monitor that also allows for picture by picture. I'll have to grab my other RTX 3090 out of my render station and try and see if I can verify this work around. Have you tested other games and measured a appreciable difference with 1 card vs 2? 


     
    I have only tried it on Fallout 4, Skyrim, and ESO.  It works on all of them.  The improvement is about 25%.





    I have my second 3090 in my system right now and setup as you described using picture by picture and then set expand displays with surround in the Nvidia CP. I can confirm that it does indeed seem to improve performance in some games and in others it just starts acting all erratic and crashing. I'll get screen tearing in the middle or what looks like desync between the cards in the middle of the screen. One game in particular is in Bioshock Remastered, the game just seems to crash/hang after a few minutes with the above mentioned symptoms. 


    It does seem to be working fine in Fallout 4 which I have heavily modded as well as Skyrim. The performance increase isn't really that good however and I don't recommend anyone to really do this for what I've seen at about 20-30% bump in games that it does work and not crash. This also requires a monitor that allows picture by picture on top. If you already have 2 3090s and a capable monitor, I however don't see the harm in trying this just be warned that your mileage may vary and your games may be unstable. 

    Intel i9 14900K ...............................Ryzen 9 7950X3D
    MSI RTX 4090 Gaming Trio................ASRock Phantom RX 7900 XTX
    Samsung Odyssey G9.......................PiMax 5K Super/Meta Quest 3
    ASUS ROG Strix Z690-F Gaming........ASUS TUF Gaming X670E Plus WiFi
    64GB G.Skill Trident Z5 6800Mhz.......64GB Kingston Fury RGB 6000Mhz
    MSI MPG A1000G 1000w..................EVGA G3 SuperNova 1000w
    #14
    gsrcrxsi
    SSC Member
    • Total Posts : 985
    • Reward points : 0
    • Joined: 2010/01/24 19:20:59
    • Status: offline
    • Ribbons : 5
    Re: Success at getting 3090 in SLI on DX11 2021/04/19 13:10:16 (permalink)
    sounds like you're not actually using SLI. you're just having each card rendering half of the image for every frame, vs what SLI does, rendering the whole image, every other frame.
     
    i can see how this works, but I can imaging you might get some slight distortion or tearing in the middle of the screen when they are rendering different images.

    Rig1: EPYC 7V12 | [4] RTX A4000
    Rig2: EPYC 7B12 | [5] 3080Ti + [2] 2080Ti
    Rig3: EPYC 7B12 | [6] 3070Ti + [2] 3060
    Rig4: [2] EPYC 7742 | RTX A2000
    Rig5: [2] EPYC 7642
    Rig6: EPYC 7551 | [4] Titan V

    #15
    CraptacularOne
    Omnipotent Enthusiast
    • Total Posts : 14533
    • Reward points : 0
    • Joined: 2006/06/12 17:20:44
    • Location: Florida
    • Status: offline
    • Ribbons : 222
    Re: Success at getting 3090 in SLI on DX11 2021/04/19 13:19:54 (permalink)
    gsrcrxsi
    sounds like you're not actually using SLI. you're just having each card rendering half of the image for every frame, vs what SLI does, rendering the whole image, every other frame.
     
    i can see how this works, but I can imaging you might get some slight distortion or tearing in the middle of the screen when they are rendering different images.


    Yes that's exactly whats happening here and as I said I'm seeing desync heavily in at least one game. It's not really "SLI" and really doesn't scale well at all (not that official SLI scales particularly well but this is even worse) and this is just a "sort of" way to get some benefit from 2 3090 series cards. 
     
    Having said that, I will be returning my other 3090 to my render station for some real "use" out of it 

    Intel i9 14900K ...............................Ryzen 9 7950X3D
    MSI RTX 4090 Gaming Trio................ASRock Phantom RX 7900 XTX
    Samsung Odyssey G9.......................PiMax 5K Super/Meta Quest 3
    ASUS ROG Strix Z690-F Gaming........ASUS TUF Gaming X670E Plus WiFi
    64GB G.Skill Trident Z5 6800Mhz.......64GB Kingston Fury RGB 6000Mhz
    MSI MPG A1000G 1000w..................EVGA G3 SuperNova 1000w
    #16
    xanthus1
    New Member
    • Total Posts : 97
    • Reward points : 0
    • Joined: 2009/03/09 11:52:50
    • Status: offline
    • Ribbons : 2
    Re: Success at getting 3090 in SLI on DX11 2021/04/20 06:41:27 (permalink)
    CraptacularOne
    gsrcrxsi
    sounds like you're not actually using SLI. you're just having each card rendering half of the image for every frame, vs what SLI does, rendering the whole image, every other frame.
     
    i can see how this works, but I can imaging you might get some slight distortion or tearing in the middle of the screen when they are rendering different images.


    Yes that's exactly whats happening here and as I said I'm seeing desync heavily in at least one game. It's not really "SLI" and really doesn't scale well at all (not that official SLI scales particularly well but this is even worse) and this is just a "sort of" way to get some benefit from 2 3090 series cards. 
     
    Having said that, I will be returning my other 3090 to my render station for some real "use" out of it 




    Thank you for running the tests.  I have about the same 20 to 30% improvement in performance however, I have no screen taring whatsoever.  Looking at your two motherboards you have listed which one are you using for the test? Both listed appear to have 16x and then 4x PCIe lanes on the primary and secondary slots.  In contrast I was testing with an Trx40 Aorus Extreme which uses 16x/8x/16x/8x and both my 3090’s are in 16x slots.  Also, I use an open loop cooling system with over 3.9L of cooling fluid with a 9x9 radiator using 120mm push/pull fan configuration with separate power supplies for the push/pull pumps at 13.8volt and the 18x delta fans from a Cisco 6509 at 7.2volts.  Also, I am using the EVGA supernova 2000+ in an L6-30 twist lock 248v power to make sure the system can run indefinitely without throttle down.  I don’t not believe they are doing AFR but it is still SLI never the less (using two cards to work on a single workload), also when I think of what I have spent in the past on a 25% improvement it does not seem that out of line.
     
    Your listed boards PCI lanes:
    ****************************************************
    Z490 AORUS ELITE AC (rev. 1.0)
                    1 x PCI Express x16 slot, running at x16 (PCIEX16)
                    * For optimum performance, if only one PCI Express graphics card is to be installed, be sure to install it in the PCIEX16 slot.
                    1 x PCI Express x16 slot, running at x4 (PCIEX4)
                    2 x PCI Express x1 slots
                    (All of the PCI Express slots conform to PCI Express 3.0 standard.)
     
     
    ASUS TUF Gaming X570 WiFi
                    3rd Gen AMD Ryzen™ Processors
                    1 x PCIe 4.0 x16 (x16 mode)
                    2nd Gen AMD Ryzen™ and 3rd Gen AMD Ryzen™ with Radeon™ Graphics Processors
                    1 x PCIe 3.0 x16 (x16 mode)
                    2nd and 1st Gen AMD Ryzen™ with Radeon™ Vega Graphics Processors
                    1 x PCIe 3.0/2.0 x16 (x8 mode)
                    AMD X570 chipset
                    1 x PCIe 4.0 x16 (max at x4 mode)
                    2 x PCIe 4.0 x1
     
     
    The board I am using PCI lanes:
    ************************************************** 
    TRX40 AORUS XTREME (rev. 1.0)
                    2 x PCI Express x16 slots, running at x16 (PCIEX16_1, PCIEX16_2)
                    2 x PCI Express x16 slots, running at x8 (PCIEX8_1, PCIEX8_2)
                    (The PCIEX16 and PCIEX8 slots conform to PCI Express 4.0 standard.)
     
     
    post edited by xanthus1 - 2021/04/20 06:48:18
    #17
    gsrcrxsi
    SSC Member
    • Total Posts : 985
    • Reward points : 0
    • Joined: 2010/01/24 19:20:59
    • Status: offline
    • Ribbons : 5
    Re: Success at getting 3090 in SLI on DX11 2021/04/20 16:01:25 (permalink)
    It’s not SLI.

    You could remove the bridge and it would still work.

    Rig1: EPYC 7V12 | [4] RTX A4000
    Rig2: EPYC 7B12 | [5] 3080Ti + [2] 2080Ti
    Rig3: EPYC 7B12 | [6] 3070Ti + [2] 3060
    Rig4: [2] EPYC 7742 | RTX A2000
    Rig5: [2] EPYC 7642
    Rig6: EPYC 7551 | [4] Titan V

    #18
    xanthus1
    New Member
    • Total Posts : 97
    • Reward points : 0
    • Joined: 2009/03/09 11:52:50
    • Status: offline
    • Ribbons : 2
    Re: Success at getting 3090 in SLI on DX11 2021/04/20 21:07:54 (permalink)
    gsrcrxsi
    It’s not SLI.

    You could remove the bridge and it would still work.



    I am curious how you came to this conclusion?  When I remove my SLI bridge on the GTX 3090’s I can no longer span the monitors across GPU’s in a monitor group.  Does yours still work?
     

     

    #19
    CraptacularOne
    Omnipotent Enthusiast
    • Total Posts : 14533
    • Reward points : 0
    • Joined: 2006/06/12 17:20:44
    • Location: Florida
    • Status: offline
    • Ribbons : 222
    Re: Success at getting 3090 in SLI on DX11 2021/04/20 21:39:32 (permalink)
     
    xanthus1
    CraptacularOne
    gsrcrxsi
    sounds like you're not actually using SLI. you're just having each card rendering half of the image for every frame, vs what SLI does, rendering the whole image, every other frame.
     
    i can see how this works, but I can imaging you might get some slight distortion or tearing in the middle of the screen when they are rendering different images.


    Yes that's exactly whats happening here and as I said I'm seeing desync heavily in at least one game. It's not really "SLI" and really doesn't scale well at all (not that official SLI scales particularly well but this is even worse) and this is just a "sort of" way to get some benefit from 2 3090 series cards. 
     
    Having said that, I will be returning my other 3090 to my render station for some real "use" out of it 




    Thank you for running the tests.  I have about the same 20 to 30% improvement in performance however, I have no screen taring whatsoever.  Looking at your two motherboards you have listed which one are you using for the test? Both listed appear to have 16x and then 4x PCIe lanes on the primary and secondary slots.  In contrast I was testing with an Trx40 Aorus Extreme which uses 16x/8x/16x/8x and both my 3090’s are in 16x slots.  Also, I use an open loop cooling system with over 3.9L of cooling fluid with a 9x9 radiator using 120mm push/pull fan configuration with separate power supplies for the push/pull pumps at 13.8volt and the 18x delta fans from a Cisco 6509 at 7.2volts.  Also, I am using the EVGA supernova 2000+ in an L6-30 twist lock 248v power to make sure the system can run indefinitely without throttle down.  I don’t not believe they are doing AFR but it is still SLI never the less (using two cards to work on a single workload), also when I think of what I have spent in the past on a 25% improvement it does not seem that out of line.
     
    Your listed boards PCI lanes:
    ****************************************************
    Z490 AORUS ELITE AC (rev. 1.0)
                    1 x PCI Express x16 slot, running at x16 (PCIEX16)
                    * For optimum performance, if only one PCI Express graphics card is to be installed, be sure to install it in the PCIEX16 slot.
                    1 x PCI Express x16 slot, running at x4 (PCIEX4)
                    2 x PCI Express x1 slots
                    (All of the PCI Express slots conform to PCI Express 3.0 standard.)
     
     
    ASUS TUF Gaming X570 WiFi
                    3rd Gen AMD Ryzen™ Processors
                    1 x PCIe 4.0 x16 (x16 mode)
                    2nd Gen AMD Ryzen™ and 3rd Gen AMD Ryzen™ with Radeon™ Graphics Processors
                    1 x PCIe 3.0 x16 (x16 mode)
                    2nd and 1st Gen AMD Ryzen™ with Radeon™ Vega Graphics Processors
                    1 x PCIe 3.0/2.0 x16 (x8 mode)
                    AMD X570 chipset
                    1 x PCIe 4.0 x16 (max at x4 mode)
                    2 x PCIe 4.0 x1
     
     
    The board I am using PCI lanes:
    ************************************************** 
    TRX40 AORUS XTREME (rev. 1.0)
                    2 x PCI Express x16 slots, running at x16 (PCIEX16_1, PCIEX16_2)
                    2 x PCI Express x16 slots, running at x8 (PCIEX8_1, PCIEX8_2)
                    (The PCIEX16 and PCIEX8 slots conform to PCI Express 4.0 standard.)
     
     


    Tests were run in my X299 board as well, I probably should have clarified that. I decided to run them in my X299 system that I use for rendering and video editing. It's X299 system with an i9 9980XE in it. Yes it's PCIe 3.0 but that doesn't really matter as 2 X16 PCIe 3.0 slots aren't going to limit the cards. 
     
    Full specs of that system would be as follows. 
     
    i9 9980XE 
    Palit RTX 3090 (& the PNY RTX 3090 from my main game rig) 
    MSI X299 SLI Plus
    128GB 3200Mhz
    1300w Seasonic Prime
     
    Nothing in that system is overclocked, CPU is cooled my a Noctua NH-D15 and the cards were run with the stock coolers. 
    post edited by CraptacularOne - 2021/04/21 08:01:34

    Intel i9 14900K ...............................Ryzen 9 7950X3D
    MSI RTX 4090 Gaming Trio................ASRock Phantom RX 7900 XTX
    Samsung Odyssey G9.......................PiMax 5K Super/Meta Quest 3
    ASUS ROG Strix Z690-F Gaming........ASUS TUF Gaming X670E Plus WiFi
    64GB G.Skill Trident Z5 6800Mhz.......64GB Kingston Fury RGB 6000Mhz
    MSI MPG A1000G 1000w..................EVGA G3 SuperNova 1000w
    #20
    jd760
    New Member
    • Total Posts : 65
    • Reward points : 0
    • Joined: 2016/03/30 22:52:18
    • Status: offline
    • Ribbons : 0
    Re: Success at getting 3090 in SLI on DX11 2021/04/20 22:18:53 (permalink)
    How did you actually get Nvidia Control Panel to enable SLI ?  I have 2 3090's w/ the NVLink and a Samsung Odyssey G9 monitor. Everything is up to date(drivers, BIOS, etc). Even a clean install of Windows. Also did that DDU in Safe Mode(as someone else had suggested on another thread) and tried re-installing the cards & NVlink.. but no matter what, I can't get it to switch from "Disable SLI" in Nvidia Control Panel.
    I tried reverting back to an older driver from January and was able to enable SLI that way.. but once I update to a newer(latest) driver, it just gets stuck at "Disable SLI".
    #21
    xanthus1
    New Member
    • Total Posts : 97
    • Reward points : 0
    • Joined: 2009/03/09 11:52:50
    • Status: offline
    • Ribbons : 2
    Re: Success at getting 3090 in SLI on DX11 2021/04/21 06:40:03 (permalink)
    jd760
    How did you actually get Nvidia Control Panel to enable SLI ?  I have 2 3090's w/ the NVLink and a Samsung Odyssey G9 monitor. Everything is up to date(drivers, BIOS, etc). Even a clean install of Windows. Also did that DDU in Safe Mode(as someone else had suggested on another thread) and tried re-installing the cards & NVlink.. but no matter what, I can't get it to switch from "Disable SLI" in Nvidia Control Panel.
    I tried reverting back to an older driver from January and was able to enable SLI that way.. but once I update to a newer(latest) driver, it just gets stuck at "Disable SLI".




    As you stated you had SLI working before with an old driver, so I am assuming you have a prosumer motherboard and CPU capable of this type of SLI.
     
     
    Some things to check:
                    Make sure you don’t have a bifurcation PCIe set on the slots you are using.
                    If you added a NVMe card to your system you might have bifurcated your PCIe slot unintentionally.
                    If you have anther card between your two 3090s make sure the card is allowing the seating of the SLI bridge.
                    Make sure you have “Above 4G decoding” set in your bios.  (Could have a logical PCIe resource issue.)
     
     
    Below I have a picture of mine with the April driver that I am using.  Use GPU-Z to verify you have the expected lanes per your PCIe slot for your cards.
     
     
     
    *****UPDATE*****
     
    I read anther one of your posts and it looks like you are using "MEG X570 GODLIKE" motherboard.
    Your video cards must go into PCI_E1 and PCI_E3
     
    https://download.msi.com/.../mb/M7C34v1.2-EURO.pdf (see: page 21 "Your video cards must go into PCI_E1 and PCI_E3")
     
    note: If you are using the M.2 XPANDER-Z GEN 4 card you will need to remove it as it is only supported in slot 3 and you need that slot for your second GTX 3090.
     
    Please check GPU-Z to make sure no motherboard M.2 NVMe card caused the bifurcation of PCI_E1 or PCI_E3 what I read in your manual it does not look like it.  However, the Ryzen 3000 has a maximum of 24 lanes so it is likely bifurcated if the M.2 NVMe is used.
     
    Lastly it sounds like you are using the EVGA bridge from your other post, the actual physical bridge PCB is encapsulated in a plastic clamshell.  In my case I wanted to run anther video card under the bridge, I could not seat it correctly until I modified the bridge by using a heat gun and guitar pick and liberating the PCB from the plastic clam shell. (I am sure this voids the warranty of the bridge) This allowed it to have the required clearance to pass over the middle card.
     
    I am not sure if EVGA lets you PM but I can explain more if needed, you are right of the verge of making this work and you have right kind of monitor.
     
     
    Before removal of plastic clamshell
     

     
    After removal of plastic clamshell
     

    post edited by xanthus1 - 2021/04/21 10:08:58
    #22
    jd760
    New Member
    • Total Posts : 65
    • Reward points : 0
    • Joined: 2016/03/30 22:52:18
    • Status: offline
    • Ribbons : 0
    Re: Success at getting 3090 in SLI on DX11 2021/04/22 11:10:52 (permalink)
    Thank you for the reply. I am using the EVGA bridge and EVGA cards. I do have the cards installed in PCI_E1 & PCI_E3.
    Additionally, there's 3 M.2 slots. I have SSD's installed in M2_1 & M2_2 (M2_3 is unused). According to the manual, M2_1 & M2_2 run  off of the chipset...  although M2_3 shows it runs off of the CPU instead.
    I need what's on both of the SSD's so I went ahead and ordered a bigger SSD that I'm going to install into M2_3 and will leave M2_1 & M2_2 unused. This new SSD should be arriving today. Oh and as for monitor, I am using the Samsung Odyssey G9.
    #23
    jd760
    New Member
    • Total Posts : 65
    • Reward points : 0
    • Joined: 2016/03/30 22:52:18
    • Status: offline
    • Ribbons : 0
    Re: Success at getting 3090 in SLI on DX11 2021/04/22 16:54:42 (permalink)
    UPDATE:
     
    I removed the SSD's from M2_1 & M2_2 and installed a single new SSD into M2_3. I also updated to the latest beta BIOS from MSI's website. It still will not allow me to enable SLI in Nvidia Control Panel. This is what GPU-Z is reporting:
     

    #24
    SleepyEs
    Superclocked Member
    • Total Posts : 101
    • Reward points : 0
    • Joined: 2021/05/01 06:03:49
    • Status: offline
    • Ribbons : 0
    Re: Success at getting 3090 in SLI on DX11 2021/05/09 12:32:47 (permalink)
    congratz dude
    #25
    wharrus5
    New Member
    • Total Posts : 16
    • Reward points : 0
    • Joined: 2020/12/26 12:39:15
    • Status: offline
    • Ribbons : 0
    Re: Success at getting 3090 in SLI on DX11 2021/05/26 06:44:43 (permalink)
    i am really interested in this topic . i have 2 x 3090 kingpin in sli and im starting to regret getting the 2nd one . it was a big effort to grab two cards , time wise , financial wise , i also had to change the pc case ( i have a corsair 1000d now ) so the gpus and the fans can fit.
    so if its possible to make dx11 games work with both cards , it would be amazing . 
    maybe we can tip some dev to make a patch or something like that...
    #26
    srfonden
    New Member
    • Total Posts : 19
    • Reward points : 0
    • Joined: 2007/06/01 11:37:35
    • Status: offline
    • Ribbons : 0
    Re: Success at getting 3090 in SLI on DX11 2021/05/30 11:44:55 (permalink)
    I really appreciate the detail that you went into on this OP - it motivated me to buy a second high-speed DP cable and to give it a shot today with my dual 3090s and G9 monitor.  Here is what I found with a few titles I have on-hand, running on mostly ultra settings across the board:
     

    ----------------------------------------------
                        |             PBP | Single
    ----------------------------------------------
    Title      |     DX | GPU Usage | FPS | FPS
    ----------------------------------------------
    Cyberpunk  |     12 |  99% / 2% |  40 |  50
    Division 2 |     12 |  99% / 4% |  77 |  87
    Grim Dawn  |     11 |     Equal | 120 | 120
    Breakpoint | Vulcan |     Equal |  86 | 103
    ----------------------------------------------

    My conclusion is that when using PBP in an SLI setup for DirectX 11 or Vulcan titles then you are definitely getting full use of both GPUs independently driving the displays.  However, when DirectX 12 titles are involved it seems that one GPU is doing all of the work, and that the video is passing through the second card at a penalty to performance.  Additionally, it is worth pointing out that I experienced noticeable mid-screen tearing in both Cyberpunk (DX12) and Breakpoint (Vulcan) with PBP enabled.  Also, mouse-look was noticeably sluggish in Cyberpunk with PBP enabled.  Given this experience, I feel that I am better off with PBP disabled, in spite of the minor usage improvement in these titles.
     
    When I bought both 3090s I had interpreted nVidia's communication as not providing SLI profiles for new games, assumed that existing SLI profiles would continue to work, and that independent parties could still hand-craft profiles for new games, but later realized that SLI profiles were disabled altogether at a driver level.  I've held on to the second card hoping that a third party might figure out a hack for getting them working again, but I'm finally coming to the conclusion that I'm better off selling the second card, especially given the crazy second-hand prices that they are fetching these days.
    post edited by srfonden - 2021/05/30 11:49:43
    #27
    miguelmotocross075
    New Member
    • Total Posts : 38
    • Reward points : 0
    • Joined: 2020/05/19 21:14:03
    • Location: Puerto Rico
    • Status: offline
    • Ribbons : 0
    Re: Success at getting 3090 in SLI on DX11 2021/07/06 20:32:24 (permalink)
    I got the new z590 FTW with two Kingping for SLI, I install everything but I'm unable to activate SLI in the Nvidia Control Panel, the option does not appear. I tried reinstalling the driver few times but nothing. Is there anything that I have to activate in the MOBO Bios? I would appreciate any help. Thanks.



    post edited by miguelmotocross075 - 2021/07/06 20:36:47

    Attached Image(s)



    EVGA RTX 3090 K|NGP|N HYBRID X²
    EVGA SuperNova 1600 T2
    Intel i9-10900K
    G.SKILL TridentZ RGB Series 64GB 4000MHz
    EVGA Z590 FTW
    EVGA Z20 RGB Optical Mechanical Gaming Keyboard
    EVGA X17 Gaming Mouse
     
    #28
    Sajin
    EVGA Forum Moderator
    • Total Posts : 49164
    • Reward points : 0
    • Joined: 2010/06/07 21:11:51
    • Location: Texas, USA.
    • Status: offline
    • Ribbons : 199
    Re: Success at getting 3090 in SLI on DX11 2021/07/06 20:51:04 (permalink)
    Clearly you didn't read the first post. You need a monitor that supports picture-by-picture mode to do the sli that is described in this thread. However I think you should still be able to activate sli for benchmarking. Are both your cards running at x8 according to gpu-z?
    #29
    telehog
    iCX Member
    • Total Posts : 413
    • Reward points : 0
    • Joined: 2018/12/05 13:48:52
    • Status: offline
    • Ribbons : 1
    Re: Success at getting 3090 in SLI on DX11 2021/07/06 20:53:20 (permalink)
    If Nvlink is not seated all the way down  it wont work. 
    #30
    Page: 12 > Showing page 1 of 2
    Jump to:
  • Back to Mobile