2016/09/16 09:00:04
drmarc
So this has been an interesting topic regarding the DX12 and using SLI vs. a single titan XP.  I'm in the market for a 4K build for gaming, and I am planning on getting 2 hybrid 1080's, knowing that for $200 more, the VRAM will exceed a single Titan when SLI and DX12 is optimized.  But this brings up the point - how many games are exceeding 8GB of VRAM.  Sure, if you turn up everything at 4K you can push it.  Do you really need it? Absolutely not.  Iluv2raceit made a great point - one can argue AA is pretty much minimal improvement in visuals at a huge performance cost (You can argue this even more so if you play on a TV and not a monitor where your face is right up to the pixels).
 
In regards to some 1080 SLI vs. Titan XP OC videos...take a look at this https://www.youtube.com/watch?v=QWtqGNmWS3Y and https://www.youtube.com/watch?v=BcdYzsFLXds.
 
I'd be interested to know more about the Witcher 3 video.  Although the video looks like the Titan XP is smoother in most scenarios, the frame rates show the Titan is 50's in town where the SLI's are around 70.  At 2:35 to 2:40 is where you see the significant difference - the Titan XP lags to the 40's where the 1080's push through the demand as you transition from town to open world rendering.  The GTA V video similarly shows the 1080's outperforming the Titan XP, although the frame rate appears smoother on the Titan XP in the recording.
 
I don't doubt that owners of the Titan XP have smooth experiences, but I have a hard time believing it outperforms 2 1080's based on the 2 videos I linked.  Also, I am not sure if video recording software affects what we are watching as a final product of performance, thus I like to look at the actual frames during recording.
2016/09/17 21:27:24
stalinx20
Iluv2raceit

Oh ye of little faith.  It's already been proven:  http://venturebeat.com/2016/03/18/heres-a-pc-running-amd-and-nvidia-video-cards-at-the-same-time/
 
It's just a matter of time before more games support the capability



Maybe so, but give it time, we all know Nvidia and AMD will do their shenanigans  
 
Just a little slip of that NVidia "driver" while running AMD and Nvidia will theoretically cause the drivers to crash every time. Companies aren't stupid. If Nvidia can make it where the PhysX drivers will not work on AMD, then surely they can make it where if one tries to SLI an AMD card with an Nvidia card, then they might "reduce" the performance gains, compared to one who has both Nvidia cards. You know it's going to happen.....Does anybody else not see this happening, when this magic occurs?
2016/09/19 06:24:27
Iluv2raceit
stalinx20
Iluv2raceit

Oh ye of little faith.  It's already been proven:  http://venturebeat.com/2016/03/18/heres-a-pc-running-amd-and-nvidia-video-cards-at-the-same-time/
 
It's just a matter of time before more games support the capability



Maybe so, but give it time, we all know Nvidia and AMD will do their shenanigans  
 
Just a little slip of that NVidia "driver" while running AMD and Nvidia will theoretically cause the drivers to crash every time. Companies aren't stupid. If Nvidia can make it where the PhysX drivers will not work on AMD, then surely they can make it where if one tries to SLI an AMD card with an Nvidia card, then they might "reduce" the performance gains, compared to one who has both Nvidia cards. You know it's going to happen.....Does anybody else not see this happening, when this magic occurs?


Fair enough.  You do make valid points.  I'm just trying to remain optimistic about the whole thing and do wish that AMD and Nvidia realize that in the end, making the consumer happy, means a much higher profit for them.  If better DX12 and Vulkan performance is to be had by AMD owners buying an Nvidia card to run together, and Nvidia owners buying an AMD card, it would be better for both parties to get onboard and make sure their future drivers fully support the capabilities.  I am an Nvidia owner, but would not hesitate a second to buy an AMD card to run in tandem if it means better gaming performance
2016/09/19 06:30:49
Iluv2raceit
drmarc
So this has been an interesting topic regarding the DX12 and using SLI vs. a single titan XP.  I'm in the market for a 4K build for gaming, and I am planning on getting 2 hybrid 1080's, knowing that for $200 more, the VRAM will exceed a single Titan when SLI and DX12 is optimized.  But this brings up the point - how many games are exceeding 8GB of VRAM.  Sure, if you turn up everything at 4K you can push it.  Do you really need it? Absolutely not.  Iluv2raceit made a great point - one can argue AA is pretty much minimal improvement in visuals at a huge performance cost (You can argue this even more so if you play on a TV and not a monitor where your face is right up to the pixels).
 
In regards to some 1080 SLI vs. Titan XP OC videos...take a look at this https://www.youtube.com/watch?v=QWtqGNmWS3Y and https://www.youtube.com/watch?v=BcdYzsFLXds.
 
I'd be interested to know more about the Witcher 3 video.  Although the video looks like the Titan XP is smoother in most scenarios, the frame rates show the Titan is 50's in town where the SLI's are around 70.  At 2:35 to 2:40 is where you see the significant difference - the Titan XP lags to the 40's where the 1080's push through the demand as you transition from town to open world rendering.  The GTA V video similarly shows the 1080's outperforming the Titan XP, although the frame rate appears smoother on the Titan XP in the recording.
 
I don't doubt that owners of the Titan XP have smooth experiences, but I have a hard time believing it outperforms 2 1080's based on the 2 videos I linked.  Also, I am not sure if video recording software affects what we are watching as a final product of performance, thus I like to look at the actual frames during recording.


"...the VRAM will exceed a single Titan when SLI and DX12 is optimized"  Huh?  Where are you getting that from?  A single Titan XP has 12GB GDDR5X and I have never seen anywhere near 12GB being used in any DX12 game.  Maybe at 5K or higher it might start getting close.  But, for 4K and below resolutions - even with AA turned on, I just don't see the VRAM buffer getting saturated.  Conventional SLI seems to have no part in the whole scheme of things as the VRAM would be split anyway.  If you are referring to split graphics duties in DX12 with running an Nvidia SLI and a single AMD card, then the VRAM still would not run out as the dedicated functions as assigned by DX12 would allocate the resources accordingly.  If anything, the Nvidia SLI configuration with a mixed card solution would probably use less VRAM because the AMD would take over certain functionalities that are better optimized for that particular GPU architecture (i.e. asynchronous executions, high memory bandwidth requirements - thanks to HBM2, etc).
 
And with regards to your concerns regarding 2 x GTX1080s vs. a single Titan XP, I can tell you now that the game play is smoother when not using SLI.  At least for me.  I am very sensitive to microstutter and immediately notice it in any game that exhibits the issue when SLI is enabled.  I previously owned two GTX980Ti cards and that was one of the major drawbacks.  I upgraded to a single Titan XP and can tell you first hand that the gameplay is much smoother and I don't have to ever worry about enabling SLI.  Also, I am running a custom watercooling solution and my coolant temps dropped a full 10C vs. running the two GTX980Ti cards.  Another point to consider is cost.  Two regular air cooled GTX1080 cards will run you around $1400 vs. $1200 for a single Titan.  That's about a 20% difference.  Buy two baseline GTX1080s with no special heatsinks or coolers and the margin is almost even.  So, performance per dollar is that buying two GTX1080s would be a much better deal.  But actually, the real answer is "it depends".  It depends on what games you play and if you are willing to use more power and generate more heat for the 20-40% performance gain over a single Titan XP, and that's IF the games you play are optimized for SLI.  And then the argument of microstuttering comes into play and if you notice it or not when gaming.  My advice is ask yourself these questions:  1) what games do you play?, 2) are they optimized for SLI?, 3) are you sensitive to microstutter?, 4) what monitor (and resolution) do you use right now for gaming?, 5) is your current power supply adequate to power two GTX1080Ti cards?, 6) do you want to eventually watercool the cards? - That means double the cost for fittings and waterblocks.  You get the idea.
 
For my particular needs, a single Titan XP was the perfect choice based upon all of these questions I asked myself.  Your answer may be different and you may want 2 x GTX1080s instead.  In the end, we all win because each of us will have what we want and enjoy the games the way we want ;-)
 
 
2016/09/19 08:41:39
cmoney408
he's referring to the fact that the idealized dx12 future stacks ram in SLI. so (2) 1080's would give you a total of 16gb's of USABLE VRam. 
2016/09/19 09:21:25
drmarc
cmoney408
he's referring to the fact that the idealized dx12 future stacks ram in SLI. so (2) 1080's would give you a total of 16gb's of USABLE VRam. 


This is exactly what I meant. Sorry if I was not clear. In looking at performance differences between 1080 sli and single Titan xp...there are so many factors to consider. Temp is a big one as iluv2raceit mentioned. But there's also the fact that many games don't support / are not optimized for sli, and most games aren't coded for dx12 yet. I'm hoping all of this changes soon. 2 hybrid 1080s are around $1450. Whether you use a gsync monitor or not also effects smoothness, as well as driver support. I've done my research, and it looks like when sli works, the 1080 sli beats out the Titan xp at 4k.

Interestingly enough, Titan xp 2 way sli versus 2 way 1080 sli shows almost no difference in performance at 4k (this was based off a head to head article from September). It was only at 5k where a difference was seen, which goes along with vram just not being utilized to show just how powerful the titans are in sli...i'm sure it will outperform it especially when hybrid titans become available.

With all that said, I think 2 hybrid 1080s is the better deal (once the software side of things catches up to the hardware potential). I'd hope that if you're spending $1.5k on video cards that you'd cool your system appropriately and use the best suited monitor to go with it so you don't get stuttering from throttling, etc. Otherwise, like you said both options are excellent choices for 4k in the grand scheme.

Edit: here is the article I was referring to, very interesting read.

http://www.pcworld.com/ar...y-decadant-in-sli.html
2016/09/19 10:17:10
cmoney408
i feel like i am in the minority, but i LOVE SLI. i either have never had micro stutter, or my eyes just dont pick it up. i play mostly AAA title games (CoD, GTA, Far Cry, Doom, JC), so most of them luckily support SLI either at release or within a month. the worst game was Just Cause 3, but i heard it was just bad all around for everyone in the beginning. i was trying to run it on a 295x2 at the time, wasn't happening.
 
i had a 980ti hybrid SLI setup, but sold one in June, thinking i was going to go with a 1080 SLI setup. but stock and now i dont care as much. at this point, my single 980ti will hold me over until the 1080ti's come out. i may even pick up another 980ti now that they are dropping a bit in price.
 
i consider myself a bang for the buck kinda guy. if i can get better performance for less (or more FPS per dollar) i consider that the better choice. i can handle some games needing time to have SLI support. i can even handle if a game here or there never gets SLI support, since even a single 980ti or single 1080 would still be ok. 
 
 
2016/09/19 11:43:36
Iluv2raceit
cmoney408
he's referring to the fact that the idealized dx12 future stacks ram in SLI. so (2) 1080's would give you a total of 16gb's of USABLE VRam. 


DX12 does not necessarily stack VRAM in every scenario.  It depends on the specific hardware configuration as well as how the game itself is coded.
2016/09/19 11:46:26
Iluv2raceit
cmoney408
i feel like i am in the minority, but i LOVE SLI. i either have never had micro stutter, or my eyes just dont pick it up. i play mostly AAA title games (CoD, GTA, Far Cry, Doom, JC), so most of them luckily support SLI either at release or within a month. the worst game was Just Cause 3, but i heard it was just bad all around for everyone in the beginning. i was trying to run it on a 295x2 at the time, wasn't happening.
 
i had a 980ti hybrid SLI setup, but sold one in June, thinking i was going to go with a 1080 SLI setup. but stock and now i dont care as much. at this point, my single 980ti will hold me over until the 1080ti's come out. i may even pick up another 980ti now that they are dropping a bit in price.
 
i consider myself a bang for the buck kinda guy. if i can get better performance for less (or more FPS per dollar) i consider that the better choice. i can handle some games needing time to have SLI support. i can even handle if a game here or there never gets SLI support, since even a single 980ti or single 1080 would still be ok. 
 
 


I would definitely recommend you pick up a 1080 rather than 'go backwards' and buy a GTX980Ti.  This is especially true if you ever plan to invest into VR in the near future.  Pascal GPUs are much more optimized for VR than the older generation Maxwell GPUs.
2016/09/19 11:59:24
cmoney408
Iluv2raceit
 
DX12 does not necessarily stack VRAM in every scenario.  It depends on the specific hardware configuration as well as how the game itself is coded.



thats why i said "idealized". in a perfect world one could be so lucky to have stacked ram! 
 
if i bought another 980ti hybrid it would only be because i got it at $300 or less. at a price i could nearly break even on in 6-8 months when the 1080ti comes out. 
 
i truly cant wait for finished VR. personally, i think we are still in the early stages. i probably wont invest in VR for at least 18 months, maybe even 2 years. i have usually been an early adopter, but after years of it. i have finally realized, its disappointing. im tired of paying for half finished products. i will most likely wait for Rev 2 or 3 products. better res, faster refresh rates, possible standards, and better prices. 

Use My Existing Forum Account

Use My Social Media Account