Sajinrjohnson11Try some steps posted here: #1 – Nvidia Graphics and Performance Fix – FPS Drops, Stuttering and Frame SkippingThe easiest way for nVidia users to run the game on optimum settings to avoid all these issues is to install and choose Watch Dogs profile. Uh... No.
rjohnson11Try some steps posted here:
Ntrain96rjbarkerOkay...so finally spent around 15 - 20 minutes last optimizing settings to find the "sweet spot" with the game. Found pretty good settings for both looks and smooth gameplay, although the "hitching / stuttering" does appear every now n then but certainly not to the point that it degrades the game in anyway (imho). Playing 2560 x 1440p - All Ultra (change shadows and reflections to high from Ultra)- Textures: High- Moton Blur: Off (I've always hated this feature)- AA: MSAAx2 (or 4) ...using TXAA seems to increase stuttering for sure...VSynch: OffFrame Buffering: left at 3 (anyone play with this setting?) Thats about it....certainly dont see the justification for all the complaints in this Thread...wow!!! Would 6GB VRAM allow me to move from MSAAx2-4 to TXAAx2-4? No idea...dont care as I know the visuals wouldnt be noticeable for me and this late in the Keplar game I wouldnt trade my 780Ti's for another Keplar Card... Really guys, tweak your settings a bit, I'm sure once a Patch or 2 comes out and perhaps the next Driver, the stuttering will be reduced, saw the same thing and same arguments (VRAM requirements) when Tomb Raider was released. As for PS4...enough already...I dont think it even runs this game at true 1080 does it?...seems to me its less...while many of us are going for 1440p..1600p or Surround!Here, I consider myself the utmost expert on settings with this game currently........at 1440p, put textures down to medium, max out all the other settings, vsync off, gpu rendering to 1 and AA on a 1440p should be temporal SMAA at MOST, regular SMAA is even a bit better........you just don't need as much AA with a 1440p screen so keep it at Temporal SMAA or lower.
rjbarkerOkay...so finally spent around 15 - 20 minutes last optimizing settings to find the "sweet spot" with the game. Found pretty good settings for both looks and smooth gameplay, although the "hitching / stuttering" does appear every now n then but certainly not to the point that it degrades the game in anyway (imho). Playing 2560 x 1440p - All Ultra (change shadows and reflections to high from Ultra)- Textures: High- Moton Blur: Off (I've always hated this feature)- AA: MSAAx2 (or 4) ...using TXAA seems to increase stuttering for sure...VSynch: OffFrame Buffering: left at 3 (anyone play with this setting?) Thats about it....certainly dont see the justification for all the complaints in this Thread...wow!!! Would 6GB VRAM allow me to move from MSAAx2-4 to TXAAx2-4? No idea...dont care as I know the visuals wouldnt be noticeable for me and this late in the Keplar game I wouldnt trade my 780Ti's for another Keplar Card... Really guys, tweak your settings a bit, I'm sure once a Patch or 2 comes out and perhaps the next Driver, the stuttering will be reduced, saw the same thing and same arguments (VRAM requirements) when Tomb Raider was released. As for PS4...enough already...I dont think it even runs this game at true 1080 does it?...seems to me its less...while many of us are going for 1440p..1600p or Surround!
warrior10Ntrain96rjbarkerOkay...so finally spent around 15 - 20 minutes last optimizing settings to find the "sweet spot" with the game. Found pretty good settings for both looks and smooth gameplay, although the "hitching / stuttering" does appear every now n then but certainly not to the point that it degrades the game in anyway (imho). Playing 2560 x 1440p - All Ultra (change shadows and reflections to high from Ultra)- Textures: High- Moton Blur: Off (I've always hated this feature)- AA: MSAAx2 (or 4) ...using TXAA seems to increase stuttering for sure...VSynch: OffFrame Buffering: left at 3 (anyone play with this setting?) Thats about it....certainly dont see the justification for all the complaints in this Thread...wow!!! Would 6GB VRAM allow me to move from MSAAx2-4 to TXAAx2-4? No idea...dont care as I know the visuals wouldnt be noticeable for me and this late in the Keplar game I wouldnt trade my 780Ti's for another Keplar Card... Really guys, tweak your settings a bit, I'm sure once a Patch or 2 comes out and perhaps the next Driver, the stuttering will be reduced, saw the same thing and same arguments (VRAM requirements) when Tomb Raider was released. As for PS4...enough already...I dont think it even runs this game at true 1080 does it?...seems to me its less...while many of us are going for 1440p..1600p or Surround!Here, I consider myself the utmost expert on settings with this game currently........at 1440p, put textures down to medium, max out all the other settings, vsync off, gpu rendering to 1 and AA on a 1440p should be temporal SMAA at MOST, regular SMAA is even a bit better........you just don't need as much AA with a 1440p screen so keep it at Temporal SMAA or lower.Ego much??? I don't deem u much credible anymore, so i'll just finish this by saying that all proof from EVGA, Nvidia and Ubisoft confirms what i have said. The more memory does help, and u keep denying the facts. I don't claim to be a know it all like you, nor will i ever because i don't. But I know when i'm right about this. Yes Ubisoft needs to put a Patch in, but i don't see them dumbing down the game just so PC ppl can play at Ultra lvls either. Like Ubisoft has said. It's pretty much 4gigs to play this game good along with a good GPU Processor. BTW, PS4 s can handle stuff in 4k. What is not known, is whether they will try to do games in support but ppl can see movies and do other things in 4k, and you still say that the the PS4 is weak LOL. Again just doing a comparison between how the PC is doing things and the Next-Gen system is. Yeah I know u will argue things to death, even though when it comes to the memory you have been proven wrong all the time by high ranking officials OF EVGA and Ubisoft. I will most likely buy Watchdogs for the 800 series GPUs. Thanx for attacking me while i've been at the Hospital. :)
yasser74Is that why the PS4 can play games like Watchdogs smoothly and most ppl on PCs can't?? I can't waste my time arguing with a person that knows nothing about Next-gen consoles specs, or doesn't bother to learn the their GPUs are more powerful to. All ur doing is arguing all the proof that I and other ppl put up to back up my claims and theirs as well. I can't/won't argue with some1 that is in bad denial and just argues for arguements sake because MY IQ falls about 50 points. Your not worth the time of day arguing with.You are not going to argue because you have nothing to argue with.Consoles run this game "smoothly" because1. game runs at 900p most2. game runs at High textures/details only3. graphic card has shared memory, different archiecture than on PC.. these three points are enough for this game to run smooth, because it was programmed specifically for consoles in the first place.. then ported to PC and not optimized well. Stop lying to yourself, if you want to be consoles fanboy then you should go to another forums. Consoles are cheap gaming platform for non technically skilled people and as there is most of people on the world non technically skilled and simple and they buy consoles it makes console market great opportunity for game developers and therefore they focus this market.There is nothing at all that is superior to todays high end pc. periodi agree on what u said about watch dogs is playing better on PS4 coz of the 3 reasons u said , but dont u need to play a game smoothly regardless ! i have PS4 and PC ..but i am realy considering getting the game on PS4 coz i realy cant play it on pc with GTX 780 ti SC in SLI without stuttering and frame rate drops ! its just i wana play the game :(
yasser74
warrior10A lot of ppl have Higher-End Monitors. Not that simple. Whether some ppl believe it or not, this is a new Generation of gaming, where Nvidia and ATI/AMD GPUs WILL NEED more GIGS on their GPUs and next year will need better processing power. Ntrain, you will never ever know more than me. Been doing Computers for over 30yrs. I just heard from E3 that Microsoft WILL be making their current GPUs more powerful and the Video power from a PS4 can get a lot higher to, But it is not known if Sony will use it. Also the PS4 doesn't use the 8700 GPU, the GPU on the PS4 just made just on the same card stock, but the GPU has a an Advanced ACP New GPU. I'm waiting for the 800 series myself biding my time, and will buy Watchdogs. Unless you have a 770 with 4gigs or a 780 6gigs, ppl can't play the game on Full Ultra settings on everything. Even ppl that have posted that they can with Youtube videos I've seen when they go into there settings and have to downgrade stuff. It will really be something to see the Gigs, Processing power and amount of CUDAS they will have with the 800 series as well If a Patch does fix the VRAM issue, than that's great to
OV3RCLK41920x1080Temporal SMAA or LOWERHIGH texturesMotion Blur offEverything else on ULTRA With 780ti SLI and a 1080p 120hz monitor those are the settings you should be using at the moment. You should average well over 100fps stutter free. I get avg 70-80 fps with a single 780ti and you have 2. Motion Blur is for low fps. You wouldnt want to use motion blur @ 120hz. It defeats the purpose of having a high refresh rate.
warrior10OV3RCLK41920x1080Temporal SMAA or LOWERHIGH texturesMotion Blur offEverything else on ULTRA With 780ti SLI and a 1080p 120hz monitor those are the settings you should be using at the moment. You should average well over 100fps stutter free. I get avg 70-80 fps with a single 780ti and you have 2. Motion Blur is for low fps. You wouldnt want to use motion blur @ 120hz. It defeats the purpose of having a high refresh rate.I don't think Watchdogs supports SLI or Crossfire yet, don't remember because so many ppl have gone back and forth on it. But i'm sure Watchdogs isn't SLI compatible yet so then that would be moot really. Unfortunately ppl with have to turn down settings for now. Hopefully the Patch will help without losing the games Optimizations that are/were good. Doesn't help though, like I said, if ppl have above 1080p Monitors. Ppl with better monitors shouldn't be forced into dumbing down the game. I would like to take a brief chance to apologize about putting in the PS4 stuff, but I know from what i've seen from games to come that GPUs will need more VRAM for Good Eye-Candy games, and will require more Hard drive space. Sorry.
AFTER 8 HRS - It's the software, not your hardwareI've tweaked and modded and manipulated my computer and game and so far this is the unbelievable result: 1) This game gets better frame rates, no setting changed, with SLI disabled. 2) Changing from Ultra to High on almost every setting results in no better performance, and oddly, sometimes performs better on Ultra. 3) MSAA/TXAA/FXAA settings have little to no impact on performance. Turning them off and using hardware MSAA/CSAA, etc. tends to look the same/better with higher FPS. 4) Writing to the GamerProfile.xml file by the game is bugged. It will sometimes write the "CustomQuality" tag twice, with two different sets of settings; initially the tag exists but isn't closed (on install, this file is created when you first run the game - there is a tag /CustomQuality, which in code is a closing tag, but no opening tag, which should be CustomQuality - no slash), settings changed in-game are not always correctly reflected in the file. RAWCPUScore is randomly changed. At one point, mine said 800 something, then later said 4095 something. This makes little sense. I have a 3570K @ 4.5GHz. 5) The "RenderProfile" and "CustomQuality" tags both contain settings that reflect the user settings. The Render tag contains settings consistent with a blanket preset (High, Ultra, etc.) and the CustomQuality contains settings that are different from the preset based on User choices. The only problem is that when you select something, it often has conflicting, redundant settings, which appear wrong under RenderProfile, and correct under CustomQuality. MSAA is an example I've seen multiple times now. 6) After hours of monitoring different settings on this game and seeing little performance difference and constant stutters, my Afterburner log shows a clear pattern: when the game lags, sticks, freezes and stutters, the CPU, GPU, VRAM and RAM usage all drop to near nothing. This means the program isn't doing anything. Please note: I am a programmer. I can say with 100% certainty this means the program isn't sending any instructions, it's non-responsive, temporarily, resulting in stuttering, temporary freezing, etc. This isn't a memory or CPU or GPU issue - the program "pauses" for some reason. It should crash but doesn't. 7) For me, turning off SLI yielded about 5-10FPS more, but was as stuttery as before. 8) There is a *mild* increase in performance in FPS from 337.50 to .88, but stuttering is the same. 9) Installing TheWorse mod and maxing settings looks great and performs about 5 FPS less than not having it installed. 10) Turning MSAA off entirely and passing it to the hardware, even if you use SLI MSAA 32x Q. In fact, on that setting, I was able to achieve a steady 75 FPS until it would stutter. It's as though the harder you push your system with this game, the higher the FPS. The only consistency I can see, after testing the SAME EXACT AREA over and over is the stuttering. I happens consistently, regardless of settings, and every time, as said, the CPU, GPU, RAM and VRAM usage drops down to near nothing. This is a problem with the program and it's resource utilization and nothing will correct this until developers correct it themselves. This isn't Nvidia or AMD. It's Ubisoft. And yes, I am a developer, programmer, etc. who can verify 100% this isn't a hardware issue.
I've tweaked and modded and manipulated my computer and game and so far this is the unbelievable result: 1) This game gets better frame rates, no setting changed, with SLI disabled. 2) Changing from Ultra to High on almost every setting results in no better performance, and oddly, sometimes performs better on Ultra. 3) MSAA/TXAA/FXAA settings have little to no impact on performance. Turning them off and using hardware MSAA/CSAA, etc. tends to look the same/better with higher FPS. 4) Writing to the GamerProfile.xml file by the game is bugged. It will sometimes write the "CustomQuality" tag twice, with two different sets of settings; initially the tag exists but isn't closed (on install, this file is created when you first run the game - there is a tag /CustomQuality, which in code is a closing tag, but no opening tag, which should be CustomQuality - no slash), settings changed in-game are not always correctly reflected in the file. RAWCPUScore is randomly changed. At one point, mine said 800 something, then later said 4095 something. This makes little sense. I have a 3570K @ 4.5GHz. 5) The "RenderProfile" and "CustomQuality" tags both contain settings that reflect the user settings. The Render tag contains settings consistent with a blanket preset (High, Ultra, etc.) and the CustomQuality contains settings that are different from the preset based on User choices. The only problem is that when you select something, it often has conflicting, redundant settings, which appear wrong under RenderProfile, and correct under CustomQuality. MSAA is an example I've seen multiple times now. 6) After hours of monitoring different settings on this game and seeing little performance difference and constant stutters, my Afterburner log shows a clear pattern: when the game lags, sticks, freezes and stutters, the CPU, GPU, VRAM and RAM usage all drop to near nothing. This means the program isn't doing anything. Please note: I am a programmer. I can say with 100% certainty this means the program isn't sending any instructions, it's non-responsive, temporarily, resulting in stuttering, temporary freezing, etc. This isn't a memory or CPU or GPU issue - the program "pauses" for some reason. It should crash but doesn't. 7) For me, turning off SLI yielded about 5-10FPS more, but was as stuttery as before. 8) There is a *mild* increase in performance in FPS from 337.50 to .88, but stuttering is the same. 9) Installing TheWorse mod and maxing settings looks great and performs about 5 FPS less than not having it installed. 10) Turning MSAA off entirely and passing it to the hardware, even if you use SLI MSAA 32x Q. In fact, on that setting, I was able to achieve a steady 75 FPS until it would stutter. It's as though the harder you push your system with this game, the higher the FPS. The only consistency I can see, after testing the SAME EXACT AREA over and over is the stuttering. I happens consistently, regardless of settings, and every time, as said, the CPU, GPU, RAM and VRAM usage drops down to near nothing. This is a problem with the program and it's resource utilization and nothing will correct this until developers correct it themselves. This isn't Nvidia or AMD. It's Ubisoft. And yes, I am a developer, programmer, etc. who can verify 100% this isn't a hardware issue.
I've included my GamerProfile.xml file as well as my Nvidia settings for Nvidia Inspector in a DB link below. Note that changes made with Inspector will be overwritten if you change them in Nvidia Control Panel AFTER applying the profile in Inspector. Nothing in Inspector was changed that is a setting not also accessible via NCP, so I also included screenshots of my Panel settings if you prefer not to use Inspector. I have also disabled the Intel GPU as it is connected to my TV and seems to improve performance a tiny bit when disabled. It's not in use during gameplay, so no harm done.https://www.dropbox.com/sh/tr21vuyde...ahzJNLzw65zjWaBE SURE TO EDIT GAMERPROFILE.XML TO ADJUST YOUR PROPER RESOLUTION AND REFRESH RATE - resolution should be set in the CustomQuality area. My performance is significantly better with these settings, though I should be able to max it out I am using High level textures. If for some reason the game doesn't by default load to High (it seems to randomly select Ultra textures), you can change this and then load your game and the rest of the settings will apply. I am getting 65-85FPS, but still have minor stuttering, mostly when driving, though the "strength" of the stutter has decreased a LOT.NOTE: I am NOT using TheWorse Mod at this time, but in my experience it doesn't significantly contribute to the stuttering issue, though it does drop FPS performance.The primary factors in improving performance IMO were as follows: 1) High vs. Ultra textures - honestly, at 1080p it's VERY hard to tell much of a difference except when you get very close to plants. I don't GAF what plants look like.Before anyone verbally assassinates me, let me just point out: I am aware the setting in the XML file for PCCustomTextureQuality is set to Medium. This is actually correct for HIGH, because there are only 3 settings in the game menu: Medium, High and Ultra. Setting it to High or Ultra actually sets the in-game setting to Ultra. 2) Shader Buffer (NCP setting) - based on Afterburner monitoring, reduced Commit size by about 5GB (improperly referred to as "Page File" in AB, Commit size can be viewing in "Performance" tab in the Task Manager in Win 8/8.1 as is the amount of reserved system memory combined with reserved Virtual Memory - I have 16GB of 1866 and SSDs so I don't use a Page File, therefore have no "Virtual" memory, but RAM usage was down from nearly 11GB to about 6-7GB), took better advantage of Core 2/3/4 (now 85% vs. 65% usage) and noticeably reduced stuttering. 3) DX11MultithreadedRendering setting - XML file - improved consistency in SLI performance - GPUs now match usage %, and average FPS raised about 5-10FPS with this change. 4) HDR setting - XML file - set to 0 (off). No apparent difference in quality, but reduced GPU VRAM usage marginally as well as reducing stuttering. 5) Vsync - XML file - set to "off" (not 0) - Did this due to FPS dips being consistent with Vsync adjustments. If Vsync sees your FPS drop too far, it drops your refresh rate to a value that is a factor of your default refresh and caps your FPS to match. In other words, if your GPU dips more than about 10FPS at 60Hz/60FPS ideal Vsync, it will cut your Hz/FPS to 30, and then to 15, etc. Changed to off, still registers as "off" in-game (as opposed to "undefined" with an invalid value) and dips are much smaller now. 6) Texture Filtering - NCP - Quality set to High Performance (no apparent loss of quality in-game) combined with Threaded Optimization "on" resulted in 10 or more FPS increase and significantly reduced stuttering.
From everything I've gathered, this isn't a graphics issue, not really anyway. It's an issue with the game taking full advantage of available memory. My VRAM rarely goes above 2.5GB usage, yet I have 4GB cards. My system RAM, game and system included, never went above 4GB. I have 16GB. When Shader Buffering is enabled on Nvdia, the Commit size goes to over 12GB, when it's disabled, it never goes above 7.5 (normal) and CPU usage only increases about 15%. This means the game is piling into the shader buffer, but it isn't clearing the shader buffer and it isn't using extra RAM for the game itself when it's allocation has been maxed. What should happen is the shader buffer should reach a max, get cleared from VRAM and pulled from RAM (as you leave/enter an area) and offload the processing of more buffering to RAM as needed. It should exchange storage of the RAM with the card and buffer into RAM before it's needed. Instead, it's clearing the RAM (Commit of the shader buffer) and the card at the same time then loading the RAM, then the card again.