ajreynolWallziiajreynolnycx360I have 2 strix ocs and only 1 ftw3 atm. I was going to flash the strix bios today, but this bios saves me the hassle, so its much appreciated. Now that they are both 450 watt. 3090 bios would be nice since the strix is 480 on that. However, I am still power limited. I would say the ftw3 cooler does a better job at 100% fan than the strix cooler does, even though by looks and build the strix oc looks more beefy and effective. I think the ftw3 card has some catching up to do compared to the strix but it is 40 bucks less at 809 vs 849. Hopefully we can get a true xoc bios Oddly I'm in a similar boat. I accidentally ended up with a FTW3 Ultra and a StrixOC trying to rush through a checkout though the Newegg App last week. Now with performance likely being functionally equal at worst (I haven't tried out the FTW3 yet but I will), I'm left trying to decide how much value I place on the second HDMI 2 port. Strix OC is beautiful no doubt, but from the side both cards are beautiful. FTW3 provides a level of peace of mind via Step Up that the Strix simply cannot. But one less HDMI port is a hard limit that I've been meditating over since the card configurations were announced.If I decide to do a 4K 120Hz TV AND a 4K, high refresh monitor next year, I'd have to buy some sort of HDMI switch or manually swap cables when I want to use one vs the other with the card. Where with the Strix, I can have both plugged in and not have to worry about it at all.I guess I'll take another day or two to decide if I prefer 2x HDMI ports vs the potential Step Up to a 3090 or similar in the next 3 months. Man, if the EVGA cards had 2 HDMI ports like all the Asus and Gigabyte cards do, this would have been the easiest decision ever. SIGH.Why do you need two HDMI 2.1 ports? Wouldn't DisplayPort give.you everything you need for monitor connectivity?No, DP doesn't have the bandwidth that HDMI 2.1 does. For anything above 4K 120, monitors that offer this require the color bit depth to be reduced, dropping image quality to compensate for the bandwidth limitations of DP. Or they have to use a compression algorithm that reduces quality some. HDMI 2.1 doesn't have to make such compromises. As such, the question is one of future purchase plans than current ones. At some point, I'll want to have a 4K 120Hz TV and a monitor that goes even higher...or even something that can manage 8K. We expect quite a few HDMI 2.1 monitors announced at CES 2021 in just a couple of months. For those, DP won't be ideal if you want to maximize their performance at their highest bitrate settings. The question is whether the day where I own (or want to own) both a TV and monitor capable of better performance than DP can offer comes before the 4000-series cards are out (or, in the next 2 years). Other brands made the conscious decision to future proof in this area, as all the Gigabyte and Asus cards have 2 or more HDMI 2.1 ports. I'll be meditating on that question today as I compare my FTW3 Ultra to my Strix. With this Bios update, I expect them to be equivalent in every performance metric (unlucky bin nothwithstanding). So it's just 2 HDMI ports vs the likelihood that I initiate a Step Up to a 20GB 3080 or a 3090 between now and January. It's a good problem to have, but I often find myself struggling with paralysis of analysis. Hopefully this doesn't end up being one of those times.
Wallziiajreynolnycx360I have 2 strix ocs and only 1 ftw3 atm. I was going to flash the strix bios today, but this bios saves me the hassle, so its much appreciated. Now that they are both 450 watt. 3090 bios would be nice since the strix is 480 on that. However, I am still power limited. I would say the ftw3 cooler does a better job at 100% fan than the strix cooler does, even though by looks and build the strix oc looks more beefy and effective. I think the ftw3 card has some catching up to do compared to the strix but it is 40 bucks less at 809 vs 849. Hopefully we can get a true xoc bios Oddly I'm in a similar boat. I accidentally ended up with a FTW3 Ultra and a StrixOC trying to rush through a checkout though the Newegg App last week. Now with performance likely being functionally equal at worst (I haven't tried out the FTW3 yet but I will), I'm left trying to decide how much value I place on the second HDMI 2 port. Strix OC is beautiful no doubt, but from the side both cards are beautiful. FTW3 provides a level of peace of mind via Step Up that the Strix simply cannot. But one less HDMI port is a hard limit that I've been meditating over since the card configurations were announced.If I decide to do a 4K 120Hz TV AND a 4K, high refresh monitor next year, I'd have to buy some sort of HDMI switch or manually swap cables when I want to use one vs the other with the card. Where with the Strix, I can have both plugged in and not have to worry about it at all.I guess I'll take another day or two to decide if I prefer 2x HDMI ports vs the potential Step Up to a 3090 or similar in the next 3 months. Man, if the EVGA cards had 2 HDMI ports like all the Asus and Gigabyte cards do, this would have been the easiest decision ever. SIGH.Why do you need two HDMI 2.1 ports? Wouldn't DisplayPort give.you everything you need for monitor connectivity?
ajreynolnycx360I have 2 strix ocs and only 1 ftw3 atm. I was going to flash the strix bios today, but this bios saves me the hassle, so its much appreciated. Now that they are both 450 watt. 3090 bios would be nice since the strix is 480 on that. However, I am still power limited. I would say the ftw3 cooler does a better job at 100% fan than the strix cooler does, even though by looks and build the strix oc looks more beefy and effective. I think the ftw3 card has some catching up to do compared to the strix but it is 40 bucks less at 809 vs 849. Hopefully we can get a true xoc bios Oddly I'm in a similar boat. I accidentally ended up with a FTW3 Ultra and a StrixOC trying to rush through a checkout though the Newegg App last week. Now with performance likely being functionally equal at worst (I haven't tried out the FTW3 yet but I will), I'm left trying to decide how much value I place on the second HDMI 2 port. Strix OC is beautiful no doubt, but from the side both cards are beautiful. FTW3 provides a level of peace of mind via Step Up that the Strix simply cannot. But one less HDMI port is a hard limit that I've been meditating over since the card configurations were announced.If I decide to do a 4K 120Hz TV AND a 4K, high refresh monitor next year, I'd have to buy some sort of HDMI switch or manually swap cables when I want to use one vs the other with the card. Where with the Strix, I can have both plugged in and not have to worry about it at all.I guess I'll take another day or two to decide if I prefer 2x HDMI ports vs the potential Step Up to a 3090 or similar in the next 3 months. Man, if the EVGA cards had 2 HDMI ports like all the Asus and Gigabyte cards do, this would have been the easiest decision ever. SIGH.
nycx360I have 2 strix ocs and only 1 ftw3 atm. I was going to flash the strix bios today, but this bios saves me the hassle, so its much appreciated. Now that they are both 450 watt. 3090 bios would be nice since the strix is 480 on that. However, I am still power limited. I would say the ftw3 cooler does a better job at 100% fan than the strix cooler does, even though by looks and build the strix oc looks more beefy and effective. I think the ftw3 card has some catching up to do compared to the strix but it is 40 bucks less at 809 vs 849. Hopefully we can get a true xoc bios
nycx360ajreynolWallziiajreynolnycx360I have 2 strix ocs and only 1 ftw3 atm. I was going to flash the strix bios today, but this bios saves me the hassle, so its much appreciated. Now that they are both 450 watt. 3090 bios would be nice since the strix is 480 on that. However, I am still power limited. I would say the ftw3 cooler does a better job at 100% fan than the strix cooler does, even though by looks and build the strix oc looks more beefy and effective. I think the ftw3 card has some catching up to do compared to the strix but it is 40 bucks less at 809 vs 849. Hopefully we can get a true xoc bios Oddly I'm in a similar boat. I accidentally ended up with a FTW3 Ultra and a StrixOC trying to rush through a checkout though the Newegg App last week. Now with performance likely being functionally equal at worst (I haven't tried out the FTW3 yet but I will), I'm left trying to decide how much value I place on the second HDMI 2 port. Strix OC is beautiful no doubt, but from the side both cards are beautiful. FTW3 provides a level of peace of mind via Step Up that the Strix simply cannot. But one less HDMI port is a hard limit that I've been meditating over since the card configurations were announced.If I decide to do a 4K 120Hz TV AND a 4K, high refresh monitor next year, I'd have to buy some sort of HDMI switch or manually swap cables when I want to use one vs the other with the card. Where with the Strix, I can have both plugged in and not have to worry about it at all.I guess I'll take another day or two to decide if I prefer 2x HDMI ports vs the potential Step Up to a 3090 or similar in the next 3 months. Man, if the EVGA cards had 2 HDMI ports like all the Asus and Gigabyte cards do, this would have been the easiest decision ever. SIGH.Why do you need two HDMI 2.1 ports? Wouldn't DisplayPort give.you everything you need for monitor connectivity?No, DP doesn't have the bandwidth that HDMI 2.1 does. For anything above 4K 120, monitors that offer this require the color bit depth to be reduced, dropping image quality to compensate for the bandwidth limitations of DP. Or they have to use a compression algorithm that reduces quality some. HDMI 2.1 doesn't have to make such compromises. As such, the question is one of future purchase plans than current ones. At some point, I'll want to have a 4K 120Hz TV and a monitor that goes even higher...or even something that can manage 8K. We expect quite a few HDMI 2.1 monitors announced at CES 2021 in just a couple of months. For those, DP won't be ideal if you want to maximize their performance at their highest bitrate settings. The question is whether the day where I own (or want to own) both a TV and monitor capable of better performance than DP can offer comes before the 4000-series cards are out (or, in the next 2 years). Other brands made the conscious decision to future proof in this area, as all the Gigabyte and Asus cards have 2 or more HDMI 2.1 ports. I'll be meditating on that question today as I compare my FTW3 Ultra to my Strix. With this Bios update, I expect them to be equivalent in every performance metric (unlucky bin nothwithstanding). So it's just 2 HDMI ports vs the likelihood that I initiate a Step Up to a 20GB 3080 or a 3090 between now and January. It's a good problem to have, but I often find myself struggling with paralysis of analysis. Hopefully this doesn't end up being one of those times.The bin on the ftw3 i got is very similar to the better strix i got. I don't think either are binned. I find most of time launch cards have better lottery chances. For many who are asking for more power, temps will smack your clocks before power in most cases. Using a box fan blowing 50s F outside air, i can maintain between 2175-2160 on my ftw3 in a bench run, and 2160-2145 on the better strix (it runs a bit warmer at 100). Awaiting blocks. With blocks I will just shut both cards. I don't think every card will be power limited. My 2nd strix, never hit the power limit and will not clock higher than the 2080-2100 range.
bcavanaghbkhan530This is what makes EVGA one of the best AIBs. Most others would have ignored their customers but EVGA listened and delivered a new bios, good job guys keep it up! Kind of an odd statement. If they had just made the power limit what was promised to begin with they wouldn't have needed to do this. Surely doing it right from the start is better.
bkhan530This is what makes EVGA one of the best AIBs. Most others would have ignored their customers but EVGA listened and delivered a new bios, good job guys keep it up!
ty_ger07jankersonWell 1st off thanks a bunch. However I must have the worst piece of silicone I could get... ZERO, NADDA, ZIPPO improvement in clocks. My card will not do 2100 period, no matter what will not do it.... I even lost 10 MHz on clock in Port Royal and Superposition. This is depressing.... I gained nothing.... I think I got a bum card....As I posted on the previous page:"An increased power limit won't make a previously unstable frequency suddenly stable. Never will. Never, ever. That is a silicon quality thing, and also depends on voltage and temperature and signal integrity and all sorts of other things which power limit can't fix.An increased power limit affects how much the GPU will boost and how soon it will start tapering off to lower boosts. That's it.I think that you will see that at the SAME over clock which was previously stable, your benchmark scores will be higher. Not because the maximum stable frequency is higher. Instead, because the average frequency is higher."
jankersonWell 1st off thanks a bunch. However I must have the worst piece of silicone I could get... ZERO, NADDA, ZIPPO improvement in clocks. My card will not do 2100 period, no matter what will not do it.... I even lost 10 MHz on clock in Port Royal and Superposition. This is depressing.... I gained nothing.... I think I got a bum card....
TheGuz4LWhile I am personally waiting for the full release before applying this, just to clear up some misconceptions about what this will do for your card: Your highest stable overclock will probably not change. What will change is the sustained clockspeed throughout. For instance, my card does 2100->2070 (depending on how high temp gets) rock stable. HOWEVER once I hit the 400W power limit, my card has to sacrifice these boost clocks, and drops me down to 1985-1999 or so. In theory, having another 50W for the graphics card will not have me hit that power wall and my 2100-2070 clocks will stay stable throughout. That will give an overall average frame boost when gaming for long periods. If you are running at 1440P or 1080P, you may not even be hitting the power limit wall and won't notice much difference. This really shines when you're running at 4K or upscaled 4K. In CoD Warzone at 1440P, i never hit higher than 370w or so. My clocks then always stay at 2100-2085 etc. Once I cranked up the resolution to 4K, it hit the power limit and my clocks dropped to the 1985-1999 as mentioned. One question though, I don't mess with the voltage slider. Has this helped anyone get higher clocks? I found after 2100Mhz, these chips are very unstable. I am able to run benchmarks at 2150+ but thats about it.
ReedeyNo matter what I do with this bios, my card still seems to peg itself at 400W. Max power, max voltage, overclocked or not. 400W is all she will do.
ajreynolNo, DP doesn't have the bandwidth that HDMI 2.1 does. For anything above 4K 120, monitors that offer this require the color bit depth to be reduced, dropping image quality to compensate for the bandwidth limitations of DP.
jedixjarfAny way we can get the actual bios rom files?
DesaccordeSo, what about the temperatures under humanly acceptable fan speeds? Does the air cooler hold up or we need at least the Hybrid Kit?
ajreynol And now that I've read about the Ampere +TSMC news...I'm getting the feeling I won't be able to use Step Up for a 20GB 3080. Rumors had put such a launch in December, but if they're shifting the entire process to TSMC (they are), I can't imagine those cards will be available until maybe in February/March. Well beyond a Step Up window for my current card. So now the question is simply: 2 HDMI ports (Strix) vs possible Step Up to a 480W Samsung-based 3090 (EVGA). And either way...will I be looking to sell my current card to get a superior TSMC variant? Cooler, quieter cards with possibly lower power draw.
VerdalixHere's a comparison on my setup, stock vs beta BIOS. I kept the offsets the same for each BIOS and maxed out the power/voltage limit for each. Clock was stable at 2114 MHz average compared to 2088 MHz previously. Results look promising and temperatures are looking good (all fans maxed out for these runs). Will see if I can try to overclock it a bit more and keep it stable. Core clock offset: +190Mem clock offset: +1150
TheGuz4LIs your memory really working that high? Once I go past +600Mhz my FPS drops. It of course works fine because it is ECC ram now, but I lose frames. Curious if you tried lower memory speeds if your score would be higher.
TheGuz4LVerdalixHere's a comparison on my setup, stock vs beta BIOS. I kept the offsets the same for each BIOS and maxed out the power/voltage limit for each. Clock was stable at 2114 MHz average compared to 2088 MHz previously. Results look promising and temperatures are looking good (all fans maxed out for these runs). Will see if I can try to overclock it a bit more and keep it stable. Core clock offset: +190Mem clock offset: +1150 Is your memory really working that high? Once I go past +600Mhz my FPS drops. It of course works fine because it is ECC ram now, but I lose frames. Curious if you tried lower memory speeds if your score would be higher.
bloodshot45
NexusSixTheGuz4LIs your memory really working that high? Once I go past +600Mhz my FPS drops. It of course works fine because it is ECC ram now, but I lose frames. Curious if you tried lower memory speeds if your score would be higher.It can't be, I'm beating his score with +130 on the core and +900 on the memory (above 900 it flakes out and protection starts kicking in). For reference I got a 12486 with +130/900. 70 more MHz on the core should be at least 100 points higher than that I'd imagine.
Dabadger84I'm curious: Why is it that some people's GPUz is showing GPU power numbers in Watts... and mine looks like this: And right as I'm posting this a GPUz update came up, mayhaps that'll fix it... Edit: Adding someone's post earlier that has what I'm talkin' about: bloodshot45
arestavoDabadger84I'm curious: Why is it that some people's GPUz is showing GPU power numbers in Watts... and mine looks like this: And right as I'm posting this a GPUz update came up, mayhaps that'll fix it... Edit: Adding someone's post earlier that has what I'm talkin' about: bloodshot45 Select "Nvidia BIOS" instead of "General" at the top.
ty_ger07ReedeyNo matter what I do with this bios, my card still seems to peg itself at 400W. Max power, max voltage, overclocked or not. 400W is all she will do.Did you power cycle your computer?