2020/10/15 05:46:27
nycx360
ajreynol
Wallzii
ajreynol
nycx360

I have 2 strix ocs and only 1 ftw3 atm. I was going to flash the strix bios today, but this bios saves me the hassle, so its much appreciated. Now that they are both 450 watt. 3090 bios would be nice since the strix is 480 on that. However, I am still power limited. I would say the ftw3 cooler does a better job at 100% fan than the strix cooler does, even though by looks and build the strix oc looks more beefy and effective. I think the ftw3 card has some catching up to do compared to the strix but it is 40 bucks less at 809 vs 849.  Hopefully we can get a true xoc bios 


Oddly I'm in a similar boat. I accidentally ended up with a FTW3 Ultra and a StrixOC trying to rush through a checkout though the Newegg App last week. 


Now with performance likely being functionally equal at worst (I haven't tried out the FTW3 yet but I will), I'm left trying to decide how much value I place on the second HDMI 2 port. Strix OC is beautiful no doubt, but from the side both cards are beautiful. FTW3 provides a level of peace of mind via Step Up that the Strix simply cannot. But one less HDMI port is a hard limit that I've been meditating over since the card configurations were announced.

If I decide to do a 4K 120Hz TV AND a 4K, high refresh monitor next year, I'd have to buy some sort of HDMI switch or manually swap cables when I want to use one vs the other with the card. Where with the Strix, I can have both plugged in and not have to worry about it at all.

I guess I'll take another day or two to decide if I prefer 2x HDMI ports vs the potential Step Up to a 3090 or similar in the next 3 months. Man, if the EVGA cards had 2 HDMI ports like all the Asus and Gigabyte cards do, this would have been the easiest decision ever. SIGH.


Why do you need two HDMI 2.1 ports? Wouldn't DisplayPort give.you everything you need for monitor connectivity?

No, DP doesn't have the bandwidth that HDMI 2.1 does. For anything above 4K 120, monitors that offer this require the color bit depth to be reduced, dropping image quality to compensate for the bandwidth limitations of DP. Or they have to use a compression algorithm that reduces quality some.
 
HDMI 2.1 doesn't have to make such compromises. As such, the question is one of future purchase plans than current ones. At some point, I'll want to have a 4K 120Hz TV and a monitor that goes even higher...or even something that can manage 8K. We expect quite a few HDMI 2.1 monitors announced at CES 2021 in just a couple of months. For those, DP won't be ideal if you want to maximize their performance at their highest bitrate settings. The question is whether the day where I own (or want to own) both a TV and monitor capable of better performance than DP can offer comes before the 4000-series cards are out (or, in the next 2 years). Other brands made the conscious decision to future proof in this area, as all the Gigabyte and Asus cards have 2 or more HDMI 2.1 ports.
 
I'll be meditating on that question today as I compare my FTW3 Ultra to my Strix. With this Bios update, I expect them to be equivalent in every performance metric (unlucky bin nothwithstanding). So it's just 2 HDMI ports vs the likelihood that I initiate a Step Up to a 20GB 3080 or a 3090 between now and January. It's a good problem to have, but I often find myself struggling with paralysis of analysis. Hopefully this doesn't end up being one of those times.


The bin on the ftw3 i got is very similar to the better strix i got. I don't think either are binned. I find most of time launch cards have better lottery chances. For many who are asking for more power, temps will smack your clocks before power in most cases.  Using a box fan blowing 50s F outside air, i can maintain between 2175-2160 on my ftw3 in a bench run, and 2160-2145 on the better strix (it runs a bit warmer at 100). Awaiting blocks. With blocks I will just shut both cards. I don't think every card will be power limited. My 2nd strix, never hit the power limit and will not clock higher than the 2080-2100 range. 
2020/10/15 05:56:00
Reedey
No matter what I do with this bios, my card still seems to peg itself at 400W. Max power, max voltage, overclocked or not. 400W is all she will do.
 
Last week on the BIOS that the card came with I could get clean runs with +140 on the core using a portable A/C unit but now I can't get a clean run in timespy with any kind of overclock at all, that's with the original bios as well as this one. It still runs games reliably with modest overclocks but it will not hold more than 400W, and can't get a single clean pass in 3dmark. Maybe I just lucked out in the silicon lottery and those early runs were the best this card will ever do?
 
12350 in Port Royal and 19127 graphics score in Timespy were good while they lasted, but nothing I do can get close to those scores now, even with this new bios :(
2020/10/15 06:00:02
ajreynol
nycx360
ajreynol
Wallzii
ajreynol
nycx360

I have 2 strix ocs and only 1 ftw3 atm. I was going to flash the strix bios today, but this bios saves me the hassle, so its much appreciated. Now that they are both 450 watt. 3090 bios would be nice since the strix is 480 on that. However, I am still power limited. I would say the ftw3 cooler does a better job at 100% fan than the strix cooler does, even though by looks and build the strix oc looks more beefy and effective. I think the ftw3 card has some catching up to do compared to the strix but it is 40 bucks less at 809 vs 849.  Hopefully we can get a true xoc bios 


Oddly I'm in a similar boat. I accidentally ended up with a FTW3 Ultra and a StrixOC trying to rush through a checkout though the Newegg App last week. 


Now with performance likely being functionally equal at worst (I haven't tried out the FTW3 yet but I will), I'm left trying to decide how much value I place on the second HDMI 2 port. Strix OC is beautiful no doubt, but from the side both cards are beautiful. FTW3 provides a level of peace of mind via Step Up that the Strix simply cannot. But one less HDMI port is a hard limit that I've been meditating over since the card configurations were announced.

If I decide to do a 4K 120Hz TV AND a 4K, high refresh monitor next year, I'd have to buy some sort of HDMI switch or manually swap cables when I want to use one vs the other with the card. Where with the Strix, I can have both plugged in and not have to worry about it at all.

I guess I'll take another day or two to decide if I prefer 2x HDMI ports vs the potential Step Up to a 3090 or similar in the next 3 months. Man, if the EVGA cards had 2 HDMI ports like all the Asus and Gigabyte cards do, this would have been the easiest decision ever. SIGH.


Why do you need two HDMI 2.1 ports? Wouldn't DisplayPort give.you everything you need for monitor connectivity?

No, DP doesn't have the bandwidth that HDMI 2.1 does. For anything above 4K 120, monitors that offer this require the color bit depth to be reduced, dropping image quality to compensate for the bandwidth limitations of DP. Or they have to use a compression algorithm that reduces quality some.
 
HDMI 2.1 doesn't have to make such compromises. As such, the question is one of future purchase plans than current ones. At some point, I'll want to have a 4K 120Hz TV and a monitor that goes even higher...or even something that can manage 8K. We expect quite a few HDMI 2.1 monitors announced at CES 2021 in just a couple of months. For those, DP won't be ideal if you want to maximize their performance at their highest bitrate settings. The question is whether the day where I own (or want to own) both a TV and monitor capable of better performance than DP can offer comes before the 4000-series cards are out (or, in the next 2 years). Other brands made the conscious decision to future proof in this area, as all the Gigabyte and Asus cards have 2 or more HDMI 2.1 ports.
 
I'll be meditating on that question today as I compare my FTW3 Ultra to my Strix. With this Bios update, I expect them to be equivalent in every performance metric (unlucky bin nothwithstanding). So it's just 2 HDMI ports vs the likelihood that I initiate a Step Up to a 20GB 3080 or a 3090 between now and January. It's a good problem to have, but I often find myself struggling with paralysis of analysis. Hopefully this doesn't end up being one of those times.


The bin on the ftw3 i got is very similar to the better strix i got. I don't think either are binned. I find most of time launch cards have better lottery chances. For many who are asking for more power, temps will smack your clocks before power in most cases.  Using a box fan blowing 50s F outside air, i can maintain between 2175-2160 on my ftw3 in a bench run, and 2160-2145 on the better strix (it runs a bit warmer at 100). Awaiting blocks. With blocks I will just shut both cards. I don't think every card will be power limited. My 2nd strix, never hit the power limit and will not clock higher than the 2080-2100 range. 


Oh I agree completely. I have ZERO concerns that the cards will be equal performers.
 
And now that I've read about the Ampere +TSMC news...I'm getting the feeling I won't be able to use Step Up for a 20GB 3080. Rumors had put such a launch in December, but if they're shifting the entire process to TSMC (they are), I can't imagine those cards will be available until maybe in February/March. Well beyond a Step Up window for my current card.
 
So now the question is simply: 2 HDMI ports (Strix) vs possible Step Up to a 480W Samsung-based 3090 (EVGA). 
 
And either way...will I be looking to sell my current card to get a superior TSMC variant? Cooler, quieter cards with possibly lower power draw.
2020/10/15 06:08:57
TheGuz4L
While I am personally waiting for the full release before applying this, just to clear up some misconceptions about what this will do for your card:
 
Your highest stable overclock will probably not change.  What will change is the sustained clockspeed throughout.   For instance, my card does 2100->2070 (depending on how high temp gets) rock stable.   HOWEVER once I hit the 400W power limit, my card has to sacrifice these boost clocks, and drops me down to 1985-1999 or so.  
 
In theory, having another 50W for the graphics card will not have me hit that power wall and my 2100-2070 clocks will stay stable throughout.  That will give an overall average frame boost when gaming for long periods.
 
If you are running at 1440P or 1080P, you may not even be hitting the power limit wall and won't notice much difference.  This really shines when you're running at 4K or upscaled 4K.   In CoD Warzone at 1440P, i never hit higher than 370w or so.  My clocks then always stay at 2100-2085 etc.   Once I cranked up the resolution to 4K, it hit the power limit and my clocks dropped to the 1985-1999 as mentioned.
 
One question though, I don't mess with the voltage slider.  Has this helped anyone get higher clocks?  I found after 2100Mhz, these chips are very unstable.  I am able to run benchmarks at 2150+ but thats about it.
2020/10/15 06:32:48
ehabash1
bcavanagh
bkhan530
This is what makes EVGA one of the best AIBs. Most others would have ignored their customers but EVGA listened and delivered a new bios, good job guys keep it up! 




Kind of an odd statement. If they had just made the power limit what was promised to begin with they wouldn't have needed to do this. Surely doing it right from the start is better.


yea but they promised 420... and if they started with 420 we wouldnt have the 450 bios. They went above and beyond and matched the more expensive strix card
2020/10/15 06:35:46
SAVIAR
Yes, it provides you more voltage than def values and it made OC more stable. But temps will be higher related to higher voltages.
2020/10/15 06:59:55
jankerson
ty_ger07
jankerson
Well 1st off thanks a bunch.
 
However I must have the worst piece of silicone I could get...
 
ZERO, NADDA, ZIPPO improvement in clocks.
 
My card will not do 2100 period, no matter what will not do it....
 
I even lost 10 MHz on clock in Port Royal and Superposition.
 
This is depressing.... I gained nothing....
 
I think I got a bum card....

As I posted on the previous page:
"An increased power limit won't make a previously unstable frequency suddenly stable.  Never will.  Never, ever.  That is a silicon quality thing, and also depends on voltage and temperature and signal integrity and all sorts of other things which power limit can't fix.
An increased power limit affects how much the GPU will boost and how soon it will start tapering off to lower boosts.  That's it.
I think that you will see that at the SAME over clock which was previously stable, your benchmark scores will be higher.  Not because the maximum stable frequency is higher.  Instead, because the average frequency is higher."



 
I worked back from scratch on the OC.
 
I figured that out on my own, higher scores at less clock speed due to more stable numbers.
 
Was working on it this morning early.
 
Still wish I could get higher clocks of this card though.
 
Oh well luck of the draw I suppose.
 
Silicon Lottery.
 
 
 
 
2020/10/15 07:10:02
ajreynol
TheGuz4L
While I am personally waiting for the full release before applying this, just to clear up some misconceptions about what this will do for your card:
 
Your highest stable overclock will probably not change.  What will change is the sustained clockspeed throughout.   For instance, my card does 2100->2070 (depending on how high temp gets) rock stable.   HOWEVER once I hit the 400W power limit, my card has to sacrifice these boost clocks, and drops me down to 1985-1999 or so.  
 
In theory, having another 50W for the graphics card will not have me hit that power wall and my 2100-2070 clocks will stay stable throughout.  That will give an overall average frame boost when gaming for long periods.
 
If you are running at 1440P or 1080P, you may not even be hitting the power limit wall and won't notice much difference.  This really shines when you're running at 4K or upscaled 4K.   In CoD Warzone at 1440P, i never hit higher than 370w or so.  My clocks then always stay at 2100-2085 etc.   Once I cranked up the resolution to 4K, it hit the power limit and my clocks dropped to the 1985-1999 as mentioned.
 
One question though, I don't mess with the voltage slider.  Has this helped anyone get higher clocks?  I found after 2100Mhz, these chips are very unstable.  I am able to run benchmarks at 2150+ but thats about it.


For what it's worth, I'm gaming at 2115 MHz locked. 
 
Note the core clock. This was the end of a 2-hour gaming session. Dips are me tabbing out to check temps and clocks.
 

 
edit, this image is really small for some reason. Here's a direct link: https://i.imgur.com/WoA88Gl.png
2020/10/15 07:27:11
turboD
Make sure that Afterburner is controlling all 3 fans on that FTW 3... I read some reports that one fan is not controlled by the Afterburner curve and you only have 2 fans running fast ?
2020/10/15 07:27:32
johnnytimmer1991
Can you detail the new normal BIOS in comparison to the original?

Use My Existing Forum Account

Use My Social Media Account