Jiberish001
Clock steps are not purely temperature based, but rather volt based, and the card determines how many volts it can handle based mostly on temps, but you can alter those steps to an extent. There are a few ways in which you can allow more volt usage, and so long as the card is stable and cool enough it will make use of those volts to push the clock. The steps themselves are not a real consistent linear stepping, but rather are curve that can change in curve rate at different points. At higher volts the curve is shallow and step further apart. But at lower volts the curve become steeper and step more frequent.
They are.
Point is voltage stays the same (1.05-1.09 range) - clocks rise a bit, 15 mhz per step.
You don't know how this boost is working but making such statements...
Jiberish001
Temperature is most definitely a huge favor even in regular gaming. Your better OC and better power limit is only made possible by the lower temps. Volts give you clocks, but they are also the source for heat.
A 10 percent performance in high-end games is a big deal. This isn't about how I feel.
Well, just run some test and see by yourself ;) Try to find any differences in fps or frame times (which could help you find freezes) between hot and cold card.
Once again: voltage is the same if you don't hit powerlimit - it always limited to maximum voltage, which in range 1.05-1.07v (1.093 max) and not depend on gpu temperature; clocks depend on temperature: at same voltage (say, 1.06v) you can get higher frequency if you cool down your gpu, but actual benefit from this is quite low. Frequency depends on voltage in case you are hitting power limit: in that case voltage will be lower, but even in this conditions thermal boost is working, but still giving a very little result.
In my oppinion 10% is not a big deal (60fps vs 66fps in rdr2, yeah), but this 10% is result of actually changing gpu for better (much better) one with much higher clocks, better overclock and higher powerlimit.
I just telling you that this thermal boost is not a big deal and you should not worry about that, cooling card down to 40c will not give you much performance boost.
Jiberish001
My rdr2 can only run at around 75-80 fps with mixed settings wherein only a couple are maxed. During initial playtime temps will begin around 40, and raise up to 50-55, but the clock will never drop below 2010 because I've allowed the card to use more volts at higher temps. But even still in dense areas the frames can drop really close to that uncomfortable area. Not because of how I feel but of what I see, which is a real tangle change in visuals. Not a philosophical feeling. I could never dream of reaching 100fps. Even when I reduce settings to mediums and lows I can only achieve frames around 90.
There is some settings that has strong negative affection on performance, you could try to search them on google and tune them. For example, tree tesselation, volumetric lighting and other stuff.
I can give you my settings if you want to. I am running game at 1440p (21:9) and with my settings I have 80-90ish in Saint Denis. To achieve these settings I also googled something like "rdr2 graphic settings guide" and runned ingame benchmark with different settings to see how fps is changing in different scenes and overal results.
As for boost - the point is not to see, that "frequency doesn't drop below 2010mhz" - but to see steps in frequency while card is heating up and cooling down:
That how this boost working, it is not about frequency as funcion of voltage, but frequency as function of temperature (more complicated, but if you lock your voltage on maximum and don't hit powerlimit - it is).