EVGA

Hot!Question about boost clock

Page: 12 > Showing page 1 of 2
Author
MNFirstBlood
iCX Member
  • Total Posts : 311
  • Reward points : 0
  • Joined: 2007/03/04 16:05:34
  • Status: offline
  • Ribbons : 0
2020/11/12 18:08:42 (permalink)
Looking for some clarity on how boost clock works. I have an example I want to throw out there. It goes like this.

I play read dead redemption 2 mixed medium/high settings for min 60fps every scene on 2k.

I play hunt showdown on highest everything settings for min 60fps on 2k

While playing both games, gpu runs at 70c( which is fine) but on rdr2 my gpu clock runs at 1949mhz and on hunt it goes all the way down to 1885. Infact on hunt the clock is way more all over the place going up and down much more frequently as the gpu temp stays the same. Is there something else that is heating up that is causing the lower clock on hunt?

Windows 10
i7-8700K  @5.0ghz 1.4v
Asus Maximus X Hero  
Asus Ryujin 360
1080ti FTW3  
32 G.skill Trident 3400  
2x 300gb Velicoraptor 3.0 Raid 0
2x Samsung Evo 500gb Raid 0 (OS)
Samsung 970 Pro NVMe M.2 1tb 
Lian Li 011-Air
Evga Super Nova 1000 

"just because you are a character doesn't mean you have character". Wolf
#1

34 Replies Related Threads

    Sajin
    EVGA Forum Moderator
    • Total Posts : 44052
    • Reward points : 0
    • Joined: 2010/06/07 21:11:51
    • Location: Texas, USA.
    • Status: online
    • Ribbons : 197
    Re: Question about boost clock 2020/11/12 18:52:17 (permalink)
    Nothing else is heating up. Usage, power limit, temps (gpu core only) & voltage (gpu core only) will affect the clock.

    Want to save 5 to 10% on your next EVGA purchase? Just click on the associates banner to save, or enter the associates code at checkout on your next purchase. If you choose to use my code I want to personally say "Thank You" for using it.
     

     
    #2
    MNFirstBlood
    iCX Member
    • Total Posts : 311
    • Reward points : 0
    • Joined: 2007/03/04 16:05:34
    • Status: offline
    • Ribbons : 0
    Re: Question about boost clock 2020/11/12 21:17:48 (permalink)
    Sajin
    Nothing else is heating up. Usage, power limit, temps (gpu core only) & voltage (gpu core only) will affect the clock.


    So if the temps are identical what would be your guess on the clock behavior difference? 

    Windows 10
    i7-8700K  @5.0ghz 1.4v
    Asus Maximus X Hero  
    Asus Ryujin 360
    1080ti FTW3  
    32 G.skill Trident 3400  
    2x 300gb Velicoraptor 3.0 Raid 0
    2x Samsung Evo 500gb Raid 0 (OS)
    Samsung 970 Pro NVMe M.2 1tb 
    Lian Li 011-Air
    Evga Super Nova 1000 

    "just because you are a character doesn't mean you have character". Wolf
    #3
    HeavyHemi
    Omnipotent Enthusiast
    • Total Posts : 14749
    • Reward points : 0
    • Joined: 2008/11/28 20:31:42
    • Location: Western Washington
    • Status: offline
    • Ribbons : 107
    Re: Question about boost clock 2020/11/12 21:42:45 (permalink)
    MNFirstBlood
    Sajin
    Nothing else is heating up. Usage, power limit, temps (gpu core only) & voltage (gpu core only) will affect the clock.


    So if the temps are identical what would be your guess on the clock behavior difference? 


    What is the power usage you are seeing between the two?  I guess one might ask, what is your purpose other than curiosity? There are ways, using the voltage curve, for example, to pretty much lock your GPU to a particular clock and voltage under a full 3D load.   Generally though, games are not going to load the GPU exactly the same. A game that is primarily single threaded my show a lower GPU power usage and higher clocks versus a game that is multi threaded and load the GPU to 99% and you drop clocks due to power draw or Vrel.

    EVGA X99 FTWK / i7 6850K 4.5ghz / GTX1080 Ti FE Hybrid / 32GB Corsair LPX 3200mhz / Samsung 850Pro 256GB / Corsair AX1200 / Window 10 Pro
    Fire Strike                        24,163
    Fire  Strike Extreme        14,452 
    Fire Strike Ultra                7,711 
    Time Spy                        10,357

    #4
    MNFirstBlood
    iCX Member
    • Total Posts : 311
    • Reward points : 0
    • Joined: 2007/03/04 16:05:34
    • Status: offline
    • Ribbons : 0
    Re: Question about boost clock 2020/11/13 11:16:48 (permalink)
    HeavyHemi
    MNFirstBlood
    Sajin
    Nothing else is heating up. Usage, power limit, temps (gpu core only) & voltage (gpu core only) will affect the clock.


    So if the temps are identical what would be your guess on the clock behavior difference? 


    What is the power usage you are seeing between the two?  I guess one might ask, what is your purpose other than curiosity? There are ways, using the voltage curve, for example, to pretty much lock your GPU to a particular clock and voltage under a full 3D load.   Generally though, games are not going to load the GPU exactly the same. A game that is primarily single threaded my show a lower GPU power usage and higher clocks versus a game that is multi threaded and load the GPU to 99% and you drop clocks due to power draw or Vrel.


    Definitely curiosity if that is okay. I will check the power draw tonight on both games. They both run the GPU to 99% usage currently. If one is drawing more power from a watts standpoint it will downclock? Is that what I am understanding. Like I said I am curious of the exact behaviors so I have a better understanding overall. Thank you for anyone that takes the time to shed light on this. I have searched the google of course and get a lot of mixed and conflicting information. 

    Windows 10
    i7-8700K  @5.0ghz 1.4v
    Asus Maximus X Hero  
    Asus Ryujin 360
    1080ti FTW3  
    32 G.skill Trident 3400  
    2x 300gb Velicoraptor 3.0 Raid 0
    2x Samsung Evo 500gb Raid 0 (OS)
    Samsung 970 Pro NVMe M.2 1tb 
    Lian Li 011-Air
    Evga Super Nova 1000 

    "just because you are a character doesn't mean you have character". Wolf
    #5
    Data1987
    Superclocked Member
    • Total Posts : 101
    • Reward points : 0
    • Joined: 2020/11/13 10:52:16
    • Status: offline
    • Ribbons : 0
    Re: Question about boost clock 2020/11/13 11:24:25 (permalink)
    Boost Clock is tied to temperature....so keep that GPU as cool as posible and boost clocks will be higher.
    #6
    MNFirstBlood
    iCX Member
    • Total Posts : 311
    • Reward points : 0
    • Joined: 2007/03/04 16:05:34
    • Status: offline
    • Ribbons : 0
    Re: Question about boost clock 2020/11/13 13:59:53 (permalink)
    Data1987
    Boost Clock is tied to temperature....so keep that GPU as cool as posible and boost clocks will be higher.


    That was my understanding as well, but as you see in my OP I state that the temps are identical. I played each game for 30 minutes just now to let the heat soak and I looked at the performance monitor which I have linked below. the difference in color is HDR, which RDR2 supports and Hunt does not. 
                
     RDR2                                                                                                                 Hunt

     
    post edited by MNFirstBlood - 2020/11/13 14:03:54

    Attached Image(s)


    Windows 10
    i7-8700K  @5.0ghz 1.4v
    Asus Maximus X Hero  
    Asus Ryujin 360
    1080ti FTW3  
    32 G.skill Trident 3400  
    2x 300gb Velicoraptor 3.0 Raid 0
    2x Samsung Evo 500gb Raid 0 (OS)
    Samsung 970 Pro NVMe M.2 1tb 
    Lian Li 011-Air
    Evga Super Nova 1000 

    "just because you are a character doesn't mean you have character". Wolf
    #7
    HeavyHemi
    Omnipotent Enthusiast
    • Total Posts : 14749
    • Reward points : 0
    • Joined: 2008/11/28 20:31:42
    • Location: Western Washington
    • Status: offline
    • Ribbons : 107
    Re: Question about boost clock 2020/11/13 14:32:52 (permalink)
    MNFirstBlood
    Data1987
    Boost Clock is tied to temperature....so keep that GPU as cool as posible and boost clocks will be higher.


    That was my understanding as well, but as you see in my OP I state that the temps are identical. I played each game for 30 minutes just now to let the heat soak and I looked at the performance monitor which I have linked below. the difference in color is HDR, which RDR2 supports and Hunt does not. 
                
     RDR2                                                                                                                 Hunt

     




    Okay.. but see this matches up with exactly how I explained this.  Where your CPU utilization is higher, your GPU clocks are lower.
    Another thing, I'm pretty sure you intentionally picked spots for both games that would reflect similar numbers. All that shows is that you can show similar numbers with a static scene. 
    What you should be looking at is the average performance as the games are running.  Showing peak of 99% for a basically a static scene, does not mean that reflects both games have the exact same power usage all the time. We know they do not. 
     
     
    Also, temps do make a difference beyond starting at 39C for the first temp bin or drop of ~13mhz.
     
     

     
    You're right in the area where just a degree or two is going to make your clocks fluctuate by +-25mhz. This is what I have  in comparison
     

     
     That's an hour or so of Division 2 at 4K. Solid 2063mhz locked at 1.050v. Standard EVGA AIO installed on a 1080 TI FE with push/pull fans on the rad. Topped out at 45C.
    post edited by HeavyHemi - 2020/11/13 14:41:56

    EVGA X99 FTWK / i7 6850K 4.5ghz / GTX1080 Ti FE Hybrid / 32GB Corsair LPX 3200mhz / Samsung 850Pro 256GB / Corsair AX1200 / Window 10 Pro
    Fire Strike                        24,163
    Fire  Strike Extreme        14,452 
    Fire Strike Ultra                7,711 
    Time Spy                        10,357

    #8
    MNFirstBlood
    iCX Member
    • Total Posts : 311
    • Reward points : 0
    • Joined: 2007/03/04 16:05:34
    • Status: offline
    • Ribbons : 0
    Re: Question about boost clock 2020/11/13 16:21:01 (permalink)
    HeavyHemi
    MNFirstBlood
    Data1987
    Boost Clock is tied to temperature....so keep that GPU as cool as posible and boost clocks will be higher.


    That was my understanding as well, but as you see in my OP I state that the temps are identical. I played each game for 30 minutes just now to let the heat soak and I looked at the performance monitor which I have linked below. the difference in color is HDR, which RDR2 supports and Hunt does not. 
                
     RDR2                                                                                                                 Hunt

     




    Okay.. but see this matches up with exactly how I explained this.  Where your CPU utilization is higher, your GPU clocks are lower.
    Another thing, I'm pretty sure you intentionally picked spots for both games that would reflect similar numbers. All that shows is that you can show similar numbers with a static scene. 
    What you should be looking at is the average performance as the games are running.  Showing peak of 99% for a basically a static scene, does not mean that reflects both games have the exact same power usage all the time. We know they do not. 
     
     
    Also, temps do make a difference beyond starting at 39C for the first temp bin or drop of ~13mhz.
     
     

     
    You're right in the area where just a degree or two is going to make your clocks fluctuate by +-25mhz. This is what I have  in comparison
     

     
     That's an hour or so of Division 2 at 4K. Solid 2063mhz locked at 1.050v. Standard EVGA AIO installed on a 1080 TI FE with push/pull fans on the rad. Topped out at 45C.


     

    I am not sure exactly what you are saying. I didn't see any explanation in detail about CPU affecting boost clock. I did you see you asked about power, and I thought the pics might help you see. You also said games will load differently in the CPU. So The idea that CPU usage could play a factor is news to me and why I am here asking how it works. Can you elaborate how and when CPU usage effects boost clock behavior? Also how do I save a graph like that so I can post it?
     
    I did not choose any scene to be specific. Like the OP states RDR2 stays at 1949mhz and HUNT goes all over the place and tends to stay closer to 1885. The temps don't go higher than 71c in either game. I have the full screen shots if you'd like and I could also stream for you on discord or whatever. As far as the area where a degree would make a difference, I'm confused by that because the lower temp has the lower clock?
     
    Side note: that hybrid keeps it at 45c, damn that's nice.  

    Windows 10
    i7-8700K  @5.0ghz 1.4v
    Asus Maximus X Hero  
    Asus Ryujin 360
    1080ti FTW3  
    32 G.skill Trident 3400  
    2x 300gb Velicoraptor 3.0 Raid 0
    2x Samsung Evo 500gb Raid 0 (OS)
    Samsung 970 Pro NVMe M.2 1tb 
    Lian Li 011-Air
    Evga Super Nova 1000 

    "just because you are a character doesn't mean you have character". Wolf
    #9
    HeavyHemi
    Omnipotent Enthusiast
    • Total Posts : 14749
    • Reward points : 0
    • Joined: 2008/11/28 20:31:42
    • Location: Western Washington
    • Status: offline
    • Ribbons : 107
    Re: Question about boost clock 2020/11/13 16:44:33 (permalink)
    MNFirstBlood
    HeavyHemi
    MNFirstBlood
    Data1987
    Boost Clock is tied to temperature....so keep that GPU as cool as posible and boost clocks will be higher.


    That was my understanding as well, but as you see in my OP I state that the temps are identical. I played each game for 30 minutes just now to let the heat soak and I looked at the performance monitor which I have linked below. the difference in color is HDR, which RDR2 supports and Hunt does not. 
                
     RDR2                                                                                                                 Hunt

     




    Okay.. but see this matches up with exactly how I explained this.  Where your CPU utilization is higher, your GPU clocks are lower.
    Another thing, I'm pretty sure you intentionally picked spots for both games that would reflect similar numbers. All that shows is that you can show similar numbers with a static scene. 
    What you should be looking at is the average performance as the games are running.  Showing peak of 99% for a basically a static scene, does not mean that reflects both games have the exact same power usage all the time. We know they do not. 
     
     
    Also, temps do make a difference beyond starting at 39C for the first temp bin or drop of ~13mhz.
     
     

     
    You're right in the area where just a degree or two is going to make your clocks fluctuate by +-25mhz. This is what I have  in comparison
     

     
     That's an hour or so of Division 2 at 4K. Solid 2063mhz locked at 1.050v. Standard EVGA AIO installed on a 1080 TI FE with push/pull fans on the rad. Topped out at 45C.


     

    I am not sure exactly what you are saying. I didn't see any explanation in detail about CPU affecting boost clock. I did you see you asked about power, and I thought the pics might help you see. You also said games will load differently in the CPU. So The idea that CPU usage could play a factor is news to me and why I am here asking how it works. Can you elaborate how and when CPU usage effects boost clock behavior? Also how do I save a graph like that so I can post it?
     
    I did not choose any scene to be specific. Like the OP states RDR2 stays at 1949mhz and HUNT goes all over the place and tends to stay closer to 1885. The temps don't go higher than 71c in either game. I have the full screen shots if you'd like and I could also stream for you on discord or whatever. As far as the area where a degree would make a difference, I'm confused by that because the lower temp has the lower clock?
     
    Side note: that hybrid keeps it at 45c, damn that's nice.  



    If one tends to 'go all over the place' and the other is relatively stable, then you DID chose scenes so that the numbers were almost exactly the same. It cannot bet otherwise.
     
    Indeed you're not going to see a detailed explanation of how CPU utilization affects GPU utilization. Why? If that basic concept is confusing to you, the details would be a waste of both of our time.
    And seriously, this is NOT the forum for paragraphs of MSEE jargon etc ( I are one). Also GPU and CPU utilization combined with GPU  Boost 3.0 can give you odd results that don't reflect what you'd think would be the performance.
    Though I find it hard to believe that as long as you have been around, you're not aware that that how a game utilizes the CPU affects GPU utilization. Who does not know that Crysis for example, being SINGLE THREADED,  is limiting FPS vice GPU rendering power?  You can take a screen shot with the AB HW window open using the Snip tool.
     
     
    post edited by HeavyHemi - 2020/11/13 16:46:35

    EVGA X99 FTWK / i7 6850K 4.5ghz / GTX1080 Ti FE Hybrid / 32GB Corsair LPX 3200mhz / Samsung 850Pro 256GB / Corsair AX1200 / Window 10 Pro
    Fire Strike                        24,163
    Fire  Strike Extreme        14,452 
    Fire Strike Ultra                7,711 
    Time Spy                        10,357

    #10
    MNFirstBlood
    iCX Member
    • Total Posts : 311
    • Reward points : 0
    • Joined: 2007/03/04 16:05:34
    • Status: offline
    • Ribbons : 0
    Re: Question about boost clock 2020/11/13 17:04:42 (permalink)
    HeavyHemi
    MNFirstBlood
    HeavyHemi
    MNFirstBlood
    Data1987
    Boost Clock is tied to temperature....so keep that GPU as cool as posible and boost clocks will be higher.


    That was my understanding as well, but as you see in my OP I state that the temps are identical. I played each game for 30 minutes just now to let the heat soak and I looked at the performance monitor which I have linked below. the difference in color is HDR, which RDR2 supports and Hunt does not. 
                
     RDR2                                                                                                                 Hunt

     




    Okay.. but see this matches up with exactly how I explained this.  Where your CPU utilization is higher, your GPU clocks are lower.
    Another thing, I'm pretty sure you intentionally picked spots for both games that would reflect similar numbers. All that shows is that you can show similar numbers with a static scene. 
    What you should be looking at is the average performance as the games are running.  Showing peak of 99% for a basically a static scene, does not mean that reflects both games have the exact same power usage all the time. We know they do not. 
     
     
    Also, temps do make a difference beyond starting at 39C for the first temp bin or drop of ~13mhz.
     
     

     
    You're right in the area where just a degree or two is going to make your clocks fluctuate by +-25mhz. This is what I have  in comparison
     

     
     That's an hour or so of Division 2 at 4K. Solid 2063mhz locked at 1.050v. Standard EVGA AIO installed on a 1080 TI FE with push/pull fans on the rad. Topped out at 45C.


     

    I am not sure exactly what you are saying. I didn't see any explanation in detail about CPU affecting boost clock. I did you see you asked about power, and I thought the pics might help you see. You also said games will load differently in the CPU. So The idea that CPU usage could play a factor is news to me and why I am here asking how it works. Can you elaborate how and when CPU usage effects boost clock behavior? Also how do I save a graph like that so I can post it?
     
    I did not choose any scene to be specific. Like the OP states RDR2 stays at 1949mhz and HUNT goes all over the place and tends to stay closer to 1885. The temps don't go higher than 71c in either game. I have the full screen shots if you'd like and I could also stream for you on discord or whatever. As far as the area where a degree would make a difference, I'm confused by that because the lower temp has the lower clock?
     
    Side note: that hybrid keeps it at 45c, damn that's nice.  



    If one tends to 'go all over the place' and the other is relatively stable, then you DID chose scenes so that the numbers were almost exactly the same. It cannot bet otherwise.
     
    Indeed you're not going to see a detailed explanation of how CPU utilization affects GPU utilization. Why? If that basic concept is confusing to you, the details would be a waste of both of our time.
    And seriously, this is NOT the forum for paragraphs of MSEE jargon etc ( I are one). Also GPU and CPU utilization combined with GPU  Boost 3.0 can give you odd results that don't reflect what you'd think would be the performance.
    Though I find it hard to believe that as long as you have been around, you're not aware that that how a game utilizes the CPU affects GPU utilization. Who does not know that Crysis for example, being SINGLE THREADED,  is limiting FPS vice GPU rendering power?  You can take a screen shot with the AB HW window open using the Snip tool.
     
     


    Oh Bud, thanks for being standard. Enjoy your forums o/

    Windows 10
    i7-8700K  @5.0ghz 1.4v
    Asus Maximus X Hero  
    Asus Ryujin 360
    1080ti FTW3  
    32 G.skill Trident 3400  
    2x 300gb Velicoraptor 3.0 Raid 0
    2x Samsung Evo 500gb Raid 0 (OS)
    Samsung 970 Pro NVMe M.2 1tb 
    Lian Li 011-Air
    Evga Super Nova 1000 

    "just because you are a character doesn't mean you have character". Wolf
    #11
    HeavyHemi
    Omnipotent Enthusiast
    • Total Posts : 14749
    • Reward points : 0
    • Joined: 2008/11/28 20:31:42
    • Location: Western Washington
    • Status: offline
    • Ribbons : 107
    Re: Question about boost clock 2020/11/13 17:22:25 (permalink)
    MNFirstBlood
    HeavyHemi
    MNFirstBlood
    HeavyHemi
    MNFirstBlood
    Data1987
    Boost Clock is tied to temperature....so keep that GPU as cool as posible and boost clocks will be higher.


    That was my understanding as well, but as you see in my OP I state that the temps are identical. I played each game for 30 minutes just now to let the heat soak and I looked at the performance monitor which I have linked below. the difference in color is HDR, which RDR2 supports and Hunt does not. 
                
     RDR2                                                                                                                 Hunt

     




    Okay.. but see this matches up with exactly how I explained this.  Where your CPU utilization is higher, your GPU clocks are lower.
    Another thing, I'm pretty sure you intentionally picked spots for both games that would reflect similar numbers. All that shows is that you can show similar numbers with a static scene. 
    What you should be looking at is the average performance as the games are running.  Showing peak of 99% for a basically a static scene, does not mean that reflects both games have the exact same power usage all the time. We know they do not. 
     
     
    Also, temps do make a difference beyond starting at 39C for the first temp bin or drop of ~13mhz.
     
     

     
    You're right in the area where just a degree or two is going to make your clocks fluctuate by +-25mhz. This is what I have  in comparison
     

     
     That's an hour or so of Division 2 at 4K. Solid 2063mhz locked at 1.050v. Standard EVGA AIO installed on a 1080 TI FE with push/pull fans on the rad. Topped out at 45C.


     

    I am not sure exactly what you are saying. I didn't see any explanation in detail about CPU affecting boost clock. I did you see you asked about power, and I thought the pics might help you see. You also said games will load differently in the CPU. So The idea that CPU usage could play a factor is news to me and why I am here asking how it works. Can you elaborate how and when CPU usage effects boost clock behavior? Also how do I save a graph like that so I can post it?
     
    I did not choose any scene to be specific. Like the OP states RDR2 stays at 1949mhz and HUNT goes all over the place and tends to stay closer to 1885. The temps don't go higher than 71c in either game. I have the full screen shots if you'd like and I could also stream for you on discord or whatever. As far as the area where a degree would make a difference, I'm confused by that because the lower temp has the lower clock?
     
    Side note: that hybrid keeps it at 45c, damn that's nice.  



    If one tends to 'go all over the place' and the other is relatively stable, then you DID chose scenes so that the numbers were almost exactly the same. It cannot bet otherwise.
     
    Indeed you're not going to see a detailed explanation of how CPU utilization affects GPU utilization. Why? If that basic concept is confusing to you, the details would be a waste of both of our time.
    And seriously, this is NOT the forum for paragraphs of MSEE jargon etc ( I are one). Also GPU and CPU utilization combined with GPU  Boost 3.0 can give you odd results that don't reflect what you'd think would be the performance.
    Though I find it hard to believe that as long as you have been around, you're not aware that that how a game utilizes the CPU affects GPU utilization. Who does not know that Crysis for example, being SINGLE THREADED,  is limiting FPS vice GPU rendering power?  You can take a screen shot with the AB HW window open using the Snip tool.
     
     


    Oh Bud, thanks for being standard. Enjoy your forums o/




    What did you expect?   "I am not sure exactly what you are saying. I didn't see any explanation in detail about CPU affecting boost clock."
    It is like you completely ignore what I posted. Started arguing about what I was posting, then asked for details.  Thanks for being such a nice guy when someone is taking their own time to help you. The world needs more argumentative folks like you. Have a nice life.

    EVGA X99 FTWK / i7 6850K 4.5ghz / GTX1080 Ti FE Hybrid / 32GB Corsair LPX 3200mhz / Samsung 850Pro 256GB / Corsair AX1200 / Window 10 Pro
    Fire Strike                        24,163
    Fire  Strike Extreme        14,452 
    Fire Strike Ultra                7,711 
    Time Spy                        10,357

    #12
    MNFirstBlood
    iCX Member
    • Total Posts : 311
    • Reward points : 0
    • Joined: 2007/03/04 16:05:34
    • Status: offline
    • Ribbons : 0
    Re: Question about boost clock 2020/11/13 19:27:35 (permalink)
    Indeed it does. People seeking knowledge and asking questions rather than assuming they know it all. If not understanding what you were saying and asking for more detail is argumentitve than you and me have different definitions of that word. Anyways you offered no suggestions on where to find information or where the proper place for this type of question would be. After you said this isnt the right place. So I also think your defination of help is quite different than mine as well. Thank you for the "help" though.

    Windows 10
    i7-8700K  @5.0ghz 1.4v
    Asus Maximus X Hero  
    Asus Ryujin 360
    1080ti FTW3  
    32 G.skill Trident 3400  
    2x 300gb Velicoraptor 3.0 Raid 0
    2x Samsung Evo 500gb Raid 0 (OS)
    Samsung 970 Pro NVMe M.2 1tb 
    Lian Li 011-Air
    Evga Super Nova 1000 

    "just because you are a character doesn't mean you have character". Wolf
    #13
    HeavyHemi
    Omnipotent Enthusiast
    • Total Posts : 14749
    • Reward points : 0
    • Joined: 2008/11/28 20:31:42
    • Location: Western Washington
    • Status: offline
    • Ribbons : 107
    Re: Question about boost clock 2020/11/14 13:38:39 (permalink)
    MNFirstBlood
    Indeed it does. People seeking knowledge and asking questions rather than assuming they know it all. If not understanding what you were saying and asking for more detail is argumentitve than you and me have different definitions of that word. Anyways you offered no suggestions on where to find information or where the proper place for this type of question would be. After you said this isnt the right place. So I also think your defination of help is quite different than mine as well. Thank you for the "help" though.



    Your replies were to ARGUE and question what I posted as if it was utterly wrong and made no sense.
    "I am not sure exactly what you are saying. I didn't see any explanation in detail about CPU affecting boost clock." You go from saying you don't under stand the basics of how CPU loading can affect GPU performance to a COMPLAINT I did not PROVIDE you with a DETAILES ABOUT HOW THIS ALL WORKS. That literally makes no sense to a mature thinking adult. You then further complain I don't provide you with LINKS. You're on the internet, you can visit Nvidia.com and read white papers on how this all works for days. You can, for FREE, join the Nvidia Developer program and get all sorts of information and cool stuff FOR FREE. You're not paying me,  I don't work for FREE. Going from "what are the basics to WHERE IS MY DETAILED EXPLANATION OF HOW EVERY THING WORKS is silly. This is a fact everyone can read. Thanks again for another round of passive aggressive insults and the opportunity to hopefully guide you to more productive discussions....with others.

    EVGA X99 FTWK / i7 6850K 4.5ghz / GTX1080 Ti FE Hybrid / 32GB Corsair LPX 3200mhz / Samsung 850Pro 256GB / Corsair AX1200 / Window 10 Pro
    Fire Strike                        24,163
    Fire  Strike Extreme        14,452 
    Fire Strike Ultra                7,711 
    Time Spy                        10,357

    #14
    MNFirstBlood
    iCX Member
    • Total Posts : 311
    • Reward points : 0
    • Joined: 2007/03/04 16:05:34
    • Status: offline
    • Ribbons : 0
    Re: Question about boost clock 2020/11/15 11:52:40 (permalink)
    HeavyHemi
    MNFirstBlood
    Indeed it does. People seeking knowledge and asking questions rather than assuming they know it all. If not understanding what you were saying and asking for more detail is argumentitve than you and me have different definitions of that word. Anyways you offered no suggestions on where to find information or where the proper place for this type of question would be. After you said this isnt the right place. So I also think your defination of help is quite different than mine as well. Thank you for the "help" though.
     
     



    Your replies were to ARGUE and question what I posted as if it was utterly wrong and made no sense.
    "I am not sure exactly what you are saying. I didn't see any explanation in detail about CPU affecting boost clock." You go from saying you don't under stand the basics of how CPU loading can affect GPU performance to a COMPLAINT I did not PROVIDE you with a DETAILES ABOUT HOW THIS ALL WORKS. That literally makes no sense to a mature thinking adult. You then further complain I don't provide you with LINKS. You're on the internet, you can visit Nvidia.com and read white papers on how this all works for days. You can, for FREE, join the Nvidia Developer program and get all sorts of information and cool stuff FOR FREE. You're not paying me,  I don't work for FREE. Going from "what are the basics to WHERE IS MY DETAILED EXPLANATION OF HOW EVERY THING WORKS is silly. This is a fact everyone can read. Thanks again for another round of passive aggressive insults and the opportunity to hopefully guide you to more productive discussions....with others.


    Where do I ask for the basics? If you don't work for free then don't reply in a forum where someone is asking a question that may require a more detailed response. I mean if you say you are helping but get mad that your basic help didn't actually provide any clarity, you seem a bit sensitive if I must say. I never asked about cpu and gpu basics. I asked how cpu usage effects boost clocks. I still don't see how me not understanding your explanation is an argument but that is why you are now calling me passive aggressive. So I replied thank you for being standard. Because this is actually the most standard forum response out there. You can see so many similar threads by people asking questions and so called people like you "helping", then getting mad because someone asked for more detail. I mean clarity in itself is more detail correct? So either you do not know or you don't want to help. In either case then don't reply. Super easy. As far as everyone can see, okay. I'm sure they are amused just like they would be with any other forum thread that has gone awry.
     
    Here is the quote you say is argumentative:
    "I am not sure exactly what you are saying. I didn't see any explanation in detail about CPU affecting boost clock. I did you see you asked about power, and I thought the pics might help you see. You also said games will load differently in the CPU. So The idea that CPU usage could play a factor is news to me and why I am here asking how it works. Can you elaborate how and when CPU usage effects boost clock behavior? Also how do I save a graph like that so I can post it?"
     
    I mean really?? If you say hey the reason it's different is because CPU loads, Temps, and VLL, and some single core apps and some multi threaded apps. That is supposed to provide me with an understanding?Sorry my guy, you were no help then got mad. 
      

    Windows 10
    i7-8700K  @5.0ghz 1.4v
    Asus Maximus X Hero  
    Asus Ryujin 360
    1080ti FTW3  
    32 G.skill Trident 3400  
    2x 300gb Velicoraptor 3.0 Raid 0
    2x Samsung Evo 500gb Raid 0 (OS)
    Samsung 970 Pro NVMe M.2 1tb 
    Lian Li 011-Air
    Evga Super Nova 1000 

    "just because you are a character doesn't mean you have character". Wolf
    #15
    ty_ger07
    Insert Custom Title Here
    • Total Posts : 17316
    • Reward points : 0
    • Joined: 2008/04/10 23:48:15
    • Location: traveler
    • Status: online
    • Ribbons : 178
    Re: Question about boost clock 2020/11/15 14:00:30 (permalink)
    MNFirstBlood
    Where do I ask for the basics?

    The CPU is the boss.  The game runs on the CPU.  The CPU gives the GPU data to work with.  The CPU load and how well the game utilizes the CPU can have a huge affect on GPU performance (FPS, load, power, temperature, etcetera) and how consistently the GPU has work to do (fluctuations in GPU performance).
     
    That is basic.
     
     
    Additionally, what Data1987 said is not accurate.  GPU boost is based on WAY MORE than just temperature.
    post edited by ty_ger07 - 2020/11/15 14:02:32
    #16
    HeavyHemi
    Omnipotent Enthusiast
    • Total Posts : 14749
    • Reward points : 0
    • Joined: 2008/11/28 20:31:42
    • Location: Western Washington
    • Status: offline
    • Ribbons : 107
    Re: Question about boost clock 2020/11/15 17:49:17 (permalink)
    MNFirstBlood
    HeavyHemi
    MNFirstBlood
    Indeed it does. People seeking knowledge and asking questions rather than assuming they know it all. If not understanding what you were saying and asking for more detail is argumentitve than you and me have different definitions of that word. Anyways you offered no suggestions on where to find information or where the proper place for this type of question would be. After you said this isnt the right place. So I also think your defination of help is quite different than mine as well. Thank you for the "help" though.
     
     



    Your replies were to ARGUE and question what I posted as if it was utterly wrong and made no sense.
    "I am not sure exactly what you are saying. I didn't see any explanation in detail about CPU affecting boost clock." You go from saying you don't under stand the basics of how CPU loading can affect GPU performance to a COMPLAINT I did not PROVIDE you with a DETAILES ABOUT HOW THIS ALL WORKS. That literally makes no sense to a mature thinking adult. You then further complain I don't provide you with LINKS. You're on the internet, you can visit Nvidia.com and read white papers on how this all works for days. You can, for FREE, join the Nvidia Developer program and get all sorts of information and cool stuff FOR FREE. You're not paying me,  I don't work for FREE. Going from "what are the basics to WHERE IS MY DETAILED EXPLANATION OF HOW EVERY THING WORKS is silly. This is a fact everyone can read. Thanks again for another round of passive aggressive insults and the opportunity to hopefully guide you to more productive discussions....with others.


    Where do I ask for the basics? If you don't work for free then don't reply in a forum where someone is asking a question that may require a more detailed response. I mean if you say you are helping but get mad that your basic help didn't actually provide any clarity, you seem a bit sensitive if I must say. I never asked about cpu and gpu basics. I asked how cpu usage effects boost clocks. I still don't see how me not understanding your explanation is an argument but that is why you are now calling me passive aggressive. So I replied thank you for being standard. Because this is actually the most standard forum response out there. You can see so many similar threads by people asking questions and so called people like you "helping", then getting mad because someone asked for more detail. I mean clarity in itself is more detail correct? So either you do not know or you don't want to help. In either case then don't reply. Super easy. As far as everyone can see, okay. I'm sure they are amused just like they would be with any other forum thread that has gone awry.
     
    Here is the quote you say is argumentative:
    "I am not sure exactly what you are saying. I didn't see any explanation in detail about CPU affecting boost clock. I did you see you asked about power, and I thought the pics might help you see. You also said games will load differently in the CPU. So The idea that CPU usage could play a factor is news to me and why I am here asking how it works. Can you elaborate how and when CPU usage effects boost clock behavior? Also how do I save a graph like that so I can post it?"
     
    I mean really?? If you say hey the reason it's different is because CPU loads, Temps, and VLL, and some single core apps and some multi threaded apps. That is supposed to provide me with an understanding?Sorry my guy, you were no help then got mad. 
      


     This is in fact the statement of a person who does not understand and is asking about the basics.
    "You also said games will load differently in the CPU. So The idea that CPU usage could play a factor is news to me and why I am here asking how it works. Can you elaborate how and when CPU usage effects boost clock behavior?"
    Son, you're projecting your frustration in not understanding what is being posted (your own words) and I'd say this screeching attacking my character is a personal attack. Mad? I don't care about you at all... let me revise that, I didn't, now I dislike you for obvious reasons. Grow up.
    post edited by HeavyHemi - 2020/11/15 18:33:05

    EVGA X99 FTWK / i7 6850K 4.5ghz / GTX1080 Ti FE Hybrid / 32GB Corsair LPX 3200mhz / Samsung 850Pro 256GB / Corsair AX1200 / Window 10 Pro
    Fire Strike                        24,163
    Fire  Strike Extreme        14,452 
    Fire Strike Ultra                7,711 
    Time Spy                        10,357

    #17
    MNFirstBlood
    iCX Member
    • Total Posts : 311
    • Reward points : 0
    • Joined: 2007/03/04 16:05:34
    • Status: offline
    • Ribbons : 0
    Re: Question about boost clock 2020/11/15 21:20:55 (permalink)
     
    HeavyHemi
    MNFirstBlood
    HeavyHemi
    MNFirstBlood
    Indeed it does. People seeking knowledge and asking questions rather than assuming they know it all. If not understanding what you were saying and asking for more detail is argumentitve than you and me have different definitions of that word. Anyways you offered no suggestions on where to find information or where the proper place for this type of question would be. After you said this isnt the right place. So I also think your defination of help is quite different than mine as well. Thank you for the "help" though.
     
     
     
     



    Your replies were to ARGUE and question what I posted as if it was utterly wrong and made no sense.
    "I am not sure exactly what you are saying. I didn't see any explanation in detail about CPU affecting boost clock." You go from saying you don't under stand the basics of how CPU loading can affect GPU performance to a COMPLAINT I did not PROVIDE you with a DETAILES ABOUT HOW THIS ALL WORKS. That literally makes no sense to a mature thinking adult. You then further complain I don't provide you with LINKS. You're on the internet, you can visit Nvidia.com and read white papers on how this all works for days. You can, for FREE, join the Nvidia Developer program and get all sorts of information and cool stuff FOR FREE. You're not paying me,  I don't work for FREE. Going from "what are the basics to WHERE IS MY DETAILED EXPLANATION OF HOW EVERY THING WORKS is silly. This is a fact everyone can read. Thanks again for another round of passive aggressive insults and the opportunity to hopefully guide you to more productive discussions....with others.


    Where do I ask for the basics? If you don't work for free then don't reply in a forum where someone is asking a question that may require a more detailed response. I mean if you say you are helping but get mad that your basic help didn't actually provide any clarity, you seem a bit sensitive if I must say. I never asked about cpu and gpu basics. I asked how cpu usage effects boost clocks. I still don't see how me not understanding your explanation is an argument but that is why you are now calling me passive aggressive. So I replied thank you for being standard. Because this is actually the most standard forum response out there. You can see so many similar threads by people asking questions and so called people like you "helping", then getting mad because someone asked for more detail. I mean clarity in itself is more detail correct? So either you do not know or you don't want to help. In either case then don't reply. Super easy. As far as everyone can see, okay. I'm sure they are amused just like they would be with any other forum thread that has gone awry.
     
    Here is the quote you say is argumentative:
    "I am not sure exactly what you are saying. I didn't see any explanation in detail about CPU affecting boost clock. I did you see you asked about power, and I thought the pics might help you see. You also said games will load differently in the CPU. So The idea that CPU usage could play a factor is news to me and why I am here asking how it works. Can you elaborate how and when CPU usage effects boost clock behavior? Also how do I save a graph like that so I can post it?"
     
    I mean really?? If you say hey the reason it's different is because CPU loads, Temps, and VLL, and some single core apps and some multi threaded apps. That is supposed to provide me with an understanding?Sorry my guy, you were no help then got mad. 
      


     This is in fact the statement of a person who does not understand and is asking about the basics.
    "You also said games will load differently in the CPU. So The idea that CPU usage could play a factor is news to me and why I am here asking how it works. Can you elaborate how and when CPU usage effects boost clock behavior?"
    Son, you're projecting your frustration in not understanding what is being posted (your own words) and I'd say this screeching attacking my character is a personal attack. Mad? I don't care about you at all... let me revise that, I didn't, now I dislike you for obvious reasons. Grow up.




    standard so very standard. Your character is under attack? You are mighty sensitive bud. Grow up? I cannot even say I don't understand what you are saying without you getting all up in my face. Discussions are what grown up's have. Not someone like you belittling a person because you think they should know something that you say is so simple. You have insulted me several times and I have not even cared to acknowledge it and yet I need to grow up? You dislike me? aww man that hurts, it really does.
     
    Quote from you "Another thing, I'm pretty sure you intentionally picked spots for both games that would reflect similar numbers. All that shows is that you can show similar numbers with a static scene."
    I offered to have you watch me stream show you could see the behavior and also different screenshots. Even though I was thrown off by the accusation. What would be the point of static images or whatever you were getting at there? didn't seem very helpful. 
     
    Quote from you "I guess one might ask, what is your purpose other than curiosity?" 
    What purpose do I need? Seemed very belittling right from the jump.
     
    Quote from you "Indeed you're not going to see a detailed explanation of how CPU utilization affects GPU utilization. Why? If that basic concept is confusing to you, the details would be a waste of both of our time." 
    Basically saying I couldn't digest information. Especially belittling because it was not even what I asked. The actual question was how it effect boost clock. Basic information about the CPU feeding the GPU and limiting the GPU usage based on the speed of the information transfer is not what was being asked. The GPU is at 99% usage at all times so that is why I didn't  know what you were saying about multi-thread and single thread effecting GPU usage because it was not the case here. More detailed information is what I was looking for obviously. 
     
    Quote from you "And seriously, this is NOT the forum for paragraphs of MSEE jargon etc ( I are one). "
    Not the forum for this conversation? I have a 1080ti from EVGA and this the 1080ti from EVGA thread. Another attempt at belittling. 
     
    Quote from you "Also GPU and CPU utilization combined with GPU  Boost 3.0 can give you odd results that don't reflect what you'd think would be the performance."
    My favorite one. Yes that is why I am asking for clarity. You can't even make it up man. 
     
    So I guess I will just go grow up now. Keep up the solid "free" help. 
     
     
     

    Windows 10
    i7-8700K  @5.0ghz 1.4v
    Asus Maximus X Hero  
    Asus Ryujin 360
    1080ti FTW3  
    32 G.skill Trident 3400  
    2x 300gb Velicoraptor 3.0 Raid 0
    2x Samsung Evo 500gb Raid 0 (OS)
    Samsung 970 Pro NVMe M.2 1tb 
    Lian Li 011-Air
    Evga Super Nova 1000 

    "just because you are a character doesn't mean you have character". Wolf
    #18
    ty_ger07
    Insert Custom Title Here
    • Total Posts : 17316
    • Reward points : 0
    • Joined: 2008/04/10 23:48:15
    • Location: traveler
    • Status: online
    • Ribbons : 178
    HeavyHemi
    Omnipotent Enthusiast
    • Total Posts : 14749
    • Reward points : 0
    • Joined: 2008/11/28 20:31:42
    • Location: Western Washington
    • Status: offline
    • Ribbons : 107
    Re: Question about boost clock 2020/11/16 11:47:56 (permalink)
    MNFirstBlood
     
    HeavyHemi
    MNFirstBlood
    HeavyHemi
    MNFirstBlood
    Indeed it does. People seeking knowledge and asking questions rather than assuming they know it all. If not understanding what you were saying and asking for more detail is argumentitve than you and me have different definitions of that word. Anyways you offered no suggestions on where to find information or where the proper place for this type of question would be. After you said this isnt the right place. So I also think your defination of help is quite different than mine as well. Thank you for the "help" though.
     
     
     
     



    Your replies were to ARGUE and question what I posted as if it was utterly wrong and made no sense.
    "I am not sure exactly what you are saying. I didn't see any explanation in detail about CPU affecting boost clock." You go from saying you don't under stand the basics of how CPU loading can affect GPU performance to a COMPLAINT I did not PROVIDE you with a DETAILES ABOUT HOW THIS ALL WORKS. That literally makes no sense to a mature thinking adult. You then further complain I don't provide you with LINKS. You're on the internet, you can visit Nvidia.com and read white papers on how this all works for days. You can, for FREE, join the Nvidia Developer program and get all sorts of information and cool stuff FOR FREE. You're not paying me,  I don't work for FREE. Going from "what are the basics to WHERE IS MY DETAILED EXPLANATION OF HOW EVERY THING WORKS is silly. This is a fact everyone can read. Thanks again for another round of passive aggressive insults and the opportunity to hopefully guide you to more productive discussions....with others.


    Where do I ask for the basics? If you don't work for free then don't reply in a forum where someone is asking a question that may require a more detailed response. I mean if you say you are helping but get mad that your basic help didn't actually provide any clarity, you seem a bit sensitive if I must say. I never asked about cpu and gpu basics. I asked how cpu usage effects boost clocks. I still don't see how me not understanding your explanation is an argument but that is why you are now calling me passive aggressive. So I replied thank you for being standard. Because this is actually the most standard forum response out there. You can see so many similar threads by people asking questions and so called people like you "helping", then getting mad because someone asked for more detail. I mean clarity in itself is more detail correct? So either you do not know or you don't want to help. In either case then don't reply. Super easy. As far as everyone can see, okay. I'm sure they are amused just like they would be with any other forum thread that has gone awry.
     
    Here is the quote you say is argumentative:
    "I am not sure exactly what you are saying. I didn't see any explanation in detail about CPU affecting boost clock. I did you see you asked about power, and I thought the pics might help you see. You also said games will load differently in the CPU. So The idea that CPU usage could play a factor is news to me and why I am here asking how it works. Can you elaborate how and when CPU usage effects boost clock behavior? Also how do I save a graph like that so I can post it?"
     
    I mean really?? If you say hey the reason it's different is because CPU loads, Temps, and VLL, and some single core apps and some multi threaded apps. That is supposed to provide me with an understanding?Sorry my guy, you were no help then got mad. 
      


     This is in fact the statement of a person who does not understand and is asking about the basics.
    "You also said games will load differently in the CPU. So The idea that CPU usage could play a factor is news to me and why I am here asking how it works. Can you elaborate how and when CPU usage effects boost clock behavior?"
    Son, you're projecting your frustration in not understanding what is being posted (your own words) and I'd say this screeching attacking my character is a personal attack. Mad? I don't care about you at all... let me revise that, I didn't, now I dislike you for obvious reasons. Grow up.




    standard so very standard. Your character is under attack? You are mighty sensitive bud. Grow up? I cannot even say I don't understand what you are saying without you getting all up in my face. Discussions are what grown up's have. Not someone like you belittling a person because you think they should know something that you say is so simple. You have insulted me several times and I have not even cared to acknowledge it and yet I need to grow up? You dislike me? aww man that hurts, it really does.
     
    Quote from you "Another thing, I'm pretty sure you intentionally picked spots for both games that would reflect similar numbers. All that shows is that you can show similar numbers with a static scene."
    I offered to have you watch me stream show you could see the behavior and also different screenshots. Even though I was thrown off by the accusation. What would be the point of static images or whatever you were getting at there? didn't seem very helpful. 
     
    Quote from you "I guess one might ask, what is your purpose other than curiosity?" 
    What purpose do I need? Seemed very belittling right from the jump.
     
    Quote from you "Indeed you're not going to see a detailed explanation of how CPU utilization affects GPU utilization. Why? If that basic concept is confusing to you, the details would be a waste of both of our time." 
    Basically saying I couldn't digest information. Especially belittling because it was not even what I asked. The actual question was how it effect boost clock. Basic information about the CPU feeding the GPU and limiting the GPU usage based on the speed of the information transfer is not what was being asked. The GPU is at 99% usage at all times so that is why I didn't  know what you were saying about multi-thread and single thread effecting GPU usage because it was not the case here. More detailed information is what I was looking for obviously. 
     
    Quote from you "And seriously, this is NOT the forum for paragraphs of MSEE jargon etc ( I are one). "
    Not the forum for this conversation? I have a 1080ti from EVGA and this the 1080ti from EVGA thread. Another attempt at belittling. 
     
    Quote from you "Also GPU and CPU utilization combined with GPU  Boost 3.0 can give you odd results that don't reflect what you'd think would be the performance."
    My favorite one. Yes that is why I am asking for clarity. You can't even make it up man. 
     
    So I guess I will just go grow up now. Keep up the solid "free" help. 
     
     
     




    Why are you spending SO MUCH TIME time belittling someone trying to help you? What you did, when an attempt was made to explain the basics, was to argue the explanations. Let me quote you again:

    I am not sure exactly what you are saying. I didn't see any explanation in detail about CPU affecting boost clock.

     
    Who goes from saying the don't understand the basics of what is being said, to complaining about not being given an EXPLANATION IN DETAIL?   Perhaps starting at the end works for you. But when some says they DO NOT KNOW THE BASICS as you said as a fact, I am certainly not going to waste my time STARTING with the DETAILS ON HOW THE CPU AFFECTS BOOST CLOCK.  Again, you stated that CPU utilization  affecting GPU performance is a NEW thing to you.  Frankly I don't know why you keep replying and just embarrassing yourself further.

     
     
    ty_ger07
    boring

    It is... surprising someone around that long would be so... boorish. I can be testy and Craptacular at times but sheesh. 
    post edited by HeavyHemi - 2020/11/16 11:50:33

    EVGA X99 FTWK / i7 6850K 4.5ghz / GTX1080 Ti FE Hybrid / 32GB Corsair LPX 3200mhz / Samsung 850Pro 256GB / Corsair AX1200 / Window 10 Pro
    Fire Strike                        24,163
    Fire  Strike Extreme        14,452 
    Fire Strike Ultra                7,711 
    Time Spy                        10,357

    #20
    ty_ger07
    Insert Custom Title Here
    • Total Posts : 17316
    • Reward points : 0
    • Joined: 2008/04/10 23:48:15
    • Location: traveler
    • Status: online
    • Ribbons : 178
    Re: Question about boost clock 2020/11/16 11:51:48 (permalink)
    still boring
     
    MNFirstBlood, you should probably do some research, and then start a new thread in like a week or so, if you still have questions.
    #21
    Data1987
    Superclocked Member
    • Total Posts : 101
    • Reward points : 0
    • Joined: 2020/11/13 10:52:16
    • Status: offline
    • Ribbons : 0
    Re: Question about boost clock 2020/11/17 21:10:42 (permalink)
    Boost...the cooler the components the more the GPU can clock....
    #22
    ty_ger07
    Insert Custom Title Here
    • Total Posts : 17316
    • Reward points : 0
    • Joined: 2008/04/10 23:48:15
    • Location: traveler
    • Status: online
    • Ribbons : 178
    Re: Question about boost clock 2020/11/17 21:24:31 (permalink)
    Data1987
    Boost...the cooler the components the more the GPU can clock....


    What about power limit?
    What about GPU utilization?
    What about silicon quality and reliability voltage?
     
    Temperature is not the only factor.
     
    The GPU won't boost as high if it is bottlenecked by the CPU and waiting for work to do.
    #23
    tattude69
    iCX Member
    • Total Posts : 303
    • Reward points : 0
    • Joined: 2013/04/21 15:30:52
    • Location: NY
    • Status: offline
    • Ribbons : 1
    Re: Question about boost clock 2020/11/18 03:18:35 (permalink)
    https://www.3dmark.com/compare/fs/24049505/fs/24049424#
     
    Here is two benchmark runs performed at same temps, If you look at the results you will see that as my Boost Clock increased, it lowered my CPU clock. Both tests were run with no overclock on GPU and CPU overclocked to 4.6  The test with 26 mhz boost increase pulled 20% more power for a whooping 1.3% increase at the cost of 20% more power not temperature difference. Also the 26mhz boost on my GPU cost me 12 mhz on my CPU and yet my physics score increased 0.9% because the system balanced out better with the higher clock on GPU and lower clock of CPU.
     
     
    post edited by tattude69 - 2020/11/18 03:45:32

    I7 4790K 4.7G
    Gigabyte Z97 Gaming 7 Motherboard
    Evga GTX 1080 Classified
    32GB Crucial DDR3 1600 MHZ 
    Evga Supernova 650 watt power supply
    Coolermaster MasterCase Pro5
    Custom Cooling Loop
    Samsung SSD Windows 10 Pro😎
     
      
    #24
    MNFirstBlood
    iCX Member
    • Total Posts : 311
    • Reward points : 0
    • Joined: 2007/03/04 16:05:34
    • Status: offline
    • Ribbons : 0
    Re: Question about boost clock 2020/11/21 18:49:10 (permalink)
    ty_ger07
    Data1987
    Boost...the cooler the components the more the GPU can clock....


    What about power limit?
    What about GPU utilization?
    What about silicon quality and reliability voltage?
     
    Temperature is not the only factor.
     
    The GPU won't boost as high if it is bottlenecked by the CPU and waiting for work to do.


    Yeah I have been testing for 4 days now. This place used to be a good place to start with questions, but some of the super poster's make it un enjoyable. People with 100k posts trolling. Very interesting, I seem to finding that it is more linked to power limit than temperature. I won't even bother with any further explaining but I know it wasn't strictly temps and I wanted to understand deeper. One thing I have found out is it has nothing to do with CPU load (at least in my case). 

    Windows 10
    i7-8700K  @5.0ghz 1.4v
    Asus Maximus X Hero  
    Asus Ryujin 360
    1080ti FTW3  
    32 G.skill Trident 3400  
    2x 300gb Velicoraptor 3.0 Raid 0
    2x Samsung Evo 500gb Raid 0 (OS)
    Samsung 970 Pro NVMe M.2 1tb 
    Lian Li 011-Air
    Evga Super Nova 1000 

    "just because you are a character doesn't mean you have character". Wolf
    #25
    MNFirstBlood
    iCX Member
    • Total Posts : 311
    • Reward points : 0
    • Joined: 2007/03/04 16:05:34
    • Status: offline
    • Ribbons : 0
    Re: Question about boost clock 2020/11/21 19:04:19 (permalink)
    HeavyHemi
    MNFirstBlood
     
    HeavyHemi
    MNFirstBlood
    HeavyHemi
    MNFirstBlood
    Indeed it does. People seeking knowledge and asking questions rather than assuming they know it all. If not understanding what you were saying and asking for more detail is argumentitve than you and me have different definitions of that word. Anyways you offered no suggestions on where to find information or where the proper place for this type of question would be. After you said this isnt the right place. So I also think your defination of help is quite different than mine as well. Thank you for the "help" though.
     
    You do seem dim. Again I never asked about how cpu affects gpu performance. I asked for clarity on your very general reply and if you can explain in more detail about how it would affect boost clock. My cpu was never in question. It was about different boost clock behaviors in seemingly very similar environments as far usage/power was concerned. Super simple really. You made it much more. You keep posting the one reply I made which is totally valid but you keep interpreting in a way that was never said. Please pull up my quote that has me asking for basic details on cpu and gpu behaviors? You make this community weaker and offer nothing of any value. 
     
     
     
     



    Your replies were to ARGUE and question what I posted as if it was utterly wrong and made no sense.
    "I am not sure exactly what you are saying. I didn't see any explanation in detail about CPU affecting boost clock." You go from saying you don't under stand the basics of how CPU loading can affect GPU performance to a COMPLAINT I did not PROVIDE you with a DETAILES ABOUT HOW THIS ALL WORKS. That literally makes no sense to a mature thinking adult. You then further complain I don't provide you with LINKS. You're on the internet, you can visit Nvidia.com and read white papers on how this all works for days. You can, for FREE, join the Nvidia Developer program and get all sorts of information and cool stuff FOR FREE. You're not paying me,  I don't work for FREE. Going from "what are the basics to WHERE IS MY DETAILED EXPLANATION OF HOW EVERY THING WORKS is silly. This is a fact everyone can read. Thanks again for another round of passive aggressive insults and the opportunity to hopefully guide you to more productive discussions....with others.


    Where do I ask for the basics? If you don't work for free then don't reply in a forum where someone is asking a question that may require a more detailed response. I mean if you say you are helping but get mad that your basic help didn't actually provide any clarity, you seem a bit sensitive if I must say. I never asked about cpu and gpu basics. I asked how cpu usage effects boost clocks. I still don't see how me not understanding your explanation is an argument but that is why you are now calling me passive aggressive. So I replied thank you for being standard. Because this is actually the most standard forum response out there. You can see so many similar threads by people asking questions and so called people like you "helping", then getting mad because someone asked for more detail. I mean clarity in itself is more detail correct? So either you do not know or you don't want to help. In either case then don't reply. Super easy. As far as everyone can see, okay. I'm sure they are amused just like they would be with any other forum thread that has gone awry.
     
    Here is the quote you say is argumentative:
    "I am not sure exactly what you are saying. I didn't see any explanation in detail about CPU affecting boost clock. I did you see you asked about power, and I thought the pics might help you see. You also said games will load differently in the CPU. So The idea that CPU usage could play a factor is news to me and why I am here asking how it works. Can you elaborate how and when CPU usage effects boost clock behavior? Also how do I save a graph like that so I can post it?"
     
    I mean really?? If you say hey the reason it's different is because CPU loads, Temps, and VLL, and some single core apps and some multi threaded apps. That is supposed to provide me with an understanding?Sorry my guy, you were no help then got mad. 
      


     This is in fact the statement of a person who does not understand and is asking about the basics.
    "You also said games will load differently in the CPU. So The idea that CPU usage could play a factor is news to me and why I am here asking how it works. Can you elaborate how and when CPU usage effects boost clock behavior?"
    Son, you're projecting your frustration in not understanding what is being posted (your own words) and I'd say this screeching attacking my character is a personal attack. Mad? I don't care about you at all... let me revise that, I didn't, now I dislike you for obvious reasons. Grow up.




    standard so very standard. Your character is under attack? You are mighty sensitive bud. Grow up? I cannot even say I don't understand what you are saying without you getting all up in my face. Discussions are what grown up's have. Not someone like you belittling a person because you think they should know something that you say is so simple. You have insulted me several times and I have not even cared to acknowledge it and yet I need to grow up? You dislike me? aww man that hurts, it really does.
     
    Quote from you "Another thing, I'm pretty sure you intentionally picked spots for both games that would reflect similar numbers. All that shows is that you can show similar numbers with a static scene."
    I offered to have you watch me stream show you could see the behavior and also different screenshots. Even though I was thrown off by the accusation. What would be the point of static images or whatever you were getting at there? didn't seem very helpful. 
     
    Quote from you "I guess one might ask, what is your purpose other than curiosity?" 
    What purpose do I need? Seemed very belittling right from the jump.
     
    Quote from you "Indeed you're not going to see a detailed explanation of how CPU utilization affects GPU utilization. Why? If that basic concept is confusing to you, the details would be a waste of both of our time." 
    Basically saying I couldn't digest information. Especially belittling because it was not even what I asked. The actual question was how it effect boost clock. Basic information about the CPU feeding the GPU and limiting the GPU usage based on the speed of the information transfer is not what was being asked. The GPU is at 99% usage at all times so that is why I didn't  know what you were saying about multi-thread and single thread effecting GPU usage because it was not the case here. More detailed information is what I was looking for obviously. 
     
    Quote from you "And seriously, this is NOT the forum for paragraphs of MSEE jargon etc ( I are one). "
    Not the forum for this conversation? I have a 1080ti from EVGA and this the 1080ti from EVGA thread. Another attempt at belittling. 
     
    Quote from you "Also GPU and CPU utilization combined with GPU  Boost 3.0 can give you odd results that don't reflect what you'd think would be the performance."
    My favorite one. Yes that is why I am asking for clarity. You can't even make it up man. 
     
    So I guess I will just go grow up now. Keep up the solid "free" help. 
     
     
     




    Why are you spending SO MUCH TIME time belittling someone trying to help you? What you did, when an attempt was made to explain the basics, was to argue the explanations. Let me quote you again:

    I am not sure exactly what you are saying. I didn't see any explanation in detail about CPU affecting boost clock.

     
    Who goes from saying the don't understand the basics of what is being said, to complaining about not being given an EXPLANATION IN DETAIL?   Perhaps starting at the end works for you. But when some says they DO NOT KNOW THE BASICS as you said as a fact, I am certainly not going to waste my time STARTING with the DETAILS ON HOW THE CPU AFFECTS BOOST CLOCK.  Again, you stated that CPU utilization  affecting GPU performance is a NEW thing to you.  Frankly I don't know why you keep replying and just embarrassing yourself further.

     
     
    ty_ger07
    boring

    It is... surprising someone around that long would be so... boorish. I can be testy and Craptacular at times but sheesh. 





    Windows 10
    i7-8700K  @5.0ghz 1.4v
    Asus Maximus X Hero  
    Asus Ryujin 360
    1080ti FTW3  
    32 G.skill Trident 3400  
    2x 300gb Velicoraptor 3.0 Raid 0
    2x Samsung Evo 500gb Raid 0 (OS)
    Samsung 970 Pro NVMe M.2 1tb 
    Lian Li 011-Air
    Evga Super Nova 1000 

    "just because you are a character doesn't mean you have character". Wolf
    #26
    HeavyHemi
    Omnipotent Enthusiast
    • Total Posts : 14749
    • Reward points : 0
    • Joined: 2008/11/28 20:31:42
    • Location: Western Washington
    • Status: offline
    • Ribbons : 107
    Re: Question about boost clock 2020/11/24 11:54:14 (permalink)
    MNFirstBlood
    ty_ger07
    Data1987
    Boost...the cooler the components the more the GPU can clock....


    What about power limit?
    What about GPU utilization?
    What about silicon quality and reliability voltage?
     
    Temperature is not the only factor.
     
    The GPU won't boost as high if it is bottlenecked by the CPU and waiting for work to do.


    Yeah I have been testing for 4 days now. This place used to be a good place to start with questions, but some of the super poster's make it un enjoyable. People with 100k posts trolling. Very interesting, I seem to finding that it is more linked to power limit than temperature. I won't even bother with any further explaining but I know it wasn't strictly temps and I wanted to understand deeper. One thing I have found out is it has nothing to do with CPU load (at least in my case). 

     
     
    If what you found out in bold is the result of your testing, your testing is obviously flawed. The caveat 'in your case' is absurd. Your system is standard. It would utilize the hardware the same as any other. Two folks here, have told you a simple fact: CPU loading or utilization affect  GPU utilization including boost clocks. Anyone telling you otherwise, is obviously in need of the same help you are.  I am sorry that your psyche is so fragile that you see your glaring mistakes that we've have tried to patiently point out (and which you respond with insulting character assaults again) as trolling. Indeed, it is not strictly temps. I made that perfectly clear.  Could you explain to us why you, as a fact, ELIMINATED the CPU as a factor in how the GPU boosts? That would be swell, as that is new theory for me. You claim you came here to learn and all you did was argue, insult those trying to help you and end up posting disinformation.  Yeah, this used to be a good place...until folks like you make me wonder why I bother to volunteer my time.

    EVGA X99 FTWK / i7 6850K 4.5ghz / GTX1080 Ti FE Hybrid / 32GB Corsair LPX 3200mhz / Samsung 850Pro 256GB / Corsair AX1200 / Window 10 Pro
    Fire Strike                        24,163
    Fire  Strike Extreme        14,452 
    Fire Strike Ultra                7,711 
    Time Spy                        10,357

    #27
    kageroo1
    New Member
    • Total Posts : 15
    • Reward points : 0
    • Joined: 2020/11/11 06:21:48
    • Status: offline
    • Ribbons : 0
    Re: Question about boost clock 2020/11/25 13:25:20 (permalink)
    there's not much to discuss, the cooler the gpu the more it will boost and stay there.

    your gpu is way too hot it will naturally downclock therefore.


    I'm running my 1070 at 49C max and it's *never* down clocking under load from 2000 MHz.

    It's that simple really.
    #28
    HeavyHemi
    Omnipotent Enthusiast
    • Total Posts : 14749
    • Reward points : 0
    • Joined: 2008/11/28 20:31:42
    • Location: Western Washington
    • Status: offline
    • Ribbons : 107
    Re: Question about boost clock 2020/11/25 14:01:55 (permalink)
    kageroo1
    there's not much to discuss, the cooler the gpu the more it will boost and stay there.

    your gpu is way too hot it will naturally downclock therefore.


    I'm running my 1070 at 49C max and it's *never* down clocking under load from 2000 MHz.

    It's that simple really.




    At 49C you've already dropped one clock bin ~39C. There are ways using the voltage curve where I start out at 2063mhz all the down at an idle temp of 17C to load temps of 45C. In other words, I don't drop a temp bin until I hit ~50C. I'm also slightly under volting. The GPU's are designed by default to run under full load at ~80C or so. Case air flow and other factors greatly influence this of course. As we have no idea of your set up or what else you're doing to get lower than typical temps... I'm not sure how to evaluate your input other than, it misses 90% of why the GPU will have fluctuating clocks.

    EVGA X99 FTWK / i7 6850K 4.5ghz / GTX1080 Ti FE Hybrid / 32GB Corsair LPX 3200mhz / Samsung 850Pro 256GB / Corsair AX1200 / Window 10 Pro
    Fire Strike                        24,163
    Fire  Strike Extreme        14,452 
    Fire Strike Ultra                7,711 
    Time Spy                        10,357

    #29
    kageroo1
    New Member
    • Total Posts : 15
    • Reward points : 0
    • Joined: 2020/11/11 06:21:48
    • Status: offline
    • Ribbons : 0
    Re: Question about boost clock 2020/11/25 17:21:08 (permalink)
    no, I'm just saying the biggest factor is temps, and if you want to avoid downclocking the easiest way to do that is to keep the card as cool as possible, it's not rocket surgery.

    "it misses 90% of why the GPU will have fluctuating clocks."

    on the contrary its addressing the biggest factor of "fluctuating clocks".
    #30
    Page: 12 > Showing page 1 of 2
    Jump to:
  • Back to Mobile