EVGA

When Did 144Hz Become the Standard for Monitors?

Author
Dukman
FTW Member
  • Total Posts : 1480
  • Reward points : 0
  • Joined: 2009/08/15 09:47:59
  • Location: They keep telling me Zion
  • Status: offline
  • Ribbons : 6
2016/09/18 10:23:51 (permalink)
I confess that I don't even try to keep abreast of all the changes happening in the computer industry.  Generally only starting to research and learn about a segment when I need to add or replace something in that segment.  

Which leads me to monitors.  I've been perfectly happy with my (now apparently ancient) 60hz Dell U2713HM monitors and actually have no plans to change them anytime soon.   But I've been helping a friend who is struggling to free himself on console peasantry gather all the parts needed to build his first ever gaming PC.  So he's been doing alot of research.  He was mentioning the prices of 165 and 144Hz monitors and I suggested looking at 120Hz monitors.  That's when I learned that anything below 144Hz is pretty much gone.    Well except for Dell.   They are still clinging tenaciously  to 60Hz.
 
I know it has to do with the panels, TN vs IPS and what not.  But when did the 144Hz thing become the standard minimum?
 
And since you're educating me (I hope), what are the advantages of the higher refresh rates beyond the obvious faster refresh.  
 
 
Edit:  Okay maybe it's not a standard.  But it seems like the 120Hz stuff just went poof and the 60Hz stuff, while still plenty out there is on the decline.
post edited by Dukman - 2016/09/18 10:30:45

Heatware



 
 
#1

11 Replies Related Threads

    WackyWRZ
    SSC Member
    • Total Posts : 603
    • Reward points : 0
    • Joined: 2011/11/03 07:52:46
    • Location: Raleigh, NC
    • Status: offline
    • Ribbons : 7
    Re: When Did 144Hz Become the Standard for Monitors? 2016/09/18 18:34:16 (permalink)
    I think that initially a lot of the 120Hz stuff was mainly Korean ones that had overclocked panels in them or panels that were strobed.  I'd say 60Hz is still the standard for a NON GAMING monitor.  There are some 144Hz IPS panels out now - but the majority of them are still TN.  Once G-Sync and Freesync came out the 144Hz seems to have started being "standard" for gaming rigs - especially with cards that can easily push over 60FPS now.  Some people can't really see or tell the higher refresh as much as the next person but I went from 1080@60 -> 1440@60 -> 1440@144/Gsync and each jump was quite noticeable.  
     
    When I was gaming at 60hz my rig could easily push 100+ and I would get some screen tearing and have trouble seeing / focusing on some things.  Enough to the point where I would get headaches after an hour or so of gaming.  Once I got the 144Hz panels all this went away.  The games were noticeably smoother and I could tell that my eyes weren't straining nearly as much.  Now since I have my desktop set to 144hz I can even tell a difference between scrolling web pages between my computer and a 60hz machine.  I really can't tell much difference between 120-144Hz but that's probably because I have gsync.

    CASE: Anidees AI Crystal XL | MOBO: ASUS X470 Crosshair VII | CPU: AMD Ryzen 3600 | RAM: G.Skill Trident Z DDR4-3200 2x8GB B-Die | SSD: Samsung 970 EVO Plus NVME 1TB | PSU: EVGA SuperNOVA 650 P2 | GPU: AMD RX5700 XT | MON: AOC CU34G2X 1440p/144hz 34" Ultrawide + Dell S2716DG 1440p/144Hz | Cooling: Full custom loop, Heatkiller D5 150, Aquacomputer D5 Next + Quadro, EK Supremacy Evo, EK Coolstream PE 480, Bykski 5700XT | HEAT: 20-0-0 - http://www.heatware.com/u/104452

    #2
    agent8
    CLASSIFIED ULTRA Member
    • Total Posts : 5129
    • Reward points : 0
    • Joined: 2007/03/03 20:28:24
    • Status: offline
    • Ribbons : 20
    Re: When Did 144Hz Become the Standard for Monitors? 2016/09/18 19:55:47 (permalink)
    I can see some difference between a 60Hz and a 120Hz but after that, I don't really see that much. I am playing on a 1440 overclocked to 90Hz. It will clock to almost 120Hz but I didn't really see any difference. 
    #3
    sethleigh
    SSC Member
    • Total Posts : 796
    • Reward points : 0
    • Joined: 2015/08/12 11:27:56
    • Status: offline
    • Ribbons : 4
    Re: When Did 144Hz Become the Standard for Monitors? 2016/09/23 01:48:36 (permalink)
    I haven't personally witnessed playing a game on a monitor with greater than 60hz refresh, so I can't really speak with authority on the subject.
     
    When standing still in a game obviously it won't make a difference at all. I think the difference will be when you're moving fast, or turning sharply. In a shooter game if you turn 180 degrees in a half second the view is changing by 6 degrees per frame update at 60hz refresh, which will drop to 3 degrees per frame update on a 120hz monitor, or only 2.5 degrees per frame at 144hz and 2.18 degrees per frame at 165hz. Obviously as a percentage change there's a sharply diminishing return with the higher refresh rates.

    I see this on my machine in games that run at 60fps on my 4k monitor. If I turn sharply the image I see gets blurry while I'm turning. I think this reduction in how much the view is changing per frame update during the turn should make this less blurry.
     
    Of course this also depends on the monitor's pixel response time, which is the time it takes for a pixel to change from one color to another color. If it can't change colors fast enough, the refresh rate advantage will be negated. 1000ms/144hz = 6.9ms. If the monitor's response time isn't at least lower than 6.9ms then some of the benefit of 144hz will be lost. I think that's right around where good monitors come up, and faster than cheaper monitors.
     
    Actually, I just read some really interesting articles while typing this response up. 
     
    Factors affecting PC monitor responsiveness.
    Motion blur website showing various types of blur and explaining them.
    Exploring input lag inside and out.
     
    Erma Gerd: check this article out showing measured blur between 60hz and 120hz.
    post edited by sethleigh - 2016/09/23 01:52:05

    Happy EVGA customer.  Affiliate Code: 0Y7-1VU-ATW2
     
    GigaByte X570 Aorus Master, AMD Ryzen 5900x under Optimus Foundation block, 32gb G.Skill DDR4 @ 3800 MHz 14-14-14-28, EVGA 3080ti FTW3 Ultra under Optimus block, 2TB 980 Pro SSD, EVGA Supernova G6 850W PS, ASUS 34" 3440x1440p 120Hz ultrawide, Lenovo 24" 1080p secondary monitor, Win 10

    #4
    sethleigh
    SSC Member
    • Total Posts : 796
    • Reward points : 0
    • Joined: 2015/08/12 11:27:56
    • Status: offline
    • Ribbons : 4
    Re: When Did 144Hz Become the Standard for Monitors? 2016/09/23 02:11:58 (permalink)
    Holy crap. After reading these articles I found a nice calibration article for my monitor brand and went into the OSD and changed the OD (overdrive) setting on my monitor and could see in real time the benefit to the motion blur problem from my eyes tracking a moving image. The article I found recommended the "fastest" setting, but I found that going from "faster" to "fastest" suddenly introduced some overdrive artifacts, and that in fact "faster" looked best when using the UFO test as a test. Now I can hardly wait to go into a game and see if it looks any better while running around shooting things. I'll test it in Paragon tomorrow.

    Happy EVGA customer.  Affiliate Code: 0Y7-1VU-ATW2
     
    GigaByte X570 Aorus Master, AMD Ryzen 5900x under Optimus Foundation block, 32gb G.Skill DDR4 @ 3800 MHz 14-14-14-28, EVGA 3080ti FTW3 Ultra under Optimus block, 2TB 980 Pro SSD, EVGA Supernova G6 850W PS, ASUS 34" 3440x1440p 120Hz ultrawide, Lenovo 24" 1080p secondary monitor, Win 10

    #5
    agent8
    CLASSIFIED ULTRA Member
    • Total Posts : 5129
    • Reward points : 0
    • Joined: 2007/03/03 20:28:24
    • Status: offline
    • Ribbons : 20
    Re: When Did 144Hz Become the Standard for Monitors? 2016/09/23 02:17:54 (permalink)
    That UFO test on my old monitor was embarrassing. I guess I had gotten used to screen tearing too because once it was gone with my new monitor, it took me awhile to adapt to the awesomeness. I didn't think people had much luck overclocking those 4k Monoprice monitors.... That's cool!
    #6
    sethleigh
    SSC Member
    • Total Posts : 796
    • Reward points : 0
    • Joined: 2015/08/12 11:27:56
    • Status: offline
    • Ribbons : 4
    Re: When Did 144Hz Become the Standard for Monitors? 2016/09/23 10:36:04 (permalink)
    Yeah I've never tried to overclock any monitor. I'm not actually sure how one does that. I could try looking it up, but it hasn't been a big deal for me.
     
    Yeah, my 10-year old Dell 1920x1200 monitor does badly on the UFO blur test. It was revolutionary when I first bought it, as it replaced two 17" CRTs. I was kind of shocked, or at least pleasantly surprised, that my Monoprice 28" 4K monitor responded so well in the motion blur test. The motion blur itself was still there, but switching OD from Off to Faster almost completely eliminated the ghosting.

    I'm glad I responded to this thread, and started doing some more research on monitor image quality. I had no idea. I thought it was all refresh rate and pixel response time. I didn't realize there were these other technologies like Light Boost or the other "blinking" techniques that eliminated so much other sources of motion blur.

    Happy EVGA customer.  Affiliate Code: 0Y7-1VU-ATW2
     
    GigaByte X570 Aorus Master, AMD Ryzen 5900x under Optimus Foundation block, 32gb G.Skill DDR4 @ 3800 MHz 14-14-14-28, EVGA 3080ti FTW3 Ultra under Optimus block, 2TB 980 Pro SSD, EVGA Supernova G6 850W PS, ASUS 34" 3440x1440p 120Hz ultrawide, Lenovo 24" 1080p secondary monitor, Win 10

    #7
    sethleigh
    SSC Member
    • Total Posts : 796
    • Reward points : 0
    • Joined: 2015/08/12 11:27:56
    • Status: offline
    • Ribbons : 4
    Re: When Did 144Hz Become the Standard for Monitors? 2016/09/23 11:53:57 (permalink)
    Btw, I looked up how to overclock the monitor, and this Monoprice 4K monitor doesn't cleanly overclock at all. I tried 72hz and it worked fine, but resulted in frame skipping. I used the UFO motion blur test to see this, and the result was that instead of smoothly moving from left to right, the UFOs kind of hitched their way across the screen. I kept going down by 3hz and retesting, and this hitching was seen even at 63 hz. It's possible I might have gotten hitch-free 61 or 62hz, but it wouldn't be worth the bother. 

    Oh well, I think we're poised in the next year to start seeing 120hz g-sync displayport 1.3 4K monitors. I'll have to start saving up for one. Once I get another 1080 and go SLI I could actually make good use of such a monitor, or I could do it now on older games that far exceed 60 fps on my single 1080.
     

    Happy EVGA customer.  Affiliate Code: 0Y7-1VU-ATW2
     
    GigaByte X570 Aorus Master, AMD Ryzen 5900x under Optimus Foundation block, 32gb G.Skill DDR4 @ 3800 MHz 14-14-14-28, EVGA 3080ti FTW3 Ultra under Optimus block, 2TB 980 Pro SSD, EVGA Supernova G6 850W PS, ASUS 34" 3440x1440p 120Hz ultrawide, Lenovo 24" 1080p secondary monitor, Win 10

    #8
    agent8
    CLASSIFIED ULTRA Member
    • Total Posts : 5129
    • Reward points : 0
    • Joined: 2007/03/03 20:28:24
    • Status: offline
    • Ribbons : 20
    Re: When Did 144Hz Become the Standard for Monitors? 2016/09/23 12:49:55 (permalink)
    Yeah, that was the one bad thing about those Monoprice monitors, otherwise they are awesome. I got an Ebay Korean monitor and it overclocks to 100Hz without breaking a sweat and it was under $300 bucks for 1440P and zero dead pixels. Kind of a gamble but it worked in my case.
    #9
    fearpoint
    CLASSIFIED Member
    • Total Posts : 2966
    • Reward points : 0
    • Joined: 2006/12/16 21:53:57
    • Status: offline
    • Ribbons : 3
    Re: When Did 144Hz Become the Standard for Monitors? 2016/09/23 13:46:02 (permalink)
    When Gsync launched and high refresh rates stopped being largely a negative.
    #10
    sethleigh
    SSC Member
    • Total Posts : 796
    • Reward points : 0
    • Joined: 2015/08/12 11:27:56
    • Status: offline
    • Ribbons : 4
    Re: When Did 144Hz Become the Standard for Monitors? 2016/09/23 14:09:00 (permalink)
    agent8
    Yeah, that was the one bad thing about those Monoprice monitors, otherwise they are awesome. I got an Ebay Korean monitor and it overclocks to 100Hz without breaking a sweat and it was under $300 bucks for 1440P and zero dead pixels. Kind of a gamble but it worked in my case.

    Yeah I did a lot of looking at various monitor options, and was tempted by those Korean 1440p screens, but in the end I decided that 1440p was just a temporary solution since in 2 or 3 years when you can buy a card with the power of a 1080 for $250 I think 4K will become the de facto gaming standard. Of course now I realize the point was moot because my current 60hz 4K monitor is really just a temporary solution until they come out with 120hz g-sync 4K monitors, which looks like this next year. Still, this 60hz 4K monitor will make a great 2nd monitor on the side when I end up purchasing the new higher refresh monitor someday. I gotta tell ya, 4K looks stunning. I'd like to see what the difference is though between 60hz and 120 or 144hz. Now that I've read the articles I linked above I think it's more of a difference than I gave it credit for.

    Happy EVGA customer.  Affiliate Code: 0Y7-1VU-ATW2
     
    GigaByte X570 Aorus Master, AMD Ryzen 5900x under Optimus Foundation block, 32gb G.Skill DDR4 @ 3800 MHz 14-14-14-28, EVGA 3080ti FTW3 Ultra under Optimus block, 2TB 980 Pro SSD, EVGA Supernova G6 850W PS, ASUS 34" 3440x1440p 120Hz ultrawide, Lenovo 24" 1080p secondary monitor, Win 10

    #11
    FattysGoneWild
    CLASSIFIED Member
    • Total Posts : 2660
    • Reward points : 0
    • Joined: 2011/04/24 18:45:43
    • Location: KFC
    • Status: offline
    • Ribbons : 3
    Re: When Did 144Hz Become the Standard for Monitors? 2016/09/23 21:22:02 (permalink)
    sethleigh
    agent8
    Yeah, that was the one bad thing about those Monoprice monitors, otherwise they are awesome. I got an Ebay Korean monitor and it overclocks to 100Hz without breaking a sweat and it was under $300 bucks for 1440P and zero dead pixels. Kind of a gamble but it worked in my case.

    Yeah I did a lot of looking at various monitor options, and was tempted by those Korean 1440p screens, but in the end I decided that 1440p was just a temporary solution since in 2 or 3 years when you can buy a card with the power of a 1080 for $250 I think 4K will become the de facto gaming standard. Of course now I realize the point was moot because my current 60hz 4K monitor is really just a temporary solution until they come out with 120hz g-sync 4K monitors, which looks like this next year. Still, this 60hz 4K monitor will make a great 2nd monitor on the side when I end up purchasing the new higher refresh monitor someday. I gotta tell ya, 4K looks stunning. I'd like to see what the difference is though between 60hz and 120 or 144hz. Now that I've read the articles I linked above I think it's more of a difference than I gave it credit for.


     
    Put it this way. I have always owned 60hz monitors. I currently have Dell U2412M, U2415 and monitor in sig. When you move to a 144hz G-Sync monitor. You will be awe struck. And I will never EVER go back to using a 60hz monitor again. Everything is just so smooth now. 

    HP Omen 880-160se custom ordered
    OS: Windows 10 64 bit
    MOBO: HP Tampa2
    CPU: Intel i7 8700k @4.8GHz
    RAM: 32GB DDR4 2400
    GPU: PNY XLR8 RTX 3080
    PSU: Delta 750w 80 Plus Platinum 
    NVMe M.2 SSD: Samsung 512GB MZVLW512HMJP
    SSD: 250GB Samsung 860 EVO
    HDD: 2TB Seagate Barracuda ST2000DM001
    Sound: Logitech Z623 THX 2.1 Speakers
    Monitor: Dell S2716DG 2560x1440 @144Hz G-Sync calibrated with ColorMunki Display
    Keyboard: HP Omen 1100
    Mouse: HP Omen 600
     
     
    #12
    Jump to:
  • Back to Mobile