EVGA

GTX 970 VideoLUT Table Entry Precision

Author
DOOM_NX
New Member
  • Total Posts : 5
  • Reward points : 0
  • Joined: 2014/10/21 22:26:38
  • Status: offline
  • Ribbons : 0
2014/10/21 22:42:12 (permalink)
Hello,
 
Does anybody know the LUT entry depth of GTX 970 cards over DVI? Is it limited to 8-bit?

I would like to upgrade from a Radeon HD 5850 and a 10-bit LUT is a feature I wouldn't like to lose, for gamma calibration purposes.
 
Please note I am not talking about 10-bit per channel (deep color) output here.

Thank you very much in advance! :)
#1

5 Replies Related Threads

    DOOM_NX
    New Member
    • Total Posts : 5
    • Reward points : 0
    • Joined: 2014/10/21 22:26:38
    • Status: offline
    • Ribbons : 0
    Re: GTX 970 VideoLUT Table Entry Precision 2014/10/23 00:54:52 (permalink)
    Well, I got a reply from the German Tech Support Manager. It might interest anyone looking for an answer.
     

    Hello,
     
    the GTX 970 do also work with 10 BIT LUT
     
    Regards,
     
     
     
    I guess I'm gonna order one. I'll report back with my findings as well.
    #2
    TG002
    New Member
    • Total Posts : 3
    • Reward points : 0
    • Joined: 2010/04/07 21:36:40
    • Status: offline
    • Ribbons : 0
    Re: GTX 970 VideoLUT Table Entry Precision 2014/11/02 01:00:57 (permalink)
    So how does this get enabled?  When I plug the HDMI cable into the 10-bit port on the TV, the TV displays an HDMI mis-match with the color depth.  There are then flickering green pixels in a fairly regular arrangement across the whole screen.  That does not happen when I put the cable into an 8-bit HDMI port on the TV.  The only color thing I see when I go through the NVIDIA Control Panel is the Color Depth drop-down in the Change Resolution tab and the only option I get is Highest (32-bit).  Thanks.
    #3
    DOOM_NX
    New Member
    • Total Posts : 5
    • Reward points : 0
    • Joined: 2014/10/21 22:26:38
    • Status: offline
    • Ribbons : 0
    Re: GTX 970 VideoLUT Table Entry Precision 2014/11/02 04:35:07 (permalink)
    This thread is about the gamma table resolution, not Deep Color or 30-bit output. I don't think you can get 10-bit per color channel output from a GeForce card. You would need a Quadro card.
    #4
    pn3umatic
    New Member
    • Total Posts : 6
    • Reward points : 0
    • Joined: 2015/10/01 16:18:58
    • Status: offline
    • Ribbons : 0
    Re: GTX 970 VideoLUT Table Entry Precision 2015/10/01 16:21:58 (permalink)
    Hi DOOM_NX
    Did you find an answer to your question?    I would also like to know...
    Thanks
    #5
    DOOM_NX
    New Member
    • Total Posts : 5
    • Reward points : 0
    • Joined: 2014/10/21 22:26:38
    • Status: offline
    • Ribbons : 0
    Re: GTX 970 VideoLUT Table Entry Precision 2015/10/01 18:32:08 (permalink)
    pn3umatic
    Hi DOOM_NX
    Did you find an answer to your question? I would also like to know...
    Thanks

    Not really... I decided to stay with my Radeon HD5850 for the time being... Most probably I will upgrade to an AMD card once again, just to be safe...
    #6
    Jump to:
  • Back to Mobile