2016/07/07 18:05:58
STR8_AN94BALLER
as the title say, does the GTX 1080 support 10 bit color output on DP connection to a true 10 bit monitor?
2016/07/07 23:30:35
STR8_AN94BALLER
found some info that nvidia block 10 bit functionality on gtx graphics card, forcing people to buy quadro?
 
however, this feature has been reenabled some months ago on a driver revision, can anyone confirm? especially for the 1080?
2016/07/08 05:54:10
CoercionShaman
Stated in a couple of reviews.  Here is one.
 
https://www.techpowerup.c...eForce_GTX_1080/3.html
 
High Dynamic Range, or HDR, isn't a new concept in photography. It isn't even new to PC gaming, as some of the oldest games with HDR (using simple bloom effects) date back to the Valve Source engine (early 2000s). Those apps, however, used the limited 24-bit (8-bit per color, 16.7 million colors in all) color palette to emulate HDR. Modern bandwidth-rich GPUs such as the GTX 1080 have native support for large color palettes, such as 10-bit (1.07 billion colors) and 12-bit (68.7 billion colors), to accelerate HDR content without software emulation. This includes support for 10-bit and 12-bit HVEC video decoding at resolutions of up to 4K @ 60 Hz, or video encoding at 10-bit for the same resolution.
 
EDIT:  I guess nothing there specifically says it is over DP.  Apologies.
2016/07/08 06:29:17
STR8_AN94BALLER
CoercionShaman
Stated in a couple of reviews.  Here is one.
 
https://www.techpowerup.c...eForce_GTX_1080/3.html
 
High Dynamic Range, or HDR, isn't a new concept in photography. It isn't even new to PC gaming, as some of the oldest games with HDR (using simple bloom effects) date back to the Valve Source engine (early 2000s). Those apps, however, used the limited 24-bit (8-bit per color, 16.7 million colors in all) color palette to emulate HDR. Modern bandwidth-rich GPUs such as the GTX 1080 have native support for large color palettes, such as 10-bit (1.07 billion colors) and 12-bit (68.7 billion colors), to accelerate HDR content without software emulation. This includes support for 10-bit and 12-bit HVEC video decoding at resolutions of up to 4K @ 60 Hz, or video encoding at 10-bit for the same resolution.
 
EDIT:  I guess nothing there specifically says it is over DP.  Apologies.




so if I use DP on a true 10-bit monitor then it will be compatible right?
and I can select 10 bit like in this video?
 
https://www.youtube.com/watch?v=7-PSFc1nc2c
2016/07/08 06:51:35
ipkha
It should do 10 and 12 bit color over DP and HDMI. I forget which v erosion of DP added HDR support, but the 2.0b HDMI and latest DP version will output it just fine.
2016/07/08 08:37:22
STR8_AN94BALLER
ipkha
It should do 10 and 12 bit color over DP and HDMI. I forget which v erosion of DP added HDR support, but the 2.0b HDMI and latest DP version will output it just fine.



there is 12 bit display? :0
2016/07/08 08:44:40
ipkha
Higher end 4k TVs support it, and maybe extremely expensive monitors.
2016/07/08 08:49:45
CoercionShaman
I can select 10 bit in the drop down when I am connected to monitors that are 10 bit capable, yes.
2016/07/08 10:12:32
STR8_AN94BALLER
CoercionShaman
I can select 10 bit in the drop down when I am connected to monitors that are 10 bit capable, yes.


ok thanks
 
which monitor is this?
2016/07/08 10:46:40
CoercionShaman
We have professional grade monitors at work that I was referring to.  Some systems run the Quadro cards, others the 'normal' cards.  It works on the 'normal' cards as well.  Probably not something that is in most homes because of the prices.
 
My Dell U3415W allows me to select it, but it is actually 8 bit with dithering, so I don't know if there is actually any real benefit.
12

Use My Existing Forum Account

Use My Social Media Account