Furious.George
New Member
- Total Posts : 2
- Reward points : 0
- Joined: 2016/11/14 21:26:23
- Status: offline
- Ribbons : 0
This is surprising to me because 4:2:0 should require less bandwidth.
My display (Sony XBR-65X850C) is purported to support 10-bit.
I found a comparison video over at avs forum to test. The 10 bit side looks less banded than the 8-bit side, irrespective of my color depth setting, however the 10-bit side doesn't look perfectly smooth either.
My question is twofold:.
First, is there a comprehensive list of color depth supported at different frequency and color formats, similar to:
... and, secondly, can anyone devise a test for support whereby I could actually see a difference?
Thanks in advance.
|
Furious.George
New Member
- Total Posts : 2
- Reward points : 0
- Joined: 2016/11/14 21:26:23
- Status: offline
- Ribbons : 0
Re: Disappearing 10-bit option depending on the 4K format. There with 4:2:2, not with 4:2
2016/11/15 01:12:25
(permalink)
I had a couple of links removed, presumably because I'm new.
For those interested:
The first was a forum post with a 10 bit display test from avs forums that comes up first or among the first results when Google searching "10 bit panel test", and similar queries.
The second was a chart from an HDMI 2.0 FAQ at HDMI org. The row headings are resultions@hz and the column headings are bit rates 8/10/12/16. The fields are then populated with chroma subsampling rates (e.g. 4:4:4 or 4:2:0).
Try Google searching "What are the 4K formats supported by HDMI 2.0?"
|
Brimy
Superclocked Member
- Total Posts : 220
- Reward points : 0
- Joined: 2013/02/21 07:58:57
- Status: offline
- Ribbons : 1
Re: Disappearing 10-bit option depending on the 4K format. There with 4:2:2, not with 4:2
2016/11/15 03:19:39
(permalink)
Have you tried the RGB for output color format? I feel it could be a Nvidia driver issue/our misunderstanding cause i only get 8 and 12 bit choices for my 1080p sony tv when the tv is 24bit capable. https://www.dpreview.com/forums/post/53072517
post edited by Brimy - 2016/11/15 03:52:28
P630 | P8Z77-V PRO | i5 3570k@4.7Ghz | ASUS STRIX-GTX1060-6G-GAMING | CM Extreme Power Plus - 600W | 1x250GB 850EVO & 2x Seagate Barracuda 1 TB HDD | Patriot Viper 3 Series DDR3 16GB 2200MHz | XSPC Rs360 Kit | CM Storm QuickFire TK | Corsair M90 | Acer GN246HL | Canon PIXMA MG5320 | Windows 7 Ultimate 64bit
|
ksgnow2010
ACX Member
- Total Posts : 255
- Reward points : 0
- Joined: 2015/09/21 17:56:35
- Status: offline
- Ribbons : 0
Re: Disappearing 10-bit option depending on the 4K format. There with 4:2:2, not with 4:2
2016/11/15 05:56:23
(permalink)
Make sure you have an HDMI cable that is fast enough. Amazon has an "Amazon Basics Latest Revision" HDMI cable that is low cost and works great. I had a similar issue and thought it was the card, driver, monitor...I put in the above cable and can do full color at 60 Hz.
|
raidflex
Superclocked Member
- Total Posts : 180
- Reward points : 0
- Joined: 2008/04/10 21:58:21
- Location: USA
- Status: offline
- Ribbons : 2
Re: Disappearing 10-bit option depending on the 4K format. There with 4:2:2, not with 4:2
2016/11/15 06:29:49
(permalink)
With my Samsung JS8500 and GTX 1050 I receive 4k 60Hz 4:4:4 8-bit, 4:2:2 10/12-bit and 4:2:0 10/12/16-bit. This is using NVIDIA 375.70 and Windows 10 pro x64. You cannot do 4k 60Hz 4:4:4 10-bit because the HDMI spec does not have enough bandwidth. Also make sure that your TV is set to UHD deep color. I have my color setting on ycbcr422 10-bit, but you can use RGB.
post edited by raidflex - 2016/11/15 06:38:26
Gigabyte X99M-Gaming 5 i7 5820 (4.5GHz Vcore 1.27V) EVGA GTX 1080 FTW Samsung 850 Pro 120GB & 500 GB Corsair AX1200 Crucial Ballistix Sport 2400 4x4GB Caselabs Mercury S5 w/pedestal Windows 10 Pro x64 EK Supremacy Alphacool Nexxxos UT60 x2 Monsoon Series Two Premium DDC Reservoir Bitspower Compression Fittings Swiftech Helix Fans LG 34UM95 Monitor
|
Brimy
Superclocked Member
- Total Posts : 220
- Reward points : 0
- Joined: 2013/02/21 07:58:57
- Status: offline
- Ribbons : 1
Re: Disappearing 10-bit option depending on the 4K format. There with 4:2:2, not with 4:2
2016/11/15 07:02:17
(permalink)
raidflex With my Samsung JS8500 and GTX 1050 I receive 4k 60Hz 4:4:4 8-bit, 4:2:2 10/12-bit and 4:2:0 10/12/16-bit. This is using NVIDIA 375.70 and Windows 10 pro x64. You cannot do 4k 60Hz 4:4:4 10-bit because the HDMI spec does not have enough bandwidth. Also make sure that your TV is set to UHD deep color. I have my color setting on ycbcr422 10-bit, but you can use RGB.
or is it 10bit per channel meaning 10 for red, 10 for green and 10 for blue making it 30bit?
P630 | P8Z77-V PRO | i5 3570k@4.7Ghz | ASUS STRIX-GTX1060-6G-GAMING | CM Extreme Power Plus - 600W | 1x250GB 850EVO & 2x Seagate Barracuda 1 TB HDD | Patriot Viper 3 Series DDR3 16GB 2200MHz | XSPC Rs360 Kit | CM Storm QuickFire TK | Corsair M90 | Acer GN246HL | Canon PIXMA MG5320 | Windows 7 Ultimate 64bit
|
arestavo
CLASSIFIED Member
- Total Posts : 3005
- Reward points : 0
- Joined: 2008/02/06 06:58:57
- Location: Through the Scary Door
- Status: offline
- Ribbons : 8

Re: Disappearing 10-bit option depending on the 4K format. There with 4:2:2, not with 4:2
2016/11/15 08:30:20
(permalink)
Brimy
raidflex With my Samsung JS8500 and GTX 1050 I receive 4k 60Hz 4:4:4 8-bit, 4:2:2 10/12-bit and 4:2:0 10/12/16-bit. This is using NVIDIA 375.70 and Windows 10 pro x64. You cannot do 4k 60Hz 4:4:4 10-bit because the HDMI spec does not have enough bandwidth. Also make sure that your TV is set to UHD deep color. I have my color setting on ycbcr422 10-bit, but you can use RGB.
or is it 10bit per channel meaning 10 for red, 10 for green and 10 for blue making it 30bit?
It's the same thing, just some people use the old way of saying it and others use the new way.
EVGA affiliate code: 9ZWDWFNW6A (Don't forget to upload your invoice or no credit is given!) FOLD ON
|
Brimy
Superclocked Member
- Total Posts : 220
- Reward points : 0
- Joined: 2013/02/21 07:58:57
- Status: offline
- Ribbons : 1
Re: Disappearing 10-bit option depending on the 4K format. There with 4:2:2, not with 4:2
2016/11/15 10:37:16
(permalink)
arestavo
Brimy
raidflex With my Samsung JS8500 and GTX 1050 I receive 4k 60Hz 4:4:4 8-bit, 4:2:2 10/12-bit and 4:2:0 10/12/16-bit. This is using NVIDIA 375.70 and Windows 10 pro x64. You cannot do 4k 60Hz 4:4:4 10-bit because the HDMI spec does not have enough bandwidth. Also make sure that your TV is set to UHD deep color. I have my color setting on ycbcr422 10-bit, but you can use RGB.
or is it 10bit per channel meaning 10 for red, 10 for green and 10 for blue making it 30bit?
It's the same thing, just some people use the old way of saying it and others use the new way.
Okay.
P630 | P8Z77-V PRO | i5 3570k@4.7Ghz | ASUS STRIX-GTX1060-6G-GAMING | CM Extreme Power Plus - 600W | 1x250GB 850EVO & 2x Seagate Barracuda 1 TB HDD | Patriot Viper 3 Series DDR3 16GB 2200MHz | XSPC Rs360 Kit | CM Storm QuickFire TK | Corsair M90 | Acer GN246HL | Canon PIXMA MG5320 | Windows 7 Ultimate 64bit
|
HeavyHemi
Omnipotent Enthusiast
- Total Posts : 10929
- Reward points : 0
- Joined: 2008/11/28 20:31:42
- Location: Western Washington
- Status: offline
- Ribbons : 57
Re: Disappearing 10-bit option depending on the 4K format. There with 4:2:2, not with 4:2
2016/11/15 10:48:47
(permalink)
raidflex With my Samsung JS8500 and GTX 1050 I receive 4k 60Hz 4:4:4 8-bit, 4:2:2 10/12-bit and 4:2:0 10/12/16-bit. This is using NVIDIA 375.70 and Windows 10 pro x64. You cannot do 4k 60Hz 4:4:4 10-bit because the HDMI spec does not have enough bandwidth. Also make sure that your TV is set to UHD deep color. I have my color setting on ycbcr422 10-bit, but you can use RGB.
I'm not sure why you'd use compressed color format instead of RBG. While you won't notice in games, the desktop, especially text, is objectively worse. Windows Desktop is natively 8 bit so using 10 bit gains you nothing. I've never seen a 16 bit per channel setting.
EVGA E758/i7 980x 4.3ghz 1.36v 12GB Corsair Dominator 2000C9 GTX 980Ti w/GTX TITAN SC for PhysX Crucial M4 512GB SSD/2x WD RE4 Black 2TB Corsair H50 Cooler/Corsair AX1200/Window 10 Pro
|
raidflex
Superclocked Member
- Total Posts : 180
- Reward points : 0
- Joined: 2008/04/10 21:58:21
- Location: USA
- Status: offline
- Ribbons : 2
Re: Disappearing 10-bit option depending on the 4K format. There with 4:2:2, not with 4:2
2016/11/15 10:51:33
(permalink)
HeavyHemi
raidflex With my Samsung JS8500 and GTX 1050 I receive 4k 60Hz 4:4:4 8-bit, 4:2:2 10/12-bit and 4:2:0 10/12/16-bit. This is using NVIDIA 375.70 and Windows 10 pro x64. You cannot do 4k 60Hz 4:4:4 10-bit because the HDMI spec does not have enough bandwidth. Also make sure that your TV is set to UHD deep color. I have my color setting on ycbcr422 10-bit, but you can use RGB.
I'm not sure why you'd use compressed color format instead of RBG. While you won't notice in games, the desktop, especially text, is objectively worse. Windows Desktop is natively 8 bit so using 10 bit gains you nothing. I've never seen a 16 bit per channel setting.
I have a separate HTPC for just Movies/TV using Kodi.
Gigabyte X99M-Gaming 5 i7 5820 (4.5GHz Vcore 1.27V) EVGA GTX 1080 FTW Samsung 850 Pro 120GB & 500 GB Corsair AX1200 Crucial Ballistix Sport 2400 4x4GB Caselabs Mercury S5 w/pedestal Windows 10 Pro x64 EK Supremacy Alphacool Nexxxos UT60 x2 Monsoon Series Two Premium DDC Reservoir Bitspower Compression Fittings Swiftech Helix Fans LG 34UM95 Monitor
|
arestavo
CLASSIFIED Member
- Total Posts : 3005
- Reward points : 0
- Joined: 2008/02/06 06:58:57
- Location: Through the Scary Door
- Status: offline
- Ribbons : 8

Re: Disappearing 10-bit option depending on the 4K format. There with 4:2:2, not with 4:2
2016/11/15 11:01:01
(permalink)
HeavyHemi
raidflex With my Samsung JS8500 and GTX 1050 I receive 4k 60Hz 4:4:4 8-bit, 4:2:2 10/12-bit and 4:2:0 10/12/16-bit. This is using NVIDIA 375.70 and Windows 10 pro x64. You cannot do 4k 60Hz 4:4:4 10-bit because the HDMI spec does not have enough bandwidth. Also make sure that your TV is set to UHD deep color. I have my color setting on ycbcr422 10-bit, but you can use RGB.
I'm not sure why you'd use compressed color format instead of RBG. While you won't notice in games, the desktop, especially text, is objectively worse. Windows Desktop is natively 8 bit so using 10 bit gains you nothing. I've never seen a 16 bit per channel setting.
RGB doesn't support 10 bit color over HDMI 2.0 at 4K. so HDR won't look as good, hence using Chroma instead. http://www.acousticfrontiers.com/uhd-101-v2/
EVGA affiliate code: 9ZWDWFNW6A (Don't forget to upload your invoice or no credit is given!) FOLD ON
|
HeavyHemi
Omnipotent Enthusiast
- Total Posts : 10929
- Reward points : 0
- Joined: 2008/11/28 20:31:42
- Location: Western Washington
- Status: offline
- Ribbons : 57
Re: Disappearing 10-bit option depending on the 4K format. There with 4:2:2, not with 4:2
2016/11/15 11:01:02
(permalink)
raidflex
HeavyHemi
raidflex With my Samsung JS8500 and GTX 1050 I receive 4k 60Hz 4:4:4 8-bit, 4:2:2 10/12-bit and 4:2:0 10/12/16-bit. This is using NVIDIA 375.70 and Windows 10 pro x64. You cannot do 4k 60Hz 4:4:4 10-bit because the HDMI spec does not have enough bandwidth. Also make sure that your TV is set to UHD deep color. I have my color setting on ycbcr422 10-bit, but you can use RGB.
I'm not sure why you'd use compressed color format instead of RBG. While you won't notice in games, the desktop, especially text, is objectively worse. Windows Desktop is natively 8 bit so using 10 bit gains you nothing. I've never seen a 16 bit per channel setting.
I have a separate HTPC for just Movies/TV using Kodi.
Okay, but for the vast majority in typical use, 8 bit RGB full would be the correct setting.
EVGA E758/i7 980x 4.3ghz 1.36v 12GB Corsair Dominator 2000C9 GTX 980Ti w/GTX TITAN SC for PhysX Crucial M4 512GB SSD/2x WD RE4 Black 2TB Corsair H50 Cooler/Corsair AX1200/Window 10 Pro
|
HeavyHemi
Omnipotent Enthusiast
- Total Posts : 10929
- Reward points : 0
- Joined: 2008/11/28 20:31:42
- Location: Western Washington
- Status: offline
- Ribbons : 57
Re: Disappearing 10-bit option depending on the 4K format. There with 4:2:2, not with 4:2
2016/11/15 11:10:33
(permalink)
arestavo
HeavyHemi
raidflex With my Samsung JS8500 and GTX 1050 I receive 4k 60Hz 4:4:4 8-bit, 4:2:2 10/12-bit and 4:2:0 10/12/16-bit. This is using NVIDIA 375.70 and Windows 10 pro x64. You cannot do 4k 60Hz 4:4:4 10-bit because the HDMI spec does not have enough bandwidth. Also make sure that your TV is set to UHD deep color. I have my color setting on ycbcr422 10-bit, but you can use RGB.
I'm not sure why you'd use compressed color format instead of RBG. While you won't notice in games, the desktop, especially text, is objectively worse. Windows Desktop is natively 8 bit so using 10 bit gains you nothing. I've never seen a 16 bit per channel setting.
RGB doesn't support 10 bit color over HDMI 2.0 at 4K. so HDR won't look as good, hence using Chroma instead. http://www.acousticfrontiers.com/uhd-101-v2/
I'm perfectly well aware of that. And well aware that neither HDTV being discussed here natively supports HDR they use a compatibility mode.. ON THE DESKTOP, for the vast majority of use, RGB FULL 8 BIT is the correct setting.
EVGA E758/i7 980x 4.3ghz 1.36v 12GB Corsair Dominator 2000C9 GTX 980Ti w/GTX TITAN SC for PhysX Crucial M4 512GB SSD/2x WD RE4 Black 2TB Corsair H50 Cooler/Corsair AX1200/Window 10 Pro
|
raidflex
Superclocked Member
- Total Posts : 180
- Reward points : 0
- Joined: 2008/04/10 21:58:21
- Location: USA
- Status: offline
- Ribbons : 2
Re: Disappearing 10-bit option depending on the 4K format. There with 4:2:2, not with 4:2
2016/11/15 11:12:40
(permalink)
HeavyHemi
arestavo
HeavyHemi
raidflex With my Samsung JS8500 and GTX 1050 I receive 4k 60Hz 4:4:4 8-bit, 4:2:2 10/12-bit and 4:2:0 10/12/16-bit. This is using NVIDIA 375.70 and Windows 10 pro x64. You cannot do 4k 60Hz 4:4:4 10-bit because the HDMI spec does not have enough bandwidth. Also make sure that your TV is set to UHD deep color. I have my color setting on ycbcr422 10-bit, but you can use RGB.
I'm not sure why you'd use compressed color format instead of RBG. While you won't notice in games, the desktop, especially text, is objectively worse. Windows Desktop is natively 8 bit so using 10 bit gains you nothing. I've never seen a 16 bit per channel setting.
RGB doesn't support 10 bit color over HDMI 2.0 at 4K. so HDR won't look as good, hence using Chroma instead. http://www.acousticfrontiers.com/uhd-101-v2/
I'm perfectly well aware of that. And well aware that neither HDTV being discussed here natively supports HDR they use a compatibility mode.. ON THE DESKTOP, for the vast majority of use, RGB FULL 8 BIT is the correct setting.
Actually the Samsung JS8500 does support HDR. But yes on my LG 34in Ultrawide I use RGB.
post edited by raidflex - 2016/11/15 11:15:35
Gigabyte X99M-Gaming 5 i7 5820 (4.5GHz Vcore 1.27V) EVGA GTX 1080 FTW Samsung 850 Pro 120GB & 500 GB Corsair AX1200 Crucial Ballistix Sport 2400 4x4GB Caselabs Mercury S5 w/pedestal Windows 10 Pro x64 EK Supremacy Alphacool Nexxxos UT60 x2 Monsoon Series Two Premium DDC Reservoir Bitspower Compression Fittings Swiftech Helix Fans LG 34UM95 Monitor
|
arestavo
CLASSIFIED Member
- Total Posts : 3005
- Reward points : 0
- Joined: 2008/02/06 06:58:57
- Location: Through the Scary Door
- Status: offline
- Ribbons : 8

Re: Disappearing 10-bit option depending on the 4K format. There with 4:2:2, not with 4:2
2016/11/15 11:25:44
(permalink)
As does the KS8000 that I've got coming in. It's a murky world out there for HDR so far.... companies are flaunting HDR on TVs that should not be rated as HDR. I'm hoping Samsung will release a One Connect box with DP 1.3 or higher. I've emailed them requesting as much, because DP will eliminate the need for using Chroma compression.
EVGA affiliate code: 9ZWDWFNW6A (Don't forget to upload your invoice or no credit is given!) FOLD ON
|