i read thorugh the HDR related threads and im still a bit confused
I have a late 2015 LG OLED (55EF9509), HDR Mode activates fine.
Is it confirmed on all systems that the vero defaults to 8 bit?
I tried the 444,10bit force but this resulted in playback issues with HDR stuff (display keeps switching black and HDR OFF ON)
Then i researched a bit more and found this:
So it seems my TV wont support 444 10 bit but only 422 10 or 12 bit? And i should paste this into /sys/class/amhdmitx/amhdmitx0/attr instead?
I tried “422,12bit” right now and this seems to work (atleast compared to 444) but dont have the time right now to really compare it against the default output.
OK did a bit more testing but no logs at this time:
422, 12bit : Works fine in 2160p60, but everything else (1080p) gives a green tint
420, 12bit: Works fine in 2160p60, but everything else (1080p) gives a green tint
420, 10bit: display turns off and on all the time, regardless of what resolution is set
I will try to do a bit more indepth testing tomorrow with proper logs.
And btw ultra deep color was of course enabled for the hdmi port of the vero.