Segui il video qui sotto per vedere come installare il nostro sito come web app sulla tua schermata principale.
Nota: Questa funzionalità potrebbe non essere disponibile in alcuni browser.
E' verissimo ma penso che anche il Radiance ci metta qualcosa di suo. Ho ordinato Dr.hdmi, vediamo se risolvo...
There seems to be some confusion about the best output format to choose when using the Radiance. As always we STRONGLY recommend you use 4:2:2 to and from the Radiance. This gives 12-bits per color.
We do NOT support the so called "deep color" which is also 12-bits for all current HDMI chips, and so NOT any deeper color than 4:2:2 which has been around since the HDMI 1.0. specification. Deep color can cause issues with cables so we have avoided adding it since for current consumer sources, it had virtually no advantage and does have disadvantages (e.g. cabling).
Concerning noise with test patterns with 4:4:4 output that is not seen with 4:2:2 output: When you output 4:4:4 from the Radiance you get 8-bits per pixel. Since, due to internal processing and calibration, the Radiance is modifying the levels, it is important to choose 4:2:2 (when the display properly supported it) so that you get more output bits per color than the compressed 8-bit consumer input sources. In fact you may be seeing the limits of the 4:4:4 8-bits format and dithering when you choose to output 4:4:4 from the Lumagen.
Another factor to consider is that there is exactly one pipeline for all output formats, until video reaches the final output stage where the Radiance converts to the output format and, if appropriate, dithers. The 4:2:2 output matches the 12-bit internal pipeline and so is not dithered, while the 4:4:4 output is dithered to 8-bits, right before being sent to the HDMI transmitter chip. This is the only difference inside the Radiance.
So, I believe whatever noise you see in 4:4:4 test patterns is a result of the 4:4:4 output limitations.
For Radiance 4:2:2 output, dither is off for "Auto", so either auto or off gives you the full 12-bits from our calibration pipeline. You can select dither to 10, or 8 bits, but only do so if you see an improvement in the image quality with one of these settings. Otherwise leave dither as Auto.
For Radiance RGB and 4:4:4 output, the maximum precision is 8-bits, so Auto dithers to 8-bits.
The 6-bit and 7-bit dither options are really for older Plasma and LCD displays. The image will appear a bit noisier but smoother on these older displays.
The static, dynamic, and truncate dither settings are probably not going to be visible. They only exist since we couldn't decide which would be best for every display, and so left it as an "exercise for the reader" to determine if one was better, or if these control settings are moot.
Conclusion: It's probably not worth your time to do anything but leave the dither settings at their factory default. It's much more important to select the best output color format - which is normally 4:2:2 for HDMI interfaces.
Randy Freeman