• Breaking News

    Thursday, December 12, 2019

    Hardware support: (Anandtech) Early TSMC 5nm Test Chip Yields 80%, HVM Coming in H1 2020

    Hardware support: (Anandtech) Early TSMC 5nm Test Chip Yields 80%, HVM Coming in H1 2020


    (Anandtech) Early TSMC 5nm Test Chip Yields 80%, HVM Coming in H1 2020

    Posted: 11 Dec 2019 04:51 PM PST

    How HDR displays actually work

    Posted: 11 Dec 2019 02:45 PM PST

    I don't know about you guys but it has puzzled me for a long time how HDR displays actually work, all the images you see on website just give you some generic scenery photo with mountains and sunsets, except one side has more saturation or contrast than the other and they tell you it's HDR.

    Or you might run into some people on forums pretending to know but instead they confuse HDR display with other HDR stuff like HDR rendering, HDR imaging, and that shitty HDR effect people put on their photos.

    Well I'm tired of all that shit so here's an explanation of how it actually works. Obviously there's no way for me to actually show you what HDR looks like since we're working on an SDR webpage here. But rather than just increasing saturation, we can actually make images that would be analogous to how HDR and SDR differ in a more scientific way.

    Some questions I've had that I think would help you understand:

    1. Why can't I just increase the brightness on my SDR display to make it HDR?
    2. Why aren't all OLED displays HDR?
    3. What does contrast ratio have to do with HDR?
    4. Wouldn't a good HDR display just blind me when doing regular work?

    Well first let's just dive into the comparison:

    https://i.imgur.com/qrlUs6V.png

    Again, this is an analogy, obviously you cannot see real life or even HDR from your SDR display. But all the relative deficiencies on an SDR display have been replicated here just with lots of exaggeration, so you know exactly why it's deficient, because no it's not reduced saturation like shitty ads would have you believe.

    The first thing you'll notice is this is a monochrome photo, because we're talking dynamic range here, not color space so color is unnecessary, we're only concerned with brightness.

    First up, the simulated HDR image. You should be noticing two things. First is how it's loosing some highlight detail on the walls. This is because even very bright screens cannot be as bright as bright things directly under sunlight, so they'll have to be clipped out to maintain a faithful representation of brightness.

    Of course they could also compress the extra brightness into the levels they have available, but that makes the highlights not the correct brightness as they would be in real life. In reality you need to take a combined approach, you have to both clip some detail and just make it pure white, and compress it a little so you don't just make everything pure white. This is partly what is meant by "tone mapping"

    The second thing you'll notice is some loss in shadow detail. This isn't because the screen can't go dark enough, it's purely becaue there's not enough bit depth to convey all the detail in the shadows. This will get better the more bits we throw at it, so 12 bits, 14bits etc.

    Now when you move down to the SDR image, the two biggest issue is that the black isn't very black and the white isn't very white. Obviously good SDR screens don't actually look like this, but it's an analogy. What really happens is it's just not quite as deep black and not quite as blinding bright as a good HDR display would be.

    There are of course exceptions, as you know there are very good SDR OLED displays that can display pure black and also can get very bright. So what's the deal with those? Wouldn't they just be the same as HDR displays? By now you should've noticed a third difference with the simulated SDR image, there's a further reduction in bits and as a result, loss of detail.

    https://i.imgur.com/ZKWvfQ8.png

    Here is a more simple and direct comparison, you'll notice that to get a good HDR experience, you need to have several things at once:

    • high bit depth
    • high brightness
    • deep blacks / high contrast
    • software compatibility to run everything properly.

    The reason some good OLED panels still can't do HDR is because they don't have the bit depth, like for example in VR headsets, they can't do it because they just don't have the bandwidth to drive more bits. And if you can't drive more bits, even if you can stretch that contrast really wide, it will just start to give you different details at different parts of the image, rather than give you more detail.

    So let's go back and see if we can answer our original questions:

    1. Usually because it doesn't increase contrast, you don't have the deep blacks. Or because you don't have enough bit depth. In fact when you have both (like on a really good CRT), it does start to look just like HDR.
    2. Usually because of a lack bits or software support.
    3. Without contrast, the increased detail (bits) would be indistinguishable because there's not enough contrast.
    4. Not if it's implemented properly. On SDR, we usually just set white windows to 100% brightness. You wouldn't do that on a fully HDR capable OS. Instead you'd set white to look like a white paper in the room, so instead of using all of your display's capability, you'd just use a fraction of it to show SDR content. It's as if you have an SDR display at a comfortable brightness, except you don't have to touch the brightness controls when you want to show things that are supposed to be brighter than white paper. It's like asking if driving a 700hp super car means you'll get a speed ticket every time there's a cop, you won't, because you have fine control over the throttle (more bits and HDR standard). You can't do the same if like your throttle is just a 4-button switch like an electric fan, then you would be getting a speed ticket every time with such a powerful engine.

    So what does all this mean for buying HDR displays?

    • LCDs, especially IPS LCDs with no full matrix mini-LED local dimming aren't real HDR. Those are just SDR displays with more bits (sometimes there's not even that) and software support for HDR, which basically converts HDR back into SDR so the limited hardware can display it properly.
    • LCDs with VA panel can be promising because of the super high contrast ratio, but there are other drawbacks like viewing angle.
    • OLEDs that can't get very bright aren't real HDR either, but at this point I think these are the best compromise you can hope for, which is when you look at the sun in a movie/game it won't actually blind you and hurt your eyes.
    • The only no-compromise full-on HDR solution today is IPS LCD + mini-LED local dimming. It's basically an LCD screen over an LED matrix screen. The LED gives you the super brightness and deep blacks. The LCD makes up for the lack of resolution in the LED technology. But even this depends on how good the implementation is, I don't think those $2000 G-Sync HDR gaming monitors are worth it, they have shitty LCD panels and not enough dimming zones, too much halo effects.
    • The ONLY difference between a really good SDR display that also has 10-bits, and an HDR display, is the software support, they're otherwise the same. If you can get the SDR tone mapping adjusted properly, then you can make the two displays look identical. One example of this would be using MadVR to play HDR videos over an SDR stream, into a really good SDR display, versus playing the same video on an HDR display. They can look equally good, with the correct settings.
    • In fact, not even the bits matter that much in most cases. Usually when you see them demonstrate difference between HDR/SDR by turning it on/off in-game, most of the difference come from the TV or gaming having settings that aren't calibrated properly for SDR. If you just adjust your game properly, make a screenshot, put it on your OLED phone, turn the brightness up to max, it will look basically the same as the HDR version of that game on a proper HDR screen.
    submitted by /u/1096bimu
    [link] [comments]

    Intel: We Aren't Stepping Back to 22nm Haswell; We Never Left

    Posted: 11 Dec 2019 07:08 AM PST

    Intel’s SGX coughs up crypto keys when scientists tweak CPU voltage

    Posted: 11 Dec 2019 04:32 PM PST

    Apple’s top-end Mac Pro costs more than Tesla Model 3

    Posted: 11 Dec 2019 04:23 AM PST

    Intel hires former GlobalFoundries, IBM chip executive

    Posted: 11 Dec 2019 02:53 PM PST

    Do any of the people complaining about the Mac pro price actually have the compute needs that the Mac Pro is for?

    Posted: 11 Dec 2019 08:58 PM PST

    I've yet to read a comment from an industry pro ( film or audio ) that has an real issue about the price. Tech you tubers don't count as professional film makers even if sometimes use similar equipment. 99% of YouTube does not require cinematic film equipment. VAST majority of videos are viewed/streamed at 1080p or similar.

    submitted by /u/JoshRTU
    [link] [comments]

    The Snapdragon 865 will make phones worse in 2020, thanks to mandatory 5G

    Posted: 11 Dec 2019 06:35 AM PST

    [VideoCardz] AMD Radeon RX 5500 XT to launch at 169 USD (4GB) and 199 USD (8GB)

    Posted: 11 Dec 2019 09:32 AM PST

    Do CPU always create a new cache entry for each memory write operation?

    Posted: 11 Dec 2019 12:07 PM PST

    I.e. does it create cache entry first, writing to it, and than writes data to memory or not?

    submitted by /u/tema3210
    [link] [comments]

    Intel Demonstrates STT-MRAM for L4 Cache

    Posted: 11 Dec 2019 09:02 AM PST

    Andes' RISC-V SoC debuts with AI-ready VPU as Microchip opens access to its PolarFire SoC

    Posted: 11 Dec 2019 11:10 AM PST

    According to Ice Universe, Samsung has determined to use the Snapdragon 865 processor in the Galaxy S11 series in South Korea

    Posted: 11 Dec 2019 05:20 AM PST

    Ice Universe is quite well known in the Samsung community and has a very good track record of leaks there.

    https://twitter.com/universeice/status/1204668298807263238

    submitted by /u/dylan522p
    [link] [comments]

    Puget Systems public beta for After Effects and Photoshop benchmarks

    Posted: 11 Dec 2019 06:40 AM PST

    You can now download free After Effects and Photoshop benchmarks from the Puget Systems website. Important note: I believe you need to have After Effects or Photoshop already installed in order to make these work.

    https://www.pugetsystems.com/labs/articles/PugetBench-for-After-Effects-1287/

    https://www.pugetsystems.com/labs/articles/PugetBench-for-Photoshop-1132/

    Puget always does a great job bench marking systems on realistic workstation workloads. Get these and make your CPUs and GPUs nice and toasty for the holidays!

    submitted by /u/JigglymoobsMWO
    [link] [comments]

    Creating the Ultimate T420

    Posted: 11 Dec 2019 07:19 AM PST

    BIOS Walk through: Asrock X570 Taichi P2.70

    Posted: 11 Dec 2019 09:11 PM PST

    Intel RealSense Lidar Camera Technology Redefines Computer Vision

    Posted: 11 Dec 2019 08:46 AM PST

    Unannounced Intel Comet Lake i7-10610U spotted in Geekbench tests powering a Chromebook Hatch

    Posted: 11 Dec 2019 08:34 AM PST

    No comments:

    Post a Comment