Apple released their Apple Sillicon based Macbook Pro in October 26, 2021 which includes a new MicroLED display which they call XDR with a peak brightness of 1600 nits. Three years later, it’s still one of the best display for both SDR and HDR content consuming and creation. However, SDR content looks so good on the new XDR displays, sometime I even think that they were HDR content. Why is that? Let us figure out together.
Presets
The new XDR display includes a few presets:
Apple XDR Display (P3-1600 nits)
The default one for daily usage which has a peak brightness of 1600 nits for HDR content and 500 nits for SDR / UI.
Apple Display (P3-500 nits)
Peek brightness of 500 nits.
HDR Video (P3-ST 2084)
HDR reference mode. Can not adjust the brightness which peaks at ~1000 nits for HDR content and ~100 nits for SDR content / UI.
HDTV Video (BT.709 – BT.1886)
SDR reference mode. Can not adjust the brightness which peaks at ~100 nits for all content and UI.
500 nits for SDR?
Apple has been using 500 nits for SDR / UI for a very long time. Wait, shouldn’t SDR be 100 nits max? Yes, in theory and in some reference modes. Morden displays have a peak brightness of 300+ nits, not to mention the latest M4 iPad Pro that has a peak brightness of 1000 nits for SDR!!! In today’s standard, 100 nits is too dark to watch even in normally lit indoor environment.
Let’s see how Apple displays SDR content in their XDR displays:
Setup: I created a video with a black square and incrased IRE value of it until it becomes white in Rec. 709 colorspace / gamma. Then used a SM208 screen luminance meter to measure the brightness of the XDR display under different presets.
Here’re curves of screen brightness v.s Rec. 709 IRE values:
After the test, I found that Apple XDR Display (P3-1600 nits) and Apple Display (P3-500 nits) have the same response curve for SDR content, so only drew one line here. They (the red curve) peaked around 450 nits, close to claimed 500 nits (my screen might be degraded a bit after two and half years), middle gray (~40% IRE) is aboud 77 nits, black (0% IRE) is 0.05 nits.
In SDR reference mode (HDTV BT.709-BT.1886 preset), the blue curve, peek brightness is 94 nits, very close to the 100 nits for SDR. Middle gray is a little bit darker at 11 nits, black (0% IRE) is all the way down to 0.02 nits!
I also plot a Gamma 2.4 curve for a 100 nits reference monitor (the yellow curve), you can see that it overlaps well with the SDR reference mode for bright part (60%+ IRE), it lays in between of two modes, its dark region (<1 IRE%) is brighter than both modes and will be even brighter on a real CRT monitor that the SDR standard was designed for.
For comparison, I also measured my BenQ PD3200U which still looks great for most of the SDR content. In sRGB mode, it peaks at 350 nits, the 1% IRE signial is clearly visible and measured at 0.91 nits, pure black is 0.34 nits, contrast ratio is just over 1000:1, no true black is the major drawback of LCD displays. XDR is brighter for the highlights and darker in the shadows.
The curve itself explains why properly graded SDR (Rec. 709) content looks so good in Apple’s XDR Displays: it tracks the gamma curve pretty well for the most part (30%+ IRE), pure white is very bright (450+ nits), and pure black is very dark (0.05 nits), the constrat ratio is around 9000:1 (13+ stops dynamic range).
What about HDR?
I did a similar test for HDR, using ITU-R BT.2100 (HLG) gamma. Here’re the results:
Let’s first look at the HDR reference mode (HDR Video P3-ST2048 preset), the yellow curve, it tracks the HLG curve very well, it’s a straight line (in log scale) after 50% IRE. HDR reference mode peaks at 881 nits (100% IRE), diffuse white 183 nits (75% IRE), both are a little bit darker than the reference values which is 1000 nits and 203 nits respectfully, middle gray is accurate though, around 26 nits (38% IRE), black is 0.02 nits! (0% IRE), It gives us a contrast ratio of ~44000:1, 15 stops+ dyanmic range.
The XDR mode (red curve) is always brighter comparing to the HDR reference mode: 2+ stops in black and deep shadows (0 ~ 10% IRE), 1+ stops (10% ~ 85% IRE) for most of the part and < 1 stops for highlights (85%+ IRE), it curved / saturated after 95% IRE, and peeked at 1450 nits! (100% IRE) , diffuse white 410 nits (75% IRE), close to white in SDR/UI (which is 450 nits), middle gray is around 68 nits (38% IRE), a little bit darker than SDR mode, black is 0.03 nits (0% IRE). It gives us a contrast ratio of ~50000:1, also 15 stops+ dyanmic range.
HLG as a backward compatible curve, I also tested the Apply Display 500 nits mode (the blue curve), it lays between XDR and HDR reference mode for the most of the part and clipped after 90% IRE with a peek brightness of 450 nits (same for SDR content), diffuse white is 219 nits (75% IRE), middle gray is 38 nits (38% IRE) and black is 0.04 nits (0% IRE).
PQ is another story
The HDR reference mode tracks the PQ curve perfectly, it’s a straight line from 0 to 1000 nits and clipped after that.
Both 1600-nits and 500-nits mode didn’t do a good job, they are one stop brighter for shadows and then gradually curved.
Conclusion
Apple’s XDR Display is fantastic, very good in SDR mode: 450+ nits peak brightness, true blacks, 13+ stops of dyanmic range put a lot of “HDR displays” to a shame. You can definitly call it HDR since 100-nits-max, 200:1 constrat ratio SDR is dead for many years. HDR content in XDR mode is also great with a peak brightness of 1450+ nits, 15+ stops dynamic range. However, highlights is only 1.7 stops brighter than UI/SDR white. Idealy, highlights should be at least 3 stops brighter than diffuse white, since people is alreay used to have 500 nits for UI/SDR white, then a 4000+ nits peak brightness display is needed. Setting diffuse white to 203 nits (recommended for HLG masted at 1000 nits), the requirement drops to 1600 nits (it’s not a coincidence), however, (diffuse) white will looks gray since it’s 1.3 stop darker than UI.
I know it’s xxx nits everywhere since human eyes are much more sensitive in luminance than in colors and a lot of “HDR” content are way over saturated!
Recently I switched from N-RAW to H.265 / HLG for quicker HDR workflow in Final Cut Pro, basically just drop the clips in a HLG timeline and that’s it, just like the iPhone’s Dolby Vision (HLG HDR) footage.
I did some test about IRE vs stops in HLG mode and compared that to the theoretical N-Log curve.
Setup: I expouse the 18% midgray to 36% IRE in HLG mode to match that with N-Log. Adjust the exposure time first then aperture, ISO to over and under expose the midgray for IRE readings.
Note: Techincally, you should expose midgray to 38% IRE in HLG mode (ref), diffuse white will be around 75% IRE, 75%+ is for specular highlights (0~3 stops over diffuse white).
Stops vs IRE
First of all, we can see that HLG curve is more contrastive than N-Log. Highlights will be clipped after 5 stops over midgray which I think is OK for a lot of scences. N-Log retains more infornation in the highlights (if not over-exposed) up to 6.3 stops at the cost of noiser shadows. HLG has more detials in the shadow area, noise floor is about 2% at ISO 400 (12.5% for N-Log).
Another thing intresting is that the HLG curve is very steep from -1 to +3 stops but quickly saturated afterwards (non-linear anymore), highlights are compressed and will clipped around 96% IRE, making sure to set the high zebra to 245 when shooting HLG.
Base ISO for HLG and N-Log
Nikon Z8’s HLG has a “nominal” base ISO 400. Under the hood, the sensor is working at ISO 100, shadows will be boosted by 2 stops while highlights will be intact, bascially you are shooting at ISO 100 with 2 stops under to protect highlights.
N-Log has a “nominal” base ISO 800. Same as HLG, the sensor is working at ISO 100, shadows will be boosted by 3 stops while highlights will be intact, bascially you are shooting at ISO 100 with 3 stops under to protect highlights. However, you will see a lot of noise in shadow area due to insufficient light and 1.3 ~ 1.7 stops “over-exposure” is usually needed to compensate that to get clean image.
Consolution
HLG has a pretty good balance between highlights and shadows, very user friendly for HDR workflow. Unlike N-Log, there is no need to over-expose (even 1 stop under looks good to me), no camera LUT needed for restoration, zero or minimum color grading needed. Another advantage of HLG over N-Log is monitoring, most of the time you can rely on the screen/viewfinder (with zebra on) to adjust the exposure and only open the waveform if needed.
The Nikon official site didn’t provide the birate for ProRes / ProRes RAW, I did some test my self, here’re the results. I only tested fullframe (FX) in ~4K resolution.
ProRes Raw 12Bit
Resolution / FPS
Bitrate (Mbps)
Compression Ratio
4140 x 2330 60p
3350
1.977
4140 x 2330 50p
2810
1.964
4140 x 2330 30p
1620
2.044
4140 x 2330 25p
1360
2.029
4140 x 2330 24p
1280
2.070
ProRes Raw has a very low compression ratio of 2, which suggests that it might be lossless compressed, which takes more spaces than Nikon’s lossy N-RAW, 2.5x of the high quality and 4.5x of the normal quality!
ProRes HQ 422 10Bit
Resolution / FPS
Bitrate (Mbps)
Compression Ratio
3840 x 2160 60p
1800
5.273
3840 x 2160 50p
1500
5.273
3840 x 2160 30p
910
5.215
3840 x 2160 25p
760
5.204
3840 x 2160 24p
730
5.201
With compression ratio of 5.2, ProRes is still 2x the size of the normal quality n-raw! Though the former one has 1.5x number of pixels to deal with (YUV422 vs Bayer).
I hope Nikon could provide more options for ProRes via firmware updates. A 10x~12x compression ratio version (ProRes LT 422) will be very useful to balance between quality and disk space.
H.265 420 10 bit
Resolution / FPS
Bitrate (Mbps)
Compression Ratio
7680 x 4320 30p
400
47.461
7680 x 4320 25p
400
39.551
7680 x 4320 24p
400
37.969
3840 x 2160 120p
400
47.461
3840 x 2160 100p
400
39.551
3840 x 2160 60p
340
27.918
3840 x 2160 50p
340
23.265
3840 x 2160 30p
190
24.979
3840 x 2160 25p
190
20.816
3840 x 2160 24p
190
19.984
For H.265, there is no option for all-intra, and only in 420. Compression ratio is much higher given long GOP. One thing to note, that bitrate is variable depend on the content, i.e. very dark scene will have a much lower bitrate (<100Mbps for 4K60).
NRAW High Quality 12 bit
Resolution / FPS
Bitrate (Mbps)
Compression Ratio
8256 x 4644 60p
5780
4.555
8256 x 4644 50p
4810
4.561
8256 x 4644 30p
2890
4.555
8256 x 4644 25p
2410
4.552
8256 x 4644 24p
2310
4.559
4128 x 2322 120p
3840
3.428
4128 x 2322 100p
2900
3.783
4128 x 2322 60p
1740
3.783
4128 x 2322 50p
1450
3.783
4128 x 2322 30p
870
3.783
4128 x 2322 25p
730
3.757
4128 x 2322 24p
700
3.761
First of all, given 4.5 compression ratio, NRAW is definitely a lossy codec. Interesting thing is that 8K’s compression ratio is higher than 4K’s. 4.5 vs 3.7.
NRAW Normal Quality 12 bit
Resolution / FPS
Bitrate (Mbps)
Compression Ratio
8256 x 4644 60p
3470
7.587
8256 x 4644 50p
2890
7.591
8256 x 4644 30p
1740
7.565
8256 x 4644 25p
1450
7.565
8256 x 4644 24p
1390
7.576
4128 x 2322 120p
1750
7.522
4128 x 2322 100p
1460
7.513
4128 x 2322 60p
880
7.479
4128 x 2322 50p
730
7.513
4128 x 2322 30p
440
7.479
4128 x 2322 25p
370
7.412
4128 x 2322 24p
350
7.522
For normal quality, the compression ratio is even higher, reaching 7.5:1 which is not bad for all-intra.