You are given the root of a binary tree and an integer k.
Return an integer denoting the size of the kthlargestperfect binary
subtree, or -1 if it doesn’t exist.
A perfect binary tree is a tree where all leaves are on the same level, and every parent has two children.
Example 1:
Input: root = [5,3,6,5,2,5,7,1,8,null,null,6,8], k = 2
Output: 3
Explanation:
The roots of the perfect binary subtrees are highlighted in black. Their sizes, in decreasing order are [3, 3, 1, 1, 1, 1, 1, 1]. The 2nd largest size is 3.
Example 2:
Input: root = [1,2,3,4,5,6,7], k = 1
Output: 7
Explanation:
The sizes of the perfect binary subtrees in decreasing order are [7, 3, 3, 1, 1, 1, 1]. The size of the largest perfect binary subtree is 7.
Example 3:
Input: root = [1,2,3,null,4], k = 3
Output: -1
Explanation:
The sizes of the perfect binary subtrees in decreasing order are [1, 1]. There are fewer than 3 perfect binary subtrees.
Constraints:
The number of nodes in the tree is in the range [1, 2000].
1 <= Node.val <= 2000
1 <= k <= 1024
Solution: DFS
Write a function f() to return the perfect subtree size at node n.
def f(TreeNode n): if not n: return 0 l, r = f(n.left), f(n.right) return l + r + 1 if l == r && l != -1 else -1
Time complexity: O(n + KlogK) Space complexity: O(n)
I was using Google’s Nest Wifi Pro (three pack). Since we don’t have ethernet cables in our home, we’re using the wireless mesh setting. The network speed and coverage are OK, ~100Mbps at most places and drop to 10-20Mbps at some corners. The only annoying bit is Chrome Cast / Airplay. It works when I just got the device, it becomes unstable afer a couple months, now it’s almost unusable.
I have to deprecate them and upgrade to TP-Link BE series, more spesificily the BE11000, tri-band / WiFi 7. Also a three pack from Costco which is on sell for $399.99! It’s a day and night difference, casting is seemless, speed and coverage also got improved, 300-500Mbps everywhere, even in garage and restrooms. I’m paying Xfinify $135/m for 1Gbps plan, measured 900Mbps down and 30Mbps up, which is ridiculous. Three month fee vs a couple of years of usage!
You have to use the Deco app to setup the device, which is pretty smooth and friendly for newbies, but some pro users might found it less features packed compared to a web-base maganement system. I have a lot of IoT devices connected to system. None of them has a single problem, just making sure you use the same SSID and password as before, resetting all IoT devices can be a nightmare!
It features WiFi7 but who cares? I don’t even have a single device supports that. 500Mbps is more than enough for me who just do streaming most of the time. Yesterday, I was downloading Black Myth: WuKong 128GB, took about half an hour. I’d hope it could be faster!
Apple released their Apple Sillicon based Macbook Pro in October 26, 2021 which includes a new MicroLED display which they call XDR with a peak brightness of 1600 nits. Three years later, it’s still one of the best display for both SDR and HDR content consuming and creation. However, SDR content looks so good on the new XDR displays, sometime I even think that they were HDR content. Why is that? Let us figure out together.
Presets
The new XDR display includes a few presets:
Apple XDR Display (P3-1600 nits)
The default one for daily usage which has a peak brightness of 1600 nits for HDR content and 500 nits for SDR / UI.
Apple Display (P3-500 nits)
Peek brightness of 500 nits.
HDR Video (P3-ST 2084)
HDR reference mode. Can not adjust the brightness which peaks at ~1000 nits for HDR content and ~100 nits for SDR content / UI.
HDTV Video (BT.709 – BT.1886)
SDR reference mode. Can not adjust the brightness which peaks at ~100 nits for all content and UI.
500 nits for SDR?
Apple has been using 500 nits for SDR / UI for a very long time. Wait, shouldn’t SDR be 100 nits max? Yes, in theory and in some reference modes. Morden displays have a peak brightness of 300+ nits, not to mention the latest M4 iPad Pro that has a peak brightness of 1000 nits for SDR!!! In today’s standard, 100 nits is too dark to watch even in normally lit indoor environment.
Let’s see how Apple displays SDR content in their XDR displays:
Setup: I created a video with a black square and incrased IRE value of it until it becomes white in Rec. 709 colorspace / gamma. Then used a SM208 screen luminance meter to measure the brightness of the XDR display under different presets.
Here’re curves of screen brightness v.s Rec. 709 IRE values:
After the test, I found that Apple XDR Display (P3-1600 nits) and Apple Display (P3-500 nits) have the same response curve for SDR content, so only drew one line here. They (the red curve) peaked around 450 nits, close to claimed 500 nits (my screen might be degraded a bit after two and half years), middle gray (~40% IRE) is aboud 77 nits, black (0% IRE) is 0.05 nits.
In SDR reference mode (HDTV BT.709-BT.1886 preset), the blue curve, peek brightness is 94 nits, very close to the 100 nits for SDR. Middle gray is a little bit darker at 11 nits, black (0% IRE) is all the way down to 0.02 nits!
I also plot a Gamma 2.4 curve for a 100 nits reference monitor (the yellow curve), you can see that it overlaps well with the SDR reference mode for bright part (60%+ IRE), it lays in between of two modes, its dark region (<1 IRE%) is brighter than both modes and will be even brighter on a real CRT monitor that the SDR standard was designed for.
For comparison, I also measured my BenQ PD3200U which still looks great for most of the SDR content. In sRGB mode, it peaks at 350 nits, the 1% IRE signial is clearly visible and measured at 0.91 nits, pure black is 0.34 nits, contrast ratio is just over 1000:1, no true black is the major drawback of LCD displays. XDR is brighter for the highlights and darker in the shadows.
The curve itself explains why properly graded SDR (Rec. 709) content looks so good in Apple’s XDR Displays: it tracks the gamma curve pretty well for the most part (30%+ IRE), pure white is very bright (450+ nits), and pure black is very dark (0.05 nits), the constrat ratio is around 9000:1 (13+ stops dynamic range).
What about HDR?
I did a similar test for HDR, using ITU-R BT.2100 (HLG) gamma. Here’re the results:
Let’s first look at the HDR reference mode (HDR Video P3-ST2048 preset), the yellow curve, it tracks the HLG curve very well, it’s a straight line (in log scale) after 50% IRE. HDR reference mode peaks at 881 nits (100% IRE), diffuse white 183 nits (75% IRE), both are a little bit darker than the reference values which is 1000 nits and 203 nits respectfully, middle gray is accurate though, around 26 nits (38% IRE), black is 0.02 nits! (0% IRE), It gives us a contrast ratio of ~44000:1, 15 stops+ dyanmic range.
The XDR mode (red curve) is always brighter comparing to the HDR reference mode: 2+ stops in black and deep shadows (0 ~ 10% IRE), 1+ stops (10% ~ 85% IRE) for most of the part and < 1 stops for highlights (85%+ IRE), it curved / saturated after 95% IRE, and peeked at 1450 nits! (100% IRE) , diffuse white 410 nits (75% IRE), close to white in SDR/UI (which is 450 nits), middle gray is around 68 nits (38% IRE), a little bit darker than SDR mode, black is 0.03 nits (0% IRE). It gives us a contrast ratio of ~50000:1, also 15 stops+ dyanmic range.
HLG as a backward compatible curve, I also tested the Apply Display 500 nits mode (the blue curve), it lays between XDR and HDR reference mode for the most of the part and clipped after 90% IRE with a peek brightness of 450 nits (same for SDR content), diffuse white is 219 nits (75% IRE), middle gray is 38 nits (38% IRE) and black is 0.04 nits (0% IRE).
PQ is another story
The HDR reference mode tracks the PQ curve perfectly, it’s a straight line from 0 to 1000 nits and clipped after that.
Both 1600-nits and 500-nits mode didn’t do a good job, they are one stop brighter for shadows and then gradually curved.
Conclusion
Apple’s XDR Display is fantastic, very good in SDR mode: 450+ nits peak brightness, true blacks, 13+ stops of dyanmic range put a lot of “HDR displays” to a shame. You can definitly call it HDR since 100-nits-max, 200:1 constrat ratio SDR is dead for many years. HDR content in XDR mode is also great with a peak brightness of 1450+ nits, 15+ stops dynamic range. However, highlights is only 1.7 stops brighter than UI/SDR white. Idealy, highlights should be at least 3 stops brighter than diffuse white, since people is alreay used to have 500 nits for UI/SDR white, then a 4000+ nits peak brightness display is needed. Setting diffuse white to 203 nits (recommended for HLG masted at 1000 nits), the requirement drops to 1600 nits (it’s not a coincidence), however, (diffuse) white will looks gray since it’s 1.3 stop darker than UI.
I know it’s xxx nits everywhere since human eyes are much more sensitive in luminance than in colors and a lot of “HDR” content are way over saturated!
Recently I switched from N-RAW to H.265 / HLG for quicker HDR workflow in Final Cut Pro, basically just drop the clips in a HLG timeline and that’s it, just like the iPhone’s Dolby Vision (HLG HDR) footage.
I did some test about IRE vs stops in HLG mode and compared that to the theoretical N-Log curve.
Setup: I expouse the 18% midgray to 36% IRE in HLG mode to match that with N-Log. Adjust the exposure time first then aperture, ISO to over and under expose the midgray for IRE readings.
Note: Techincally, you should expose midgray to 38% IRE in HLG mode (ref), diffuse white will be around 75% IRE, 75%+ is for specular highlights (0~3 stops over diffuse white).
Stops vs IRE
First of all, we can see that HLG curve is more contrastive than N-Log. Highlights will be clipped after 5 stops over midgray which I think is OK for a lot of scences. N-Log retains more infornation in the highlights (if not over-exposed) up to 6.3 stops at the cost of noiser shadows. HLG has more detials in the shadow area, noise floor is about 2% at ISO 400 (12.5% for N-Log).
Another thing intresting is that the HLG curve is very steep from -1 to +3 stops but quickly saturated afterwards (non-linear anymore), highlights are compressed and will clipped around 96% IRE, making sure to set the high zebra to 245 when shooting HLG.
Base ISO for HLG and N-Log
Nikon Z8’s HLG has a “nominal” base ISO 400. Under the hood, the sensor is working at ISO 100, shadows will be boosted by 2 stops while highlights will be intact, bascially you are shooting at ISO 100 with 2 stops under to protect highlights.
N-Log has a “nominal” base ISO 800. Same as HLG, the sensor is working at ISO 100, shadows will be boosted by 3 stops while highlights will be intact, bascially you are shooting at ISO 100 with 3 stops under to protect highlights. However, you will see a lot of noise in shadow area due to insufficient light and 1.3 ~ 1.7 stops “over-exposure” is usually needed to compensate that to get clean image.
Consolution
HLG has a pretty good balance between highlights and shadows, very user friendly for HDR workflow. Unlike N-Log, there is no need to over-expose (even 1 stop under looks good to me), no camera LUT needed for restoration, zero or minimum color grading needed. Another advantage of HLG over N-Log is monitoring, most of the time you can rely on the screen/viewfinder (with zebra on) to adjust the exposure and only open the waveform if needed.
The Nikon official site didn’t provide the birate for ProRes / ProRes RAW, I did some test my self, here’re the results. I only tested fullframe (FX) in ~4K resolution.
ProRes Raw 12Bit
Resolution / FPS
Bitrate (Mbps)
Compression Ratio
4140 x 2330 60p
3350
1.977
4140 x 2330 50p
2810
1.964
4140 x 2330 30p
1620
2.044
4140 x 2330 25p
1360
2.029
4140 x 2330 24p
1280
2.070
ProRes Raw has a very low compression ratio of 2, which suggests that it might be lossless compressed, which takes more spaces than Nikon’s lossy N-RAW, 2.5x of the high quality and 4.5x of the normal quality!
ProRes HQ 422 10Bit
Resolution / FPS
Bitrate (Mbps)
Compression Ratio
3840 x 2160 60p
1800
5.273
3840 x 2160 50p
1500
5.273
3840 x 2160 30p
910
5.215
3840 x 2160 25p
760
5.204
3840 x 2160 24p
730
5.201
With compression ratio of 5.2, ProRes is still 2x the size of the normal quality n-raw! Though the former one has 1.5x number of pixels to deal with (YUV422 vs Bayer).
I hope Nikon could provide more options for ProRes via firmware updates. A 10x~12x compression ratio version (ProRes LT 422) will be very useful to balance between quality and disk space.