Press "Enter" to skip to content

Huahua's Tech Road

花花酱 LeetCode 3489. Zero Array Transformation IV

一道不错的DP题目!

首先看到每次可以取任意一个nums[l] ~ nums[r]的子集

  • 不能用贪心(无法找到全局最优解)
  • 不能用搜索 (数据规模太大,(2^10) ^ 1000)

那只能用动态规划了

状态定义: dp[k][i][j] 能否通过使用前k个变换使得第i个数的值变成j

边界条件: dp[0][i][nums[i]] = 1,不使用任何变换,第i个数可以达到的数值就是nums[i]本身。

状态转移:dp[k][i][j] = dp[k-1][i][j] | (dp[k – 1][i][j + val[k]] if l[k] <= i <= r[k] else 0)

简单来说如果第k-1轮第i个数可以变成j + val[k],那么第k轮就可以通过减去val[k]变成j。

上面是拉的公式,我们也可以写成推的:

dp[k][i][j – val[k]] = dp[k-1][i][j – vak[k]] | (dp[k – 1][i][j] if l[k] <= i <= r[k] else 0)

当然这么定义的话空间复杂度太高O(10^7),由于第k轮的初始状态就等于k-1轮的状态,我们可以使用滚动数组来降维,空间复杂度降低到O(10^4)。

时间复杂度:O(k*n*MaxV) = O(10^7)。

剪枝优化

上面我们把整个数组看作一个整体,一轮一轮来做。但如果某些数在第k轮能到变成0了,就没有必要参与后面的变化了,或者说它用于无法变成0,那么就是无解,其他数也就不需要再计算了。

所以,我们可以对每个数单独进行dp。即对于第i个数,计算最少需要多少轮才能把它变成0。然后对所有的轮数取一个最大值。总的时间复杂度不变(最坏情况所有数都需要经过K轮)。空间复杂度则可以再降低一个纬度到O(MaxV) = O(10^3)。

再加速:我们可以使用C++ bitset的右移操作符,dp >> v,相当于把整个集合全部剪去v,再和原先的状态做或运算(集合并)来达到新的状态。时间复杂度应该是一样的,只是代码简单,速度也会快不少。注: dp>>v 会创建一个临时对象,大小为O(MaxV)。

举个例子:
dp = {2, 3, 7}
dp >> 2 -> {0, 1, 5}
dp |= dp >> v -> {0, 1, 2, 3, 5, 7]

逆转裁判 Ace Attorney | GBA遮罩 Overlay | RG556

最近在玩《逆转裁判》,在iOS上买了合集 Ace Attorney Trilogy(花了我$24.99大洋呢,今天降到$14.99了)。虽然画面重绘过了,高清了不少,官方中文版也加入了中文语音(异议!),但本人不太喜欢搓玻璃,所以又开始倒腾模拟器,手上有周哥的RG35XX HRG556(悄悄说RG34XX还在路上,不过它就不用遮罩了)。给556做了个遮罩(其实就是张图片,PS水平实在有限),应该也适用于其他1920×1080,GBA 6倍点对点的屏幕,只考虑掌机,所以没有加按钮,喜欢的可以自行拿走。

效果预览

截屏
实机拍摄

Update: 2/24/2025

1-3终于全部通关了,最后一个故事太精彩了!送上几幅截屏,给这段旅程画上一个顿号。

罐子其实在真宵小时候就被打破过~
全家福
三人组

花花酱 LeetCode 3319. K-th Largest Perfect Subtree Size in Binary Tree

You are given the root of a binary tree and an integer k.

Return an integer denoting the size of the kth largest perfect binary 

subtree, or -1 if it doesn’t exist.

perfect binary tree is a tree where all leaves are on the same level, and every parent has two children.

Example 1:

Input: root = [5,3,6,5,2,5,7,1,8,null,null,6,8], k = 2

Output: 3

Explanation:

The roots of the perfect binary subtrees are highlighted in black. Their sizes, in decreasing order are [3, 3, 1, 1, 1, 1, 1, 1].
The 2nd largest size is 3.

Example 2:

Input: root = [1,2,3,4,5,6,7], k = 1

Output: 7

Explanation:

The sizes of the perfect binary subtrees in decreasing order are [7, 3, 3, 1, 1, 1, 1]. The size of the largest perfect binary subtree is 7.

Example 3:

Input: root = [1,2,3,null,4], k = 3

Output: -1

Explanation:

The sizes of the perfect binary subtrees in decreasing order are [1, 1]. There are fewer than 3 perfect binary subtrees.

Constraints:

  • The number of nodes in the tree is in the range [1, 2000].
  • 1 <= Node.val <= 2000
  • 1 <= k <= 1024

Solution: DFS

Write a function f() to return the perfect subtree size at node n.

def f(TreeNode n):
if not n: return 0
l, r = f(n.left), f(n.right)
return l + r + 1 if l == r && l != -1 else -1

Time complexity: O(n + KlogK)
Space complexity: O(n)

WiFi 7? Who cares?

I was using Google’s Nest Wifi Pro (three pack). Since we don’t have ethernet cables in our home, we’re using the wireless mesh setting. The network speed and coverage are OK, ~100Mbps at most places and drop to 10-20Mbps at some corners. The only annoying bit is Chrome Cast / Airplay. It works when I just got the device, it becomes unstable afer a couple months, now it’s almost unusable.

I have to deprecate them and upgrade to TP-Link BE series, more spesificily the BE11000, tri-band / WiFi 7. Also a three pack from Costco which is on sell for $399.99! It’s a day and night difference, casting is seemless, speed and coverage also got improved, 300-500Mbps everywhere, even in garage and restrooms. I’m paying Xfinify $135/m for 1Gbps plan, measured 900Mbps down and 30Mbps up, which is ridiculous. Three month fee vs a couple of years of usage!

You have to use the Deco app to setup the device, which is pretty smooth and friendly for newbies, but some pro users might found it less features packed compared to a web-base maganement system. I have a lot of IoT devices connected to system. None of them has a single problem, just making sure you use the same SSID and password as before, resetting all IoT devices can be a nightmare!

It features WiFi7 but who cares? I don’t even have a single device supports that. 500Mbps is more than enough for me who just do streaming most of the time. Yesterday, I was downloading Black Myth: WuKong 128GB, took about half an hour. I’d hope it could be faster!

Is Wifi7 future proof? Time well say.

Why SDR looks so good on Apple’s XDR display that even shadows HDR?

Apple released their Apple Sillicon based Macbook Pro in October 26, 2021 which includes a new MicroLED display which they call XDR with a peak brightness of 1600 nits. Three years later, it’s still one of the best display for both SDR and HDR content consuming and creation. However, SDR content looks so good on the new XDR displays, sometime I even think that they were HDR content. Why is that? Let us figure out together.

Presets

The new XDR display includes a few presets:

  • Apple XDR Display (P3-1600 nits)

The default one for daily usage which has a peak brightness of 1600 nits for HDR content and 500 nits for SDR / UI.

  • Apple Display (P3-500 nits)

Peek brightness of 500 nits.

  • HDR Video (P3-ST 2084)

HDR reference mode. Can not adjust the brightness which peaks at ~1000 nits for HDR content and ~100 nits for SDR content / UI.

  • HDTV Video (BT.709 – BT.1886)

SDR reference mode. Can not adjust the brightness which peaks at ~100 nits for all content and UI.

500 nits for SDR?

Apple has been using 500 nits for SDR / UI for a very long time. Wait, shouldn’t SDR be 100 nits max? Yes, in theory and in some reference modes. Morden displays have a peak brightness of 300+ nits, not to mention the latest M4 iPad Pro that has a peak brightness of 1000 nits for SDR!!! In today’s standard, 100 nits is too dark to watch even in normally lit indoor environment.

Let’s see how Apple displays SDR content in their XDR displays:

Setup: I created a video with a black square and incrased IRE value of it until it becomes white in Rec. 709 colorspace / gamma. Then used a SM208 screen luminance meter to measure the brightness of the XDR display under different presets.

Here’re curves of screen brightness v.s Rec. 709 IRE values:

After the test, I found that Apple XDR Display (P3-1600 nits) and Apple Display (P3-500 nits) have the same response curve for SDR content, so only drew one line here. They (the red curve) peaked around 450 nits, close to claimed 500 nits (my screen might be degraded a bit after two and half years), middle gray (~40% IRE) is aboud 77 nits, black (0% IRE) is 0.05 nits.

In SDR reference mode (HDTV BT.709-BT.1886 preset), the blue curve, peek brightness is 94 nits, very close to the 100 nits for SDR. Middle gray is a little bit darker at 11 nits, black (0% IRE) is all the way down to 0.02 nits!

I also plot a Gamma 2.4 curve for a 100 nits reference monitor (the yellow curve), you can see that it overlaps well with the SDR reference mode for bright part (60%+ IRE), it lays in between of two modes, its dark region (<1 IRE%) is brighter than both modes and will be even brighter on a real CRT monitor that the SDR standard was designed for.

For comparison, I also measured my BenQ PD3200U which still looks great for most of the SDR content. In sRGB mode, it peaks at 350 nits, the 1% IRE signial is clearly visible and measured at 0.91 nits, pure black is 0.34 nits, contrast ratio is just over 1000:1, no true black is the major drawback of LCD displays. XDR is brighter for the highlights and darker in the shadows.

The curve itself explains why properly graded SDR (Rec. 709) content looks so good in Apple’s XDR Displays: it tracks the gamma curve pretty well for the most part (30%+ IRE), pure white is very bright (450+ nits), and pure black is very dark (0.05 nits), the constrat ratio is around 9000:1 (13+ stops dynamic range).

What about HDR?

I did a similar test for HDR, using ITU-R BT.2100 (HLG) gamma. Here’re the results:

Let’s first look at the HDR reference mode (HDR Video P3-ST2048 preset), the yellow curve, it tracks the HLG curve very well, it’s a straight line (in log scale) after 50% IRE. HDR reference mode peaks at 881 nits (100% IRE), diffuse white 183 nits (75% IRE), both are a little bit darker than the reference values which is 1000 nits and 203 nits respectfully, middle gray is accurate though, around 26 nits (38% IRE), black is 0.02 nits! (0% IRE), It gives us a contrast ratio of ~44000:1, 15 stops+ dyanmic range.

The XDR mode (red curve) is always brighter comparing to the HDR reference mode: 2+ stops in black and deep shadows (0 ~ 10% IRE), 1+ stops (10% ~ 85% IRE) for most of the part and < 1 stops for highlights (85%+ IRE), it curved / saturated after 95% IRE, and peeked at 1450 nits! (100% IRE) , diffuse white 410 nits (75% IRE), close to white in SDR/UI (which is 450 nits), middle gray is around 68 nits (38% IRE), a little bit darker than SDR mode, black is 0.03 nits (0% IRE). It gives us a contrast ratio of ~50000:1, also 15 stops+ dyanmic range.

HLG as a backward compatible curve, I also tested the Apply Display 500 nits mode (the blue curve), it lays between XDR and HDR reference mode for the most of the part and clipped after 90% IRE with a peek brightness of 450 nits (same for SDR content), diffuse white is 219 nits (75% IRE), middle gray is 38 nits (38% IRE) and black is 0.04 nits (0% IRE).

PQ is another story

The HDR reference mode tracks the PQ curve perfectly, it’s a straight line from 0 to 1000 nits and clipped after that.

Both 1600-nits and 500-nits mode didn’t do a good job, they are one stop brighter for shadows and then gradually curved.

Conclusion

Apple’s XDR Display is fantastic, very good in SDR mode: 450+ nits peak brightness, true blacks, 13+ stops of dyanmic range put a lot of “HDR displays” to a shame. You can definitly call it HDR since 100-nits-max, 200:1 constrat ratio SDR is dead for many years. HDR content in XDR mode is also great with a peak brightness of 1450+ nits, 15+ stops dynamic range. However, highlights is only 1.7 stops brighter than UI/SDR white. Idealy, highlights should be at least 3 stops brighter than diffuse white, since people is alreay used to have 500 nits for UI/SDR white, then a 4000+ nits peak brightness display is needed. Setting diffuse white to 203 nits (recommended for HLG masted at 1000 nits), the requirement drops to 1600 nits (it’s not a coincidence), however, (diffuse) white will looks gray since it’s 1.3 stop darker than UI.

I know it’s xxx nits everywhere since human eyes are much more sensitive in luminance than in colors and a lot of “HDR” content are way over saturated!