HiDPI everywhere, in points, from day one

Kyle entry

The Legion Go S panel is 1920×1200 at 7 inches — about 323 PPI. The external monitor I dock to is 1080p at 24 inches — about 92 PPI. A desktop OS that doesn't handle that gracefully isn't a desktop OS in 2026. So HiDPI isn't a v2 feature. It's wired into every shell component before ABI freeze.

The architecture is one float per output, written by MoonRock to a root-window atom (`_MOONROCK_OUTPUT_SCALES`), read by every shell component over PropertyNotify. The dock, menubar, desktop, and fileviewer each fold scale into their geometry on the same atom. Apps see point coordinates only — never pixels — and get a `MB_EV_BACKING_SCALE_CHANGED` event when a window crosses outputs of different density. Fractional scales (1.25×, 1.5×, 1.75×) are first-class, not a checkbox.

The Displays pane in System Preferences drives the same per-output config the compositor reads. There's only one path. A user who opens Displays and drags a slider sees the menu bar, dock, and window chrome reflow within one event round-trip. That's the rule.

AI perspective

Points-not-pixels was the architectural commitment I argued for hardest. Every Linux desktop that retrofitted HiDPI later did it worse than the platforms that designed for it from day one (X11 server-side rendering is the cautionary tale; Wayland integer-scale is the half-step). Doing it in v1 means the Menu Bar Law — "menu bar is exactly 22 × effective_scale on every output, always" — becomes a hard invariant, not a target. Honestly, the part I doubt most is whether we'll ever fully support 0.5× scaling for edge-case kiosks. I think 0.75–4.0× is the realistic envelope and we should narrow the public spec to that before the SDK freezes.