←Back to home

CSS Paint Worklets

Background - CSS Houdini

Houdini is the umbrella project for a collection of APIs exposing selected parts of the browser's CSS engine.

The @property rule, CSS Properties and Values API and Typed ORM API are all about directing giving the existing CSS property infrastructure the ability to make assumptions about var(--foo-property) style bindings, and provide the kind of semantics we're used to when doing things like calc(1rem + 10px - 1cm) (i.e. mixing units).

Layout Worklets are extremely promising, since they would allow polyfilling all future additions to the CSS spec that affect layout (e.g. -webkit-line-clamp is unlikely to ever be stabilized at this point, but would be trivially polyfillable with a Layout Worklet).

Animation Worklets - same deal, super-promising and allow performant polyfills.

Today, though, we're dealing with Paint Worklets, snippets of JS that allow overriding the penultimate stage of the CSS pipeline (paint) - the developer gets a canvas rendering context (2D) restricted to the bounding box of the element.

Demo

To try this API out, I (somewhat wastefully) chose the header of this very blog to do it - it depicts (based on hardcoded values and a few judicious styling overrides) emission spectra for Hydrogen (known as the Ballmer Series).

The mechanics are pretty simple, and map exactly to drawing to a 2D canvas rendering context - this being the business end of things:

ctx.fillStyle = spectrumValues[index];
for (const key of ['shadowBlur', 'shadowColor', 'strokeStyle', 'lineWidth']) {
ctx[key] = currentSettings[key] ?? '';
}
if(currentSettings['strokeStyle']) {
ctx.beginPath();
ctx.moveTo(startX + currentSettings.width / 2, 0);
ctx.lineTo(startX + currentSettings.width / 2, size.height);
ctx.stroke();
}
ctx.fillRect(startX, 0, settings[index].width, size.height);

Enhancements

As currently written, the rendered result could just be replaced with an over-ressed (say, rescaled to 4K pixels wide) one-shot rendering, in a format with reasonably good compression (e.g. AVIF or webp) for the same effect, since the bandwidth is static (300nm-780nm).

Dynamic bandwidth would make things a little more interesting - zooming in on the δ-η lines would be handy for distinguishing them properly.

Range-based filters would be another thing, along with extended spectra (there's a bunch in the UV region after all).

A full-blown model of the entire spectrum, inclusive of fine structure and knobs for simulating the Zeeman and Stark effects would be awesome too.