Spectrimage’s palette extractor uses K-means++ in Oklab with four iterative structural passes to extract five human-feeling swatches from any photo.
Key Takeaways
OKLCH/Oklab replaces HSL because chroma (C) is a true distance from the achromatic axis, not a ratio that blows up near black.
K=14 with overshoot then merge-down beats targeting K=5 directly; it catches small chromatic accents in near-grayscale images like the bicycle test case.
Hue-weighted merge distance (chromatic plane counts 2x vs. lightness) prevents closest-pair from collapsing distinct hues before it collapses near-duplicate reds.
A phantom guard drops clusters below 2.5% pixel weight and chroma < 0.05 before forced collapse, removing shadow-pocket noise without merge-distance tuning.
Achromatic slot allocation reserves dark/mid/light buckets proportional to image mass, freeing slots for chromatic accents on near-grayscale images.
Hacker News Comment Review
Commenters note the algorithm ignores spatial pixel position entirely; one suggested phantom detection might improve if cluster pixel proximity in 2D image space were factored in.
Comparison to David Aerne’s okpalette.color.pizza and RYBitten surfaced as related prior art; no technical critique of relative quality was offered.
At least one commenter ported the algorithm into their own open-source imgstat utility, confirming the approach produces visibly better palettes than simpler baselines.
Notable Comments
@indigo945: Notes that no heuristic uses pixel spatial position; sorting pixels before running the program would produce identical results, which may matter for phantom detection.
@cududa: “This might be the best color palette generator I’ve ever seen” – frames the problem as one where well-funded OS teams historically struggled.