I have recently been implementing a highly parallel bitmap grayscale kernel as part of another project (more on this later).
Interestingly almost all sources on the web, including wikipedia (http://en.wikipedia.org/wiki/Grayscale) still quote the:
Y = (r * 0.3) + (g * 0.59) + (b * 0.11)
This is the correct technique for older displays. Improvements in display technology (LCD / plasma) etc mean this is no longer the case.
The more correct form for modern displays is:
Y = (r * 0.2125) + (g * 0.7154) + (b * 0.0721)
As soon as I find the link to the paper / author that conducted this research I will post it here as it is not immediately findable on google.com
The two equations produce markedly different results, with the second one producing a much fuller / smoother luminance range on my monitors.
I wonder how many legacy (or new) video and image editing software suites still use the older version? With the proliferation of LCD displays its well worth a quick update.
Interestingly Wikipedia currently does cite these figures in its coverage of the Luma Coefficients of HDTV.
ReplyDeleteWell spotted - perhaps they should ammend the entry on grayscaling to show this is the "correct" version for HD.
ReplyDeleteOf course "correct" is a very subjective term.
As cited http://www.poynton.com/notes/PU-PR-IS/Poynton-PU-PR-IS.book.pdf makes for an interesting read.