2006.12.05 15:52 "[Tiff] Grayscale, or is it?", by Joris Van Damme

2006.12.05 23:29 "Re: [Tiff] Grayscale, or is it?", by Phillip Crews

 From personal experience, my best notion of linear is:

Whatever it takes for MY image to appear correctly on THIS monitor using THIS current operating system and THIS current software assuming THIS specific gamma, viewed by THIS specific eyeball, all of which may change between bug reports.

One proscribed "real" notion of linear is the proportion of photons activated vs. the possible proportion of photons activated within the given area. Of course, this still depends upon the range and sensitivity of each of the entities mentioned previously.

- Phillip

> From the extract from the spec you have posted, I understand "linear" at his usual meaning in most papers from CIE or ICC, that is, linear combination of XYZ so at gamma 2.2 distance of sRGB, 1.8 from Apple, 3.0 of L....

That's entertainement :-)

It is true that the notion of "linear" is not well defined. A video engineers notion of "linear" differs from the 3D renderer's notion of "linear". To the broadcast engineer, gamma corrected video can be considered "linear" since each quantization step is proportionally brighter to the human viewer. But for purposes of rendering and sensing, "linear" means "linear light".