But why did Apple change, and more importantly why now?
John Nack, Adobe Photoshop’s Blogger/Product Manager asked a very qualified collegue, Adobe Principal Scientist Lars Borg, to answer that question. Lars has spent the past 20 years at Adobe defining & driving color management solutions, and lately he’s been focused on digital cinema standards. Here’s what he said: Macintosh, in 1984, introduced us to desktop publishing and to displays with shades of grays. Publishing at that time meant printing presses, and the dot gain of a typical press (then and now) corresponds to a gamma of 1.8. As color management was non-existent at the time (the first color management solutions did not appear until early 1990s, when color displays became more available), Apple’s pick of a 1.8 display gamma enabled the Macintosh displays to match the press. In early 1990s, the TV industry developed the High-Definition TV capture standard known as ITU Recommendation 709, using a net gamma of around 2. Later, in 1996, IEC put forth a CRT-based display standard (sRGB) for the Web that would match the HDTV capture standard, having a net gamma of around 2.2. sRGB was slowly adopted first in the PC display market, next in the burgeoning digital camera market, and 2.2 became the dominant display gamma. Is 2.2 the ultimate gamma? No. In 2005, leveraging color science research, the movie studios’ Digital Cinema Initiative selected a gamma of 2.6 as providing the best perceptual quality for 12-bit cinema projection. Today, few can afford a true Digital Cinema display at home, but as always prices are falling. Yes, that’s what I’ll have in my next home theater. But, recall VHS versus BetaMax. The VHS format finally died with the last video tape. Gamma 2.2 will not be unseated easily. However, calibrated displays and functional color management will make gamma a moot point. Gamma will be for the Luddites.