I'm going to respond to the last part first, since it was an important mistake on my part:
What I said was misleading. (Hmm... glass houses and stones come to mind for some reason [Originally Posted by Chuck Lee
].) I did not mean that sRGB is only for novices, or that everyone who uses sRGB is a novice. I use sRGB, and I don't consider myself a novice. I only meant that out of all the color spaces available for a novice to use, sRGB is the best one. That is, one should not consider using other color spaces until one is no longer a novice. And even then, one may decide to continue using sRGB, just as I have.
Yes. I totally agree with the three statements you pulled out of his article. My "I agree that sRGB is best for novices" statement was a poor attempt to restate #2 ("If you really know what you're doing and working in publishing, go right ahead and use it. If you have to ask, don't even try it.").Originally Posted by Chuck Lee
Ken got those ones right. But there are other parts of his article that I have a problem with. I'll cover some of the inaccuracies below.
There are several good reasons to use AdobeRGB, including accurate color on the display (irrespective of printing), to take advantage of more of the color range that the printer is capable of, and more accuracy in print proofing.Originally Posted by Chuck Lee
The selection of quality sRGB displays is dwindling rapidly. In fact, as of right now I cannot find even one high quality IPS sRGB display. (I'm sure it's out there somewhere, like a needle in the haystack of wide gamut monitors.) Most displays are being built as "wide gamut", i.e. AdobeRGB. Such monitors are not physically capable of accurate portrayal of sRGB. No matter how well the conversion is completed in the software, nor how many bits the LUT in the monitor has, they all have the same bottlenck: 8-bit data from the video card to the display. That is not nearly enough resolution to accurately map to the appropriate color.
If we had 10-bit video cards, cables, displays, and operating systems, it would be possible to send a 10-bit value, which *does* have enough resolution to be mapped to the appropriate color. Then it would not matter that all the quality monitors are AdobeRGB, because they would then be able to portray sRGB accurately. Unfortunately right now we are stuck in Purgatory: one foot is in the Heaven of accurate wide gamut displays and the other foot is stuck in 8-bit display path. Until we step out of 8-bit, AdobeRGB will be useful for anyone who has a wide gamut display.
Another good reason for AdobeRGB is to take better advantage of the color range possible in printers. sRGB just doesn't contain the full saturation that's possible with a quality printer. If you have any photographs with such colors in them (sunsets, flowers, etc.), it would be nice if they could be printed instead of clipped or mapped to a different color. The thread that you and I participated in just last week had a photograph with just such colors:
http://community.the-digital-picture.com/forums/t/1667.aspx?PageIndex=1
The true colors in the scene can't be seen on sRGB, so raw converters have to map them to the most similar color in sRGB (or let them clip). AdobeRGB is a little better, but there are other color spaces such as ProPhotoRGB and BetaRGB that contain any color.
Those two issues combine into higher accuracy in print proofing. If the print is made from AdobeRGB to take advantage of the wider gamut, and the monitor is AdobeRGB, then it becomes possible to attain a much higher level of print proofing accuracy.
That is not to say that you should switch to another color space in your workflow, or that I am going to. I'm sticking to sRGB for now, especially since I bought a very nice high end sRGB display (NEC 2490Wuxi) before they replaced it with an AdobeRGB model.
Getting back to the inaccuracy of Ken's article. Here are some of Ken's mistakes:
No.Originally Posted by Ken Rockwell
That is clearly false from all the factual errors in the article, but I guess he's entitled to his own puffed up opinion.Originally Posted by Ken Rockwell
Vain conceit.Originally Posted by Ken Rockwell
True.Originally Posted by Ken Rockwell
False.Originally Posted by Ken Rockwell
False. That only occurs when "grave errors" are made as he stated above, such as using the color space for the wrong purpose (e.g. posting photos on the web).Originally Posted by Ken Rockwell
True.Originally Posted by Ken Rockwell
Agreed.Originally Posted by Ken Rockwell
Not false, just a little too extreme. I would just leave it at "if you know what you're doing".Originally Posted by Ken Rockwell
True.Originally Posted by Ken Rockwell
True.Originally Posted by Ken Rockwell
Doubtful. I can't imagine how that would be. I wont contest it in detail, but I will say that some software makes it so easy that it's automatic. Look at Lightroom (what the OP uses). Newbies who print from Lightroom are using an even *more* advanced color space than AdobeRGB, and they'll get all the extra colors in their print, without ever even knowing about it!Originally Posted by Ken Rockwell
True, but borders on fear mongering, IMHO.Originally Posted by Ken Rockwell
True.Originally Posted by Ken Rockwell
False. I've never found a photo lab that doesn't support AdobeRGB. It's *hard* to not support AdobeRGB, because all the software reads the profile embedded in the image.Originally Posted by Ken Rockwell
I'll let him have his own opinion, but in my tests, the difference is clear.Originally Posted by Ken Rockwell
Anyone can download the profile for a bunch of printers/inks/papers and see that almost all of them contain colors outside of sRGB.Originally Posted by Ken Rockwell
False. He's trying to refer to something that happens *only* in 8-bit images, but he has it backwards. AdobeRGB colors are *spaced out more* to cover a wider range. This results in lower precision compared to sRGB over the same range. This does not make them duller, it just means they have a higher potential for slightly less accuracy. sRGB, on the other hand, is "squeezed" into a smaller range, giving it the advantage of higher accuracy. But in 16-bit mode, this difference goes away. Keep in mind that Ken does not shoot raw, just JPEG.Originally Posted by Ken Rockwell
At this point I am tired and will not debunk the rest of his article.