My previous graphics card was a Nvidea GeForce 430. This is where I have found more conflicting data on the web. Perhaps the best information was from Nvidea themselves in this response to a question. In it, they state that GeForce has supported 10 bit since the 200 series but that programs such as Adobe Photoshop (and I am going to assume Lightroom) need OpenGL for 10-bit per color. This is only available on their Quadro series of GPUs (in 2011). What is interesting, but the GeForce 430 does have OpenGL (4.2). But, "OpenGL" might not be "OpenGL 10 bit" Those could be different things. But, for sure, the GeForce 430 is not a Quadro card. Those are more recent.

Also, I could open the Nvidea control panel and saw the following display:

Attachment 2622

Color depth of 32-bit is greater than 30 bit, making me think the GPU was capable of 10 bit per channel. But, then I found a reference to the same screen but it had more information.

Then I had dinner with a computer scientist that also happens to be a photographer running dual Benq sw2700pt monitors (he shoots Sony). He wasn't sure if the GeForce 430 would support 10 bit, but he knew that the new GeForce 1050/1060/1070 would, even though they aren't Quadro (that quote was from 2011).

So, I brought in the GeForce 1050 Ti. I inserted into my computer this morning, and now I have the following options in the Nvidea control panel:

Attachment 2623

Under #3, "Output color depth" had defaulted to 8 bit per channel. I did have to go in and change it to 10 bpc. But, now I am pretty confident that I am running 10 bits in Adobe RGB on my monitor.

As Mike was saying as we discussed it, it is the whole line that you have to look at when you start making changes: the monitor, the connection, and the graphics card. Each has to be compatible (I know, in the end it is obvious). I might be able to use D-DVI, but I have switched to displayport and with the new GPU, I think I am getting what I originally wanted.