Results 1 to 10 of 21

Thread: Monitor Calibration

Threaded View

Previous Post Previous Post   Next Post Next Post
  1. #14
    Senior Member
    Join Date
    Dec 2008
    Location
    Ottawa, ON
    Posts
    1,445
    Re: Kayaker's points.

    Color spaces. Another important one is P3, or DCI-P3. It is what the new iMacs use, likely what the new MacBooks use, and is what all the HDR 4K TVs use. This is what digital projectors in theatres use. This means these panels will drop in price and images in this color space will become more and more common. I'd seriously consider P3 vs. AdobeRGB.

    Bits. I agree with your assessment. sRGB can be done in any number of bits. Original VGA was 6-bit. Commodore Amigas were 4-bit. A larger color space needs more bits to produce step-free gradients.

    About 32-bit display options. This is just an optimization on the graphics card. It is far easier for the programmer and the hardware to deal with a pixel in a single 32-bit word than worry about 24-bit words. Typically graphics data with be X8R8G8B8 or A8R8G8B8, with "X" being unused, and "A" being alpha (transparency). I'm not up on the latest GPUs, but a quick look on the nVidia site doesn't reveal even their Titan X mentioning 10-bit color, and AMD mentions something about Deep Color, but I don't know enough to comment on what that is.

    edit:
    Seems a relevant 2016 post from someone who has tried it. http://forums.evga.com/FindPost/2510424
    nVidia GTX cards allow 10-bit output from full-screen DirectX, but not in windowed/desktop mode. A Quadro w/ OpenGL can do 10-bit in windowed/desktop mode. They tried on a GTX 1080.

    Looking on the AMD side (since Macs now support 10-bit DCI-P3, and are using AMD GPUs), I see that they've supported 10-bit for years (starting with HD 7000 series). I haven't seen any detail about fullscreen vs. windowed on the AMD side though, so it could have the same issues as nVidia.

    Last, and likely most important, this post about 10-bit display getting support in Photoshop CC on Mac atleast tells you where you can find the option to enable "30-bit" mode. It should be the same or very similar on PC. Maybe it does work on your new GPU, but nobody thought Adobe hid the feature away in preferences menu, instead of just using your current display mode.
    https://petapixel.com/2015/12/04/ado...-to-enable-it/
    Last edited by DavidEccleston; 02-21-2017 at 04:29 AM.
    On Flickr - Namethatnobodyelsetook on Flickr
    R8 | R7 | 7DII | 10-18mm STM | 24-70mm f/4L | Sigma 35mm f/1.4 | 50mm f/1.8 | 85mm f/1.8 | 70-300mm f/4-5.6L | RF 100-500mm f/4-5-7.1L

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •