Page 1 of 3 123 LastLast
Results 1 to 10 of 21

Thread: Monitor Calibration

Hybrid View

Previous Post Previous Post   Next Post Next Post
  1. #1
    Super Moderator Kayaker72's Avatar
    Join Date
    Dec 2008
    Location
    New Hampshire, USA
    Posts
    5,565

    Monitor Calibration

    Hi Everyone,

    Just looking to tap into the forums expertise and solicit thoughts.

    So, as part of my winter system reorg, I purchased a new monitor. I selected the Benq sw2700pt. Not 4K, but higher res than FHD, 10 bit, had 99% AdobeRGB and was well reviewed.

    I have it up and running (I am typing this on it right now). And, overall, I am very impressed. Great levels of control, pictures on screen almost look as good as prints, etc. But I have run into a single issue. It comes with its own calibration software. I have created a couple of profiles using that software and my Spyder5 calibration device. After calibration the Benq software lets you "Validate" the calibration. And my monitor keeps failing the validation step.

    In looking at it, the validation test wants the average dE to be less than 2 and the maximum dE to be less than 4. I am new to this, but dE is explained here, it appears to be a measurement of color accuracy/reproduction. The lower the dE, the closer the Spyder5 is to measuring the color it expects.

    The monitor came pre-calibrated and those values were avg dE = 0.45 and max dE = 1.97.

    After my first calibrations, I was in the 6 and 9 range for each. But my most recent values, after making some adjustments in the "advanced" calibration setup, are avg dE = ~2.2 (right above "pass" threshold), and max dE = 3.4 (technically a pass, but higher than the factory calibration). Looking at online reviews, and most reviewers were able to achieve avg and max dE very similar to my monitors factory calibration values.

    One thing I noticed was all the online reviewers used the x-rite i1 display pro where I have the Spyder5. In looking online, it does seem that the x-rite i1 is better, more consistent, more sensitive, and more accurate (example here).

    So, my question to those that know more than I about this topic:
    • Do you think it is possible that my monitor is fine and the issue is the Spyder5?
    • Do you think I might need to return/replace my monitor?


    And I am also aware, for the enthusiast photographer I am, this is probably still amazing. But right now my new toy prints a big red "Failed" when I try to validate my calibration and I would like to fix that.

    By the way, I have tried to contact Benq. Thus far their response was to check if the cable was properly connected . It is.

    Thanks in advance.
    Last edited by Kayaker72; 02-20-2017 at 09:52 AM.

  2. #2
    Senior Member Photog82's Avatar
    Join Date
    Mar 2012
    Location
    Maine, USA
    Posts
    321
    I have the same monitor and it's great. I have not yet purchased the x-rite but have edited photos on my screen, ordered prints and they came back great. I did end up having to turn the brightness down to about 70% though.

    I can't be of much help yet.
    --

  3. #3
    Senior Member Photog82's Avatar
    Join Date
    Mar 2012
    Location
    Maine, USA
    Posts
    321
    My X-Rite i1 Display Pro came in tonight; I usually edit in complete dark so I had it measure my screen in that setting. These were my settings pre-calibration:

    Brightness: 65
    Contrast: 47
    Sharpness: 5
    Temp: 6500K
    GammaL 2.2
    sRGB

    My prints from Mpix were coming back pretty good (I usually don't have them color correct but sometimes I do) and sometimes they'd come back too dark. So I thought I'd try calibrating.

    Post Calibration in Dark:
    Brightness: 32

    The screen is sooo dark now and the prints are a tad washed out. I'll have to order some prints; some of my photos look a bit dark. I am wondering if it's just due to me being used to such a bright screen?

    What happens if I want a particular photo to be brighter and I bump the exposure up and then send it to a client who's screen is un-calibrated and it's totally over-exposed on their screen?

    The colors look great though, my screen had a green tint to it and I had never noticed before.

    When I save a JPG in PS do I export my custom profile?
    Last edited by Photog82; 02-16-2017 at 02:50 AM.
    --

  4. #4
    Super Moderator Kayaker72's Avatar
    Join Date
    Dec 2008
    Location
    New Hampshire, USA
    Posts
    5,565
    Quote Originally Posted by Photog82 View Post
    Post Calibration in Dark:
    Brightness: 32

    What happens if I want a particular photo to be brighter and I bump the exposure up and then send it to a client who's screen is un-calibrated and it's totally over-exposed on their screen?

    The colors look great though, my screen had a green tint to it and I had never noticed before.
    From what I've read, most monitors should be 80 cd/m2 to 120 cd/m2. I've seen this a number of places, but here is a link I just found that has that reference.

    Going through Benq, 120 cd/m2 is actually equivalent to ~28-29 "Brightness"...so you might still be too bright. With that said, I think the proof is comparing professional level prints to what you seen on a monitor.

    Glad to hear about the colors. I've been surprised to seen minor changes after calibration.

    As to you other statement, thinking about how other people view our output is pretty amazing. Ultimately, we live in a uncalibrated, sRGB, 8-bit world. For now.

  5. #5
    Senior Member Photog82's Avatar
    Join Date
    Mar 2012
    Location
    Maine, USA
    Posts
    321
    I always check my edits on:
    iPad, iPhone, Android devices, 2 laptops, my BenQ of course, and one at work. The photos pretty much look the same with the exception of a slight color shift but as far as exposure goes it looks good.

    I worked with Mpix, they asked me to upload 5 photos, they are going to not touch them, print them and send them out to me and I will compare to my settings:

    120cd/m2, 55 brightness, 50 contrast and newly tweaked sRGB color profile (which btw looks better than before).

    I had ordered a photobook through Mpix just to show our clients what some of the nicer products would look like, they all came back perfect, just as they had looked on my monitor which makes me happy.
    --

  6. #6
    Super Moderator Kayaker72's Avatar
    Join Date
    Dec 2008
    Location
    New Hampshire, USA
    Posts
    5,565
    Quote Originally Posted by Photog82 View Post
    I always check my edits on:
    iPad, iPhone, Android devices, 2 laptops, my BenQ of course, and one at work. The photos pretty much look the same with the exception of a slight color shift but as far as exposure goes it looks good.

    I worked with Mpix, they asked me to upload 5 photos, they are going to not touch them, print them and send them out to me and I will compare to my settings:

    120cd/m2, 55 brightness, 50 contrast and newly tweaked sRGB color profile (which btw looks better than before).

    I had ordered a photobook through Mpix just to show our clients what some of the nicer products would look like, they all came back perfect, just as they had looked on my monitor which makes me happy.
    James,

    It sounds as if you have yours dialed in. It will be interesting what you think of the prints post calibration compared to pre-calibration.

  7. #7
    Super Moderator Kayaker72's Avatar
    Join Date
    Dec 2008
    Location
    New Hampshire, USA
    Posts
    5,565
    So these are topics that at depth, I am admittedly ignorant. But there are many aspects of photography of where I can learn more, which is something I actually like. So, if you understand this better than I do, please feel free to add on to my understanding.

    I decided to upgrade my monitor this year. Mostly this was because my current monitor was ~6 years old, and there was likely much better monitors out there that were within my budget. The main area I wanted to upgrade to a TPS screen. My understanding is that TPS screens allow for better viewing angles and less light bleed around the pixels. That has held up with my new monitor (freakish how far to the side I can go without image degradation). Talking to Mike, he claims OLED to be even better. I also wanted better control, which I have. And I wanted to AdobeRGB color space as well as a monitor that could handle 10 bits and slightly higher resolution.

    But, as we live in a world where most monitors are Full HD (1,920 x 1,080), 8 bit and sRGB, I have stepped out of the norm. As I have being reading about all of this on the web, I have found a lot of conflicting statements and assumptions about what it takes to make these systems work, so I thought I would comment.

    While I have been looking into graphics cards, colorimeters, etc, I am going to start with color space, 8 bit vs 10 bit and cables.

    Color Space


    Name:  AdobeRGB-vs-sRGB_936px[1] (jpg).jpg
Views: 3236
Size:  157.0 KB

    I like the above figure as the wavelength of the visible spectrum is labeled along the edge and it also displays color temperature. Monitors do not present the full spectrum of colors visible to the human eye. Most monitors display only colors found in the sRGB (red triangle) color space. To an extent, since that is how most images are displayed, that is a critical output. But, I did want to start viewing my images on a monitor that would display more of the color spectrum, thus my current monitor is rated for 99% Adobe RGB (blue triangle).

    Of course, there are color spaces (Pro Photo, Epsom, etc) that will display even more of the visible color spectrum. But Adobe RGB is as good as I have found for monitors in my price range.


    8 bit vs 10 bit

    Most monitors (and jpegs for that matter) are 8 bit (bit stands for "binary digit). Each color channel (red, green and blue) is defined by a 8 bit "byte." With binary codes, each bit is a 0 or a 1, thus for a 8 bit byte there are 28, or 256, potential values for each color channel. As there are 3 color channels, there are the potential for 256 x 256 x 256, or 16,777,216, color combinations in an 8 bit system. For a 10 bit byte, there are 210, or 1,024, potential values for each color channel. When combining each color channel within a single pixel, this results in 1,0243, or 1.07 billion, different potential colors in a 10 bit system.

    My understanding is that these are used to define a given color space. For example, if an 8 bit system is used to define a sRGB color space, that color space would have over 16 million potential combinations of colors. But if a 10 bit system were used to define the sRGB color space, that same color space would have over 1 billion different potential colors. The primary advantage is in color tone and smoother transition between colors within that given color space. I like the following image as an illustration:

    Name:  10-bit-chart[1].jpg
Views: 2566
Size:  19.6 KB


    Bottom line, more variations of the same colors is used to define the same color space so you get smoother transitions.

    However, this is where I have seen conflicting assumptions made by different authors. I believe the above is correct, but I have seen authors assume that the sRGB system is 16 million colors (8 bit) and the Adobe RGB system has 1 billion colors (10 bit). This assumption is that instead of further defining a given color space and dividing each color in to smaller/finer units, each "color" is the same size and with more "colors" the area has to be larger. Like I said, this is out there, but I believe it is incorrect.

    One observation if my assumption is correct, if you stay with the same number of colors (say 16M or 8 bit) and move to a larger color space (sRGB to AdobeRGB) then each color unit actually has to represent a larger area in that color space. Same number of units, larger area = larger units. So, if you go to the Adobe RGB color space, but stay 8 bit, you are actually having a negative impact on color tone.

    Thus, likely best if you go to Adobe RGB that you also go to 10 bit. But, with 10 bits, you are moving more information.

    Cables and Connections

    Easy right? Cables, you just hook them together and things work, right? Any connection that is on a device should be able to run that device as intended, right? In working through this all, I am not sure that last assumption is 100% accurate.

    First, cables are intended to transmit information. But how and how much information they transmit does vary.

    Quickly going through how much information needs to be transmitted. Below are a few calculations for typical monitors:


    • FHD: 1,920 x 1,080 = 2,073,600 pixels. If you are transmitting 8 bit in 3 colors per pixel that is 24 bits per pixel. If you want a refresh rate of 60 fps (hertz) then you have 2,073,600 x 24 x 60 which equals ~3 Gigabits per second (Gbit/s).
    • My new monitor: 2,560 x 1,440 pixels. 10 bits in 3 colors = 30 bits per pixel. 60 fps = 6.64 Gbit/s.
    • 4K UHD: 3,840 x 2,160 pixels, 30 bits per pixel, 60 fps = 14.9 Gbit/s.



    So, what are the different types of cables and what are they capable of:


    • VGA. Analog signal. I see several references to it being obsolete, but it still have a port on my laptop from work. So, maybe not completely obsolete.
    • DVI. DVI can be a digital or analog signal. In addition, I see references that it writes each pixel sequentially, something that seems similar to a rolling shutter. There are two important types of DVI:
      • Single link DVI can move 24 bits of information (3 colors/8 bits per color) and maxes out at nearly 4 Gbit/s.
      • Dual link DVI is reported to be able to move more than 8 bits by separating and moving the extra bits of information to the dual linkage. It supports a maximum of 2,560 x 1,600 resolution and up to 7.9 Gbit/sec (with overhead removed).

    • HDMI. Ubiquitous with TVs the version of HDMI you have matters. It is digital. Can run ~10 meter in distance.
      • Version 1.3 can move up to 8.16 Gbit/sec of data (with overhead removed). I have seen some references stating that this can do 4K UHD, others saying you need version 1.4.
      • Version 1.4. Definitely allows 4K UHD video at 30 fps but not at 60 fps.
      • Version 2.0 will support 4K UHD at 60 fps as it allows 18 Gbit/sec.
      • Version 2.1 will support 48 Gbit/s. It states that it will do 8K and even 10K video but that likely is only with compression as 8K video at 10 bits and 60 fps is ~60 Gbit/s. So I expect compression, fps, and bit depth will vary with increasing resolution.

    • DisplayPort. DisplayPort seems to be the newest and, perhaps best. The only issue I have read is that it does not travel significant distances (~3 meters), but from a computer to a monitor is not an issue. Sends the information digitally in packets. This seems more like a global shutter as I read about it. The version of DisplayPort also matters.
      • Version 1.2: 17.28 Gbit/s
      • Version 1.3: 32.2 Gbit/s
      • Version 1.4: Compression (3:1) that allows 8K video in that 32.2 Gbit/s stream


    This is relevant to me as the Benq sw2700pt came with D-DVI, HDMI 1.4 and DisplayPort 1.2 ports. As my graphic card has dual D-DVI and mini-HDMI ports. So, DisplayPort, the only one with significant headroom to the 6.6 Gbit/s of my monitor isn't supported by my graphics card.

    While I am sitting here running my monitor through the D-DVI, I am not sure if I have 10 bit color (actually, I strongly suspect I do not have 10 bit color, even though the monitor is capable). Even if I do, the extra bits are being separated and reconnected to flow through the D-DVI connection and I am at 84% of the D-DVI's throughput capacity. But, I actually think that my Graphics Card is intentionally dumbed down to only do 8 bits. Thus, I am about to log off and install my new graphics card.....
    Last edited by Kayaker72; 02-20-2017 at 05:45 PM.

  8. #8
    Super Moderator Kayaker72's Avatar
    Join Date
    Dec 2008
    Location
    New Hampshire, USA
    Posts
    5,565
    My previous graphics card was a Nvidea GeForce 430. This is where I have found more conflicting data on the web. Perhaps the best information was from Nvidea themselves in this response to a question. In it, they state that GeForce has supported 10 bit since the 200 series but that programs such as Adobe Photoshop (and I am going to assume Lightroom) need OpenGL for 10-bit per color. This is only available on their Quadro series of GPUs (in 2011). What is interesting, but the GeForce 430 does have OpenGL (4.2). But, "OpenGL" might not be "OpenGL 10 bit" Those could be different things. But, for sure, the GeForce 430 is not a Quadro card. Those are more recent.

    Also, I could open the Nvidea control panel and saw the following display:

    Attachment 2622

    Color depth of 32-bit is greater than 30 bit, making me think the GPU was capable of 10 bit per channel. But, then I found a reference to the same screen but it had more information.

    Then I had dinner with a computer scientist that also happens to be a photographer running dual Benq sw2700pt monitors (he shoots Sony). He wasn't sure if the GeForce 430 would support 10 bit, but he knew that the new GeForce 1050/1060/1070 would, even though they aren't Quadro (that quote was from 2011).

    So, I brought in the GeForce 1050 Ti. I inserted into my computer this morning, and now I have the following options in the Nvidea control panel:

    Attachment 2623

    Under #3, "Output color depth" had defaulted to 8 bit per channel. I did have to go in and change it to 10 bpc. But, now I am pretty confident that I am running 10 bits in Adobe RGB on my monitor.

    As Mike was saying as we discussed it, it is the whole line that you have to look at when you start making changes: the monitor, the connection, and the graphics card. Each has to be compatible (I know, in the end it is obvious). I might be able to use D-DVI, but I have switched to displayport and with the new GPU, I think I am getting what I originally wanted.

  9. #9
    Senior Member
    Join Date
    Jan 2013
    Location
    Sainte Angele De Monnoir, Quebec
    Posts
    478
    did you use a calibrator before and after Brant ? I am curious to know if the color output did change after upgrading the graphics card. I bought a monitor that was advertised at 1 billion colors but when I calibrate , I am only getting 74% abode rgb but 100% srgb. I have an older graphics card though.
    Stuart Edwards
    1DX Mark II , 6D , Samyang 14mm f2.8 ,Sigma 85mm f1.4A , 24-105mm f/4L IS , 70-200mm f/2.8L IS II ,100-400 f5.6L II , 300mm f/2.8L II , EF 1.4x III , EF 2x III, 430EX II

  10. #10
    Senior Member Photog82's Avatar
    Join Date
    Mar 2012
    Location
    Maine, USA
    Posts
    321
    My prints came back today.

    The exposure was pretty much spot on; some of the shadows in one portion of the print was slightly darker but not a big deal at all. The colors looked good with the exception of the skin tones; they were slightly warmer in the print. I believe that is due to the fact that Mpix prints at D50 and I'm on D65. I'm going to order the same prints and have them color correct the prints to see what the actual difference is.

    I've read that an update to JPG is coming out to support 10bit color depth but it didn't indicate when. :| How did you test your color depth? I tried installed the BenQ software and it kept crashing.

    I'm using DisplayPort. DVI-D works but doesn't perform as well at higher resolutions. I'm also running mine on sRGB as that is what most of the world is using.
    --

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •