So guys, I have a question about saturation, as the title of the post might have hinted. When I sort of calibrated my laptop monitor (which means using one of these http://epaperpress.com/monitorcal/ websites to do so) everything became more de-saturated. That was because I set the option called Digital Brightness in my settings to zero. If it matters, I have NVIDIA hardware. No problem so far.
Today I went down at the lake to take near my city to take some pictures, and, since they looked de-saturated to me, I added some in order for me to see them as I like on the screen. Later in the day I showed them to my mom on a LG plasma TV -I did so by putting images on a pen drive which then was placed in my Xbox which made the displaying on the TV. They looked horrendously saturated. The rooftops were tinted of a very unrealistic red, for example, and the greens were just glowing. I understand it was most likely because of the saturation settings on the TV.
but my question is: when I will print those pictures, will they look more like what's displayed on my monitor or on the TV? Would the calibration of the printers of the lab be more similar to my computer screen's images or would it be more similar to the one of the TV?
(If it matters, the TV has a contrast ratio of 1million to 1. For what I read this is the standard for judging image quality on a TV... and the "Image style" setting was on "standard")
thank you for your time and patience,
Andy
ps: if you want me to post one of the images, no problem, just ask =)
[edited to add the end of a sentence..XD]