Monitor calibration: should I?

I also own a BenQ (PD3200U, bought new and also factory calibrated) and actually encountered my self with the same questions. I also read about different color calibration devices giving different results. When I looked up the price of the calibration device used by BenQ it was in the thousands of dollars, if I remember correctly. Regarding brightness I have since new the slider in the middle.

So, I leave it as it is. Many will disagree with me.
I don't know what you were looking at that costs thousands of dollars. I have a BenQ SW320 and I calibrate it about 4 times a year with BenQ's free Palette Master Ultimate software and an X-rite i1 Display Pro which the software recognizes. X-rite colorimeters are now owned and produced by Calibrite. A Calibrite Display Pro HL costs $379.00 Canadian on Amazon at the moment.
I too use my X1 Display Pro with PMU on my SW270C

In regard to the OP talking off a calibration device used by BenQ costing $1000's without him showing where or what he read......only two concl
My apologies, I was confusing you the OP with @Ab S
I never said that.
Yes, no doubt they use more sophisticated devices than the average end user would be using.

Could he have read about the devices to create paper ICC profiles?
Not sure what you mean.
Again in regard to what Ab S said about the screen calibration device costing $1000's and clearly they do not.......but the kit used in the manufacturing plants to calibrate the screens likely could be very expensive.

Or Ab S is confusing screen calibration with paper profiling calibration where such spectrophotometers are up to 10x the price a screen calibrator.
 
I used to own an X-Rite i1Display Studio calibration tool. However, I struggled to keep up to date with profiling my monitor as per the reminders popping up. Also, I got different profiling results between that device and the Datacolor Spyder I owned before. I read about how colorimeters themselves might “drift” over time, and I also read about people who had different experiences using the open-source DisplayCAL software. Last but not least, I now have a Benq PD3220U, and although I bought it used, it is factory-calibrated. I’m also not so sure anymore about the need to regularly calibrate. I mean, if that is so important, then unless I’m missing something, factory calibration should be less important, right?
My personal experience with mostly Apple monitors over the years and following these sorts of discussions in this forum (and others), is that the use of monitor calibration devices/software for most users is - at best - overkill and, all too often, more trouble than its worth. Unless you're doing critical commercial work (e.g., product photography) or critical printing (e.g., large expensive prints at a commercial lab instead of with your personal printer), then I'd suggest not bothering unless and until you're seeing problems yourself (or getting concerned feedback from others). View your posted images on multiple devices and see for yourself if there's noticeable problems relative to your working monitor. If not, stop sweating the issue. Viewing your prints (if that's your thing) provides important validation as well, but bear in mind that the printer's calibration and color settings are also very much a factor here! Only resort to mucking with your working monitor's calibration if there's good reason to do so and the added complexity is justified.
So all in all, I sold my X-Rite device, and I am hesitant to go back to monitor calibration. But lately I find myself wondering whether the colors I see on my screen, are actually accurate. I went to the Photography show in London a week ago, attended a few talks, and was amazed by the colors and white balance in some of the images.
You were "amazed" by the white balance? And the "amazing" colors you saw had virtually nothing to do with the accuracy of the calibration of the monitors used to generate the images. My guess is that most of those images weren't very truthful to the color of the photographed scene as directly viewed by the photographer.
So I find myself thinking: if color is getting more important to me, should I revisit the whole subject?
The fact that your'e asking that question indicates to me that you're not really there yet in understanding the role of monitor calibration vs. things like white balance, image lightness, contrast, saturation, etc. that are primarily managed in image post/processing.
Admittedly it is also my lack of understanding of how colorspaces and monitor brightness come into the equation.
Yep. Color space consideration is really quite different from monitor calibration. Getting the right working monitor brightness for your output needs doesn't begin and end with calibration software. Use your eyes and comp with other display devices and prints.
Plus, the fact that I already have limited time to edit my images. I’d rather spend the time that would be lost calibrating my monitor, to actually editing photos.
Totally agreed that your limited time is better applied to other parts of the image creation process.
But maybe people can provide some more information do that I can make a more informed decision? I guess that’s what I’m after with this topic.
Just my 2 cents. YMMV
 
I agree 100%. Monitor calibration is a lot like wanting more megapixels, useful for some but overkill for most. Ask yourself "do I really need it"? I once desired more and more megapixels in my cameras. When I reached 42mp I realized that 20mp is all I really needed. Before someone argues with me, I do understand why some photographers can make use of high-resolution cameras when printing huge or extensive cropping to extend lens capability (cropping) at the long end which is what I find useful sometimes. For a final image I rarely need more than 8mp.

--
Tom
 
Last edited:
I would say that utmost monitor accuracy only matters ...

... if you're printing. Then you can soft proof. That matters because you don't want to have to guess about how your print is going to look.

... if you're selling your work.

I don't sell my work, and I almost never print. My monitor is set up according to my experienced eye as a photographer. Anyone else who sees my photos is using a different display device anyway, so I don't worry too much about what they see.
 
Last edited:
I would say that utmost monitor accuracy only matters ...

... if you're printing. Then you can soft proof. That matters because you don't want to have to guess about how your print is going to look.

... if you're selling your work.

I don't sell my work, and I almost never print. My monitor is set up according to my experienced eye as a photographer. Anyone else who sees my photos is using a different display device anyway, so I don't worry too much about what they see.
In the future (or is it the present?), nearly all pictures will be taken, and viewed, on phones.

I really dislike those tiny screens. Maybe because, due to presbyopia, I can't hold the screen 6" (15cm) from my eye anymore. And I don't have any 7 diopter reading glasses.
 
Last edited:
I would say that utmost monitor accuracy only matters ...

... if you're printing. Then you can soft proof. That matters because you don't want to have to guess about how your print is going to look.

... if you're selling your work.

I don't sell my work, and I almost never print. My monitor is set up according to my experienced eye as a photographer. Anyone else who sees my photos is using a different display device anyway, so I don't worry too much about what they see.
Hmm, but if you export your images from your RAW converter, you have to chose a color space, and if the converter software is any good, it will tag the exported image with that color space. And maybe it will even embed the color space profile (IIRC). Then, if on the receiving end the application is color-aware, I do believe that the result should match what you see. Though I guess, if their monitor is not calibrated, things could still be off.
 
I would say that utmost monitor accuracy only matters ...

... if you're printing. Then you can soft proof. That matters because you don't want to have to guess about how your print is going to look.

... if you're selling your work.

I don't sell my work, and I almost never print. My monitor is set up according to my experienced eye as a photographer. Anyone else who sees my photos is using a different display device anyway, so I don't worry too much about what they see.
Hmm, but if you export your images from your RAW converter, you have to chose a color space, and if the converter software is any good, it will tag the exported image with that color space. And maybe it will even embed the color space profile (IIRC). Then, if on the receiving end the application is color-aware, I do believe that the result should match what you see.
You can look at an image file on various devices and see that's not true. I'm looking at the same photos on my computer and my phone, and there are significant differences.
Though I guess, if their monitor is not calibrated, things could still be off.
It's not just a matter of calibration (though it's pretty certain that most devices are not calibrated), but also of the display technology, the software involved, and to some extent the age of the device.
 
Last edited:
I would say that utmost monitor accuracy only matters ...

... if you're printing. Then you can soft proof. That matters because you don't want to have to guess about how your print is going to look.

... if you're selling your work.

I don't sell my work, and I almost never print. My monitor is set up according to my experienced eye as a photographer. Anyone else who sees my photos is using a different display device anyway, so I don't worry too much about what they see.
Hmm, but if you export your images from your RAW converter, you have to chose a color space, and if the converter software is any good, it will tag the exported image with that color space. And maybe it will even embed the color space profile (IIRC). Then, if on the receiving end the application is color-aware, I do believe that the result should match what you see.
You can look at an image file on various devices and see that's not true. I'm looking at the same photos on my computer and my phone, and there are significant differences.
That is not an indication that what I say is not true. Though it might influence the usefulness of calibrating your own monitor.

Still, I'm not sure if that's the case either. Music studios have monitor speakers that need to be super accurate, precisely because people have different headphones, speakers, etc etc etc. Within all that chaos you need to have a standard such that the music will sound good on the entire variety of electronics and speakers out there.
Though I guess, if their monitor is not calibrated, things could still be off.
It's not just a matter of calibration (though it's pretty certain that most devices are not calibrated), but also of the display technology, the software involved, and to some extent the age of the device.
Well, regarding the software, that's why I said, "if the application is color-aware". Also, how does the display technology affect things in a way that calibration cannot compensate for? (unless I'm mistaken about what calibration actually does)
 
I would say that utmost monitor accuracy only matters ...

... if you're printing. Then you can soft proof. That matters because you don't want to have to guess about how your print is going to look.

... if you're selling your work.

I don't sell my work, and I almost never print. My monitor is set up according to my experienced eye as a photographer. Anyone else who sees my photos is using a different display device anyway, so I don't worry too much about what they see.
Hmm, but if you export your images from your RAW converter, you have to chose a color space, and if the converter software is any good, it will tag the exported image with that color space. And maybe it will even embed the color space profile (IIRC). Then, if on the receiving end the application is color-aware, I do believe that the result should match what you see.
You can look at an image file on various devices and see that's not true. I'm looking at the same photos on my computer and my phone, and there are significant differences.
That is not an indication that what I say is not true. Though it might influence the usefulness of calibrating your own monitor.

Still, I'm not sure if that's the case either. Music studios have monitor speakers that need to be super accurate, precisely because people have different headphones, speakers, etc etc etc. Within all that chaos you need to have a standard such that the music will sound good on the entire variety of electronics and speakers out there.
Though I guess, if their monitor is not calibrated, things could still be off.
It's not just a matter of calibration (though it's pretty certain that most devices are not calibrated), but also of the display technology, the software involved, and to some extent the age of the device.
Well, regarding the software, that's why I said, "if the application is color-aware". Also, how does the display technology affect things in a way that calibration cannot compensate for? (unless I'm mistaken about what calibration actually does)
It's not worth my time to try to convince you of anything. Think what you will; it's all fine.
 
Last edited:
I would say that utmost monitor accuracy only matters ...

... if you're printing. Then you can soft proof. That matters because you don't want to have to guess about how your print is going to look.

... if you're selling your work.

I don't sell my work, and I almost never print. My monitor is set up according to my experienced eye as a photographer. Anyone else who sees my photos is using a different display device anyway, so I don't worry too much about what they see.
Hmm, but if you export your images from your RAW converter, you have to chose a color space, and if the converter software is any good, it will tag the exported image with that color space. And maybe it will even embed the color space profile (IIRC). Then, if on the receiving end the application is color-aware, I do believe that the result should match what you see.
You can look at an image file on various devices and see that's not true. I'm looking at the same photos on my computer and my phone, and there are significant differences.
That is not an indication that what I say is not true. Though it might influence the usefulness of calibrating your own monitor.

Still, I'm not sure if that's the case either. Music studios have monitor speakers that need to be super accurate, precisely because people have different headphones, speakers, etc etc etc. Within all that chaos you need to have a standard such that the music will sound good on the entire variety of electronics and speakers out there.
Though I guess, if their monitor is not calibrated, things could still be off.
It's not just a matter of calibration (though it's pretty certain that most devices are not calibrated), but also of the display technology, the software involved, and to some extent the age of the device.
Well, regarding the software, that's why I said, "if the application is color-aware". Also, how does the display technology affect things in a way that calibration cannot compensate for? (unless I'm mistaken about what calibration actually does)
It's not worth my time to try to convince you of anything. Think what you will; it's all fine.
Ok, err… up to you. Not sure why I deserved a response like that though.
 
It's not worth my time to try to convince you of anything. Think what you will; it's all fine.
Ok, err… up to you. Not sure why I deserved a response like that though.
There's no malice intended, just a realization that I have little enthusiasm to keep discussing this. I briefly explained my position, which has been formulated according to my experiences and goals. Yours might very well be different, and there's nothing wrong with that.

If what you prefer are scientific answers to the questions you're asking, it might be beneficial for you to ask in the Photographic Science and Technology forum. Those guys are eager for such discussions.
 
Last edited:
Again in regard to what Ab S said about the screen calibration device costing $1000's and clearly they do not.......but the kit used in the manufacturing plants to calibrate the screens likely could be very expensive.

Or Ab S is confusing screen calibration with paper profiling calibration where such spectrophotometers are up to 10x the price a screen calibrator.
I am sorry if being confusing. The BenQ PD3200U has been factory calibrated. On the calibration report it has been specified that they used as measurement device the Konica-Minolta CA310. This device - as far I could see is only available pre-owned. Prices vary between around $ 1000,- to $ 7500,- See e.g. https://www.ebay.com/itm/1446821851...1066911271&itmmeta=01JPSVDAKVANHAN9F20CCJ55QY
 
Again in regard to what Ab S said about the screen calibration device costing $1000's and clearly they do not.......but the kit used in the manufacturing plants to calibrate the screens likely could be very expensive.

Or Ab S is confusing screen calibration with paper profiling calibration where such spectrophotometers are up to 10x the price a screen calibrator.
I am sorry if being confusing. The BenQ PD3200U has been factory calibrated. On the calibration report it has been specified that they used as measurement device the Konica-Minolta CA310. This device - as far I could see is only available pre-owned. Prices vary between around $ 1000,- to $ 7500,- See e.g. https://www.ebay.com/itm/1446821851...1066911271&itmmeta=01JPSVDAKVANHAN9F20CCJ55QY
 
I used to calibrate my monitors periodically, until I bought a MBP 16 in with M1 Max in 2021 after which I stopped calibration. I did not think that anything I could accomplish could compare with the high end equipment Apple has.

Then a few months ago, I replaced my extended monitor with 40in Dell 4025, which was reputed to have good calibration out of the box.

Yesterday I was "discussing" my settings etc with my new best AI friend "Claude" and changed a few things. I have both the Mac and the Dell set to Display - P3 and updated the Calibrate software to the newest version and yesterday calibrated both with the x-rite I1. Now they are both set to identical brightness, color space and calibrated with 400 plus patches.

I have to say that to my old eyes, the colors on both look identical and the test patch results were very good. Time will tell how this works out.

Best
 
IMHO, you can successfully go either way. I'm a bit OCD and calibrate periodically with a Calibrite unit. Is my color more fabulous? No. Is my confidence higher? Maybe. The fact is I know what neutral grey looks like and I can get a monitor very close by eye. Same for brightness. I like 100 nits in my medium dim room, but I'd set it the same way by eye. If the cost of the puck doesn't bother you, go for it. If the money is better spent on lenses, film or food, go for it. I see almost no shift over time and just do the cal a couple times a year. There is no right answer and in spite of the advertising, you'll still have occasional doubts about color management!
 
One benefit I got out of having a monitor calibrator is it actually tells you how good a monitor is (relative to another), as well as calibrating it.

For instance I had an old BenQ upstairs and I thought they were a quite good brand so I thought I'd bring it down and use it for photo editing.

When I calibrated it, it said that it could only display about 50-odd percent of sRGB. My Samsungs can display about 95% sRGB, and my macBook air can display 100% of sRGB.

So that told me that the BenQ was probably a lot worse for editing photos on than the Samsungs or my macbook.

Also it doesn't take long, only about 5 minutes at most.
 
One benefit I got out of having a monitor calibrator is it actually tells you how good a monitor is (relative to another), as well as calibrating it.

For instance I had an old BenQ upstairs and I thought they were a quite good brand so I thought I'd bring it down and use it for photo editing.

When I calibrated it, it said that it could only display about 50-odd percent of sRGB. My Samsungs can display about 95% sRGB, and my macBook air can display 100% of sRGB.

So that told me that the BenQ was probably a lot worse for editing photos on than the Samsungs or my macbook.

Also it doesn't take long, only about 5 minutes at most.
Wow. What model BenQ was that? I have the SW 321C and it gets close to 100% Adobe RGB.
 
One benefit I got out of having a monitor calibrator is it actually tells you how good a monitor is (relative to another), as well as calibrating it.

For instance I had an old BenQ upstairs and I thought they were a quite good brand so I thought I'd bring it down and use it for photo editing.

When I calibrated it, it said that it could only display about 50-odd percent of sRGB. My Samsungs can display about 95% sRGB, and my macBook air can display 100% of sRGB.

So that told me that the BenQ was probably a lot worse for editing photos on than the Samsungs or my macbook.

Also it doesn't take long, only about 5 minutes at most.
Wow. What model BenQ was that?
GW 2480
 
IMHO, you can successfully go either way. I'm a bit OCD and calibrate periodically with a Calibrite unit. Is my color more fabulous? No. Is my confidence higher? Maybe. The fact is I know what neutral grey looks like and I can get a monitor very close by eye. Same for brightness. I like 100 nits in my medium dim room, but I'd set it the same way by eye. If the cost of the puck doesn't bother you, go for it. If the money is better spent on lenses, film or food, go for it. I see almost no shift over time and just do the cal a couple times a year. There is no right answer and in spite of the advertising, you'll still have occasional doubts about color management!
Sounds good, in particular that you don't see much of shift over time!

Which monitor do you have and is it factory calibrated?
 

Keyboard shortcuts

Back
Top