New Build for 2023 Q4: Considerations and Thoughts?

what is running slow for you? I am very happy with the performance of LrC.
Also, any possibility of transferring over my OS and programs, without drivers, or is that just a mess that never ends well?
I generate Standard and Smart Previews and when I scroll through my images in the Grid mode, I still get gray thumbnails that take a bit of time to load.

Is there an ideal graphic card? As I said, I have an Nvidia 1060 6GB. Is this sufficient? Not interested in an expensive card with negligible benits.
I'd need to toss a lower end GPU (I have a 1080) in my system to check for sure but I don't think it uses the GPU for culling. To test it out as is I just scrolled through a 5+k image catalog with 10 across per row as fast I could and they were all there instantly. And the scrolling through the images on the main monitor is instant as well.

This is on a Ryzen 3900X and also using a PCI-E 3 NVME drive.
 
what is running slow for you? I am very happy with the performance of LrC.
Also, any possibility of transferring over my OS and programs, without drivers, or is that just a mess that never ends well?
I generate Standard and Smart Previews and when I scroll through my images in the Grid mode, I still get gray thumbnails that take a bit of time to load.

Is there an ideal graphic card? As I said, I have an Nvidia 1060 6GB. Is this sufficient? Not interested in an expensive card with negligible benits.
I'd need to toss a lower end GPU (I have a 1080) in my system to check for sure but I don't think it uses the GPU for culling. To test it out as is I just scrolled through a 5+k image catalog with 10 across per row as fast I could and they were all there instantly. And the scrolling through the images on the main monitor is instant as well.

This is on a Ryzen 3900X and also using a PCI-E 3 NVME drive.
Here is some information from Adobe


How does Lightroom Classic use the graphics processor?

When configured (Preferences > Performance), Lightroom Classic can use a compatible graphics processor (also called a graphics card, video card, or GPU) to speed up tasks of displaying and adjusting images in the Develop module, the Library module's Grid view, Loupe view, and Filmstrip. Features like Select Subject, Select Sky, and Enhance Details are also accelerated by the GPU. Using a compatible GPU can provide significant speed improvements on high-resolution displays, such as 4K and 5K monitors.
 
Why switch? Incremental gains in new CPUs have become minimal compared to the heyday of Moore's Law. Buying a new machine when your old one is adequate is, IMHO, needless consumption that contributes to our global woes.
LOL. Yeah ok. And by who's definition of "adequate"?

Anyway, with today's builds its not "simply" about the CPU but also the new tech and features the supporting motherboard provides.

When I do a new build besides the CPU I focus on the motherboard, as I get more from it than the "CPU".
 
Last edited:
A state of the art PC with the latest and greatest Intel or AMD components and DDR5, PCIe5 m2 etc will test faster on every metric than the exisitng machine.

How much of that increased speed will be percieved for the stated use case is likely to be significantly less. One can only type and read and make decisions about image processing at a fixed rate regardless of the speed inherent to the silicon. Core count means far less than IPC for all things Adobe, except Premiere.

The most perceptible speed increases will be in tasks that involve unattended CPU and GPU processing over periods of time susceptible to human perception. That excludes many software tests of computer components that generate impressive bar graphs of fractional speed differences. The most predictable improvement would be in video rendering, particularly if using CUDA dependent software or newer codecs like AV1. Large batch renders in LR should be more efficient but I doubt that is something many use on a daily basis. It is difficult to see how perceived speed would increase in PS, unless moving from mechanical storage of very large PSD files to solid state (sadly perceptible I/O in PS does not always scale with the speed of the storage media in my experience).

Newer GPUs and CPUs should have an advantage for "AI" empowered software of the Topaz ilk, but that software has to be designed to run on lesser silicon.

Given the current silicon the most cost effective investment would be in a large gamut, hi resolution monitor.

The psychological benefits of retail therapy cannot be overstated when deciding to invest in new computer goodies.
 
what is running slow for you? I am very happy with the performance of LrC.
Also, any possibility of transferring over my OS and programs, without drivers, or is that just a mess that never ends well?
I generate Standard and Smart Previews and when I scroll through my images in the Grid mode, I still get gray thumbnails that take a bit of time to load.

Is there an ideal graphic card? As I said, I have an Nvidia 1060 6GB. Is this sufficient? Not interested in an expensive card with negligible benits.
I don't think the GPU impacts previews and scrolling photos. I have a 900K photo library on a NAS and D850 photos seem to res-in very fast. Almost instantly. When I scroll with in grid mode, it also generates all the thumbnails quickly.
 
what is running slow for you? I am very happy with the performance of LrC.
Also, any possibility of transferring over my OS and programs, without drivers, or is that just a mess that never ends well?
I generate Standard and Smart Previews and when I scroll through my images in the Grid mode, I still get gray thumbnails that take a bit of time to load.

Is there an ideal graphic card? As I said, I have an Nvidia 1060 6GB. Is this sufficient? Not interested in an expensive card with negligible benits.
I'd need to toss a lower end GPU (I have a 1080) in my system to check for sure but I don't think it uses the GPU for culling. To test it out as is I just scrolled through a 5+k image catalog with 10 across per row as fast I could and they were all there instantly. And the scrolling through the images on the main monitor is instant as well.

This is on a Ryzen 3900X and also using a PCI-E 3 NVME drive.
Here is some information from Adobe

https://helpx.adobe.com/lightroom-classic/kb/lightroom-gpu-faq.html#:~:text=When%20configured%20(Preferences%20%3E%20Performance),%2C%20Loupe%20view%2C%20and%20Filmstrip.

How does Lightroom Classic use the graphics processor?

When configured (Preferences > Performance), Lightroom Classic can use a compatible graphics processor (also called a graphics card, video card, or GPU) to speed up tasks of displaying and adjusting images in the Develop module, the Library module's Grid view, Loupe view, and Filmstrip. Features like Select Subject, Select Sky, and Enhance Details are also accelerated by the GPU. Using a compatible GPU can provide significant speed improvements on high-resolution displays, such as 4K and 5K monitors.
Thanks, that last bit looks to be the ticket.

So I tested this with my i5 Surface Pro 6 for a low end spec and stressed things a lot running it both just with the internal monitor and with 2 additional ones hooked up via a dock. So with the integrated GPU and 2 monitors it's definitely dragging a bit though I think a GTX 1060 should be able to handle that without an issue.

The big thing though is resolution.

This is how it went:

Surface pro 6 just internal monitor - minor lag

Surface pro 6 internal monitor with 2 hooked up - very noticeable lag of sometimes over a second.

Surface Pro 6 on a 1920x1200 display - jerkiness as in it would jump to the new selection and not scroll through but not really any lag

With the 3900X/GTX1080 on a 4k display it was similar just a little less jerky to the Surface on the 1920x1200 display.

And at all other resolutions the GTX 1080 was smooth scrolling through them.

So reid thaler you might want to do what I used to with my 4k laptop. If you have a 4k display cull at 1080P and then edit at 4k. Beyond the reduced lag preview generation time is greatly reduced.
 
Last edited:
what is running slow for you? I am very happy with the performance of LrC.
Also, any possibility of transferring over my OS and programs, without drivers, or is that just a mess that never ends well?
I generate Standard and Smart Previews and when I scroll through my images in the Grid mode, I still get gray thumbnails that take a bit of time to load.

Is there an ideal graphic card? As I said, I have an Nvidia 1060 6GB. Is this sufficient? Not interested in an expensive card with negligible benits.
I'd need to toss a lower end GPU (I have a 1080) in my system to check for sure but I don't think it uses the GPU for culling. To test it out as is I just scrolled through a 5+k image catalog with 10 across per row as fast I could and they were all there instantly. And the scrolling through the images on the main monitor is instant as well.

This is on a Ryzen 3900X and also using a PCI-E 3 NVME drive.
Here is some information from Adobe

https://helpx.adobe.com/lightroom-classic/kb/lightroom-gpu-faq.html#:~:text=When%20configured%20(Preferences%20%3E%20Performance),%2C%20Loupe%20view%2C%20and%20Filmstrip.

How does Lightroom Classic use the graphics processor?

When configured (Preferences > Performance), Lightroom Classic can use a compatible graphics processor (also called a graphics card, video card, or GPU) to speed up tasks of displaying and adjusting images in the Develop module, the Library module's Grid view, Loupe view, and Filmstrip. Features like Select Subject, Select Sky, and Enhance Details are also accelerated by the GPU. Using a compatible GPU can provide significant speed improvements on high-resolution displays, such as 4K and 5K monitors.
Thanks, that last bit looks to be the ticket.

So I tested this with my i5 Surface Pro 6 for a low end spec and stressed things a lot running it both just with the internal monitor and with 2 additional ones hooked up via a dock. So with the integrated GPU and 2 monitors it's definitely dragging a bit though I think a GTX 1060 should be able to handle that without an issue.

The big thing though is resolution.

This is how it went:

Surface pro 6 just internal monitor - minor lag

Surface pro 6 internal monitor with 2 hooked up - very noticeable lag of sometimes over a second.

Surface Pro 6 on a 1920x1200 display - jerkiness as in it would jump to the new selection and not scroll through but not really any lag

With the 3900X/GTX1080 on a 4k display it was similar just a little less jerky to the Surface on the 1920x1200 display.

And at all other resolutions the GTX 1080 was smooth scrolling through them.

So reid thaler you might want to do what I used to with my 4k laptop. If you have a 4k display cull at 1080P and then edit at 4k. Beyond the reduced lag preview generation time is greatly reduced.
Interesting. I do have a Dell 5k monitor, but can only get it run at 4K. Will try a lower res and see.

I just tried it and have trouble getting the side panels smaller, even hold the alt/opt key.

--
Thanks!
Reid
Photography Education and Lightroom Instructor, San Francisco Bay Area
www.lumiograph.com
Kodak Brownie
Argus 126
Quaker Oats Container Pinhole Camera
 
Last edited:
what is running slow for you? I am very happy with the performance of LrC.
Also, any possibility of transferring over my OS and programs, without drivers, or is that just a mess that never ends well?
I generate Standard and Smart Previews and when I scroll through my images in the Grid mode, I still get gray thumbnails that take a bit of time to load.

Is there an ideal graphic card? As I said, I have an Nvidia 1060 6GB. Is this sufficient? Not interested in an expensive card with negligible benits.
I'd need to toss a lower end GPU (I have a 1080) in my system to check for sure but I don't think it uses the GPU for culling. To test it out as is I just scrolled through a 5+k image catalog with 10 across per row as fast I could and they were all there instantly. And the scrolling through the images on the main monitor is instant as well.

This is on a Ryzen 3900X and also using a PCI-E 3 NVME drive.
Here is some information from Adobe

https://helpx.adobe.com/lightroom-classic/kb/lightroom-gpu-faq.html#:~:text=When%20configured%20(Preferences%20%3E%20Performance),%2C%20Loupe%20view%2C%20and%20Filmstrip.

How does Lightroom Classic use the graphics processor?

When configured (Preferences > Performance), Lightroom Classic can use a compatible graphics processor (also called a graphics card, video card, or GPU) to speed up tasks of displaying and adjusting images in the Develop module, the Library module's Grid view, Loupe view, and Filmstrip. Features like Select Subject, Select Sky, and Enhance Details are also accelerated by the GPU. Using a compatible GPU can provide significant speed improvements on high-resolution displays, such as 4K and 5K monitors.
Thanks, that last bit looks to be the ticket.

So I tested this with my i5 Surface Pro 6 for a low end spec and stressed things a lot running it both just with the internal monitor and with 2 additional ones hooked up via a dock. So with the integrated GPU and 2 monitors it's definitely dragging a bit though I think a GTX 1060 should be able to handle that without an issue.

The big thing though is resolution.

This is how it went:

Surface pro 6 just internal monitor - minor lag

Surface pro 6 internal monitor with 2 hooked up - very noticeable lag of sometimes over a second.

Surface Pro 6 on a 1920x1200 display - jerkiness as in it would jump to the new selection and not scroll through but not really any lag

With the 3900X/GTX1080 on a 4k display it was similar just a little less jerky to the Surface on the 1920x1200 display.

And at all other resolutions the GTX 1080 was smooth scrolling through them.

So reid thaler you might want to do what I used to with my 4k laptop. If you have a 4k display cull at 1080P and then edit at 4k. Beyond the reduced lag preview generation time is greatly reduced.
Interesting. I do have a Dell 5k monitor, but can only get it run at 4K. Will try a lower res and see.

I just tried it and have trouble getting the side panels smaller, even hold the alt/opt key.
 
Which Dell monitor do you have?

Do you have the monitor "drivers" installed for it? (The quotation marks are because what the "drivers" typically install is an .inf that describes the monitor to Windows, plus an .icm for color management.)

The GTX 1060 should have no trouble running a monitor at 5k and 60Hz. What refresh rate are you using?
 
Why switch? Incremental gains in new CPUs have become minimal compared to the heyday of Moore's Law. Buying a new machine when your old one is adequate is, IMHO, needless consumption that contributes to our global woes.
LOL. Yeah ok. And by who's definition of "adequate"?
In this case, the OP's definition:
Thinking of building a new PC in a year. Current rig runs OK so I don't know how much I'll really gain running Lightroom Classic.
Anyway, with today's builds its not "simply" about the CPU but also the new tech and features the supporting motherboard provides.
The biggest gains are usually had by switching to SSD drives and, depending on the software, adding or upgrading a GPU. And sometimes by adding memory. You don't generally need to build an entire new system from scratch to do those - that's one of the big benefits of going with a desktop machine compared to a laptop.
 
what is running slow for you? I am very happy with the performance of LrC.
Also, any possibility of transferring over my OS and programs, without drivers, or is that just a mess that never ends well?
I generate Standard and Smart Previews and when I scroll through my images in the Grid mode, I still get gray thumbnails that take a bit of time to load.

Is there an ideal graphic card? As I said, I have an Nvidia 1060 6GB. Is this sufficient? Not interested in an expensive card with negligible benits.
I'd need to toss a lower end GPU (I have a 1080) in my system to check for sure but I don't think it uses the GPU for culling. To test it out as is I just scrolled through a 5+k image catalog with 10 across per row as fast I could and they were all there instantly. And the scrolling through the images on the main monitor is instant as well.

This is on a Ryzen 3900X and also using a PCI-E 3 NVME drive.
Here is some information from Adobe

https://helpx.adobe.com/lightroom-classic/kb/lightroom-gpu-faq.html#:~:text=When%20configured%20(Preferences%20%3E%20Performance),%2C%20Loupe%20view%2C%20and%20Filmstrip.

How does Lightroom Classic use the graphics processor?

When configured (Preferences > Performance), Lightroom Classic can use a compatible graphics processor (also called a graphics card, video card, or GPU) to speed up tasks of displaying and adjusting images in the Develop module, the Library module's Grid view, Loupe view, and Filmstrip. Features like Select Subject, Select Sky, and Enhance Details are also accelerated by the GPU. Using a compatible GPU can provide significant speed improvements on high-resolution displays, such as 4K and 5K monitors.
Thanks, that last bit looks to be the ticket.

So I tested this with my i5 Surface Pro 6 for a low end spec and stressed things a lot running it both just with the internal monitor and with 2 additional ones hooked up via a dock. So with the integrated GPU and 2 monitors it's definitely dragging a bit though I think a GTX 1060 should be able to handle that without an issue.

The big thing though is resolution.

This is how it went:

Surface pro 6 just internal monitor - minor lag

Surface pro 6 internal monitor with 2 hooked up - very noticeable lag of sometimes over a second.

Surface Pro 6 on a 1920x1200 display - jerkiness as in it would jump to the new selection and not scroll through but not really any lag

With the 3900X/GTX1080 on a 4k display it was similar just a little less jerky to the Surface on the 1920x1200 display.

And at all other resolutions the GTX 1080 was smooth scrolling through them.

So reid thaler you might want to do what I used to with my 4k laptop. If you have a 4k display cull at 1080P and then edit at 4k. Beyond the reduced lag preview generation time is greatly reduced.
I have not tried LR C on my new machine. I have used LR since v1, but since last year I use DXO PL5, and have my own DAM system. I decided to test out LR with my new system. First I checked to make sure my GPU was setup in preferences (It is). I run two Dell 27"4K Ultra IPS monitors. I will say the catalog is great with my new system. Even using 2 windows is very fast, zero lag. Great having a system with a modern GPU.
 
A

Newer GPUs and CPUs should have an advantage for "AI" empowered software of the Topaz ilk, but that software has to be designed to run on lesser silicon.
My previous computer 6+ year old Dell XPS I-7 with 32gb of memory and a very weak GPU was barely adequate to run DXO PL5 with my R5 RAW images. The change in performance with my new XPS is beyond amazing. This comes from a person who has owned computers for over 37 years works in tech and built many systems over the years. I have not been a fast upgrader for awhile, my new machine is my 3rd home machine in the last 13+ years. I don't change a lot. In fact, until I started using DXO my last two machines worked fine with LR and PS.

DXO, Topaz and Luminar Neo are the reasons I upgraded my machine. I am very happy with the upgrade.
 
The biggest gains are usually had by switching to SSD drives and, depending on the software, adding or upgrading a GPU. And sometimes by adding memory. You don't generally need to build an entire new system from scratch to do those - that's one of the big benefits of going with a desktop machine compared to a laptop.
For the record I've always built my desktop systems. The first PC I ever owned was a custom build (1998). Never ever owned a pre-built. My current system was built in 2019, and I plan on doing a new build sometime in 2023. I usually do a new build every 4-5 years. Some stuff I keep - cases, drives, but the CPU, GPU, motherboard, and this time RAM, will be replaced. Will most likely replaced the NVMe (SSD) drives as well. That's me.

Anyway, SSD's have been around for at least 15 years (I got my first on in 2010), so they should be standard in today's systems; and especially since prices have made them very affordable.
 
Currently running a 9700K/ Asus mobo/32 GB RAM, 1060 6 GB GPU, NVME, Corsair HX750i, 750 Watt Photoshop, and Noctua NH-D15 Premium CPU Cooler built in the fall of 2018

Thinking of building a new PC in a year. Current rig runs OK so I don't know how much I'll really gain running Lightroom Classic.
Well none of us know what kind of performance Meteor Lake will bring until it's out and real tests have been run. But...I can tell you that for me personally, my jump from a 8600K to a 12700K was far from 'wow'. It's faster, yes, but incremental at best. Don't expect to be amazed. Gone are the days where a 5 year gap in PC tech blew your mind. Even some gains that look huge on a benchmark don't really translate to a faster end user experience, especially tasks that aren't just "click export then wait for it to finish".

Honestly, the biggest gains for me came with software that uses the GPU. Topaz Gigapixel got a huge boost from the GTX 1080 to RTX 3070ti upgrade.
Gipper,

Thanks for the reality check and good to know that things are more incremental than amazing. Basically, my system works (but fills a temp at \AppData\Local\Temp ever couple of days even with Storage Sense turned on).

I've used Lightroom Classic since 2006 and do know that it has driven more than one computer upgrade when the Adjustment Brush was lagging. I guess I was hoping that the new computer would enable a single button in Lightroom Classic that would download, flag, rate, keyword, put in collections, optimize, and print : )
Well, don't let me discourage buying new toys lol. If ya got the money burning a hole, there's worse things to spend it on.

Honestly, when was the last time you reinstalled windows on your PC? If it's been more than a couple of years, try a fresh install. Sometimes starting with a clean slate takes care of a bit of lag. I firmly believe no matter how clean you keep your machine, the 'windows junk' accumulation eventually degrades performance. Just a question of how much.
 
Which Dell monitor do you have?

Do you have the monitor "drivers" installed for it? (The quotation marks are because what the "drivers" typically install is an .inf that describes the monitor to Windows, plus an .icm for color management.)

The GTX 1060 should have no trouble running a monitor at 5k and 60Hz. What refresh rate are you using?
60 Hz,

I'm using two cables, and just recently, again, installed the drivers for the Dell UP2715k monitor. Windows shows the 5k option, when I select it, the text just gets stretched.
 
Which Dell monitor do you have?

Do you have the monitor "drivers" installed for it? (The quotation marks are because what the "drivers" typically install is an .inf that describes the monitor to Windows, plus an .icm for color management.)

The GTX 1060 should have no trouble running a monitor at 5k and 60Hz. What refresh rate are you using?
60 Hz,

I'm using two cables, and just recently, again, installed the drivers for the Dell UP2715k monitor. Windows shows the 5k option, when I select it, the text just gets stretched.
Nice monitor.

I tried a different monitor (UP3218K, 8k) a few months ago, that also required dual cables. Based on my experience, if you can choose 5k (5120 X 2880), then the monitor is properly recognized by the PC.

(I didn't keep the UP3218K because it showed serious image retention issues, which weren't helped much by the utility that was built-in to reduce the effect. I hope that the UP2715K doesn't have the same defect.)

Looking at the manual, I wonder if the monitor's aspect ratio setting is correct? It should be 16:9. That would be an obvious way that text could be "stretched". If it was displaying 3840 X 2160, that's still 16:9. What resolution is actual in use when you select 5k?

I'd suggest checking with Dell support, but based on my experience of a few months ago, it might be a waste of time. (They had no clue about drivers for the UP3218k, and were of no help when I had trouble getting the dual cable connection recognized. I had to figure both out on my own.)

Good luck.
 
(I didn't keep the UP3218K because it showed serious image retention issues, which weren't helped much by the utility that was built-in to reduce the effect.
Thank you for curing my bad case of UP3218K envy. :-)
 
(I didn't keep the UP3218K because it showed serious image retention issues, which weren't helped much by the utility that was built-in to reduce the effect.
Thank you for curing my bad case of UP3218K envy. :-)
The UP3218K came to market 5 years ago. It requires two DisplayPort connections, due to the bandwidth limit of the DP of its day. It's a true 10 bit monitor, although if you choose that over 8 bits, its maximum refresh rate drops to 48Hz.

The main complaint I saw in reviews were that the monitor's face is glossy and highly reflective. I was prepared to work around that.

It has no HDR, even the limited sort (low peak luminance) commonly available with current monitors.

Conventional wisdom is that IPS monitors don't suffer from burn in (or image retention), but this one did, at least for me. The monitor has a built-in utility to ameliorate image retention, but running it for a half hour only slightly reduced the retention I was getting.

Maybe it would have been OK if only used for photo editing. The retention I was getting was from DP Review, in part. (I guess I spend too much time here. :-| )

I'm not sure whether any of the latest interfaces (DP 2.0, HDMI 2.1) would allow 7680 X 4320 at 60 Hz with 10 bit color and full resolution for the chroma information (equivalent to 4:4:4 on a TV). Even a high-end graphics card (e. g., an RTX 3090 ti) only supports DP 1.4a. (I don't know the capabilities of its HDMI 2.1 port.) That may partially explain why the UP3218K hasn't been updated to a model that could be run with a single cable.

IMHO, the UP3218K may have been ahead of its time.
 
Last edited:
For what it may be worth....

This is my 2nd Dell UP2715K monitor. The first died. I thought I'd try the Dell UP2718Q which I realized I would need to return as soon as I turned it on. I decided to buy the BenQ SW271 and wrote the follow review on Amazon.

https://www.amazon.com/gp/customer-reviews/R1360KWM57O01W?ref=pf_ov_at_pdctrvw_srp

2.0 out of 5 stars

Dell Quality Control and Feature Corners Cut, But Kept the Premium Price

Reviewed in the United States on December 16, 2019

A little background: I started producing photographs in a darkroom in 1974. Since 2012, I have taught photography and digital image processing at colleges and universities, camera stores, and other community venues. I produce prints on two large-format Epson printers. I calibrate my monitors, and create my own printer profiles. Suffice it to say I have experience in the field.

Unfortunately I had to buy a new monitor, because my fairly recently acquired Dell UP2715K which retailed for around $1800 just passed its three year warranty period and has power issues causing it to not turn on. Dell does not repair their out of warranty monitors nor make parts available. Take note…

Having had good experiences with the Dell UP2715K, I naturally sought out Dell’s replacement. The box shipped by Dell had previously been opened, as the box was ripped in two places, it appears the tape had been opened, and the box showed other signs of distress. Again, for such an expensive item geared toward imaging professionals, you would think Dell would pay attention to what they ship.

Despite the packaging flaws, the contents seem to be fine, but that was not to be the case. As soon as I started the computer with the new monitor attached, I noticed an obvious blue spot on the startup screen which also reappeared on the desktop. Additionally, I also noticed a cluster of dead pixels, and another few dead pixels above that, which was obviously disappointing. Again, for an expensive product, it should be flawless.

I should reiterate that I was quite happy and impressed with my previous Dell UP2715K. It was stunning. It had a bezeless, anti-reflective glass screen, Harman Kardon speakers, an SD memory card reader, and a 5K (not just 4K) resolution screen. Images looked great and the design was beautiful. Sadly, Dell replaced the glass screen with a plastic one, eliminated speakers, eliminated the card reader, eliminated the bezeless design, and reduced the resolution to 4K. The only thing they kept was the premium price.

With the flaws, it was clear that this monitor was going to be returned, but instead of just replacing it, I had repeatedly read great reviews over and over on the internet about the BenQ SW271, so I ordered one to compare. The result: the Ben Q has better image quality with noticeably better details in the shadows and highlights, looked more neutral, has more features including an SD memory card reader, a “hotkey” USB device with buttons for switching color space modes, a hood to protect from extraneous light, a written report with its factory calibration results, and costs 25% less then the Dell, retailing for $1099. The Ben Q is a better monitor at a cheaper price.

To make the comparison fair, both monitors were calibrated with an X-Right, i1Display Pro monitor calibrator with the same settings applied. Results were viewed in Lightroom while both monitors were connected to observe the differences simultaneously. Wondering if there was a difference caused by viewing the images on the second screen mode, I switch primary and secondary monitors repeatedly and the results were consistent. I found that the Ben Q displayed more detail in the shadows and highlights, and the colors look more neutral. In actuality, neither was actually neutral (and one could argue that my testing methodology was either flawed or not scientific, but it was consistent): I white balanced on a photo of one our our tuxedo (black and white) cat’s nose so it was absolutely neutral, photographed both screens with my iPhone 7, then read them with the white balance tool in Lightroom. The Dell was cooler with a bias toward green; the BenQ toward red, but the BenQ looked better and more neutral. If you have experience in postprocessing your images, you will know that almost nothing, especially people, look better with a green color cast the Dell exhibited.

The BenQ was the clear winner. With better image quality, better details in the highlights and shadows, a more pleasing display, more features, and not arriving with screen flaws in a damaged box, and cheaper, there is simply no reason to even consider this monitor.
 
Yes, Dell sometimes strikes out on their Ultrasharp monitors. The one you mentioned did not have great reviews. I have been very happy with my Dell 4K IPS monitors. I would love BenQ 4K monitors, but at 3x the cost and needing two monitors that is outside my budget. Photography is a hobby, tech in my work. For you I can see the need for great monitors being a LR instructor.

What did you decide on an upgraded system?
 

Keyboard shortcuts

Back
Top