Built new fast computer for Lightroom

Just in case you're still adamant that a graphics card has no use in Lightroom or Photoshop, you might find this reading useful. I'm sure it won't change your mind...I mean, why let facts get in the way of a good opinion, right?

https://www.pugetsystems.com/labs/articles/Photoshop-CC-2018-NVIDIA-GeForce-GPU-Performance-1139/
Thanks! "For most users, you will be better off in the long run spending that extra money on more RAM, storage, or a higher-end CPU".

My only disagreement is about the "higher-end CPU"; but then again, Puget sells them.

Spend that extra money on something else entirely.
Put into context, they say don't spend the money on a more powerful card than their recommendation of a 1060 or 1070. They don't say skip the GPU altogether as you imply. Geez you have a weird way of distorting facts to suit your stance.

The full quote that you cherry picked a portion of:

"Overall for Photoshop, we recommend using either a GTX 1060/1070 video card even if you have the budget for a more powerful card. For most users, you will be better off in the long run spending that extra money on more RAM, storage, or a higher-end CPU rather than a more powerful video card."
They are trying to sell you a graphics card as well Gipper... Look at their own data. The improvement over built-in GPU is significant only in very infrequent operations, and insignificant in the common operations when you are spending 10X that time looking at the image and pondering your next adjustment.
I disagree. Seems we work in different manners on image processing. Besides the operations in that test, I can tell you the following:
  • I use a 4K monitor plus a pair of 2K monitors (3 total). If I turn off the GPU acceleration in ACR, zooming into images becomes slower and panning around is choppier. With GPU on, zooming is instant and panning is smooth as silk.
  • 4K monitors slow Photoshop's response time compared to lower res monitors. 4x the amount of pixels to push. The higher the resolution goes, the more the GPU matters with common operations.
I found a noticeable slowdown in response time on my previous computer (the system with the same CPU you have) when I went to from a 1920x1200 screen to a 4K monitor. My old system didn't have a GPU when I first built it, used the integrated graphics. Running a single 1920x1200 monitor PS felt lightning quick. But I added another screen, then a 4K screen, and it wasn't as 'snappy' as it used to be with the dual monitors. Added a GTX 960 and that speed was mostly back, and let me run all three monitors (couldn't do that before). Going to higher MP cameras slowed processing down too. Working with 36MP doesn't flow as fast as 20MP.

Just the nature of the beast if you prefer that kind of resolution. Is it enough of a slowdown to really impact workflow? Not always but often times can. If I'm crunching through lots of images, say like from my kid's soccer or basketball games it's noticeable. Many things are batch adjusted, but zooming to check sharpness and cropping most of them, those fractions of a second delay become irritating when you're working fast.

Often times I'm not sitting there 'pondering my next adjustment' for 30 seconds, but wanting to crank out a couple hundred images quickly with a methodical workflow. When doing this, the speed makes the work more enjoyable and saves time. I'm sure to you that sounds trivial.

--
My site:
http://www.gipperich-photography.com
 
Last edited:
They are trying to sell you a graphics card as well Gipper... Look at their own data. The improvement over built-in GPU is significant only in very infrequent operations, and insignificant in the common operations when you are spending 10X that time looking at the image and pondering your next adjustment.
I disagree. Seems we work in different manners on image processing.
An important factor to consider.
  • I use a 4K monitor plus a pair of 2K monitors (3 total).
I use two 2560 x 1440 monitors and shoot 20Mp Raw images, process in Lightroom, do occasional panos, frequent HDRs. i7-3770, 16 GB RAM, SSD + HDD, no external graphics card (Dell Opti 9010 mini, bought June 2012, no plans or needs to upgrade anything).
  • If I turn off the GPU acceleration in ACR, zooming into images becomes slower and panning around is choppier. With GPU on, zooming is instant and panning is smooth as silk.
I would not call my zooming and panning instant, but it is less than 1 sec.
Running a single 1920x1200 monitor PS felt lightning quick. But I added another screen, then a 4K screen, and it wasn't as 'snappy' as it used to be with the dual monitors. Added a GTX 960 and that speed was mostly back, and let me run all three monitors (couldn't do that before).
Still fast enough for me with two 2560 x 1440, but granted that you are pushing more pixels, and maybe I am more accepting of visible (but less that 1 sec) delays.
Going to higher MP cameras slowed processing down too. Working with 36MP doesn't flow as fast as 20MP.
I'm sure you're right on that. I have no plans to go to a higher-res camera.

My "old" 3770 chip has the integrated Intel Graphics 4000 processor (1150 MHz, 16 cores). Current Intel i7 processors have between 48 and 72 integrated graphics cores; so that might easily allow me to upgrade to multiple 4K monitors and a 36 MP camera without an external graphics card and with the same or better responsiveness as I am seeing now (but just speculating -- no need to spend the money or time checking that out. This 6-yr old computer may be all I need for the future. But hopefully my photo skills will keep improving).
Often times I'm not sitting there 'pondering my next adjustment' for 30 seconds, but wanting to crank out a couple hundred images quickly with a methodical workflow.
I very seldom need to crank out 200+ images. I'm lucky to shoot a dozen good ones in an outing! But I am sifting and sorting through a few hundred at a time, with 30-40 previews on the screen at once, and that seems to go quite fast enough that the computer is never the limiting factor.

I'm sure you're as happy with your rig as I am with mine.
 
Last edited:
I think it is much simpler. There is so much invested in the traditional layouts that OEMs are reluctant to adapt. RAM chips are too close to coolers in many cases. M.2 slots too close to CPUs, etc.
There are just lots of trade-offs involved here. If a motherboard is to be designed for fast overclocked RAM, then that RAM needs to be very close to the CPU so the connecting traces are short. I have a motherboard designed for up to 4400MHz RAM for which you would also probably need cooling fins on the DIMMs. And, if you buy the right cooler, it has clearance for those DIMMs (which my very large air cooler does). So, I still say it's just a matter of buying the right components that all fit together. My motherboard (ASRock Taichi) has three M2. slots all if which fit just fine.
 

Keyboard shortcuts

Back
Top