GPU USELESS For Lightroom Classic!

Started 5 months ago | Discussions
Batdude
Batdude Veteran Member • Posts: 4,987
GPU USELESS For Lightroom Classic!
2

Received my new Ryzen 3900X system today and just started messing around with it.

I grabbed my 16MP Fuji XT1 and filled the SD card with 1,912 RAW photos and imported to Lightroom.   Let me just say that this CPU is a LOT faster than my previous i7-6700.

It only took 1 minute and 40 seconds to import all.  Translation:  I'm very happy

With the GPU enabled the import time is actually six seconds slower, at 1:46.

I just want to say that I'm so glad I didn't go crazy and obsessed about buying a really expensive GPU and I'm happy with the used $180 GTX 1070 I got and if I had gotten one it would have been money being thrown to the trash.  I personally don't do anything in LR that I feel is so advanced that a "high end" GPU is required.  Maybe there are certain functions that others might use or that I'm not aware of, but me (personally) I just don't bloody see it.

The good:  You will save a TON of money by ignoring the GPU.

I know, they are sooooooooooooooo nice and cool looking aren't they?

The bad:  Adobe looks really bad simply because it's software is pure garbage for GPU usage.  I feel really bad for whoever purchased a very expensive GPU for this application thinking the GPU will kick A** and then it must hurt to see that the GPU does nothing.

I purchased my GPU used so I'm cool with that, but if I had paid over $200 for a new one forget it I would simply unplug it and return it and get something cheaper until Adobe upgrades Lightroom to take 100% advantage of those massive expensive graphics cards.

Anyway, I'm sorry that the time here is midnight and I have to go rest so I'll catch up here tomorrow

 Batdude's gear list:Batdude's gear list
Fujifilm X10 Nikon D4 Fujifilm X-E1 Fujifilm X-T1 Nikon AF Nikkor 50mm f/1.8D +8 more
Fujifilm X-T1
If you believe there are incorrect tags, please send us this post using our feedback form.
Robert Zanatta Senior Member • Posts: 2,013
Re: GPU USELESS For Lightroom Classic!
1

GPU will help with some develop module functions - faster transitions with some sliders and brushes.

 Robert Zanatta's gear list:Robert Zanatta's gear list
Canon EOS 5D Mark IV
Batdude
OP Batdude Veteran Member • Posts: 4,987
Re: GPU USELESS For Lightroom Classic!

Robert Zanatta wrote:

GPU will help with some develop module functions - faster transitions with some sliders and brushes.

And today’s modern CPUs can’t handle that?

From everything that I’ve noticed and learned is that the good GPUs that Adobe “recommends” costs around $800 and up. To me that’s an insane amount of money for the kind of low performance you get with LR. That was my whole point 😃

 Batdude's gear list:Batdude's gear list
Fujifilm X10 Nikon D4 Fujifilm X-E1 Fujifilm X-T1 Nikon AF Nikkor 50mm f/1.8D +8 more
Simon Garrett Veteran Member • Posts: 6,905
Re: GPU USELESS For Lightroom Classic!

Batdude wrote:

Received my new Ryzen 3900X system today and just started messing around with it.

I grabbed my 16MP Fuji XT1 and filled the SD card with 1,912 RAW photos and imported to Lightroom. Let me just say that this CPU is a LOT faster than my previous i7-6700.

The 3900X is not hugely faster than the i7-6700 at single-core speed, but it has 24 processors as against the i7-6700's 8, and so has much greater all-core speed.

I've made the same upgrade (in my case i7-6700K, slightly over-clocked) to 3900X, and here are the CPU-ID benchmarks: first figure is single-core speed, second is all-core:

i7-6700: 474, 2377

i7-6700K: 483, 2555

3900X: 532, 8201

Processor clock speeds have not risen greatly since the i7-6700, but current processors have many more cores.  Lightroom does not seem to use anything like the 24 processors on my machine, but uses around half of them when exporting or building previews.

It only took 1 minute and 40 seconds to import all. Translation: I'm very happy

With the GPU enabled the import time is actually six seconds slower, at 1:46.

I just want to say that I'm so glad I didn't go crazy and obsessed about buying a really expensive GPU and I'm happy with the used $180 GTX 1070 I got and if I had gotten one it would have been money being thrown to the trash. I personally don't do anything in LR that I feel is so advanced that a "high end" GPU is required. Maybe there are certain functions that others might use or that I'm not aware of, but me (personally) I just don't bloody see it.

The good: You will save a TON of money by ignoring the GPU.

I know, they are sooooooooooooooo nice and cool looking aren't they?

The bad: Adobe looks really bad simply because it's software is pure garbage for GPU usage. I feel really bad for whoever purchased a very expensive GPU for this application thinking the GPU will kick A** and then it must hurt to see that the GPU does nothing.

I purchased my GPU used so I'm cool with that, but if I had paid over $200 for a new one forget it I would simply unplug it and return it and get something cheaper until Adobe upgrades Lightroom to take 100% advantage of those massive expensive graphics cards.

Anyway, I'm sorry that the time here is midnight and I have to go rest so I'll catch up here tomorrow

In your second post you put:

From everything that I’ve noticed and learned is that the good GPUs that Adobe “recommends” costs around $800 and up.

That's not what they say at https://helpx.adobe.com/uk/lightroom-cc/system-requirements.html . A link there suggests (on another Adobe page) using a card with a minimum Passmark score of 2,000, which is pretty low. Your 1070 scores 6,206.

I quite agree with you: a powerful graphics card is wasted on Lightroom.

Better, I think, to spend money on a fast M.2 NVMe SSD for C drive, raw cache, catalog and previews. That can all be the same drive if it's big enough.

-- hide signature --

Simon

 Simon Garrett's gear list:Simon Garrett's gear list
Nikon D800
DeathArrow Senior Member • Posts: 3,387
Re: GPU USELESS For Lightroom Classic!
4

If you want GPU support, look at other software like DXO Photolab 4.

 DeathArrow's gear list:DeathArrow's gear list
Sony RX100 VA Nikon D300 Nikon D610 Nikon D750 Nikon AF Nikkor 50mm f/1.4D +6 more
Robert Zanatta Senior Member • Posts: 2,013
Re: GPU USELESS For Lightroom Classic!
1

Batdude wrote:

Robert Zanatta wrote:

GPU will help with some develop module functions - faster transitions with some sliders and brushes.

And today’s modern CPUs can’t handle that?

Not worth the same performance.

 Robert Zanatta's gear list:Robert Zanatta's gear list
Canon EOS 5D Mark IV
edispics
edispics Veteran Member • Posts: 4,404
Confirming Puget test results
1

In my last reply to your previous post where I cited Puget's testing, here is an excerpt

Overall, we didn't see much of a difference between the various GPUs we tested, or even the test using just Intel integrated graphics and GPU acceleration disabled entirely. NVIDIA is definitely a hair faster than AMD (which oddly was slower than having no GPU acceleration at all), but the performance between each NVIDIA GPU is close enough to be within the margin of error. In fact, Lightroom Classic tends to have a larger margin of error than our other benchmarks, and anything within ~5% we would consider to be effectively the same.

We could go into our results in more detail, but what we are taking from this is that for what we are testing, the GPU has almost no impact. As we mentioned earlier in this post, we do hope to include a number of other tests that should be a better indicator of GPU performance, but this simply reinforces that your GPU is a very low priority relative to your CPU, RAM, and storage.

Your results confirm Puget's take on LR and GPU use

 edispics's gear list:edispics's gear list
Panasonic ZS100 Nikon D7000 Nikon D7200 Nikon Z6 Nikon D70 +25 more
edispics
edispics Veteran Member • Posts: 4,404
Re: GPU USELESS For Lightroom Classic!
3

Robert Zanatta wrote:

Batdude wrote:

Robert Zanatta wrote:

GPU will help with some develop module functions - faster transitions with some sliders and brushes.

And today’s modern CPUs can’t handle that?

Not worth the same performance.

These benchmarks show where a GPU can make some difference, note the first column in using only the onboard graphics on the intel cpu. But overall Puget concludes that GPU is not where you should put your money if only considering LR.

https://www.pugetsystems.com/pic_disp.php?id=63773

 edispics's gear list:edispics's gear list
Panasonic ZS100 Nikon D7000 Nikon D7200 Nikon Z6 Nikon D70 +25 more
CAcreeks
CAcreeks Forum Pro • Posts: 16,079
Re: GPU USELESS For Lightroom Classic!
1

Simon Garrett wrote:

Processor clock speeds have not risen greatly since the i7-6700, but current processors have many more cores. Lightroom does not seem to use anything like the 24 processors on my machine, but uses around half of them when exporting or building previews.

Many moons ago, Austinian posted a screenshot showing how DxO Photolab was using all of his many CPU cores during Prime noise reduction. (6 cores each with 2 threads? I dunno)

https://www.dpreview.com/forums/post/62618848

I'm amazed by in-camera noise reduction because it is so much quicker than (though not as good as) noise reduction in PC software.

As poster DeathArrow says in a well-liked post, DxO is an alternative. But if you're invested in Lightroom catalog etc, DxO is probably not a viable option.

The good: You will save a TON of money by ignoring the GPU.

Maybe someday Batdude will buy Topaz AI and finally make use of his $180 GPU.

DeathArrow Senior Member • Posts: 3,387
Re: GPU USELESS For Lightroom Classic!
2

CAcreeks wrote:

Simon Garrett wrote:

Processor clock speeds have not risen greatly since the i7-6700, but current processors have many more cores. Lightroom does not seem to use anything like the 24 processors on my machine, but uses around half of them when exporting or building previews.

Many moons ago, Austinian posted a screenshot showing how DxO Photolab was using all of his many CPU cores during Prime noise reduction. (6 cores each with 2 threads? I dunno)

https://www.dpreview.com/forums/post/62618848

I'm amazed by in-camera noise reduction because it is so much quicker than (though not as good as) noise reduction in PC software.

As poster DeathArrow says in a well-liked post, DxO is an alternative. But if you're invested in Lightroom catalog etc, DxO is probably not a viable option.

The good: You will save a TON of money by ignoring the GPU.

Maybe someday Batdude will buy Topaz AI and finally make use of his $180 GPU.

In a another thread from yesterday someone said his 1060GT has quite high usage using new DeepPrime noise reduction in Photolab 4. It seems that the more powerful the GPU, the faster Photolab finishes the export.

 DeathArrow's gear list:DeathArrow's gear list
Sony RX100 VA Nikon D300 Nikon D610 Nikon D750 Nikon AF Nikkor 50mm f/1.4D +6 more
Batdude
OP Batdude Veteran Member • Posts: 4,987
Re: GPU USELESS For Lightroom Classic!

Simon Garrett wrote:

Batdude wrote:

Received my new Ryzen 3900X system today and just started messing around with it.

I grabbed my 16MP Fuji XT1 and filled the SD card with 1,912 RAW photos and imported to Lightroom. Let me just say that this CPU is a LOT faster than my previous i7-6700.

The 3900X is not hugely faster than the i7-6700 at single-core speed, but it has 24 processors as against the i7-6700's 8, and so has much greater all-core speed.

I've made the same upgrade (in my case i7-6700K, slightly over-clocked) to 3900X, and here are the CPU-ID benchmarks: first figure is single-core speed, second is all-core:

i7-6700: 474, 2377

i7-6700K: 483, 2555

3900X: 532, 8201

Processor clock speeds have not risen greatly since the i7-6700, but current processors have many more cores. Lightroom does not seem to use anything like the 24 processors on my machine, but uses around half of them when exporting or building previews.

It only took 1 minute and 40 seconds to import all. Translation: I'm very happy

With the GPU enabled the import time is actually six seconds slower, at 1:46.

I just want to say that I'm so glad I didn't go crazy and obsessed about buying a really expensive GPU and I'm happy with the used $180 GTX 1070 I got and if I had gotten one it would have been money being thrown to the trash. I personally don't do anything in LR that I feel is so advanced that a "high end" GPU is required. Maybe there are certain functions that others might use or that I'm not aware of, but me (personally) I just don't bloody see it.

The good: You will save a TON of money by ignoring the GPU.

I know, they are sooooooooooooooo nice and cool looking aren't they?

The bad: Adobe looks really bad simply because it's software is pure garbage for GPU usage. I feel really bad for whoever purchased a very expensive GPU for this application thinking the GPU will kick A** and then it must hurt to see that the GPU does nothing.

I purchased my GPU used so I'm cool with that, but if I had paid over $200 for a new one forget it I would simply unplug it and return it and get something cheaper until Adobe upgrades Lightroom to take 100% advantage of those massive expensive graphics cards.

Anyway, I'm sorry that the time here is midnight and I have to go rest so I'll catch up here tomorrow

In your second post you put:

From everything that I’ve noticed and learned is that the good GPUs that Adobe “recommends” costs around $800 and up.

That's not what they say at https://helpx.adobe.com/uk/lightroom-cc/system-requirements.html . A link there suggests (on another Adobe page) using a card with a minimum Passmark score of 2,000, which is pretty low. Your 1070 scores 6,206.

I quite agree with you: a powerful graphics card is wasted on Lightroom.

Better, I think, to spend money on a fast M.2 NVMe SSD for C drive, raw cache, catalog and previews. That can all be the same drive if it's big enough.

I agree. I had some gift cards and some credit card reward points accumulated and used and got me two 1TB M.2 fast SSD. Pretty much everything was doubled in this new system including the RAM and its speed.

For what I do and for what my kids might end up doing in the near future this computer is going to last me many years to come and I don’t see myself spending one more cent on it. The rest the software manufacturer is the one that has to do its part by giving the software a boost in speed.

Also the day someone comes out with something that actually beats and replaces this Adobe software I will be one of the first ones to jump ship 😁 meanwhile and unfortunately Lightroom’s workflow is what works for me.

 Batdude's gear list:Batdude's gear list
Fujifilm X10 Nikon D4 Fujifilm X-E1 Fujifilm X-T1 Nikon AF Nikkor 50mm f/1.8D +8 more
Batdude
OP Batdude Veteran Member • Posts: 4,987
Re: GPU USELESS For Lightroom Classic!

CAcreeks wrote:

Simon Garrett wrote:

Processor clock speeds have not risen greatly since the i7-6700, but current processors have many more cores. Lightroom does not seem to use anything like the 24 processors on my machine, but uses around half of them when exporting or building previews.

Many moons ago, Austinian posted a screenshot showing how DxO Photolab was using all of his many CPU cores during Prime noise reduction. (6 cores each with 2 threads? I dunno)

https://www.dpreview.com/forums/post/62618848

I'm amazed by in-camera noise reduction because it is so much quicker than (though not as good as) noise reduction in PC software.

As poster DeathArrow says in a well-liked post, DxO is an alternative. But if you're invested in Lightroom catalog etc, DxO is probably not a viable option.

The good: You will save a TON of money by ignoring the GPU.

Maybe someday Batdude will buy Topaz AI and finally make use of his $180 GPU.

Hahaha yeah I’m not sure I’ll ever use that piece of software but I would like to have at least a GTX 1080Ti  for that.  
 One thing that I’m seeing is that right this moment it is THE worst time to buy a GPU.  With the release of AMD’s new cards competition is going to kick in and I do see getting a really good deal by 2021 thanks giving.  Anyone that buys a GPU right now is getting totally ripped off.  That’s only my humble opinion.

 Batdude's gear list:Batdude's gear list
Fujifilm X10 Nikon D4 Fujifilm X-E1 Fujifilm X-T1 Nikon AF Nikkor 50mm f/1.8D +8 more
Austinian
MOD Austinian Forum Pro • Posts: 10,757
Re: GPU USELESS For Lightroom Classic!
3

DeathArrow wrote:

CAcreeks wrote:

Simon Garrett wrote:

Processor clock speeds have not risen greatly since the i7-6700, but current processors have many more cores. Lightroom does not seem to use anything like the 24 processors on my machine, but uses around half of them when exporting or building previews.

Many moons ago, Austinian posted a screenshot showing how DxO Photolab was using all of his many CPU cores during Prime noise reduction. (6 cores each with 2 threads? I dunno)

Yes. i7-7800X; haven't checked DeepPRIME CPU usage as it now seems almost irrelevant.

https://www.dpreview.com/forums/post/62618848

I'm amazed by in-camera noise reduction because it is so much quicker than (though not as good as) noise reduction in PC software.

As poster DeathArrow says in a well-liked post, DxO is an alternative. But if you're invested in Lightroom catalog etc, DxO is probably not a viable option.

The good: You will save a TON of money by ignoring the GPU.

Maybe someday Batdude will buy Topaz AI and finally make use of his $180 GPU.

Topaz AI Sharpen can certainly use a fast GPU; pure CPU performance on my PC was dismal.

In a another thread from yesterday someone said his 1060GT has quite high usage using new DeepPrime noise reduction in Photolab 4. It seems that the more powerful the GPU, the faster Photolab finishes the export.

I've only had PhotoLab 4 for a day, so all I can say so far WRT performance is that it just processed a 61Mp raw file with DeepPRIME at defaults in 25 seconds, using a GTX 1080Ti.

Much faster than the PRIME in PhotoLab 2, and superior results. This was indeed a worthy upgrade.

 Austinian's gear list:Austinian's gear list
Sony a7R III Sony a7R IV Samyang 14mm F2.8 ED AS IF UMC Sony FE 50mm F2.8 Macro Sony FE 24-105mm F4 +2 more
johnnyandedgar Contributing Member • Posts: 514
Re: GPU USELESS For Lightroom Classic!

"from everything that I’ve noticed and learned is that the good GPUs that Adobe “recommends” costs around $800 and up"

Where did you come up with this? It would appear you didn't understand or seek clarification on the information you were provided with.

Very sad considering all the effort forum members gave to "help" you.

johnnyandedgar

johnnyandedgar Contributing Member • Posts: 514
Re: GPU USELESS For Lightroom Classic!

"Better, I think, to spend money on a fast M.2 NVMe SSD for C drive, raw cache, catalog and previews. That can all be the same drive if it's big enough."

Yes, yes, yes. ESPECIALLY a top tier PCIE 4 NVMe card.

johnnyandedgar

Batdude
OP Batdude Veteran Member • Posts: 4,987
Re: GPU USELESS For Lightroom Classic!

Austinian wrote:

DeathArrow wrote:

CAcreeks wrote:

Simon Garrett wrote:

Processor clock speeds have not risen greatly since the i7-6700, but current processors have many more cores. Lightroom does not seem to use anything like the 24 processors on my machine, but uses around half of them when exporting or building previews.

Many moons ago, Austinian posted a screenshot showing how DxO Photolab was using all of his many CPU cores during Prime noise reduction. (6 cores each with 2 threads? I dunno)

Yes. i7-7800X; haven't checked DeepPRIME CPU usage as it now seems almost irrelevant.

https://www.dpreview.com/forums/post/62618848

I'm amazed by in-camera noise reduction because it is so much quicker than (though not as good as) noise reduction in PC software.

As poster DeathArrow says in a well-liked post, DxO is an alternative. But if you're invested in Lightroom catalog etc, DxO is probably not a viable option.

The good: You will save a TON of money by ignoring the GPU.

Maybe someday Batdude will buy Topaz AI and finally make use of his $180 GPU.

Topaz AI Sharpen can certainly use a fast GPU; pure CPU performance on my PC was dismal.

In a another thread from yesterday someone said his 1060GT has quite high usage using new DeepPrime noise reduction in Photolab 4. It seems that the more powerful the GPU, the faster Photolab finishes the export.

I've only had PhotoLab 4 for a day, so all I can say so far WRT performance is that it just processed a 61Mp raw file with DeepPRIME at defaults in 25 seconds, using a GTX 1080Ti.

Much faster than the PRIME in PhotoLab 2, and superior results. This was indeed a worthy upgrade.

What do you mean by it just processed a 61MP raw file?  What is that mean?

 Batdude's gear list:Batdude's gear list
Fujifilm X10 Nikon D4 Fujifilm X-E1 Fujifilm X-T1 Nikon AF Nikkor 50mm f/1.8D +8 more
CAcreeks
CAcreeks Forum Pro • Posts: 16,079
Re: GPU USELESS For Lightroom Classic!

johnnyandedgar wrote:

"from everything that I’ve noticed and learned is that the good GPUs that Adobe “recommends” costs around $800 and up"

I suspect you have not priced GPUs lately. Top Puget performer costs $1300, second place performer costs $750. See link below.

Where did you come up with this? It would appear you didn't understand or seek clarification on the information you were provided with.

https://www.pugetsystems.com/recommended/Recommended-Systems-for-Adobe-Lightroom-Classic-141/Hardware-Recommendations

Batdude
OP Batdude Veteran Member • Posts: 4,987
Re: GPU USELESS For Lightroom Classic!

johnnyandedgar wrote:

"Better, I think, to spend money on a fast M.2 NVMe SSD for C drive, raw cache, catalog and previews. That can all be the same drive if it's big enough."

Yes, yes, yes. ESPECIALLY a top tier PCIE 4 NVMe card.

johnnyandedgar

No, no , no.  PCIE 4 is simply too new at the moment and and overheats causing the system to slow down automatically so what’s the point?  I’m I wrong?

 Batdude's gear list:Batdude's gear list
Fujifilm X10 Nikon D4 Fujifilm X-E1 Fujifilm X-T1 Nikon AF Nikkor 50mm f/1.8D +8 more
johnnyandedgar Contributing Member • Posts: 514
Re: GPU USELESS For Lightroom Classic!

IMO you totally blew it by not buying a nvme ssd either PCIE 3 or 4.

johnnyandedgar

Austinian
MOD Austinian Forum Pro • Posts: 10,757
Re: GPU USELESS For Lightroom Classic!

Batdude wrote:

Austinian wrote:

DeathArrow wrote:

CAcreeks wrote:

Simon Garrett wrote:

Processor clock speeds have not risen greatly since the i7-6700, but current processors have many more cores. Lightroom does not seem to use anything like the 24 processors on my machine, but uses around half of them when exporting or building previews.

Many moons ago, Austinian posted a screenshot showing how DxO Photolab was using all of his many CPU cores during Prime noise reduction. (6 cores each with 2 threads? I dunno)

Yes. i7-7800X; haven't checked DeepPRIME CPU usage as it now seems almost irrelevant.

https://www.dpreview.com/forums/post/62618848

I'm amazed by in-camera noise reduction because it is so much quicker than (though not as good as) noise reduction in PC software.

As poster DeathArrow says in a well-liked post, DxO is an alternative. But if you're invested in Lightroom catalog etc, DxO is probably not a viable option.

The good: You will save a TON of money by ignoring the GPU.

Maybe someday Batdude will buy Topaz AI and finally make use of his $180 GPU.

Topaz AI Sharpen can certainly use a fast GPU; pure CPU performance on my PC was dismal.

In a another thread from yesterday someone said his 1060GT has quite high usage using new DeepPrime noise reduction in Photolab 4. It seems that the more powerful the GPU, the faster Photolab finishes the export.

I've only had PhotoLab 4 for a day, so all I can say so far WRT performance is that it just processed a 61Mp raw file with DeepPRIME at defaults in 25 seconds, using a GTX 1080Ti.

Much faster than the PRIME in PhotoLab 2, and superior results. This was indeed a worthy upgrade.

What do you mean by it just processed a 61MP raw file? What is that mean?

PL 4 created a 61Mp JPEG from a 61Mp raw file.

 Austinian's gear list:Austinian's gear list
Sony a7R III Sony a7R IV Samyang 14mm F2.8 ED AS IF UMC Sony FE 50mm F2.8 Macro Sony FE 24-105mm F4 +2 more
Keyboard shortcuts:
FForum MMy threads