Lightroom AI Benchmarks?

CBR1100XX

Senior Member
Messages
2,861
Solutions
3
Reaction score
1,291
Location
US
Lightroom AI performance has become a frequent topic here so I thought I'd check in to see if anyone has run across a good benchmark for various video cards, especially the different NVIDIA generations.

Because this is becoming a big bottleneck for me, I picked up a 3060 Ti as a stopgap last year when AI tools were getting more common and while it's better than my GTX1080 it still takes a long time if I apply it to a whole shoot.

NVIDIA should be announcing the Super models of the RTX 40 series on January 8th which has me interested. They claim the tensor cores are twice as fast and I might be able to get 50% more of them which on paper would be game changing BUT this is Lightroom. It has a long history of being an outlier from where you'd expect the performance difference to be given any 2 chips.
 
Last edited:
Lightroom AI performance has become a frequent topic here so I thought I'd check in to see if anyone has run across a good benchmark for various video cards, especially the different NVIDIA generations.

Because this is becoming a big bottleneck for me, I picked up a 3060 Ti as a stopgap last year when AI tools were getting more common and while it's better than my GTX1080 it still takes a long time if I apply it to a whole shoot.

NVIDIA should be announcing the Super models of the RTX 40 series on January 8th which has me interested. They claim the tensor cores are twice as fast and I might be able to get 50% more of them which on paper would be game changing BUT this is Lightroom. It has a long history of being an outlier from where you'd expect the performance difference to be given any 2 chips.
IMHO all those 40 series 192 bit bus cards are over priced. I personally would wait for their next generation 50 series and meanwhile get something else. But, it’s your money so…
 
Lightroom AI performance has become a frequent topic here so I thought I'd check in to see if anyone has run across a good benchmark for various video cards, especially the different NVIDIA generations.

Because this is becoming a big bottleneck for me, I picked up a 3060 Ti as a stopgap last year when AI tools were getting more common and while it's better than my GTX1080 it still takes a long time if I apply it to a whole shoot.

NVIDIA should be announcing the Super models of the RTX 40 series on January 8th which has me interested. They claim the tensor cores are twice as fast and I might be able to get 50% more of them which on paper would be game changing BUT this is Lightroom. It has a long history of being an outlier from where you'd expect the performance difference to be given any 2 chips.
IMHO all those 40 series 192 bit bus cards are over priced. I personally would wait for their next generation 50 series and meanwhile get something else. But, it’s your money so…
It's more about the processing time being an issue.

The small bus limits their total RAM but may not be a performance bottleneck in these use cases. And if the 4070 super gets most of the way to a 4070 Ti it's not a great deal but it's still ~50% more tensor cores each of which can be significantly faster so for this specific use it could be a big improvement beyond what it is for overall raw compute.
 
Last edited:
Lightroom AI performance has become a frequent topic here so I thought I'd check in to see if anyone has run across a good benchmark for various video cards, especially the different NVIDIA generations.

Because this is becoming a big bottleneck for me, I picked up a 3060 Ti as a stopgap last year when AI tools were getting more common and while it's better than my GTX1080 it still takes a long time if I apply it to a whole shoot.

NVIDIA should be announcing the Super models of the RTX 40 series on January 8th which has me interested. They claim the tensor cores are twice as fast and I might be able to get 50% more of them which on paper would be game changing BUT this is Lightroom. It has a long history of being an outlier from where you'd expect the performance difference to be given any 2 chips.
IMHO all those 40 series 192 bit bus cards are over priced. I personally would wait for their next generation 50 series and meanwhile get something else. But, it’s your money so…
It's more about the processing time being an issue.

The small bus limits their total RAM but may not be a performance bottleneck in these use cases. And if the 4070 super gets most of the way to a 4070 Ti it's not a great deal but it's still ~50% more tensor cores each of which can be significantly faster so for this specific use it could be a big improvement beyond what it is for overall raw compute.
What I recommend is to write to this guy because he is one of the only YouTubers that “tests” GPU for Lightroom. I’m sure he will eventually get a hold of the Super series cards so ask him to make a video showing the actual times to denoise some photos.

What this guy always says is that “you don’t need a high end GPU for Lightroom Classic. There must be a reason for that and he’s 100% correct.

 
Last edited:
What I recommend is to write to this guy because he is one of the only YouTubers that “tests” GPU for Lightroom. I’m sure he will eventually get a hold of the Super series cards so ask him to make a video showing the actual times to denoise some photos.

What this guy always says is that “you don’t need a high end GPU for Lightroom Classic. There must be a reason for that and he’s 100% correct.

Thanks, It looks like he's mostly just using pugetbench for the Adobe stuff.

And yup that's right, before AI NR a high end GPU didn't help out much over a mid ranged one in the develop module. But now it's a different story specifically for AI NR.
 
Last edited:
The 4070, 4070 Ti, and 4080 variants will likely have at most a 10% to 20% performance improvement.
May I ask performance improvements in what?
For users, the improvements can be observed in response time/latency. I upgraded from 3080Ti to 4090 and experience a big improvement.
 

Keyboard shortcuts

Back
Top