Graphics cards . . .

The clue is in the name "AI" Denoise. For that you need Ray Tracing cores and Tensor cores. RTX has those, again the clue is in the "RT". The GTX doesn't have that stuff.
Ray tracing cores? I don't know for sure, but I doubt that.

I believe that people have done well with the GTX 16X0 series, which are Turing cards (like the RTX 20X0 series), but without the hardware raytracing features.

(I've never owned one of the GTX 16X0 cards, personally. I went straight from a GTX 970 to an RTX 3000 series card.)
I posted this three weeks ago regarding upgrading from a GTX 1660 Super to a RTX 3070TI:

Today before my new nVidia RTX 3079TI card arrived, I did a carefully timed test on a 10,000 ISO image with lots of noise. The rig is a I7-8700K, 64GB ram and Win 10 pro. I had the noise reduction setting at 59. The first test was with the GTX1660S in the computer, the second is with the new RTX 3070TI:

1660S - Expected time 1 min, Actual time 1:10

3070TI - Expected time 10 sec, Actual time 11sec.

The exact same image was used for the comparison.

The 3070TI times are very similar to what I'm getting on the same image on my MBP 16" 64GB M1 Max that I principally use for travel.
Interesting. A more than 6X performance increase.

nVidia's specs for their past few generations of GPUs are here .

The 4070 ti has 12GB of VRAM, 7680 CUDA cores, and a 192 bit memory bus. The GTX 1660S has 6GB of VRAM, 1408 CUDA cores, and the same VRAM bus width.

(CUDA cores aren't associated with raytracing, as far as I know. They are mentioned for GPU computing.)

If you'd gone from a GTX 1660S to an RTX 2060, you'd have made the closest comparison of an RTX card to a similar card that lacks hardware raytracing. The lowest-end version of the 2060 still has more Cuda cores than the 1660S.
It's not the RT cores being used but the Tensor AI cores.

This is more important when looking at the various RTX cards as NVIDIA gives performance metrics for each... though something to also keep in mind is that TFLOPS doesn't always equal TFLOPS when moving to new architectures.
 
Last edited:
I am running a RTX 2070 super in my XPS workstation. It's FAST. No waiting around for stuff with my system.
How many seconds does your card take to run Denoise? What size files and what Pc system do you have?

Thanks
I have a Dell xps 8490.

Core i7 10700, 128gb of ram, RTX 2070 super, 4 ssds, 1 HDD, etc etc etc.

Using Luminar Neo, and AI denoise on a file from my X-S1, it just does it, I select denoise, and it's done. What software are you using and why is it taking like 30-40 seconds a file like I am reading about above?
 
I am running a RTX 2070 super in my XPS workstation. It's FAST. No waiting around for stuff with my system.
How many seconds does your card take to run Denoise? What size files and what Pc system do you have?

Thanks
I have a Dell xps 8490.

Core i7 10700, 128gb of ram, RTX 2070 super, 4 ssds, 1 HDD, etc etc etc.

Using Luminar Neo, and AI denoise on a file from my X-S1, it just does it, I select denoise, and it's done. What software are you using and why is it taking like 30-40 seconds a file like I am reading about above?
Im using good old lame Adobe Lightroom Classic.
--
Fronterra Photography Tours
One Lens, No Problem
The Point and Shoot Pro
The People of the Red and White stand with the people of the yellow and blue!
 
I am running a RTX 2070 super in my XPS workstation. It's FAST. No waiting around for stuff with my system.
Have you tried to do AI noise reduction on a raw file of any sort in Lightroom or ACR?
No. I don't pay subs for my photo software. I pay single price so I use affinity and Luminar Neo. For just for fun photos I use polar.
 
I am running a RTX 2070 super in my XPS workstation. It's FAST. No waiting around for stuff with my system.
Have you tried to do AI noise reduction on a raw file of any sort in Lightroom or ACR?
No. I don't pay subs for my photo software. I pay single price so I use affinity and Luminar Neo. For just for fun photos I use polar.
OK. Never mind. Then you are not addressing what the OP wrote about.

--
Kind regards
Kaj
http://www.pbase.com/kaj_e
WSSA member #13
It's about time we started to take photography seriously and treat it as a hobby.- Elliott Erwitt
 
Last edited:
I am running a RTX 2070 super in my XPS workstation. It's FAST. No waiting around for stuff with my system.
Have you tried to do AI noise reduction on a raw file of any sort in Lightroom or ACR?
No. I don't pay subs for my photo software. I pay single price so I use affinity and Luminar Neo. For just for fun photos I use polar.
OK. Never mind. Then you are not addressing what the OP wrote about.
Ok, I was giving my experience with my system. Maybe the OP should try some modern software instead of old stuff like lightroom....
 
I am running a RTX 2070 super in my XPS workstation. It's FAST. No waiting around for stuff with my system.
Have you tried to do AI noise reduction on a raw file of any sort in Lightroom or ACR?
No. I don't pay subs for my photo software. I pay single price so I use affinity and Luminar Neo. For just for fun photos I use polar.
OK. Never mind. Then you are not addressing what the OP wrote about.
Ok, I was giving my experience with my system. Maybe the OP should try some modern software instead of old stuff like lightroom....
🤣
 
I am running a RTX 2070 super in my XPS workstation. It's FAST. No waiting around for stuff with my system.
Have you tried to do AI noise reduction on a raw file of any sort in Lightroom or ACR?
No. I don't pay subs for my photo software. I pay single price so I use affinity and Luminar Neo. For just for fun photos I use polar.
OK. Never mind. Then you are not addressing what the OP wrote about.
Ok, I was giving my experience with my system. Maybe the OP should try some modern software instead of old stuff like lightroom....
🤣
The point and shoot pro ain’t wrong though. I’ve tried leaving Adobe Lightroom for something else because when it comes down to speed and performance Adobe suck at it. This Denoise feature is a perfect example. Is not our hardware (up to a point), is their software. Unfortunately LR has the best workflow for my needs 😭
--
Kind regards
Kaj
http://www.pbase.com/kaj_e
WSSA member #13
It's about time we started to take photography seriously and treat it as a hobby.- Elliott Erwitt
 
This post keeps getting updated based on a specific 60mp sample file and different systems




2957a19845ce48cdb45aa8b8d459f96b.jpg




--
My Flickr Birds
 
This post keeps getting updated based on a specific 60mp sample file and different systems

https://www.lightroomqueen.com/comm...ia-rtx-4070-ti-12gb.47572/page-2#post-1315545

2957a19845ce48cdb45aa8b8d459f96b.jpg
As a photographer that does this as a profession, to me IMHO, these numbers are unacceptable even the 18 seconds per photo is just too darn long. This software is simply too slow even with the more expensive GPUs and CPU. I sure hope Adobe Lightroom does something about all this.

Also, are the AMD completely useless GPUs or something because I don’t see any on that list?? AMD latest more modern GPUs don’t work 6000 or 7000 series don’t work??
 
Last edited:
This post keeps getting updated based on a specific 60mp sample file and different systems

https://www.lightroomqueen.com/comm...ia-rtx-4070-ti-12gb.47572/page-2#post-1315545

2957a19845ce48cdb45aa8b8d459f96b.jpg
As a photographer that does this as a profession, to me IMHO, these numbers are unacceptable even the 18 seconds per photo is just too darn long. This software is simply too slow even with the more expensive GPUs and CPU. I sure hope Adobe Lightroom does something about all this.

Also, are the AMD completely useless GPUs or something because I don’t see any on that list?? AMD latest more modern GPUs don’t work 6000 or 7000 series don’t work??
The lack of AMD results is likely more a reflection of Adobe users gravitating towards nVidia cards due to the perception of better driver support. it would be interesting to see some times from recent AMD cards for comparison.

--
My Flickr Birds
 
This post keeps getting updated based on a specific 60mp sample file and different systems

https://www.lightroomqueen.com/comm...ia-rtx-4070-ti-12gb.47572/page-2#post-1315545

2957a19845ce48cdb45aa8b8d459f96b.jpg
As a photographer that does this as a profession, to me IMHO, these numbers are unacceptable even the 18 seconds per photo is just too darn long. This software is simply too slow even with the more expensive GPUs and CPU. I sure hope Adobe Lightroom does something about all this.

Also, are the AMD completely useless GPUs or something because I don’t see any on that list?? AMD latest more modern GPUs don’t work 6000 or 7000 series don’t work??
The lack of AMD results is likely more a reflection of Adobe users gravitating towards nVidia cards due to the perception of better driver support. it would be interesting to see some times from recent AMD cards for comparison.

--
My Flickr Birds
Is the file being used fair? It's certainly an unusual experiment. How fast are a wide assortment of machines, running different flavours and setups of an OS with then different GFX cards.

Perhaps it's not a sufficiently robust description of affairs.
 
Also, are the AMD completely useless GPUs or something because I don’t see any on that list?? AMD latest more modern GPUs don’t work 6000 or 7000 series don’t work??
I think you may have answered your own question back in the early days of this thead. :-)

https://www.dpreview.com/forums/post/67026235

Performance of the new denoise is targeted to make use of hardware features found in NVIDIA GPUs (and Apple SoCs).
 
This post keeps getting updated based on a specific 60mp sample file and different systems

https://www.lightroomqueen.com/comm...ia-rtx-4070-ti-12gb.47572/page-2#post-1315545

2957a19845ce48cdb45aa8b8d459f96b.jpg
As a photographer that does this as a profession, to me IMHO, these numbers are unacceptable even the 18 seconds per photo is just too darn long. This software is simply too slow even with the more expensive GPUs and CPU. I sure hope Adobe Lightroom does something about all this.

Also, are the AMD completely useless GPUs or something because I don’t see any on that list?? AMD latest more modern GPUs don’t work 6000 or 7000 series don’t work??
The lack of AMD results is likely more a reflection of Adobe users gravitating towards nVidia cards due to the perception of better driver support. it would be interesting to see some times from recent AMD cards for comparison.
Is the file being used fair? It's certainly an unusual experiment. How fast are a wide assortment of machines, running different flavours and setups of an OS with then different GFX cards.

Perhaps it's not a sufficiently robust description of affairs.
The GPU the main reason for these time. My system is the last on on this list. The cpu is from over 10 years ago yet does ok with a 2060 GPU for a 60mp file, and not far behind a newer generation 3060 GPU with a much faster CPU. My OM-1 20mp files take 16s with my old system.

--
My Flickr Birds
 
This post keeps getting updated based on a specific 60mp sample file and different systems

https://www.lightroomqueen.com/comm...ia-rtx-4070-ti-12gb.47572/page-2#post-1315545

2957a19845ce48cdb45aa8b8d459f96b.jpg
As a photographer that does this as a profession, to me IMHO, these numbers are unacceptable even the 18 seconds per photo is just too darn long. This software is simply too slow even with the more expensive GPUs and CPU. I sure hope Adobe Lightroom does something about all this.

Also, are the AMD completely useless GPUs or something because I don’t see any on that list?? AMD latest more modern GPUs don’t work 6000 or 7000 series don’t work??
The lack of AMD results is likely more a reflection of Adobe users gravitating towards nVidia cards due to the perception of better driver support. it would be interesting to see some times from recent AMD cards for comparison.
Is the file being used fair? It's certainly an unusual experiment. How fast are a wide assortment of machines, running different flavours and setups of an OS with then different GFX cards.

Perhaps it's not a sufficiently robust description of affairs.
The GPU the main reason for these time. My system is the last on on this list. The cpu is from over 10 years ago yet does ok with a 2060 GPU for a 60mp file, and not far behind a newer generation 3060 GPU with a much faster CPU. My OM-1 20mp files take 16s with my old system.
Some one recently claimed that their 2060 super does it instantaneously with Luminar. So the problem is not the GPUs but the software. And yes per the response from Shood it would be interesting to see what the new AMD GPU can do with Lighrtoom. I find it really surprising no one is using one hahaha :-)
 
Last edited:

Keyboard shortcuts

Back
Top