I was wrong- ACR AI denoise is quite nice

chrishurley

Senior Member
Messages
1,248
Reaction score
537
Location
US
So I've been giving ACR's AI denoise another shot based on advice in another thread and I am pretty pleased with the results

I upgraded my GPU from a 1660 TI to a 3060 and the denoise time of a 20Mp Olympus file went from about a minute (it always said 15-20 seconds but never took that short) to around 10 seconds or so. I will probably go ahead and upgrade the rest of my machine while I'm at it because I'm using a scavenged parts PC with older CPU, SSD, etc... On a machine with no compatible GPU, it usually takes 20-30 minutes !

I've rolled my eyes a little when people talk about AI denoise being a gamechanger for micro four thirds- even after using it some- but it really is, as long as your machine can handle it. When it takes 1-30 minutes per photo, it is of limited value but 10 seconds to run on a keeper photo isn't too bad.

Of course it works for other stuff than m43 just as well. I also shoot full frame and there is often plenty of opportunity to denoise that stuff too.

A lot of times, the basic denoising in ACR is fine if I don't want to wait but for keepers, it can get a little better.
 
I think with each ACR update there have been small improvements, it is a bit slower than DXO PR4 but for low volume shooters like myself not really an issue. Although many consider it a bit of a gimmick :-) the lens blur thing with a bit of fine tweaking can give surprisingly decent results

--
Jim Stirling:
“It is one thing to show a man that he is in error, and another to put him in possession of truth.” Locke
Feel free to tinker with any photos I post
 
Last edited:
I don't denoise a lot but I like the results from ACR AI. My M1 MacBook Pro takes 25-30s, which is manageable for my use but still a little annoying. Unfortunately, no option to upgrade components if I ever needed to :(
 
It really is a game-changer. High ISO is usable on any modern camera. But the image looks heaps better after going through Adobe's AI Denoise. That plus the computational modes in many M43 cameras even the playing field a bit.
 
Last edited:
So I've been giving ACR's AI denoise another shot based on advice in another thread and I am pretty pleased with the results

I upgraded my GPU from a 1660 TI to a 3060 and the denoise time of a 20Mp Olympus file went from about a minute (it always said 15-20 seconds but never took that short) to around 10 seconds or so. I will probably go ahead and upgrade the rest of my machine while I'm at it because I'm using a scavenged parts PC with older CPU, SSD, etc... On a machine with no compatible GPU, it usually takes 20-30 minutes !

I've rolled my eyes a little when people talk about AI denoise being a gamechanger for micro four thirds- even after using it some- but it really is, as long as your machine can handle it. When it takes 1-30 minutes per photo, it is of limited value but 10 seconds to run on a keeper photo isn't too bad.

Of course it works for other stuff than m43 just as well. I also shoot full frame and there is often plenty of opportunity to denoise that stuff too.

A lot of times, the basic denoising in ACR is fine if I don't want to wait but for keepers, it can get a little better.
ACR AI denoise is quite good, but I have an older laptop, which takes about 1 minute for each E-M5 ii orf to denoise. That's why I use DxO Pureraw, I run all my files through it, either with Deepprime or Deepprime XD, and then process the dng files in ACR. The output is in the "good enough" category, even up to iso 6400, making it much more difficult to justify "upgrading" to full frame.
 
ACR AI denoise is quite good, but I have an older laptop, which takes about 1 minute for each E-M5 ii orf to denoise. That's why I use DxO Pureraw, I run all my files through it, either with Deepprime or Deepprime XD, and then process the dng files in ACR. The output is in the "good enough" category, even up to iso 6400, making it much more difficult to justify "upgrading" to full frame.
Must still be pretty capable if it runs it in 1 minute. My "older laptops" take like 30 minutes but they have no GPU compute capability to help with the machine learning part.
 
A big advantage of ACR AI noise reduction is that you don't have to go out and back to a different program, like DxO, where the I/O speed can make the process seem onerous even using nvme drives and an uber CPU. However for really noisy images its worth the wait for DxO to do its thing, for average noise the Adobe does a good enough job.

In my experience with a range of GPUs from an AMD 580 to current nVidia: CPU IPC seems more important for just about all "AI" processing. I've noticed a bigger difference with a faster CPU than a faster GPU, which suggests to me there seems no point investing beyond the mid-tier GPU for Adobes (and Topaz) if you don't have an upper tier CPU. Unless you also play games.

Those so inclined might find it informative to monitor CPU and GPU usage when "AI" image processors are doing it to the data. Generally the CPU seems all in while the GPU is barely stimulated.

I haven't seen any big difference between similar tiers of nVidia and AMD GPUs in all image processing programs I've used (Adobes, Topaz, On1) so there's that too. Puget sound has data showing much the same for Adobes.
 
I used to use Topaz, but then I needed the more expensive Adobe subscription plan that had Photoshop. So using the Denise saved me $10 a month.

I also use a laptop with an onboard video card and M43 usually take 15 seconds or less for me. Definitely think M43 is in a good spot to benefit from denoising since it has good enough bones to make it worthwhile (versus a cell phone).

However, I have been using it less because HHHR is actually even more effective!
 
.

I haven't seen any big difference between similar tiers of nVidia and AMD GPUs in all image processing programs I've used (Adobes, Topaz, On1) so there's that too. Puget sound has data showing much the same for Adobes.
I saw a pretty big boost from the GTX 1660 to RTX 3060 with an 8th gen i5 in ACR. It could be that any RTX with tensor cores would have worked similarly. Task manager in windows doesn't show CPU maxed out for denoise but I'm sure it wouldn't hurt. The AI denoise in ACR is obviously running mostly on the GPU when it can. I can't remember what it did on my 13th Gen i5 but it wasn't even remotely close to the slower CPU with either video card.

Unfortunately, I couldn't find any concrete benchmarks for this specific task but the 3060 was said to work well for it and it does. ACR went from 'limited acceleration" to "full acceleratoon"

I gather the Apple CPU situation is totally different since it has some compute resources that can be leveraged for this kind of machine learning task.
 
I think I used Adobe Camera Raw (ACR?) as a plugin with PSE (v15) in the past. I haven't re-installed and used PSE since my new computer (been a while).

Does ACR work only with the subscription products, or can it be used with any other products?

Thanks.
 
I used to use Topaz, but then I needed the more expensive Adobe subscription plan that had Photoshop. So using the Denise saved me $10 a month.

I also use a laptop with an onboard video card and M43 usually take 15 seconds or less for me. Definitely think M43 is in a good spot to benefit from denoising since it has good enough bones to make it worthwhile (versus a cell phone).

However, I have been using it less because HHHR is actually even more effective!
While HHHR certainly has a strong denoising effect on its own, there's still good reason to run HHHR raws through ACR/LR AI Denoise because Adobe's AI algorithm tends to clean up some of the pixel level artifacts that can appear in HHHR images. It's not a magic bullet, but you will sometimes see less ugliness in the motion artifacts and cleaner edges.
 
Different camera, but I do know what you mean!

Quite liked this, processed as well as I could using "ordinary" denoising in LR - but wasn't not quite what I'd seen one rainy lunchtime in a Roman side-street!



...

- but liked this SO much more - allowed for some more sharpening, then removed the bad noise, leaving behind the small stuff that's meant to be there?



Glad I did finally swap my 11 year old Dell for something capable of running LR's AI features.

Peter
 

Attachments

  • 4437571.jpg
    4437571.jpg
    1.9 MB · Views: 0
  • 4437572.jpg
    4437572.jpg
    4.3 MB · Views: 0
Last edited:
A big advantage of ACR AI noise reduction is that you don't have to go out and back to a different program, like DxO, where the I/O speed can make the process seem onerous even using nvme drives and an uber CPU. However for really noisy images its worth the wait for DxO to do its thing, for average noise the Adobe does a good enough job.

In my experience with a range of GPUs from an AMD 580 to current nVidia: CPU IPC seems more important for just about all "AI" processing. I've noticed a bigger difference with a faster CPU than a faster GPU, which suggests to me there seems no point investing beyond the mid-tier GPU for Adobes (and Topaz) if you don't have an upper tier CPU. Unless you also play games.

Those so inclined might find it informative to monitor CPU and GPU usage when "AI" image processors are doing it to the data. Generally the CPU seems all in while the GPU is barely stimulated.

I haven't seen any big difference between similar tiers of nVidia and AMD GPUs in all image processing programs I've used (Adobes, Topaz, On1) so there's that too. Puget sound has data showing much the same for Adobes.
I upgraded my CPU, memory and everything else with a new machine today. ACR denoise takes basically the same time as it did in my old machine and CPU continues to be doing very little while that is happening. Upgrading from 1660 TI to 3060 RTX made a significant improvement.

At least for Adobe's AI denoise- the GPU is the first order thing to change, assuming the CPU and other stuff is at least plausible.
 
For those interested in upgrading GPU, here is a good site to compare NVIDIA offerings

GPU compare
 
So I've been giving ACR's AI denoise another shot based on advice in another thread and I am pretty pleased with the results

I upgraded my GPU from a 1660 TI to a 3060 and the denoise time of a 20Mp Olympus file went from about a minute (it always said 15-20 seconds but never took that short) to around 10 seconds or so. I will probably go ahead and upgrade the rest of my machine while I'm at it because I'm using a scavenged parts PC with older CPU, SSD, etc... On a machine with no compatible GPU, it usually takes 20-30 minutes !

I've rolled my eyes a little when people talk about AI denoise being a gamechanger for micro four thirds- even after using it some- but it really is, as long as your machine can handle it. When it takes 1-30 minutes per photo, it is of limited value but 10 seconds to run on a keeper photo isn't too bad.

Of course it works for other stuff than m43 just as well. I also shoot full frame and there is often plenty of opportunity to denoise that stuff too.

A lot of times, the basic denoising in ACR is fine if I don't want to wait but for keepers, it can get a little better.
If you use it in conjuntion with the raw enhance utility it really is fantastic, I've not used DxO for a long time since this feature has come into camera raw.

Adobe refers to the raw enhance as re-rendering the raw file but from eyeballing it on my images it produces smoother but cleaner edges and kills off the artefacts I sometimes see in my ORF files (depending on the content of the image). It allows for a crisper but cleaner output but used in conjuction with AI denoise (I set mine to like 30, less than default) it looks really quite natural.

I personally think Adobe have done a great job there, esepcially with the level of competition on denoising and raw enhancement tools available now.
 
I think I used Adobe Camera Raw (ACR?) as a plugin with PSE (v15) in the past. I haven't re-installed and used PSE since my new computer (been a while).

Does ACR work only with the subscription products, or can it be used with any other products?

Thanks.
Yeah it's an Adobe only product, apparently does still work as part of PSE according to Google.

Generally though it's embedded as a raw plugin within Photoshop, or exists within Lightroom and Lightroom Classic - they are the only programs that make use of ACR.
 
Denoise is quite speedy on my M2 Mac Mini, and I use it often. It is very effective indeed. In fact, my greatest tendency is to overdo it. Recently I had some biggish (70 x 50 cm) prints done, and the noise has been removed too efficiently, so that viewed close up the images have that artificial, plasticky look. Slight over-sharpening compounds the problem. (When exporting a file from Lightroom that is going to be printed somewhere, now I enter 'none' in the sharpening box.)

It's a reminder that in a print noise is usually much less obvious than pixel-peeping leads you to expect, and it's best to leave a bit in the image.
 
Denoise is quite speedy on my M2 Mac Mini,
Apple has built in some of the machine learning specialty cores into their newer processors which can be leveraged for this sort of thing. PC's have generally gone the GPU route for that capability but it is handy having it in the CPU when you don't need the other GPU stuff the GPU would do.

Apple was ahead of the curve on including these things in a CPU- previously this specialty compute capability was found in the GPU, but it looks like all the major processors are starting to incorporate some amount of that functionality in there as well.

It reminds me of way back times that many on here are old enough to remember- "math coprocessor" chips that you used to be able to add to a computer.

I'd like to see it show up in a USB brick that could be used on more computers. Seeing how dead my CPU is when I'm running denoise makes me think that this is a workload that you would just shove over the bus into the compute cores and then wait for them to hand back the result. USB might be plenty fast for that.
 
As an insect macro photographer, the Enhance feature in Lightroom has one very big benefit over Topaz Denoise AI. The Topaz product has an annoying habit of seeing compound eyes as noise, and smoothing them out so I then have to paint them back; but Enhance sees them as legitimate detail and preserves them as such.

I've no experience of any other AI noise reduction software, so I don't know how common this issue is.
 
ACR? Please define.

thanks!
 

Keyboard shortcuts

Back
Top