High CPU with Camera Raw Denoise?

Redcrown

Senior Member
Messages
2,024
Solutions
7
Reaction score
682
Location
US
Asking for help to confirm or deny an issue I see with the "new" Adobe Camera Raw Denoise AI on Windows (11). Please test and report if you see the same.

1. Open a "virgin" raw file in ACR while monitoring the CPU utilization.
2. Tick the Denoise option, see the "Enhance" window for a few seconds, then...
3. Don't do anything, just sit and watch the CPU percentage.

On my Win11 system I see the CPU ramp up to +95% and float there for 20 to 30 seconds. Yet, there appears to be nothing happening. If I click "Done" while the CPU is high, the ACR window closes immediately, CPU drops to idle, and I get a properly denoised file. However, if I reopen that file later in ACR, I see the same high CPU for the same 20-30 seconds.

On my high powered desktop, liquid cooled CPU and many fans, no problem. But on any battery powered Laptop, this will certainly drain the battery and overheat.
 
How long does it say the Denoise will take? Do you have a GPU it can use?
 
The "new" Denoise AI in ACR Version 17 no longer show a time estimate like the old DNG creating version did. A progress window appears simply saying "Enhancing..." and lasts about 5 seconds before the CPU begins the 40 second crunching.

I have an Nvidia RTX 3070 8GB GPU. During the 40 second high CPU session, the GPU floats erratically between 3% and 14%.
 
Camera Raw Denoise responds to GPU horsepower but whips the CPU much harder than the GPU.

If you watch Topaz and On1 hardware utilization its a similar situation: high CPU utilization, barely stressed GPU.

Perhaps that explains why the Puget Sound PS script scores show a hugely bigger impact of the CPU than GPU.

For image processing, as opposed to gaming, I guess you put more money into the CPU than GPU.

I wish there were more objective tests of CPU/GPU in different image processing applications with databases of scores, I guess the target audience is too small compared to gamers.
 
  • Like
Reactions: Lan
Camera Raw Denoise responds to GPU horsepower but whips the CPU much harder than the GPU.
Thats unfortunate, as the current LRc Denoise really takes advantage of my GPU. 10 seconds for a full denoise. I'd hate to take a step back is the Camera Raw Denoise makes it into production in LRc.
 
Asking for help to confirm or deny an issue I see with the "new" Adobe Camera Raw Denoise AI on Windows (11). Please test and report if you see the same.

1. Open a "virgin" raw file in ACR while monitoring the CPU utilization.
2. Tick the Denoise option, see the "Enhance" window for a few seconds, then...
3. Don't do anything, just sit and watch the CPU percentage.

On my Win11 system I see the CPU ramp up to +95% and float there for 20 to 30 seconds. Yet, there appears to be nothing happening. If I click "Done" while the CPU is high, the ACR window closes immediately, CPU drops to idle, and I get a properly denoised file. However, if I reopen that file later in ACR, I see the same high CPU for the same 20-30 seconds.

On my high powered desktop, liquid cooled CPU and many fans, no problem. But on any battery powered Laptop, this will certainly drain the battery and overheat.
Found this article that perhaps could be of assistance. Perhaps!

https://community.adobe.com/t5/ligh...i-denoise-gpu-vs-cpu-time-issues/m-p/14285170

--
Major Jack
"You are welcome to retouch any photograph I post in these forums without prior consent from me". Have fun, and play as you wish.
 
Last edited:
Thanks for the replies, but I think many are missing the point. On my Win11 system, ACR runs the CPU to 100% and GPU to 15% long AFTER the Denoise task is complete and the system is apparently doing nothing. Proof of that is when I interrupt the process by simply clicking Done, yet still get a proper denoised image.

That can't be right, so I'm trying to determine if it's unique to my system, or widespread. Somebody please do the test.

P.S. I've posted same in Adobe's own Camera Raw forum, but have no response. No response there to many other questions as well.
 
Forgot to mention earlier... In testing this issue I tried to turn off GPU processing in ACR. It makes no difference. I get the same CPU and GPU percentages for the same times. The GPU settings in ACR/Performance appear to be non-functional.
 
Forgot to mention earlier... In testing this issue I tried to turn off GPU processing in ACR. It makes no difference. I get the same CPU and GPU percentages for the same times. The GPU settings in ACR/Performance appear to be non-functional.
Have you tried this by setting it to always use the GPU?
 
That sounds like a configuration-specific Adobe bug to me. Denoise is supposed to be almost all GPU-based. The CPU shouldn't work hard at all during Denoise. During Denoise I have never seen the "CPU hammering" claimed by mboag. The CPU tends to take a break and stay idle, cool, and low power.

I just ran the same test on my M1 Pro MacBook Pro. During the full course of the Denoise run, everything was as expected and as documented by many:

CPU performance core usage under 25%

GPU usage at or near 100%

Laptop fans not even coming on (but that may be because of Apple Silicon thermal efficiency)

When I say you might be seeing a "bug" I mean a bug possibly related to the fact that you have Technology Preview turned on. Do you see the same results if you Denoise the old way with the Technology Preview off? (I got the same time and results either way.)

If it works as expected and the CPU problem goes away after you turn off the Tech Preview setting (and reboot the host app), and try again, then I will propose that there is a bug with the new way on your configuration and you should report it to Adobe so they might fix it before this new way makes it out of Technology Preview mode. Because unusual CPU usage during Denoise doesn't happen on other configurations (like mine).
 
Last edited:
That sounds like a configuration-specific Adobe bug to me. Denoise is supposed to be almost all GPU-based. The CPU shouldn't work hard at all during Denoise. During Denoise I have never seen the "CPU hammering" claimed by mboag. The CPU tends to take a break and stay idle, cool, and low power.

I just ran the same test on my M1 Pro MacBook Pro. During the full course of the Denoise run, everything was as expected and as documented by many:

CPU performance core usage under 25%

GPU usage at or near 100%

Laptop fans not even coming on (but that may be because of Apple Silicon thermal efficiency)

When I say you might be seeing a "bug" I mean a bug possibly related to the fact that you have Technology Preview turned on. Do you see the same results if you Denoise the old way with the Technology Preview off? (I got the same time and results either way.)

If it works as expected and the CPU problem goes away after you turn off the Tech Preview setting (and reboot the host app), and try again, then I will propose that there is a bug with the new way on your configuration and you should report it to Adobe so they might fix it before this new way makes it out of Technology Preview mode. Because unusual CPU usage during Denoise doesn't happen on other configurations (like mine).
This is what one of my laptops is showing. This is from LrC using Denoise AI.

Looking at the new NVIDIA app it tells me power, voltage, clockspeeds and so forth. For this function the 4070 never uses more than 26W, so about 1/6th of the maximum. So something odd is happening.

fce48e05cda34c43a0a5aa55c2f54874.jpg.png



cb1a3c84800f4f5796f64c4e555495a9.jpg.png
 
That sounds like a configuration-specific Adobe bug to me. Denoise is supposed to be almost all GPU-based. The CPU shouldn't work hard at all during Denoise. During Denoise I have never seen the "CPU hammering" claimed by mboag. The CPU tends to take a break and stay idle, cool, and low power.

I just ran the same test on my M1 Pro MacBook Pro. During the full course of the Denoise run, everything was as expected and as documented by many:

CPU performance core usage under 25%

GPU usage at or near 100%

Laptop fans not even coming on (but that may be because of Apple Silicon thermal efficiency)

When I say you might be seeing a "bug" I mean a bug possibly related to the fact that you have Technology Preview turned on. Do you see the same results if you Denoise the old way with the Technology Preview off? (I got the same time and results either way.)

If it works as expected and the CPU problem goes away after you turn off the Tech Preview setting (and reboot the host app), and try again, then I will propose that there is a bug with the new way on your configuration and you should report it to Adobe so they might fix it before this new way makes it out of Technology Preview mode. Because unusual CPU usage during Denoise doesn't happen on other configurations (like mine).
This is what one of my laptops is showing. This is from LrC using Denoise AI.

Looking at the new NVIDIA app it tells me power, voltage, clockspeeds and so forth. For this function the 4070 never uses more than 26W, so about 1/6th of the maximum. So something odd is happening.

fce48e05cda34c43a0a5aa55c2f54874.jpg.png

cb1a3c84800f4f5796f64c4e555495a9.jpg.png
This is from ACR via PS Beta.

Same image but this now uses 35W

f7e977c0ed2b4afe8d910edfb360557e.jpg.png



0102132c906342008065101dfb60e9ca.jpg.png

With these settings



af7e2052face4003bb3e172794ccd22e.jpg.png
 
The "new" Denoise AI in ACR Version 17 no longer show a time estimate like the old DNG creating version did. A progress window appears simply saying "Enhancing..." and lasts about 5 seconds before the CPU begins the 40 second crunching.

I have an Nvidia RTX 3070 8GB GPU. During the 40 second high CPU session, the GPU floats erratically between 3% and 14%.
That shouldn't happen. The program should be using the GPU with very little CPU usage. Is there a setting in the software to use the GPU?
 
Camera Raw Denoise responds to GPU horsepower but whips the CPU much harder than the GPU.

If you watch Topaz and On1 hardware utilization its a similar situation: high CPU utilization, barely stressed GPU.

Perhaps that explains why the Puget Sound PS script scores show a hugely bigger impact of the CPU than GPU.

For image processing, as opposed to gaming, I guess you put more money into the CPU than GPU.

I wish there were more objective tests of CPU/GPU in different image processing applications with databases of scores, I guess the target audience is too small compared to gamers.
I use DXO PhotoLab and Topaz Photo AI. Both of those use the GPU heavily and the CPU hardly at all.
 
The "new" Denoise AI in ACR Version 17 no longer show a time estimate like the old DNG creating version did. A progress window appears simply saying "Enhancing..." and lasts about 5 seconds before the CPU begins the 40 second crunching.

I have an Nvidia RTX 3070 8GB GPU. During the 40 second high CPU session, the GPU floats erratically between 3% and 14%.
That shouldn't happen. The program should be using the GPU with very little CPU usage. Is there a setting in the software to use the GPU?
It’s in the Preferences section.
 
With these settings

af7e2052face4003bb3e172794ccd22e.jpg.png
Those aren’t the right settings. Those are the Technology Preview settings from Photoshop. I am referring to the Technology Preview settings in Camera RAw Preferences, not Photoshop preferences.

It is odd that your GPU utilization is low in the other screen shots. The CPU utilization shown is low…as expected, so that is normal.
 
With these settings

af7e2052face4003bb3e172794ccd22e.jpg.png
Those aren’t the right settings. Those are the Technology Preview settings from Photoshop. I am referring to the Technology Preview settings in Camera RAw Preferences, not Photoshop preferences.
Are they not one and the same thing?
It is odd that your GPU utilization is low in the other screen shots. The CPU utilization shown is low…as expected, so that is normal.
 
I just ran this test on my newly assembled Win 11 machine: Asus Dark Hero MB, I7-14700K, 64GB ram and Nvidia RTX 3070TI video card.

My task manager results were nearly identical in PS 2025 opened initially in ACR and Lightroom Classic using Detail -> Enhance -> Denoise.

CPU < 10%, GPU 90% +. Denoise processing time on Nikon Z 9 45MP file shows 9 seconds. There is no real indication of when it has finished, but watching Task Manger, I can tell that it's done. The graph for the GPU stays high for about that time or perhaps a little less.

BTW, I upgraded from an I7-8700K system using the same 3070TI video card. All PS and LrC functions are far faster with the I7-14700. Makes me really happy about that.

--
Regards,
Nikon Z 9, Z 8, 14mm-800mm + Z 1.4TC.
Computer Win 11 Pro, I7-14700K, 64GB, RTX3070TI. Travel machine: 2021 MacBook Pro M1 MAX 64GB. All Adobe apps.
FAA Remote Pilot Certificate, ATP ASMEL
Mizzou PJ '66 - Amateur Radio K6KT
www.kenseals.com
 
Last edited:
  • Like
Reactions: Lan
Are they not one and the same thing?
No, absolutely not. It should have been already clear from my other reply, but I will make it even more clear.

There are Technology Preview options in Photoshop and in Camera Raw, and they are different.

I am looking in Photoshop 26.1 under Preferences/Technology Previews, the list is what was posted above and does not affect Camera Raw:
  • Enable Preserve Details 2.0 Upscale
  • Enable Content-Aware Tracing Tool
  • Precise color management for HDR display
  • Precise previews for 16-bit documents
Now I am looking at the Technology Previews section in Camera Raw 17 Preferences. which is a separate menu item (on the Mac, I am not sure about Windows), or you can click the gear icon when you are inside Camera Raw itself. There is only one setting:
  • New AI Features and Settings Panel
This is the setting that controls whether Denoise is in a dialog box (the currently normal way), or direct in the Details panel (the new unproven way undergoing public user testing).
 
Last edited:
Are they not one and the same thing?
No, absolutely not. It should have been already clear from my other reply, but I will make it even more clear.

There are Technology Preview options in Photoshop and in Camera Raw, and they are different.
The clarity want not so transparent I am afraid.

I think this is the area you are describing. However, if one clicks on the 'Learn more' that takes you to the same place as this link.

https://helpx.adobe.com/uk/camera-raw/using/enhance.html#ai-enhancements

The tick box was already selected when I viewed it.

ACR was launch from Photoshop Beta, 26.2.0 R.

092ceccd44774ee191d0cbfc3a052479.jpg

I am looking in Photoshop 26.1 under Preferences/Technology Previews, the list is what was posted above and does not affect Camera Raw:
  • Enable Preserve Details 2.0 Upscale
  • Enable Content-Aware Tracing Tool
  • Precise color management for HDR display
  • Precise previews for 16-bit documents
Now I am looking at the Technology Previews section in Camera Raw 17 Preferences. which is a separate menu item (on the Mac, I am not sure about Windows), or you can click the gear icon when you are inside Camera Raw itself. There is only one setting:
  • New AI Features and Settings Panel
This is the setting that controls whether Denoise is in a dialog box (the currently normal way), or direct in the Details panel (the new unproven way undergoing public user testing).
 

Keyboard shortcuts

Back
Top