Mini update from the OP.
I updated my GPU driver, with interesting results. I had upgraded the entire Win11 OS to version 23H2 just 2 weeks ago, and I noted I got a new GPU driver in that process (dated 8/14/24). So I thought I was OK. But I checked, and Nvidia said the current driver was now dated 5/14/25, just 2 weeks ago.
The driver update was a minor PITA. Turns out Nvidia has killed the "GeForce Experience" and replaced it with a new routine simply called "Nvidia App". So I had to download/install that first. The driver update then went well, and even deleted/uninstalled all traces of the old "GeForce Experience". However, it tries to force some gaming junk to be included, and you have to "opt-out" a few times.
With the new GPU driver, I repeated some tests. I loaded 25 "naked" raw files into ACR, then used the filmstrip to AI Denoise them one at a time. The GPU memory went to 5.4 GB and stayed there. Denoise speed remained normal at 8 seconds per image. Then I did it again, this time selecting all files and running Denoise in batch. Same result, normal 8 second speed, no memory bloat.
So, that sounds good, but time will tell. In my real world workflow I Denoise the raw images one at a time, bring one into Photoshop, and hammer away. I do lots of other "AI" stuff in Photoshop. Some removes, auto selections, Gen Fill, etc. Then I go back to ACR, grab the next image, and repeat. So, I don't know yet if all those round trips to Photoshop are causing problems that eventually degrade Denoise.
I updated my GPU driver, with interesting results. I had upgraded the entire Win11 OS to version 23H2 just 2 weeks ago, and I noted I got a new GPU driver in that process (dated 8/14/24). So I thought I was OK. But I checked, and Nvidia said the current driver was now dated 5/14/25, just 2 weeks ago.
The driver update was a minor PITA. Turns out Nvidia has killed the "GeForce Experience" and replaced it with a new routine simply called "Nvidia App". So I had to download/install that first. The driver update then went well, and even deleted/uninstalled all traces of the old "GeForce Experience". However, it tries to force some gaming junk to be included, and you have to "opt-out" a few times.
With the new GPU driver, I repeated some tests. I loaded 25 "naked" raw files into ACR, then used the filmstrip to AI Denoise them one at a time. The GPU memory went to 5.4 GB and stayed there. Denoise speed remained normal at 8 seconds per image. Then I did it again, this time selecting all files and running Denoise in batch. Same result, normal 8 second speed, no memory bloat.
So, that sounds good, but time will tell. In my real world workflow I Denoise the raw images one at a time, bring one into Photoshop, and hammer away. I do lots of other "AI" stuff in Photoshop. Some removes, auto selections, Gen Fill, etc. Then I go back to ACR, grab the next image, and repeat. So, I don't know yet if all those round trips to Photoshop are causing problems that eventually degrade Denoise.