Playing the "Used Game" and Consolidating to one Mac

I had a 13" 2020 M1 which used for travel however I'm not working. It's just a hobby. I found 13" a tad small so I traded that in for a new 15" MacBook Air M3, 24GB RAM and 512GB. That bit of extra size made a big difference. O'm used to my desktop which is a 27" 5K screen.

I tell people to go a little bigger if they can manage it. Our age makes a difference as well.
The 15" Air is a fine choice for travel. But for my needs it doesn't have the processing grunt that a Max chip with more GPU cores and Ram gives me.

We are spoilt for choice, it's just a question of making the right choice (which may change over time too).
Hi Ray, what video files do you work with?
I’m stills only. But I bulk process a lot of images, often 500 plus per shoot day. Plus AI NR quite a bit. So that’s where I appreciate extra GPU cores and more ram. It speeds things up a lot.
Or you could get it done in 1/2 the time with DxO, no costly extra GPU cores required. My $1000 M4 MacBook Air running DeepPRIME will be as fast as a $4000 Mac Studio Ultra running Denoise. See my other posts in conversation with Zeee for more about this.

--
Event professional for 20+ years, travel & landscape enthusiast for 30+, stills-only.
http://jacquescornell.photography
http://happening.photos
 
Last edited:
FWIW, when running DxO's DeepPRIME noise reduction, my new 13" M4 MacBook Air is as fast as my M1 Max Mac Studio and 14" M1 Max MacBook Pro (both with 32 GPU cores). But, when running Adobe's Denoise, it takes twice as long as the M1 Maxes. DxO leans on Apple's Neural Engine, which got a major power-up in the M4, whereas Adobe relies on GPU cores.
Just curious. Did DXO actually fix the issue they also experienced with the NR or are they still using a work around? I loved that up a few months ago but couldn't find an answer.

14d1c6268426416aa1c6bee7d566bf7a.jpg
I have never seen any magenta cast, and I apply DeepPRIME of various flavors to almost all my images. I may have skipped Ventura. Been using Sequoia for a while now, not yet on Tahoe.
That is from 2023 and it was discussed on these forums and other sources. You don't see it because DXO found a temporary fix in 2023. I was just curios if they actually solved it or still using the fix.
I don't even know what that means. As far as I can tell, the "temporary fix" has solved the problem. I have had no issue with DxO on Apple Silicon, and the Neural Engine gives a rocket boost to noise reduction processing.
There is a work and an actual fix to an issue. They are different to me.
Can you define the difference?
My roof is leaking so I put a tarp over it. My neighbour replaced his shingles. Since other companies have had issues the NE I'm curious to know if they were able to solve it. Even that link said temporary. Is still temporary. I'm not asking a difficult question.
DXO not the only company that experienced NE issues with AI algorithms.
But apparently the only one to have worked out a practical solution.
It's been "temporary" for a long time. I just keep using it and getting better results faster than I can with Adobe. If people hadn't written about the "issue", I never would have guessed it existed.
Thanks. For other readers Adobe had NE enabled for about 3 months and disabled it. Apparently they did not like the quality in the deep shadows. Something I never noticed and that included others. On Adobe forums people wanted the choice to enable/disable it. I'm not a computer engineer. Perhaps it is more difficult to make a workaround for noise that it is a colour drift/shift.

Some other conversations. PL does not actually apply Denoise until you export a file. This is why it looks instant but you have to view it in that window. LrC applies it when you when you click on it. Denoise is applied to entire file in real time and you can tweak it as many times as you like and that is instant.
Except that after applying it to a large batch, I have to wait a long time for the processing to complete before I can move on to making other adjustments to the images. With Adobe, I wait in the middle of my workflow. With DxO, I do all my adjustments and then wait for the final export, during which I can simply walk off to make some coffee or have dinner. Since I often process hundreds of event photos at a time, DxO's workflow is far superior.
Not if you have the gear. Studio Max does Denoise AI in 8 seconds. If I was a pro I'd be maxing out on my gear no matter what I used. These conversations are pretty regular in the forums.
Plus, with DxO's workflow, if I'm really in a hurry, it's easy to first do all my adjustments, then put half the batch on an SSD for processing on my MacBook to cut my export time in half.
I did quick search and there are threads here and other places about how PL exports take a lot of time. Again that would be hardware dependant.
On my M1 Max Mac Studio with 32 GPU cores, DeepPRIME 3 takes half the time of Denoise.
Are you comparing DXO export time to LrC Denoise time? Or are you comparing how long it takes to view DeepPrime to show you what it will look like after you export?
LrC exports are instant, even for my slowest Macs. Something to remember. With LrC you wait at the beginning of your editing process. I don't see a big difference.

Adobe recommends to do Denoising first.
After I've downloaded 2,000 images, I want to make my selects in LRC, send to PL, make adjustments, start the export, and walk away. What I don't want to do is spend an hour making my selects and then have to stop for an hour or two while I wait for Denoise and then come back to make my adjustments. That would be a major PiTA.
Adobe recommends to apply Denoise AI first. Plan to batch, set it up, walk away then come back and go full bore editing. I prefer to se the full Denoised file as I develop. That was a game changer to me but it is a personal preference. You prefer your method.
You can select as many files as you like, batch Denies and go for a coffee. When I used it something I did that with PL at the end of an editing session because I had wait for the exports.
I'd much rather wait half as long at the end than twice at long in the middle.
That is a personal preference which fine. Everyone edits differently.
You can talk all you want about theoretical difficulties of using the Neural Engine for noise reduction and cast shade about "temporary" fixes, but DxO does a better job on noise reduction in half the time. That's my bottom line.
And I will when this comes up again. Our conversations are of little significance to each other. I think it's a good idea that other readers have all the information so they get the product that works best for them.

Oh, and I should mention that on my M4 MacBook Air, Denoise takes not 2x but 4x longer than DP3, which makes a huge difference when I'm processing onsite at a multi-day event. My MBA is actually just as fast as my Studio and my M1 Pro MBP on DP3. Which means that I can get the same performance running DxO on my portable $1000 MBA as I'd get with a desk-bound $4000 Mac Studio Ultra running Denoise.
I've been using LRC and PL side-by-side for a decade. LRC is my DAM, and I do all my RAW processing with PL. For my work, the only advantage of LRC for RAW processing is the ability to work on Sony's mRAW and sRAW files, something I'm pushing DxO hard to bring to PL so I can shoot 26MP mRAWs on my 61MP bodies.


--
Funny how millions of people on an internet platform where they can communicate instantaneously with people on the other side of the world using incredibly powerful handheld computers linked to orbiting the satellites hundreds of miles in space don’t believe in science. Neil deGrasse Tyson
 
Matt K. The NEW Way to Batch Noise Reduction in Lightroom

Yeah, I know all that. What he's saying is in accord with what I'm saying: when you sync Denoise across a batch of photos, you have to wait for Denoise to process before you can move on to doing anything else. In the video, it's 2 minutes for 4 photos. Now, extrapolate that to 2,000 photos. Yah, no thanks. Not in the middle of my workflow.
I repeat. If you are doing this in the middle you are not following the suggested order.


I have tweaked this to shorten the list but Denoise is always first. I was easy to adapt to it.

f482438a79ca4f3a9662db81a9b79cb9.jpg





Anyway, we're now way OT, so I won't pursue this conversation any further here. If you want to discuss, you can either DM me or start a new conversation.
--
Funny how millions of people on an internet platform where they can communicate instantaneously with people on the other side of the world using incredibly powerful handheld computers linked to orbiting the satellites hundreds of miles in space don’t believe in science. Neil deGrasse Tyson
 
FWIW, when running DxO's DeepPRIME noise reduction, my new 13" M4 MacBook Air is as fast as my M1 Max Mac Studio and 14" M1 Max MacBook Pro (both with 32 GPU cores). But, when running Adobe's Denoise, it takes twice as long as the M1 Maxes. DxO leans on Apple's Neural Engine, which got a major power-up in the M4, whereas Adobe relies on GPU cores.
Just curious. Did DXO actually fix the issue they also experienced with the NR or are they still using a work around? I loved that up a few months ago but couldn't find an answer.

14d1c6268426416aa1c6bee7d566bf7a.jpg
I have never seen any magenta cast, and I apply DeepPRIME of various flavors to almost all my images. I may have skipped Ventura. Been using Sequoia for a while now, not yet on Tahoe.
That is from 2023 and it was discussed on these forums and other sources. You don't see it because DXO found a temporary fix in 2023. I was just curios if they actually solved it or still using the fix.
I don't even know what that means. As far as I can tell, the "temporary fix" has solved the problem. I have had no issue with DxO on Apple Silicon, and the Neural Engine gives a rocket boost to noise reduction processing.
There is a work and an actual fix to an issue. They are different to me.
Can you define the difference?
My roof is leaking so I put a tarp over it. My neighbour replaced his shingles. Since other companies have had issues the NE I'm curious to know if they were able to solve it. Even that link said temporary. Is still temporary. I'm not asking a difficult question.
DXO not the only company that experienced NE issues with AI algorithms.
But apparently the only one to have worked out a practical solution.
It's been "temporary" for a long time. I just keep using it and getting better results faster than I can with Adobe. If people hadn't written about the "issue", I never would have guessed it existed.
Thanks. For other readers Adobe had NE enabled for about 3 months and disabled it. Apparently they did not like the quality in the deep shadows. Something I never noticed and that included others. On Adobe forums people wanted the choice to enable/disable it. I'm not a computer engineer. Perhaps it is more difficult to make a workaround for noise that it is a colour drift/shift.

Some other conversations. PL does not actually apply Denoise until you export a file. This is why it looks instant but you have to view it in that window. LrC applies it when you when you click on it. Denoise is applied to entire file in real time and you can tweak it as many times as you like and that is instant.
Except that after applying it to a large batch, I have to wait a long time for the processing to complete before I can move on to making other adjustments to the images. With Adobe, I wait in the middle of my workflow. With DxO, I do all my adjustments and then wait for the final export, during which I can simply walk off to make some coffee or have dinner. Since I often process hundreds of event photos at a time, DxO's workflow is far superior.
Not if you have the gear. Studio Max does Denoise AI in 8 seconds.
8 seconds for how many megapixels? My M1 Max Studio with the 32-GPU-core upgrade and my lowly M4 MBA both do DP3 on a 40MP RAW in under 4 seconds.
If I was a pro I'd be maxing out on my gear no matter what I used.
I'm running a business to make a living, so ROI is a major factor in my buying decisions. The ROI on DxO is through the roof.

I don't know where folks get the idea that pro photographers are made of money and can buy all the newest and greatest toys. Do you have any idea what the average income of a pro photog in the U.S. is?

From ShotKit.com:

"...according to Forbes, the highest paying salaries for photographers are in the District of Columbia, reaching an average of US$82,840. The lowest income is registered in North Carolina with US$33,630."

Salary.com: $89,632

U.S. Bureau of Labor Statistics: $42,520

Even the Salary.com figure is barely scraping by for a family of four trying to own a home in a major metropolitan area. Not a lot of extra for that pair of a9IIIs, maxed out MacBook Pro, Mac Studio Ultra, f2.8 zooms, f1.4 primes, thousands of dollars of lighting equipment, and a reliable vehicle, never mind health insurance that's about to double in cost so Elon Musk can get a tax break. Don't get me started. Long story short, most photogs have to be smart to do the most with the least. Over 25 years, I've become an expert in this.
These conversations are pretty regular in the forums.
Plus, with DxO's workflow, if I'm really in a hurry, it's easy to first do all my adjustments, then put half the batch on an SSD for processing on my MacBook to cut my export time in half.
I did quick search and there are threads here and other places about how PL exports take a lot of time. Again that would be hardware dependant.
On my M1 Max Mac Studio with 32 GPU cores, DeepPRIME 3 takes half the time of Denoise.
Are you comparing DXO export time to LrC Denoise time? Or are you comparing how long it takes to view DeepPrime to show you what it will look like after you export?
I'm comparing how long it takes DxO to export a finished DNG or JPEG that has DeepPRIME applied as compared with how long it takes LRC to complete its Denoise processing.
LrC exports are instant, even for my slowest Macs. Something to remember. With LrC you wait at the beginning of your editing process. I don't see a big difference.

Adobe recommends to do Denoising first.
After I've downloaded 2,000 images, I want to make my selects in LRC, send to PL, make adjustments, start the export, and walk away. What I don't want to do is spend an hour making my selects and then have to stop for an hour or two while I wait for Denoise and then come back to make my adjustments. That would be a major PiTA.
Adobe recommends to apply Denoise AI first.
Applying Denoise to all 2000 images rather than culling down to 800 selects and then Denoising those would be a colossal waste of time.
Plan to batch, set it up, walk away then come back and go full bore editing.
Still gotta cull first.
I prefer to se the full Denoised file as I develop.
I don't need to.
That was a game changer to me but it is a personal preference. You prefer your method.
DxO's default basic NR for full-screen preview is instant and sufficient for culling and adjusting. If I really want to check DeepPRIME denoising on a few images, the Loupe tool is fast and effective. I generally don't need to, though, because I know from experience what the end result is going to look like.
You can select as many files as you like, batch Denies and go for a coffee. When I used it something I did that with PL at the end of an editing session because I had wait for the exports.
I'd much rather wait half as long at the end than twice at long in the middle.
That is a personal preference which fine. Everyone edits differently.
I'm not telling you what to do. I'm describing the advantages of DxO.
You can talk all you want about theoretical difficulties of using the Neural Engine for noise reduction and cast shade about "temporary" fixes, but DxO does a better job on noise reduction in half the time. That's my bottom line.
And I will when this comes up again. Our conversations are of little significance to each other. I think it's a good idea that other readers have all the information so they get the product that works best for them.
That's what I'm providing to offset the FUD that's often posted about DxO by folks who haven't done a careful and thorough comparison.
Oh, and I should mention that on my M4 MacBook Air, Denoise takes not 2x but 4x longer than DP3, which makes a huge difference when I'm processing onsite at a multi-day event. My MBA is actually just as fast as my Studio and my M1 Pro MBP on DP3. Which means that I can get the same performance running DxO on my portable $1000 MBA as I'd get with a desk-bound $4000 Mac Studio Ultra running Denoise.
I've been using LRC and PL side-by-side for a decade. LRC is my DAM, and I do all my RAW processing with PL. For my work, the only advantage of LRC for RAW processing is the ability to work on Sony's mRAW and sRAW files, something I'm pushing DxO hard to bring to PL so I can shoot 26MP mRAWs on my 61MP bodies.
--
Event professional for 20+ years, travel & landscape enthusiast for 30+, stills-only.
http://jacquescornell.photography
http://happening.photos
 
Last edited:
FWIW, when running DxO's DeepPRIME noise reduction, my new 13" M4 MacBook Air is as fast as my M1 Max Mac Studio and 14" M1 Max MacBook Pro (both with 32 GPU cores). But, when running Adobe's Denoise, it takes twice as long as the M1 Maxes. DxO leans on Apple's Neural Engine, which got a major power-up in the M4, whereas Adobe relies on GPU cores.
Just curious. Did DXO actually fix the issue they also experienced with the NR or are they still using a work around? I loved that up a few months ago but couldn't find an answer.

14d1c6268426416aa1c6bee7d566bf7a.jpg
I have never seen any magenta cast, and I apply DeepPRIME of various flavors to almost all my images. I may have skipped Ventura. Been using Sequoia for a while now, not yet on Tahoe.
That is from 2023 and it was discussed on these forums and other sources. You don't see it because DXO found a temporary fix in 2023. I was just curios if they actually solved it or still using the fix.
I don't even know what that means. As far as I can tell, the "temporary fix" has solved the problem. I have had no issue with DxO on Apple Silicon, and the Neural Engine gives a rocket boost to noise reduction processing.
There is a work and an actual fix to an issue. They are different to me.
Can you define the difference?
My roof is leaking so I put a tarp over it. My neighbour replaced his shingles. Since other companies have had issues the NE I'm curious to know if they were able to solve it. Even that link said temporary. Is still temporary. I'm not asking a difficult question.
DXO not the only company that experienced NE issues with AI algorithms.
But apparently the only one to have worked out a practical solution.
It's been "temporary" for a long time. I just keep using it and getting better results faster than I can with Adobe. If people hadn't written about the "issue", I never would have guessed it existed.
Thanks. For other readers Adobe had NE enabled for about 3 months and disabled it. Apparently they did not like the quality in the deep shadows. Something I never noticed and that included others. On Adobe forums people wanted the choice to enable/disable it. I'm not a computer engineer. Perhaps it is more difficult to make a workaround for noise that it is a colour drift/shift.

Some other conversations. PL does not actually apply Denoise until you export a file. This is why it looks instant but you have to view it in that window. LrC applies it when you when you click on it. Denoise is applied to entire file in real time and you can tweak it as many times as you like and that is instant.
Except that after applying it to a large batch, I have to wait a long time for the processing to complete before I can move on to making other adjustments to the images. With Adobe, I wait in the middle of my workflow. With DxO, I do all my adjustments and then wait for the final export, during which I can simply walk off to make some coffee or have dinner. Since I often process hundreds of event photos at a time, DxO's workflow is far superior.
Not if you have the gear. Studio Max does Denoise AI in 8 seconds.
8 seconds for how many megapixels? My Studio Max and M4 MBA both do DP3 on a 40MP RAW in under 4 seconds.
I don't know. This is what a pro stated at the Lightroom Queens site. So DP3 takes 4 seconds and LrC 8. I don't see a huge issue.
If I was a pro I'd be maxing out on my gear no matter what I used.
I'm running a business to make a living, so ROI is a major factor in my buying decisions. The ROI on DxO is through the roof.
Glad it works for you. ROI is a critical part of any business investment.

These conversations are pretty regular in the forums.
Plus, with DxO's workflow, if I'm really in a hurry, it's easy to first do all my adjustments, then put half the batch on an SSD for processing on my MacBook to cut my export time in half.
I did quick search and there are threads here and other places about how PL exports take a lot of time. Again that would be hardware dependant.
On my M1 Max Mac Studio with 32 GPU cores, DeepPRIME 3 takes half the time of Denoise.
Are you comparing DXO export time to LrC Denoise time? Or are you comparing how long it takes to view DeepPrime to show you what it will look like after you export?
I'm comparing how long it takes DxO to export a finished DNG or JPEG that has DeepPRIME applied as compared with how long it takes LRC to complete its Denoise processing.
40 seconds to 80 is significant. 4 to 8 is not. For me anyway.
LrC exports are instant, even for my slowest Macs. Something to remember. With LrC you wait at the beginning of your editing process. I don't see a big difference.

Adobe recommends to do Denoising first.
After I've downloaded 2,000 images, I want to make my selects in LRC, send to PL, make adjustments, start the export, and walk away. What I don't want to do is spend an hour making my selects and then have to stop for an hour or two while I wait for Denoise and then come back to make my adjustments. That would be a major PiTA.
Adobe recommends to apply Denoise AI first.
Applying Denoise to all 2000 images rather than culling down to 800 selects and then Denoising those would be a colossal waste of time.
Plan to batch, set it up, walk away then come back and go full bore editing.
Still gotta cull first.
I like to a break after culling. Give my eyes a rest. I don't consider that part of file development but we both have our personal ways of doing things.
I prefer to se the full Denoised file as I develop.
I don't need to.
I need to. Critical for my wildlife work. In your case doing mass edits for you is not necessary. Even as a pro I would want to develop wildlife shots by seeing the full denoised file. We both have our requirements.
That was a game changer to me but it is a personal preference. You prefer your method.
DxO's default basic NR for full-screen preview is instant and sufficient for culling and adjusting. If I really want to check DeepPRIME denoising on a few images, the Loupe tool is fast and effective. I generally don't need to, though, because I know from experience what the end result is going to look like.
The last LrC update improved import times significantly. Using Embedded & Sidecar at import allows instant culling while the previews build.
You can select as many files as you like, batch Denies and go for a coffee. When I used it something I did that with PL at the end of an editing session because I had wait for the exports.
I'd much rather wait half as long at the end than twice at long in the middle.
That is a personal preference which fine. Everyone edits differently.
I'm not telling you what to do. I'm describing the advantages of DxO.
You can talk all you want about theoretical difficulties of using the Neural Engine for noise reduction and cast shade about "temporary" fixes, but DxO does a better job on noise reduction in half the time. That's my bottom line.
And I will when this comes up again. Our conversations are of little significance to each other. I think it's a good idea that other readers have all the information so they get the product that works best for them.
That's what I'm providing to offset the FUD that's often posted about DxO by folks who haven't done a careful and thorough comparison.
Oh, and I should mention that on my M4 MacBook Air, Denoise takes not 2x but 4x longer than DP3, which makes a huge difference when I'm processing onsite at a multi-day event. My MBA is actually just as fast as my Studio and my M1 Pro MBP on DP3. Which means that I can get the same performance running DxO on my portable $1000 MBA as I'd get with a desk-bound $4000 Mac Studio Ultra running Denoise.
I've been using LRC and PL side-by-side for a decade. LRC is my DAM, and I do all my RAW processing with PL. For my work, the only advantage of LRC for RAW processing is the ability to work on Sony's mRAW and sRAW files, something I'm pushing DxO hard to bring to PL so I can shoot 26MP mRAWs on my 61MP bodies.
Over and out.
Sure.

--
Funny how millions of people on an internet platform where they can communicate instantaneously with people on the other side of the world using incredibly powerful handheld computers linked to orbiting the satellites hundreds of miles in space don’t believe in science. Neil deGrasse Tyson
 
Matt K. The NEW Way to Batch Noise Reduction in Lightroom

Yeah, I know all that. What he's saying is in accord with what I'm saying: when you sync Denoise across a batch of photos, you have to wait for Denoise to process before you can move on to doing anything else. In the video, it's 2 minutes for 4 photos. Now, extrapolate that to 2,000 photos. Yah, no thanks. Not in the middle of my workflow.
I repeat. If you are doing this in the middle you are not following the suggested order.
I HAVE TO CULL FIRST. WHAT PART OF THAT ARE YOU NOT UNDERSTANDING?

I have tweaked this to shorten the list but Denoise is always first.
NO! CULLING IS FIRST. JEEZ. ADOBE HASN'T GOT A FING CLUE ABOUT AND EVENT SHOOTER'S WORKFLOW. ARE YOU DELIBERATELY IGNORING WHAT I'M SAYING SO YOU CAN "WIN"? I'M DONE TALKING WITH YOU. YOU'RE CLEARLY NOT LISTENING AND ARE COMMITTED TO PUSHING AN AGENDA. I'VE PROVIDED THE INFORMATION FOR OTHERS. NOW, YOU'RE ON MY IGNORE LIST.
I was easy to adapt to it.

f482438a79ca4f3a9662db81a9b79cb9.jpg
Anyway, we're now way OT, so I won't pursue this conversation any further here. If you want to discuss, you can either DM me or start a new conversation.
QUIT SPREADING FUD ABOUT DXO!

--
Event professional for 20+ years, travel & landscape enthusiast for 30+, stills-only.
 
FWIW, when running DxO's DeepPRIME noise reduction, my new 13" M4 MacBook Air is as fast as my M1 Max Mac Studio and 14" M1 Max MacBook Pro (both with 32 GPU cores). But, when running Adobe's Denoise, it takes twice as long as the M1 Maxes. DxO leans on Apple's Neural Engine, which got a major power-up in the M4, whereas Adobe relies on GPU cores.
Just curious. Did DXO actually fix the issue they also experienced with the NR or are they still using a work around? I loved that up a few months ago but couldn't find an answer.

14d1c6268426416aa1c6bee7d566bf7a.jpg
I have never seen any magenta cast, and I apply DeepPRIME of various flavors to almost all my images. I may have skipped Ventura. Been using Sequoia for a while now, not yet on Tahoe.
That is from 2023 and it was discussed on these forums and other sources. You don't see it because DXO found a temporary fix in 2023. I was just curios if they actually solved it or still using the fix.
I don't even know what that means. As far as I can tell, the "temporary fix" has solved the problem. I have had no issue with DxO on Apple Silicon, and the Neural Engine gives a rocket boost to noise reduction processing.
There is a work and an actual fix to an issue. They are different to me.
Can you define the difference?
My roof is leaking so I put a tarp over it. My neighbour replaced his shingles. Since other companies have had issues the NE I'm curious to know if they were able to solve it. Even that link said temporary. Is still temporary. I'm not asking a difficult question.
DXO not the only company that experienced NE issues with AI algorithms.
But apparently the only one to have worked out a practical solution.
It's been "temporary" for a long time. I just keep using it and getting better results faster than I can with Adobe. If people hadn't written about the "issue", I never would have guessed it existed.
Thanks. For other readers Adobe had NE enabled for about 3 months and disabled it. Apparently they did not like the quality in the deep shadows. Something I never noticed and that included others. On Adobe forums people wanted the choice to enable/disable it. I'm not a computer engineer. Perhaps it is more difficult to make a workaround for noise that it is a colour drift/shift.

Some other conversations. PL does not actually apply Denoise until you export a file. This is why it looks instant but you have to view it in that window. LrC applies it when you when you click on it. Denoise is applied to entire file in real time and you can tweak it as many times as you like and that is instant.
Except that after applying it to a large batch, I have to wait a long time for the processing to complete before I can move on to making other adjustments to the images. With Adobe, I wait in the middle of my workflow. With DxO, I do all my adjustments and then wait for the final export, during which I can simply walk off to make some coffee or have dinner. Since I often process hundreds of event photos at a time, DxO's workflow is far superior.
Not if you have the gear. Studio Max does Denoise AI in 8 seconds.
8 seconds for how many megapixels? My Studio Max and M4 MBA both do DP3 on a 40MP RAW in under 4 seconds.
I don't know.
Well, pixel count impacts processing time in a linear fashion, so the "8 seconds" figure is meaningless unless you know the pixel count.
This is what a pro stated at the Lightroom Queens site.
Oh, well, if some random "pro" on a Lightroom site says Lightroom is the ultimate, it must be so.
So DP3 takes 4 seconds and LrC 8. I don't see a huge issue.
You don't see the difference between 4 and 8? And yet you're suggesting the OP spend thousands on maxed-out hardware for a 50% speed gain???
If I was a pro I'd be maxing out on my gear no matter what I used.
I'm running a business to make a living, so ROI is a major factor in my buying decisions. The ROI on DxO is through the roof.
Glad it works for you. ROI is a critical part of any business investment.

These conversations are pretty regular in the forums.
Plus, with DxO's workflow, if I'm really in a hurry, it's easy to first do all my adjustments, then put half the batch on an SSD for processing on my MacBook to cut my export time in half.
I did quick search and there are threads here and other places about how PL exports take a lot of time. Again that would be hardware dependant.
On my M1 Max Mac Studio with 32 GPU cores, DeepPRIME 3 takes half the time of Denoise.
Are you comparing DXO export time to LrC Denoise time? Or are you comparing how long it takes to view DeepPrime to show you what it will look like after you export?
I'm comparing how long it takes DxO to export a finished DNG or JPEG that has DeepPRIME applied as compared with how long it takes LRC to complete its Denoise processing.
40 seconds to 80 is significant. 4 to 8 is not.
How about 4 hours vs. 8 hours?
For me anyway.
For you. Not everybody is you. Some of us need better solutions.
LrC exports are instant, even for my slowest Macs. Something to remember. With LrC you wait at the beginning of your editing process. I don't see a big difference.

Adobe recommends to do Denoising first.
After I've downloaded 2,000 images, I want to make my selects in LRC, send to PL, make adjustments, start the export, and walk away. What I don't want to do is spend an hour making my selects and then have to stop for an hour or two while I wait for Denoise and then come back to make my adjustments. That would be a major PiTA.
Adobe recommends to apply Denoise AI first.
Applying Denoise to all 2000 images rather than culling down to 800 selects and then Denoising those would be a colossal waste of time.
Plan to batch, set it up, walk away then come back and go full bore editing.
Still gotta cull first.
I like to a break after culling. Give my eyes a rest. I don't consider that part of file development but we both have our personal ways of doing things.
You don't work on deadlines.
I prefer to se the full Denoised file as I develop.
I don't need to.
I need to. Critical for my wildlife work. In your case doing mass edits for you is not necessary. Even as a pro I would want to develop wildlife shots by seeing the full denoised file. We both have our requirements.
Again, I'm not telling you what to do, but you're insisting that what's good enough for you is good enough for everyone else. This hubris is sadly common among amateurs in these forums.
That was a game changer to me but it is a personal preference. You prefer your method.
DxO's default basic NR for full-screen preview is instant and sufficient for culling and adjusting. If I really want to check DeepPRIME denoising on a few images, the Loupe tool is fast and effective. I generally don't need to, though, because I know from experience what the end result is going to look like.
The last LrC update improved import times significantly. Using Embedded & Sidecar at import allows instant culling while the previews build.
So? What does that have to do with DxO and noise reduction? Also, then you have to wait for a 1:1 preview to build when you want to zoom in to check focus, which is hugely disruptive when culling thousands of images and why I build 1:1 previews on import.
You can select as many files as you like, batch Denies and go for a coffee. When I used it something I did that with PL at the end of an editing session because I had wait for the exports.
I'd much rather wait half as long at the end than twice at long in the middle.
That is a personal preference which fine. Everyone edits differently.
I'm not telling you what to do. I'm describing the advantages of DxO.
You can talk all you want about theoretical difficulties of using the Neural Engine for noise reduction and cast shade about "temporary" fixes, but DxO does a better job on noise reduction in half the time. That's my bottom line.
And I will when this comes up again. Our conversations are of little significance to each other. I think it's a good idea that other readers have all the information so they get the product that works best for them.
That's what I'm providing to offset the FUD that's often posted about DxO by folks who haven't done a careful and thorough comparison.
Oh, and I should mention that on my M4 MacBook Air, Denoise takes not 2x but 4x longer than DP3, which makes a huge difference when I'm processing onsite at a multi-day event. My MBA is actually just as fast as my Studio and my M1 Pro MBP on DP3. Which means that I can get the same performance running DxO on my portable $1000 MBA as I'd get with a desk-bound $4000 Mac Studio Ultra running Denoise.
I've been using LRC and PL side-by-side for a decade. LRC is my DAM, and I do all my RAW processing with PL. For my work, the only advantage of LRC for RAW processing is the ability to work on Sony's mRAW and sRAW files, something I'm pushing DxO hard to bring to PL so I can shoot 26MP mRAWs on my 61MP bodies.
Over and out.
Sure.
You just aren't listening. You think everyone needs what you need. I got news for you: pros need more.

Bye.

--
Event professional for 20+ years, travel & landscape enthusiast for 30+, stills-only.
http://jacquescornell.photography
http://happening.photos
 
Last edited:
Matt K. The NEW Way to Batch Noise Reduction in Lightroom

Yeah, I know all that. What he's saying is in accord with what I'm saying: when you sync Denoise across a batch of photos, you have to wait for Denoise to process before you can move on to doing anything else. In the video, it's 2 minutes for 4 photos. Now, extrapolate that to 2,000 photos. Yah, no thanks. Not in the middle of my workflow.
I repeat. If you are doing this in the middle you are not following the suggested order.
I HAVE TO CULL FIRST. WHAT PART OF THAT ARE YOU NOT UNDERSTANDING?
Everyone typically has to cull first. I commented about that in the other post. I don't consider culling part of file development. It's about sorting images. I like to give my eyes a break before I start to develop.

I have tweaked this to shorten the list but Denoise is always first.
NO! CULLING IS FIRST. JEEZ. ADOBE HASN'T GOT A FING CLUE ABOUT AND EVENT SHOOTER'S WORKFLOW. ARE YOU DELIBERATELY IGNORING WHAT I'M SAYING SO YOU CAN "WIN"? I'M DONE TALKING WITH YOU. YOU'RE CLEARLY NOT LISTENING AND ARE COMMITTED TO PUSHING AN AGENDA. I'VE PROVIDED THE INFORMATION FOR OTHERS. NOW, YOU'RE ON MY IGNORE LIST.
No need to yell. :-) Perfect. I like being on your list.
I was easy to adapt to it.

f482438a79ca4f3a9662db81a9b79cb9.jpg
Anyway, we're now way OT, so I won't pursue this conversation any further here. If you want to discuss, you can either DM me or start a new conversation.
QUIT SPREADING FUD ABOUT DXO!


--
Funny how millions of people on an internet platform where they can communicate instantaneously with people on the other side of the world using incredibly powerful handheld computers linked to orbiting the satellites hundreds of miles in space don’t believe in science. Neil deGrasse Tyson
 
Matt K. The NEW Way to Batch Noise Reduction in Lightroom

Yeah, I know all that. What he's saying is in accord with what I'm saying: when you sync Denoise across a batch of photos, you have to wait for Denoise to process before you can move on to doing anything else. In the video, it's 2 minutes for 4 photos. Now, extrapolate that to 2,000 photos. Yah, no thanks. Not in the middle of my workflow.
I repeat. If you are doing this in the middle you are not following the suggested order.
I HAVE TO CULL FIRST. WHAT PART OF THAT ARE YOU NOT UNDERSTANDING?
Everyone typically has to cull first. I commented about that in the other post. I don't consider culling part of file development. It's about sorting images. I like to give my eyes a break before I start to develop.

I have tweaked this to shorten the list but Denoise is always first.
NO! CULLING IS FIRST. JEEZ. ADOBE HASN'T GOT A FING CLUE ABOUT AND EVENT SHOOTER'S WORKFLOW. ARE YOU DELIBERATELY IGNORING WHAT I'M SAYING SO YOU CAN "WIN"? I'M DONE TALKING WITH YOU. YOU'RE CLEARLY NOT LISTENING AND ARE COMMITTED TO PUSHING AN AGENDA. I'VE PROVIDED THE INFORMATION FOR OTHERS. NOW, YOU'RE ON MY IGNORE LIST.
No need to yell. :-) Perfect. I like being on your list.
You're being deliberately obtuse and spreading misleading "information".
I was easy to adapt to it.

f482438a79ca4f3a9662db81a9b79cb9.jpg
Anyway, we're now way OT, so I won't pursue this conversation any further here. If you want to discuss, you can either DM me or start a new conversation.
QUIT SPREADING FUD ABOUT DXO!


--
Event professional for 20+ years, travel & landscape enthusiast for 30+, stills-only.
 
FWIW, when running DxO's DeepPRIME noise reduction, my new 13" M4 MacBook Air is as fast as my M1 Max Mac Studio and 14" M1 Max MacBook Pro (both with 32 GPU cores). But, when running Adobe's Denoise, it takes twice as long as the M1 Maxes. DxO leans on Apple's Neural Engine, which got a major power-up in the M4, whereas Adobe relies on GPU cores.
Just curious. Did DXO actually fix the issue they also experienced with the NR or are they still using a work around? I loved that up a few months ago but couldn't find an answer.

14d1c6268426416aa1c6bee7d566bf7a.jpg
I have never seen any magenta cast, and I apply DeepPRIME of various flavors to almost all my images. I may have skipped Ventura. Been using Sequoia for a while now, not yet on Tahoe.
That is from 2023 and it was discussed on these forums and other sources. You don't see it because DXO found a temporary fix in 2023. I was just curios if they actually solved it or still using the fix.
I don't even know what that means. As far as I can tell, the "temporary fix" has solved the problem. I have had no issue with DxO on Apple Silicon, and the Neural Engine gives a rocket boost to noise reduction processing.
There is a work and an actual fix to an issue. They are different to me.
Can you define the difference?
My roof is leaking so I put a tarp over it. My neighbour replaced his shingles. Since other companies have had issues the NE I'm curious to know if they were able to solve it. Even that link said temporary. Is still temporary. I'm not asking a difficult question.
DXO not the only company that experienced NE issues with AI algorithms.
But apparently the only one to have worked out a practical solution.
It's been "temporary" for a long time. I just keep using it and getting better results faster than I can with Adobe. If people hadn't written about the "issue", I never would have guessed it existed.
Thanks. For other readers Adobe had NE enabled for about 3 months and disabled it. Apparently they did not like the quality in the deep shadows. Something I never noticed and that included others. On Adobe forums people wanted the choice to enable/disable it. I'm not a computer engineer. Perhaps it is more difficult to make a workaround for noise that it is a colour drift/shift.

Some other conversations. PL does not actually apply Denoise until you export a file. This is why it looks instant but you have to view it in that window. LrC applies it when you when you click on it. Denoise is applied to entire file in real time and you can tweak it as many times as you like and that is instant.
Except that after applying it to a large batch, I have to wait a long time for the processing to complete before I can move on to making other adjustments to the images. With Adobe, I wait in the middle of my workflow. With DxO, I do all my adjustments and then wait for the final export, during which I can simply walk off to make some coffee or have dinner. Since I often process hundreds of event photos at a time, DxO's workflow is far superior.
Not if you have the gear. Studio Max does Denoise AI in 8 seconds.
8 seconds for how many megapixels? My Studio Max and M4 MBA both do DP3 on a 40MP RAW in under 4 seconds.
I don't know. This is what a pro stated at the Lightroom Queens site. So DP3 takes 4 seconds and LrC 8. I don't see a huge issue.
Then you are blind. 2x is huge when we're talking about thousands of images and hours of processing onsite for same-day delivery.
If I was a pro I'd be maxing out on my gear no matter what I used.
I'm running a business to make a living, so ROI is a major factor in my buying decisions. The ROI on DxO is through the roof.
Glad it works for you. ROI is a critical part of any business investment.

These conversations are pretty regular in the forums.
Plus, with DxO's workflow, if I'm really in a hurry, it's easy to first do all my adjustments, then put half the batch on an SSD for processing on my MacBook to cut my export time in half.
I did quick search and there are threads here and other places about how PL exports take a lot of time. Again that would be hardware dependant.
On my M1 Max Mac Studio with 32 GPU cores, DeepPRIME 3 takes half the time of Denoise.
Are you comparing DXO export time to LrC Denoise time? Or are you comparing how long it takes to view DeepPrime to show you what it will look like after you export?
I'm comparing how long it takes DxO to export a finished DNG or JPEG that has DeepPRIME applied as compared with how long it takes LRC to complete its Denoise processing.
40 seconds to 80 is significant. 4 to 8 is not.
How about 4 hours vs. 8 hours?
For me anyway.
For you. Not everybody is you. Some of us need better solutions.
LrC exports are instant, even for my slowest Macs. Something to remember. With LrC you wait at the beginning of your editing process. I don't see a big difference.

Adobe recommends to do Denoising first.
After I've downloaded 2,000 images, I want to make my selects in LRC, send to PL, make adjustments, start the export, and walk away. What I don't want to do is spend an hour making my selects and then have to stop for an hour or two while I wait for Denoise and then come back to make my adjustments. That would be a major PiTA.
Adobe recommends to apply Denoise AI first.
Applying Denoise to all 2000 images rather than culling down to 800 selects and then Denoising those would be a colossal waste of time.
Plan to batch, set it up, walk away then come back and go full bore editing.
Still gotta cull first.
I like to a break after culling. Give my eyes a rest. I don't consider that part of file development but we both have our personal ways of doing things.
You don't work on deadlines.
I prefer to se the full Denoised file as I develop.
I don't need to.
I need to. Critical for my wildlife work. In your case doing mass edits for you is not necessary. Even as a pro I would want to develop wildlife shots by seeing the full denoised file. We both have our requirements.
Again, I'm not telling you what to do, but you're insisting that what's good enough for you is good enough for everyone else. This hubris is sadly common among amateurs in these forums.
Neither am I. I don't care about what you do.
That was a game changer to me but it is a personal preference. You prefer your method.
DxO's default basic NR for full-screen preview is instant and sufficient for culling and adjusting. If I really want to check DeepPRIME denoising on a few images, the Loupe tool is fast and effective. I generally don't need to, though, because I know from experience what the end result is going to look like.
The last LrC update improved import times significantly. Using Embedded & Sidecar at import allows instant culling while the previews build.
So? What does that have to do with DxO and noise reduction? Also, then you have to wait for a 1:1 preview to build when you want to zoom in to check focus, which is hugely disruptive when culling thousands of images and why I build 1:1 preview on import.
Then you re building the wrong previews if you want speed while culling. Library and Develop previews are different. Library previews are stored in the Lightroom folder. Develop previews are only one size and cashed in the Go - Library - Cashes. The size of the cache can be controlled in Preferences - Performance.

When you open a file in the Develop module LrC quickly builds a preview or that file and the one before and after. It has nothing to do with the previews that are applied at import.
You can select as many files as you like, batch Denies and go for a coffee. When I used it something I did that with PL at the end of an editing session because I had wait for the exports.
I'd much rather wait half as long at the end than twice at long in the middle.
That is a personal preference which fine. Everyone edits differently.
I'm not telling you what to do. I'm describing the advantages of DxO.
You can talk all you want about theoretical difficulties of using the Neural Engine for noise reduction and cast shade about "temporary" fixes, but DxO does a better job on noise reduction in half the time. That's my bottom line.
And I will when this comes up again. Our conversations are of little significance to each other. I think it's a good idea that other readers have all the information so they get the product that works best for them.
That's what I'm providing to offset the FUD that's often posted about DxO by folks who haven't done a careful and thorough comparison.
Oh, and I should mention that on my M4 MacBook Air, Denoise takes not 2x but 4x longer than DP3, which makes a huge difference when I'm processing onsite at a multi-day event. My MBA is actually just as fast as my Studio and my M1 Pro MBP on DP3. Which means that I can get the same performance running DxO on my portable $1000 MBA as I'd get with a desk-bound $4000 Mac Studio Ultra running Denoise.
I've been using LRC and PL side-by-side for a decade. LRC is my DAM, and I do all my RAW processing with PL. For my work, the only advantage of LRC for RAW processing is the ability to work on Sony's mRAW and sRAW files, something I'm pushing DxO hard to bring to PL so I can shoot 26MP mRAWs on my 61MP bodies.
Over and out.
Sure.
You just aren't listening. You think everyone needs what you need. I got news for you: pros need more.
I'd say you should as well.


--
Funny how millions of people on an internet platform where they can communicate instantaneously with people on the other side of the world using incredibly powerful handheld computers linked to orbiting the satellites hundreds of miles in space don’t believe in science. Neil deGrasse Tyson
 
Matt K. The NEW Way to Batch Noise Reduction in Lightroom

Yeah, I know all that. What he's saying is in accord with what I'm saying: when you sync Denoise across a batch of photos, you have to wait for Denoise to process before you can move on to doing anything else. In the video, it's 2 minutes for 4 photos. Now, extrapolate that to 2,000 photos. Yah, no thanks. Not in the middle of my workflow.
I repeat. If you are doing this in the middle you are not following the suggested order.
I HAVE TO CULL FIRST. WHAT PART OF THAT ARE YOU NOT UNDERSTANDING?
Everyone typically has to cull first. I commented about that in the other post. I don't consider culling part of file development. It's about sorting images. I like to give my eyes a break before I start to develop.

I have tweaked this to shorten the list but Denoise is always first.
NO! CULLING IS FIRST. JEEZ. ADOBE HASN'T GOT A FING CLUE ABOUT AND EVENT SHOOTER'S WORKFLOW. ARE YOU DELIBERATELY IGNORING WHAT I'M SAYING SO YOU CAN "WIN"? I'M DONE TALKING WITH YOU. YOU'RE CLEARLY NOT LISTENING AND ARE COMMITTED TO PUSHING AN AGENDA. I'VE PROVIDED THE INFORMATION FOR OTHERS. NOW, YOU'RE ON MY IGNORE LIST.
No need to yell. :-) Perfect. I like being on your list.
You're being deliberately obtuse and spreading misleading "information".
What misinformation am I spreading? Culling is not in the Adobe list because it isn't part of file development. I also explained the Library and Develop previews are different and are stored in different locations. I can prove that to you if you would like,
I was easy to adapt to it.

f482438a79ca4f3a9662db81a9b79cb9.jpg
Anyway, we're now way OT, so I won't pursue this conversation any further here. If you want to discuss, you can either DM me or start a new conversation.
QUIT SPREADING FUD ABOUT DXO!


--
Funny how millions of people on an internet platform where they can communicate instantaneously with people on the other side of the world using incredibly powerful handheld computers linked to orbiting the satellites hundreds of miles in space don’t believe in science. Neil deGrasse Tyson
 
Question about your MacBook Air with M4 processor: How long does it take to run DeepPrime PureRaw4 on a 20MB file? I shoot Olympus, use an old Lenovo laptop, and it takes around 20 minutes per image with with PureRaw4.
 
Question about your MacBook Air with M4 processor: How long does it take to run DeepPrime PureRaw4 on a 20MB file? I shoot Olympus, use an old Lenovo laptop, and it takes around 20 minutes per image with with PureRaw4.
There are two current variants of DeepPRIME, but "DeepPrime PureRaw4" is not one of them. PureRAW is an app. DeepPRIME is a noise-reduction tool available through PureRAW and PhotoLab. Current versions of these apps offer DeepPRIME 3 and DeepPRIME XD2s. The latter extracts more fine detail and takes about 2-3 times longer than DeepPRIME 3.

My MacBook Air with basic M4 processor, 24GB RAM and 512GB SSD takes 3.4 seconds to process a 40MP file with DeepPRIME 3, and it takes 12.2 seconds with DeepPRIME XD2s, so I'd expect a 20MP file to take about 2 seconds and 6 seconds, respectively. Even a 5-year-old entry-level MacBook Air with M1 processor would take only about twice this long. The Neural Engines on Apple's M1 processors really work wonders with DeepPRIME, giving modest Apple Silicon Macs fantastic bang for the buck.

--
Event professional for 20+ years, travel & landscape enthusiast for 30+, stills-only.
http://jacquescornell.photography
http://happening.photos
 
Last edited:
Sorry about jumbling the nomenclature. I have PureRAW4 (not the whole PhotoLab suite), and use DeepPRIME XD2s to process some images -- primarily bird photos shot at high ISO or cropped significantly. I do the rest of the processing in Lightroom Classic. The processing times you're talking about on the MacBook would be a game-changer. Tnx.
 
I had a 13" 2020 M1 which used for travel however I'm not working. It's just a hobby. I found 13" a tad small so I traded that in for a new 15" MacBook Air M3, 24GB RAM and 512GB. That bit of extra size made a big difference. O'm used to my desktop which is a 27" 5K screen.

I tell people to go a little bigger if they can manage it. Our age makes a difference as well.
The 15" Air is a fine choice for travel. But for my needs it doesn't have the processing grunt that a Max chip with more GPU cores and Ram gives me.

We are spoilt for choice, it's just a question of making the right choice (which may change over time too).
Hi Ray, what video files do you work with?
I’m stills only. But I bulk process a lot of images, often 500 plus per shoot day. Plus AI NR quite a bit. So that’s where I appreciate extra GPU cores and more ram. It speeds things up a lot.
Or you could get it done in 1/2 the time with DxO, no costly extra GPU cores required. My $1000 M4 MacBook Air running DeepPRIME will be as fast as a $4000 Mac Studio Ultra running Denoise. See my other posts in conversation with Zeee for more about this.
Well yes and no and it depends:

1. Talking hardware only we need to look at performance in the round, across a bunch of tasks. So, it's not just the neural engine. Ram and GPU cores etc interplay here too.

ArtIsRight on YT is a brilliant resource for us photographers to evaluate and extrapolate these things.

2. It also depends on what software(s) you are using and your workflow.

3. We do similar events oriented work yet we have VERY different workflows. I'll touch on that in other replies.
 
FWIW, when running DxO's DeepPRIME noise reduction, my new 13" M4 MacBook Air is as fast as my M1 Max Mac Studio and 14" M1 Max MacBook Pro (both with 32 GPU cores). But, when running Adobe's Denoise, it takes twice as long as the M1 Maxes. DxO leans on Apple's Neural Engine, which got a major power-up in the M4, whereas Adobe relies on GPU cores.
Just curious. Did DXO actually fix the issue they also experienced with the NR or are they still using a work around? I loved that up a few months ago but couldn't find an answer.

14d1c6268426416aa1c6bee7d566bf7a.jpg
I have never seen any magenta cast, and I apply DeepPRIME of various flavors to almost all my images. I may have skipped Ventura. Been using Sequoia for a while now, not yet on Tahoe.
That is from 2023 and it was discussed on these forums and other sources. You don't see it because DXO found a temporary fix in 2023. I was just curios if they actually solved it or still using the fix.
I don't even know what that means. As far as I can tell, the "temporary fix" has solved the problem. I have had no issue with DxO on Apple Silicon, and the Neural Engine gives a rocket boost to noise reduction processing.
There is a work and an actual fix to an issue. They are different to me.
Can you define the difference?
My roof is leaking so I put a tarp over it. My neighbour replaced his shingles. Since other companies have had issues the NE I'm curious to know if they were able to solve it. Even that link said temporary. Is still temporary. I'm not asking a difficult question.
DXO not the only company that experienced NE issues with AI algorithms.
But apparently the only one to have worked out a practical solution.
It's been "temporary" for a long time. I just keep using it and getting better results faster than I can with Adobe. If people hadn't written about the "issue", I never would have guessed it existed.
Thanks. For other readers Adobe had NE enabled for about 3 months and disabled it. Apparently they did not like the quality in the deep shadows. Something I never noticed and that included others. On Adobe forums people wanted the choice to enable/disable it. I'm not a computer engineer. Perhaps it is more difficult to make a workaround for noise that it is a colour drift/shift.

Some other conversations. PL does not actually apply Denoise until you export a file. This is why it looks instant but you have to view it in that window. LrC applies it when you when you click on it. Denoise is applied to entire file in real time and you can tweak it as many times as you like and that is instant.
Except that after applying it to a large batch, I have to wait a long time for the processing to complete before I can move on to making other adjustments to the images. With Adobe, I wait in the middle of my workflow. With DxO, I do all my adjustments and then wait for the final export, during which I can simply walk off to make some coffee or have dinner. Since I often process hundreds of event photos at a time, DxO's workflow is far superior.
Whatever your workflow there's a "walk off and have a coffee" moment when batch processing stuff like our event photography. Better hardware simply makes us drink our coffee faster. Older hardware might involve having a second cup :)
Plus, with DxO's workflow, if I'm really in a hurry, it's easy to first do all my adjustments, then put half the batch on an SSD for processing on my MacBook to cut my export time in half.
I do all my culling AND editing in LRc, so we differ there. Creating an import preset which includes auto adjust with Adobe Color profile means I don't have to do "all my adjustments"..... it's automated. I MAY have to fine tune and fettle where needed.
I did quick search and there are threads here and other places about how PL exports take a lot of time. Again that would be hardware dependant.
On my M1 Max Mac Studio with 32 GPU cores, DeepPRIME 3 takes half the time of Denoise.
And on my M3 Max with 40 GPU cores and 64GB Ram it's even faster for DeepPRIME3, which I do use sometimes.......
LrC exports are instant, even for my slowest Macs. Something to remember. With LrC you wait at the beginning of your editing process. I don't see a big difference.

Adobe recommends to do Denoising first.
After I've downloaded 2,000 images, I want to make my selects in LRC, send to PL, make adjustments, start the export, and walk away. What I don't want to do is spend an hour making my selects and then have to stop for an hour or two while I wait for Denoise and then come back to make my adjustments. That would be a major PiTA.
So you're making your selects in LRc same as me. Which means you are importing to LRc same as me. Which also means rendering previews. Here, GPU cores and Ram are your friend.

Your workflow obviously works for you, but the idea of using LRc to DAM, import, cull then send to external software to PP fills me with horror personally :) No right or wrong answer here. I do it ALL within LRc
You can select as many files as you like, batch Denies and go for a coffee. When I used it something I did that with PL at the end of an editing session because I had wait for the exports.
I'd much rather wait half as long at the end than twice at long in the middle.
Like I said, there's a "have a coffee" moment somewhere :). One such moment is LRc import and render previews. Again, GPU cores and Ram help here.
You can talk all you want about theoretical difficulties of using the Neural Engine for noise reduction and cast shade about "temporary" fixes, but DxO does a better job on noise reduction in half the time. That's my bottom line.
Yes, in isolation. Although I'm using LRc for processing I do also own and use PureRaw5 and use it where needed (standalone and LR plugin) .... but less so these days..... more of that in a bit
Oh, and I should mention that on my M4 MacBook Air, Denoise takes not 2x but 4x longer than DP3, which makes a huge difference when I'm processing onsite at a multi-day event. My MBA is actually just as fast as my Studio and my M1 Pro MBP on DP3. Which means that I can get the same performance running DxO on my portable $1000 MBA as I'd get with a desk-bound $4000 Mac Studio Ultra running Denoise.
Yes, if looking at NR in isolation without looking at import/render/export times. Which are the other batch processing bottlenecks.
I've been using LRC and PL side-by-side for more than a decade. LRC is my DAM, and I do all my RAW processing with PL. For my work, the only advantage of LRC for RAW processing is the ability to work on Sony's mRAW and sRAW files, something I'm pushing DxO hard to bring to PL so I can shoot 26MP mRAWs on my 61MP bodies.
Been with LRc since V1, got all my RAW files since then too (the non-culled ones obviously).

--
Follow: https://www.instagram.com/ray_burnimage/
 
Last edited:
FWIW, when running DxO's DeepPRIME noise reduction, my new 13" M4 MacBook Air is as fast as my M1 Max Mac Studio and 14" M1 Max MacBook Pro (both with 32 GPU cores). But, when running Adobe's Denoise, it takes twice as long as the M1 Maxes. DxO leans on Apple's Neural Engine, which got a major power-up in the M4, whereas Adobe relies on GPU cores.
Just curious. Did DXO actually fix the issue they also experienced with the NR or are they still using a work around? I loved that up a few months ago but couldn't find an answer.

14d1c6268426416aa1c6bee7d566bf7a.jpg
I have never seen any magenta cast, and I apply DeepPRIME of various flavors to almost all my images. I may have skipped Ventura. Been using Sequoia for a while now, not yet on Tahoe.
That is from 2023 and it was discussed on these forums and other sources. You don't see it because DXO found a temporary fix in 2023. I was just curios if they actually solved it or still using the fix.
I don't even know what that means. As far as I can tell, the "temporary fix" has solved the problem. I have had no issue with DxO on Apple Silicon, and the Neural Engine gives a rocket boost to noise reduction processing.
There is a work and an actual fix to an issue. They are different to me.
Can you define the difference?
My roof is leaking so I put a tarp over it. My neighbour replaced his shingles. Since other companies have had issues the NE I'm curious to know if they were able to solve it. Even that link said temporary. Is still temporary. I'm not asking a difficult question.
DXO not the only company that experienced NE issues with AI algorithms.
But apparently the only one to have worked out a practical solution.
It's been "temporary" for a long time. I just keep using it and getting better results faster than I can with Adobe. If people hadn't written about the "issue", I never would have guessed it existed.
Thanks. For other readers Adobe had NE enabled for about 3 months and disabled it. Apparently they did not like the quality in the deep shadows. Something I never noticed and that included others. On Adobe forums people wanted the choice to enable/disable it. I'm not a computer engineer. Perhaps it is more difficult to make a workaround for noise that it is a colour drift/shift.

Some other conversations. PL does not actually apply Denoise until you export a file. This is why it looks instant but you have to view it in that window. LrC applies it when you when you click on it. Denoise is applied to entire file in real time and you can tweak it as many times as you like and that is instant.
Except that after applying it to a large batch, I have to wait a long time for the processing to complete before I can move on to making other adjustments to the images. With Adobe, I wait in the middle of my workflow. With DxO, I do all my adjustments and then wait for the final export, during which I can simply walk off to make some coffee or have dinner. Since I often process hundreds of event photos at a time, DxO's workflow is far superior.
Not if you have the gear. Studio Max does Denoise AI in 8 seconds.
8 seconds for how many megapixels? My M1 Max Studio with the 32-GPU-core upgrade and my lowly M4 MBA both do DP3 on a 40MP RAW in under 4 seconds.
My metrics for 68mb files from Sony A1 are:

PureRAW3 - 2 seconds. Adobe 10 seconds. But this is only one bottleneck in the workflow as already mentioned.
If I was a pro I'd be maxing out on my gear no matter what I used.
I'm running a business to make a living, so ROI is a major factor in my buying decisions. The ROI on DxO is through the roof.
Same here, that's part of the entire part of this thread. Which, to remind, is a about playing the used/refurbished game to maximise bang for buck.
I don't know where folks get the idea that pro photographers are made of money and can buy all the newest and greatest toys. Do you have any idea what the average income of a pro photog in the U.S. is?

From ShotKit.com:

"...according to Forbes, the highest paying salaries for photographers are in the District of Columbia, reaching an average of US$82,840. The lowest income is registered in North Carolina with US$33,630."

Salary.com: $89,632

U.S. Bureau of Labor Statistics: $42,520

Even the Salary.com figure is barely scraping by for a family of four trying to own a home in a major metropolitan area. Not a lot of extra for that pair of a9IIIs, maxed out MacBook Pro, Mac Studio Ultra, f2.8 zooms, f1.4 primes, thousands of dollars of lighting equipment, and a reliable vehicle, never mind health insurance that's about to double in cost so Elon Musk can get a tax break. Don't get me started. Long story short, most photogs have to be smart to do the most with the least. Over 25 years, I've become an expert in this.
These conversations are pretty regular in the forums.
Plus, with DxO's workflow, if I'm really in a hurry, it's easy to first do all my adjustments, then put half the batch on an SSD for processing on my MacBook to cut my export time in half.
I did quick search and there are threads here and other places about how PL exports take a lot of time. Again that would be hardware dependant.
On my M1 Max Mac Studio with 32 GPU cores, DeepPRIME 3 takes half the time of Denoise.
Are you comparing DXO export time to LrC Denoise time? Or are you comparing how long it takes to view DeepPrime to show you what it will look like after you export?
I'm comparing how long it takes DxO to export a finished DNG or JPEG that has DeepPRIME applied as compared with how long it takes LRC to complete its Denoise processing.
Apples and oranges......
LrC exports are instant, even for my slowest Macs. Something to remember. With LrC you wait at the beginning of your editing process. I don't see a big difference.

Adobe recommends to do Denoising first.
After I've downloaded 2,000 images, I want to make my selects in LRC, send to PL, make adjustments, start the export, and walk away. What I don't want to do is spend an hour making my selects and then have to stop for an hour or two while I wait for Denoise and then come back to make my adjustments. That would be a major PiTA.
Adobe recommends to apply Denoise AI first.
Applying Denoise to all 2000 images rather than culling down to 800 selects and then Denoising those would be a colossal waste of time.
Agreed! I cull then select images that need Denoise..... which may be some/none/all depending on the ISO used/needed
Plan to batch, set it up, walk away then come back and go full bore editing.
Still gotta cull first.
I prefer to se the full Denoised file as I develop.
I don't need to.
Same
That was a game changer to me but it is a personal preference. You prefer your method.
DxO's default basic NR for full-screen preview is instant and sufficient for culling and adjusting. If I really want to check DeepPRIME denoising on a few images, the Loupe tool is fast and effective. I generally don't need to, though, because I know from experience what the end result is going to look like.
From experience I never need to consider denoise as part of the culling workflow and decision process
You can select as many files as you like, batch Denies and go for a coffee. When I used it something I did that with PL at the end of an editing session because I had wait for the exports.
I'd much rather wait half as long at the end than twice at long in the middle.
That is a personal preference which fine. Everyone edits differently.
I'm not telling you what to do. I'm describing the advantages of DxO.
Describing the advantages for your workflow, without considering any of the disadvantages. Such as:

1. Adobe NR is now non-destructive.

2. Adobe NR does not create a separate file.

3. Adobe NR does not increase file size. To me, it's a biggie as I NEVER delete keeper Raw files. Yes storage is cheap. But my 50mb Sony A1 raw files are typically about 68mb. Increasing to about 170mb after PureRAW3.

Is pureRAW3 worth a 150% increase in file size across tens of thousands of images JUST for noise reduction? Yikes!
You can talk all you want about theoretical difficulties of using the Neural Engine for noise reduction and cast shade about "temporary" fixes, but DxO does a better job on noise reduction in half the time. That's my bottom line.
And I will when this comes up again. Our conversations are of little significance to each other. I think it's a good idea that other readers have all the information so they get the product that works best for them.
That's what I'm providing to offset the FUD that's often posted about DxO by folks who haven't done a careful and thorough comparison.
I use PureRAW3 as my only external plugin. Plus Adobe NR now its non-destructive.

The latter is RAPIDLY usurping the former in my batch processing when dealing with 500-1000 images. No contest.
Oh, and I should mention that on my M4 MacBook Air, Denoise takes not 2x but 4x longer than DP3, which makes a huge difference when I'm processing onsite at a multi-day event. My MBA is actually just as fast as my Studio and my M1 Pro MBP on DP3. Which means that I can get the same performance running DxO on my portable $1000 MBA as I'd get with a desk-bound $4000 Mac Studio Ultra running Denoise.
I've been using LRC and PL side-by-side for a decade. LRC is my DAM, and I do all my RAW processing with PL. For my work, the only advantage of LRC for RAW processing is the ability to work on Sony's mRAW and sRAW files, something I'm pushing DxO hard to bring to PL so I can shoot 26MP mRAWs on my 61MP bodies.
Ive been using LRc since 1.0 well over 20 years ago. The "auto this and that" tools from the last 2 or 3 years are SENSATIONAL for large volume batch processing. The idea of using anything different is incomprehensible to me.

However, there's no right and wrong.



--
Follow: https://www.instagram.com/ray_burnimage/
 
I repeat. If you are doing this in the middle you are not following the suggested order.


I have tweaked this to shorten the list but Denoise is always first. I was easy to adapt to it.

f482438a79ca4f3a9662db81a9b79cb9.jpg
The thing here is that this is the SUGGESTED order for AI EDITS as opposed to end to end workflow including non-AI editing :)

Here's the LRc workflow I'm using for events, where I'm shooting say 1000 images and culling down to say 100 final images. I burst shoot to get the best moment, avoid closed eyes etc. Hence the aggressive culling needed due to similar duplicates.

Please note, I've recently started to amend this workflow to move away from PureRAW3 for NR and it's 150% uplift in file size in favour of Adobe non-destructive NR and zero uplift in file size:

Firstly, my import preset is applied:
  1. Copy to new location and add to catalog
  2. Render 1:1 previews (set in preferences to delete after 30 days) plus smart previews
  3. Adobe Color profile and Auto applied for exposure adjustment.
This is my "have a coffee moment". Having an M3 Max with 40GPU cores and 64GB Ram has vastly reduced my caffeine intake ;-)

Culling Run 1

Some folk swear by culling using FRV or PM for culling, but I do it within LRc after the above import routine:

1. I simply use the <> arrows and P on the keyboard. My first sweep I "pick" the obvious deletes.... you know, OOF, eyes closed etc.

2. Toggle > Flagged > Select All > Delete.

Culling Run 2

1. Apply stars using <> and 1, 2 or none on keyboard

3. 1 = maybe, 2 = best of sequence and no rating means it is a reject from this sequence

4. Toggle > Un-Rated > Select All > Delete

Crop, Staighten, Adjust

I now crop straighten and fine tune exposure on 2* images if and when needed.

Final Cull

Revisit edited 2* images vs 1* and aggressively cull down to 1 keeper per sequence.

Noise Reduction
  1. Use same <> keyboard and now 3 to select images that need Adobe non-destructive NR (if any)
  2. 2. Apply NR to first of those images, then auto-sync to the rest
Have a coffee.

Export

1. Finally, mark all other keepers as 3* and export... job done.





--
Follow: https://www.instagram.com/ray_burnimage/
 
I repeat. If you are doing this in the middle you are not following the suggested order.


I have tweaked this to shorten the list but Denoise is always first. I was easy to adapt to it.

f482438a79ca4f3a9662db81a9b79cb9.jpg
The thing here is that this is the SUGGESTED order for AI EDITS as opposed to end to end workflow including non-AI editing :)

Here's the LRc workflow I'm using for events, where I'm shooting say 1000 images and culling down to say 100 final images. I burst shoot to get the best moment, avoid closed eyes etc. Hence the aggressive culling needed due to similar duplicates.

Please note, I've recently started to amend this workflow to move away from PureRAW3 for NR and it's 150% uplift in file size in favour of Adobe non-destructive NR and zero uplift in file size:

Firstly, my import preset is applied:
  1. Copy to new location and add to catalog
  2. Render 1:1 previews (set in preferences to delete after 30 days) plus smart previews
  3. Adobe Color profile and Auto applied for exposure adjustment.
This is my "have a coffee moment". Having an M3 Max with 40GPU cores and 64GB Ram has vastly reduced my caffeine intake ;-)

Culling Run 1

Some folk swear by culling using FRV or PM for culling, but I do it within LRc after the above import routine:

1. I simply use the <> arrows and P on the keyboard. My first sweep I "pick" the obvious deletes.... you know, OOF, eyes closed etc.

2. Toggle > Flagged > Select All > Delete.

Culling Run 2

1. Apply stars using <> and 1, 2 or none on keyboard

3. 1 = maybe, 2 = best of sequence and no rating means it is a reject from this sequence

4. Toggle > Un-Rated > Select All > Delete

Crop, Staighten, Adjust

I now crop straighten and fine tune exposure on 2* images if and when needed.

Final Cull

Revisit edited 2* images vs 1* and aggressively cull down to 1 keeper per sequence.

Noise Reduction
  1. Use same <> keyboard and now 3 to select images that need Adobe non-destructive NR (if any)
  2. 2. Apply NR to first of those images, then auto-sync to the rest
Have a coffee.

Export

1. Finally, mark all other keepers as 3* and export... job done.
For other readers. 1:1 Previews are great if you want to zoom over 100% in the Library module. As noted they take longer to build. There is all types of info out there about what to use for a persons requirements. Some suggest Standard Previews or Embedded & Sidecar.

Over time Embedded & Sidecar and Minimal eventually will build into Standard Previews. Embedded & Sidecar was designed for immediate culling. I used to import as 1:1 but these days I use Minimal because I don't used the Library for anything as I pre-cull before importing. Adobe improved the import speeds with the last (or previous) update. 500 files are imported instantly with minimal. I have not tested Standard. Embedded & Sidecar should be close to instant as well.

Develop files are cached differently. While Library Previews are cached in the Library folder, on Mac Develop are cached in the user Library. They are only one size (large like 1:1) and build instantly as you open them. Adobe suggests to increase Develop cache size in Preferences - Performance. I'm at 50GB. You can purge that cache there or drag the cache folder in the user Library to the trash. A new folder will be built automatically. I do that every month or so.

Tim Grey states to use 1:1 if you want to zoom past 100% in the Library module. I thought it was if you want to up to 100%. I may import at 1:1 next time and test that out myself. I'm not going to go back to 1:1. Just curious.


--
Funny how millions of people on an internet platform where they can communicate instantaneously with people on the other side of the world using incredibly powerful handheld computers linked to orbiting the satellites hundreds of miles in space don’t believe in science. Neil deGrasse Tyson
 
I repeat. If you are doing this in the middle you are not following the suggested order.


I have tweaked this to shorten the list but Denoise is always first. I was easy to adapt to it.

f482438a79ca4f3a9662db81a9b79cb9.jpg
The thing here is that this is the SUGGESTED order for AI EDITS as opposed to end to end workflow including non-AI editing :)

Here's the LRc workflow I'm using for events, where I'm shooting say 1000 images and culling down to say 100 final images. I burst shoot to get the best moment, avoid closed eyes etc. Hence the aggressive culling needed due to similar duplicates.

Please note, I've recently started to amend this workflow to move away from PureRAW3 for NR and it's 150% uplift in file size in favour of Adobe non-destructive NR and zero uplift in file size:

Firstly, my import preset is applied:
  1. Copy to new location and add to catalog
  2. Render 1:1 previews (set in preferences to delete after 30 days) plus smart previews
  3. Adobe Color profile and Auto applied for exposure adjustment.
This is my "have a coffee moment". Having an M3 Max with 40GPU cores and 64GB Ram has vastly reduced my caffeine intake ;-)

Culling Run 1

Some folk swear by culling using FRV or PM for culling, but I do it within LRc after the above import routine:

1. I simply use the <> arrows and P on the keyboard. My first sweep I "pick" the obvious deletes.... you know, OOF, eyes closed etc.

2. Toggle > Flagged > Select All > Delete.

Culling Run 2

1. Apply stars using <> and 1, 2 or none on keyboard

3. 1 = maybe, 2 = best of sequence and no rating means it is a reject from this sequence

4. Toggle > Un-Rated > Select All > Delete

Crop, Staighten, Adjust

I now crop straighten and fine tune exposure on 2* images if and when needed.

Final Cull

Revisit edited 2* images vs 1* and aggressively cull down to 1 keeper per sequence.

Noise Reduction
  1. Use same <> keyboard and now 3 to select images that need Adobe non-destructive NR (if any)
  2. 2. Apply NR to first of those images, then auto-sync to the rest
Have a coffee.

Export

1. Finally, mark all other keepers as 3* and export... job done.
For other readers. 1:1 Previews are great if you want to zoom over 100% in the Library module. As noted they take longer to build. There is all types of info out there about what to use for a persons requirements. Some suggest Standard Previews or Embedded & Sidecar.
Yep 1:1 previews are great if you want to zoom in, which I do want to do as part of my culling process. It's also why I have my LRc preferences set to delete them after 30 days, because I'm done with my bulk editing by then.

Other flavours are quicker, but a machine with lots of Ram and GPU cores somewhat negates this.
Over time Embedded & Sidecar and Minimal eventually will build into Standard Previews. Embedded & Sidecar was designed for immediate culling. I used to import as 1:1 but these days I use Minimal because I don't used the Library for anything as I pre-cull before importing. Adobe improved the import speeds with the last (or previous) update. 500 files are imported instantly with minimal. I have not tested Standard. Embedded & Sidecar should be close to instant as well.

Develop files are cached differently. While Library Previews are cached in the Library folder, on Mac Develop are cached in the user Library. They are only one size (large like 1:1) and build instantly as you open them. Adobe suggests to increase Develop cache size in Preferences - Performance. I'm at 50GB. You can purge that cache there or drag the cache folder in the user Library to the trash. A new folder will be built automatically. I do that every month or so.
I have internal 2TB SSD and set my develop cache size to 150GB. I purge it now and again.

My libraries are on internal too.
Tim Grey states to use 1:1 if you want to zoom past 100% in the Library module. I thought it was if you want to up to 100%. I may import at 1:1 next time and test that out myself. I'm not going to go back to 1:1. Just curious.

https://asktimgrey.com/2024/06/03/which-previews-to-build/
This is where this thread has gone OT yet interesting.

If you want to speed up import and preview rendering then GPU cores and ram are your friend.

Yes, you can render smaller previews, but they become 1.1 when you zoom in anyway, (simply moving the bottleneck elsewhere:)

This thread has become a great debate even though it's gone OT.

For folks who batch process tons of images such as event and wedding photographers:

An M4 chip due to its neural engine performance for PureRAW3 should NOT be taken as a slam dunk argument in isolation.

GPU cores, Ram and indeed internal SSD space/speed come into play and all interact with each other when the rest of the process is taken into account. Import/render/cull/export etc. NR is a tiny part of that cog.....

--
Follow: https://www.instagram.com/ray_burnimage/
 
Last edited:

Keyboard shortcuts

Back
Top