Noise Removal - total time

A friend with an Intel i5 Mac Mini 16gb asked me to try LrC 12.3 Denoise AI on my M2 Pro 16" Macbook Pro 12/19 32gb. He gave me one of his high ISO Canon R5 45mp raw files to check. He is thinking about updating to a new computer. On his computer it takes 12 minutes. On mine it takes 34 seconds. Both using the default 50% setting.

Topaz Denoise AI 3.6.2 using default settings took 9 seconds.

I also tried Topaz Photo AI 1.3.1. I only used Remove Noise. I turned off Sharpen, Recover Faces, and Enhance Resolution. It took 12 seconds.

The times above are for just processing and saving the file.
I think most people who have LrC from now on will not bother with DXO or Topaz for processing raw files. Most won't care about whether that for a particular photo one may do slightly better, but for another photo not. LrC Denoise AI is built-in and included for no additional cost. Seems like PureRaw really has no reason to exist anymore. Topaz Denoise AI can also handle non-raw files which Denoise AI and PureRaw cannot so that still is useful. Adobe says they are working on making Denoise AI work with non-raw files though.

For people who do not use LrC then DXO and Topaz can still sell to them.
Pretty much where I am. I won’t need DxO moving forward, although LRc Denoise Ai isn’t “perfect”, it’s more than good enough for my needs.

You can read this thread in the DxO forums, heaps of discussion between DxO vs Adobe Denoise Ai (https://feedback.dxo.com/t/lightroo...does-it-compare-to-photolabs-pureraw/32822/56)
The element im less sure about is what would be the most suitable machine for this system? Is it the same as the competition or subtly different?
 
A friend with an Intel i5 Mac Mini 16gb asked me to try LrC 12.3 Denoise AI on my M2 Pro 16" Macbook Pro 12/19 32gb. He gave me one of his high ISO Canon R5 45mp raw files to check. He is thinking about updating to a new computer. On his computer it takes 12 minutes. On mine it takes 34 seconds. Both using the default 50% setting.

Topaz Denoise AI 3.6.2 using default settings took 9 seconds.

I also tried Topaz Photo AI 1.3.1. I only used Remove Noise. I turned off Sharpen, Recover Faces, and Enhance Resolution. It took 12 seconds.

The times above are for just processing and saving the file.
I think most people who have LrC from now on will not bother with DXO or Topaz for processing raw files. Most won't care about whether that for a particular photo one may do slightly better, but for another photo not. LrC Denoise AI is built-in and included for no additional cost. Seems like PureRaw really has no reason to exist anymore. Topaz Denoise AI can also handle non-raw files which Denoise AI and PureRaw cannot so that still is useful. Adobe says they are working on making Denoise AI work with non-raw files though.

For people who do not use LrC then DXO and Topaz can still sell to them.
Pretty much where I am. I won’t need DxO moving forward, although LRc Denoise Ai isn’t “perfect”, it’s more than good enough for my needs.

You can read this thread in the DxO forums, heaps of discussion between DxO vs Adobe Denoise Ai (https://feedback.dxo.com/t/lightroo...does-it-compare-to-photolabs-pureraw/32822/56)
The element im less sure about is what would be the most suitable machine for this system? Is it the same as the competition or subtly different?
All the AI NR tools make heavy use of the GPU, but the new LR Denoise AI tool is apparently much slower than the DxO and Topaz AI NR tools. But with all of them, a powerful modern GPU provides a dramatic improvement in performance.

Other factors like CPU and disk speeds are comparatively unimportant. Of course, there are also differences between Apple silicon and PC hardware.
 
A friend with an Intel i5 Mac Mini 16gb asked me to try LrC 12.3 Denoise AI on my M2 Pro 16" Macbook Pro 12/19 32gb. He gave me one of his high ISO Canon R5 45mp raw files to check. He is thinking about updating to a new computer. On his computer it takes 12 minutes. On mine it takes 34 seconds. Both using the default 50% setting.

Topaz Denoise AI 3.6.2 using default settings took 9 seconds.

I also tried Topaz Photo AI 1.3.1. I only used Remove Noise. I turned off Sharpen, Recover Faces, and Enhance Resolution. It took 12 seconds.

The times above are for just processing and saving the file.
I think most people who have LrC from now on will not bother with DXO or Topaz for processing raw files. Most won't care about whether that for a particular photo one may do slightly better, but for another photo not. LrC Denoise AI is built-in and included for no additional cost. Seems like PureRaw really has no reason to exist anymore. Topaz Denoise AI can also handle non-raw files which Denoise AI and PureRaw cannot so that still is useful. Adobe says they are working on making Denoise AI work with non-raw files though.

For people who do not use LrC then DXO and Topaz can still sell to them.
Pretty much where I am. I won’t need DxO moving forward, although LRc Denoise Ai isn’t “perfect”, it’s more than good enough for my needs.

You can read this thread in the DxO forums, heaps of discussion between DxO vs Adobe Denoise Ai (https://feedback.dxo.com/t/lightroo...does-it-compare-to-photolabs-pureraw/32822/56)
The element im less sure about is what would be the most suitable machine for this system? Is it the same as the competition or subtly different?
All the AI NR tools make heavy use of the GPU, but the new LR Denoise AI tool is apparently much slower than the DxO and Topaz AI NR tools. But with all of them, a powerful modern GPU provides a dramatic improvement in performance.

Other factors like CPU and disk speeds are comparatively unimportant. Of course, there are also differences between Apple silicon and PC hardware.
Thanks Nigel.

The typical difference I am seeing between LR and Topaz AI is a factor of around 15 on my laptop.

A few reasons why DxO isn't a tool we use but speed Vs what was DeNoise AI being one.

I had pondered if the ideal machine isn't ideal for all systems.
 
A friend with an Intel i5 Mac Mini 16gb asked me to try LrC 12.3 Denoise AI on my M2 Pro 16" Macbook Pro 12/19 32gb. He gave me one of his high ISO Canon R5 45mp raw files to check. He is thinking about updating to a new computer. On his computer it takes 12 minutes. On mine it takes 34 seconds. Both using the default 50% setting.

Topaz Denoise AI 3.6.2 using default settings took 9 seconds.

I also tried Topaz Photo AI 1.3.1. I only used Remove Noise. I turned off Sharpen, Recover Faces, and Enhance Resolution. It took 12 seconds.

The times above are for just processing and saving the file.
I think most people who have LrC from now on will not bother with DXO or Topaz for processing raw files. Most won't care about whether that for a particular photo one may do slightly better, but for another photo not. LrC Denoise AI is built-in and included for no additional cost. Seems like PureRaw really has no reason to exist anymore. Topaz Denoise AI can also handle non-raw files which Denoise AI and PureRaw cannot so that still is useful. Adobe says they are working on making Denoise AI work with non-raw files though.

For people who do not use LrC then DXO and Topaz can still sell to them.
Pretty much where I am. I won’t need DxO moving forward, although LRc Denoise Ai isn’t “perfect”, it’s more than good enough for my needs.

You can read this thread in the DxO forums, heaps of discussion between DxO vs Adobe Denoise Ai (https://feedback.dxo.com/t/lightroo...does-it-compare-to-photolabs-pureraw/32822/56)
The element im less sure about is what would be the most suitable machine for this system? Is it the same as the competition or subtly different?
The task is heavily reliant on GPU, and is optimised for GPU.

Mac user

The current Adobe denoise AI doesn’t leverage the M1/M2 ANE (Apple Neural Engine) once they resolve that, there should be some interesting performance gains (Source : https://community.adobe.com/t5/lightroom-classic-discussions/denoise-ai-in-12-3/m-p/13739400)

In the interim, the timing has been “perfect”, my 2018 Apple laptop is about to become “vintage” and out of support (definition of vintage https://support.apple.com/en-us/HT2...considered vintage when,less than 7 years ago.)

I have decided to purchase a new apple device with the most amount of GPU cores while trying to keep the price down. This will hopefully give me the “best” experience possible, and buy me some headroom for the future.

Recently i have purchased a macbook M1 Max with 32core GPU for $3500 AUD (new). That seemed the most value from a price vs performance perspective.

Output

I am finding threads like these quite informative, but like most “ordinary folks” the differences are subtle, they are all much better than the original raw files that is produced by a single magic click of a button.

As always it’s never going to be perfect for every single image, but if it’s mostly good most of the time that’s great.. But we need abit of reality check.. you can’t expect clean / detailed results from incredibly high iso (relative to the camera). The basics still apply, always “try” to shoot at the lowest iso where possible.

Processing time

The additional processing time for LRc compared to it’s peers for denoise isn’t much of a deal for me (I am a enthusiast), I just run denoise and do other “life” things while it happens. It’s just another step that hopefully can be eradicated in future releases.
 
A friend with an Intel i5 Mac Mini 16gb asked me to try LrC 12.3 Denoise AI on my M2 Pro 16" Macbook Pro 12/19 32gb. He gave me one of his high ISO Canon R5 45mp raw files to check. He is thinking about updating to a new computer. On his computer it takes 12 minutes. On mine it takes 34 seconds. Both using the default 50% setting.

Topaz Denoise AI 3.6.2 using default settings took 9 seconds.

I also tried Topaz Photo AI 1.3.1. I only used Remove Noise. I turned off Sharpen, Recover Faces, and Enhance Resolution. It took 12 seconds.

The times above are for just processing and saving the file.
I think most people who have LrC from now on will not bother with DXO or Topaz for processing raw files. Most won't care about whether that for a particular photo one may do slightly better, but for another photo not. LrC Denoise AI is built-in and included for no additional cost. Seems like PureRaw really has no reason to exist anymore. Topaz Denoise AI can also handle non-raw files which Denoise AI and PureRaw cannot so that still is useful. Adobe says they are working on making Denoise AI work with non-raw files though.

For people who do not use LrC then DXO and Topaz can still sell to them.
Pretty much where I am. I won’t need DxO moving forward, although LRc Denoise Ai isn’t “perfect”, it’s more than good enough for my needs.

You can read this thread in the DxO forums, heaps of discussion between DxO vs Adobe Denoise Ai (https://feedback.dxo.com/t/lightroo...does-it-compare-to-photolabs-pureraw/32822/56)
The element im less sure about is what would be the most suitable machine for this system? Is it the same as the competition or subtly different?
All the AI NR tools make heavy use of the GPU, but the new LR Denoise AI tool is apparently much slower than the DxO and Topaz AI NR tools. But with all of them, a powerful modern GPU provides a dramatic improvement in performance.

Other factors like CPU and disk speeds are comparatively unimportant. Of course, there are also differences between Apple silicon and PC hardware.
Thanks Nigel.

The typical difference I am seeing between LR and Topaz AI is a factor of around 15 on my laptop.

A few reasons why DxO isn't a tool we use but speed Vs what was DeNoise AI being one.
I find that DeNoise AI is about the same speed as DeepPRIME, but XD takes about twice as long on my machine. But the relative speeds probably depend on the graphics card, as XD seems to be optimised for the more recent cards. With low to middling ISO images, image quality is similar, but with really high ISOs (or very low light), XD is much more effective than DeNoise or DeepPRIME, but also risks artefacts unless you turn down the noise model. That's why I reserve its use for the noisiest images.
I had pondered if the ideal machine isn't ideal for all systems.
It all does seem to depend on the GPU. One that's faster with one of these will also be faster with the others, but of course LR will probably be the slowest in each case.
 
A friend with an Intel i5 Mac Mini 16gb asked me to try LrC 12.3 Denoise AI on my M2 Pro 16" Macbook Pro 12/19 32gb. He gave me one of his high ISO Canon R5 45mp raw files to check. He is thinking about updating to a new computer. On his computer it takes 12 minutes. On mine it takes 34 seconds. Both using the default 50% setting.

Topaz Denoise AI 3.6.2 using default settings took 9 seconds.

I also tried Topaz Photo AI 1.3.1. I only used Remove Noise. I turned off Sharpen, Recover Faces, and Enhance Resolution. It took 12 seconds.

The times above are for just processing and saving the file.
I think most people who have LrC from now on will not bother with DXO or Topaz for processing raw files. Most won't care about whether that for a particular photo one may do slightly better, but for another photo not. LrC Denoise AI is built-in and included for no additional cost. Seems like PureRaw really has no reason to exist anymore. Topaz Denoise AI can also handle non-raw files which Denoise AI and PureRaw cannot so that still is useful. Adobe says they are working on making Denoise AI work with non-raw files though.

For people who do not use LrC then DXO and Topaz can still sell to them.
Pretty much where I am. I won’t need DxO moving forward, although LRc Denoise Ai isn’t “perfect”, it’s more than good enough for my needs.

You can read this thread in the DxO forums, heaps of discussion between DxO vs Adobe Denoise Ai (https://feedback.dxo.com/t/lightroo...does-it-compare-to-photolabs-pureraw/32822/56)
The element im less sure about is what would be the most suitable machine for this system? Is it the same as the competition or subtly different?
The task is heavily reliant on GPU, and is optimised for GPU.

Mac user

The current Adobe denoise AI doesn’t leverage the M1/M2 ANE (Apple Neural Engine) once they resolve that, there should be some interesting performance gains (Source : https://community.adobe.com/t5/lightroom-classic-discussions/denoise-ai-in-12-3/m-p/13739400)
Is that link correct Raymond? I couldn't get it to open.
In the interim, the timing has been “perfect”, my 2018 Apple laptop is about to become “vintage” and out of support (definition of vintage https://support.apple.com/en-us/HT2...considered vintage when,less than 7 years ago.)

I have decided to purchase a new apple device with the most amount of GPU cores while trying to keep the price down. This will hopefully give me the “best” experience possible, and buy me some headroom for the future.

Recently i have purchased a macbook M1 Max with 32core GPU for $3500 AUD (new). That seemed the most value from a price vs performance perspective.

Output

I am finding threads like these quite informative, but like most “ordinary folks” the differences are subtle, they are all much better than the original raw files that is produced by a single magic click of a button.

As always it’s never going to be perfect for every single image, but if it’s mostly good most of the time that’s great.. But we need abit of reality check.. you can’t expect clean / detailed results from incredibly high iso (relative to the camera). The basics still apply, always “try” to shoot at the lowest iso where possible.
I had been thinking if we may see some of this to be included in a camera or if some challenges are energy related. That is, if we had some ASIC like device do this in camera what would the energy defecit be. Also at what stage may we have some mild application of what Adobe have created turned in all the time (when can we process it fast enough).

Also, is a route available for maybe a 4/3 like camera to offload this to the cloud to provide some benefits to the user rather than investing in a full frame set up.

Outside of my work my 'need' for these tools is much reduced - nice to have tho.
Processing time

The additional processing time for LRc compared to it’s peers for denoise isn’t much of a deal for me (I am a enthusiast), I just run denoise and do other “life” things while it happens. It’s just another step that hopefully can be eradicated in future releases.
I really need to spend a bit of money in a personal machine. I can't really afford vast amounts and it would be Windows based (or not Apple). Perhaps an Nvidia 3060 or even a bit lower down the scale would be a route for me, and likely as a second hand machine.

One concern of those machines (Vs what your describe from Apple) is the huge amount of energy they require. With our fuel costs in the UK I don't really want a machine that's using a kw. A machine that I can afford, offer speed improvements and save some energy over my aging 7 year old Xeon Dell laptop would be great. I could of course be wanting a bit to much.

We have so much choice for processing images now - really good news.
 
A friend with an Intel i5 Mac Mini 16gb asked me to try LrC 12.3 Denoise AI on my M2 Pro 16" Macbook Pro 12/19 32gb. He gave me one of his high ISO Canon R5 45mp raw files to check. He is thinking about updating to a new computer. On his computer it takes 12 minutes. On mine it takes 34 seconds. Both using the default 50% setting.

Topaz Denoise AI 3.6.2 using default settings took 9 seconds.

I also tried Topaz Photo AI 1.3.1. I only used Remove Noise. I turned off Sharpen, Recover Faces, and Enhance Resolution. It took 12 seconds.

The times above are for just processing and saving the file.
I think most people who have LrC from now on will not bother with DXO or Topaz for processing raw files. Most won't care about whether that for a particular photo one may do slightly better, but for another photo not. LrC Denoise AI is built-in and included for no additional cost. Seems like PureRaw really has no reason to exist anymore. Topaz Denoise AI can also handle non-raw files which Denoise AI and PureRaw cannot so that still is useful. Adobe says they are working on making Denoise AI work with non-raw files though.

For people who do not use LrC then DXO and Topaz can still sell to them.
Pretty much where I am. I won’t need DxO moving forward, although LRc Denoise Ai isn’t “perfect”, it’s more than good enough for my needs.

You can read this thread in the DxO forums, heaps of discussion between DxO vs Adobe Denoise Ai (https://feedback.dxo.com/t/lightroo...does-it-compare-to-photolabs-pureraw/32822/56)
The element im less sure about is what would be the most suitable machine for this system? Is it the same as the competition or subtly different?
The task is heavily reliant on GPU, and is optimised for GPU.

Mac user

The current Adobe denoise AI doesn’t leverage the M1/M2 ANE (Apple Neural Engine) once they resolve that, there should be some interesting performance gains (Source : https://community.adobe.com/t5/lightroom-classic-discussions/denoise-ai-in-12-3/m-p/13739400)
Is that link correct Raymond? I couldn't get it to open.
Just got to remove the “)” at the end ;-(

In the interim, the timing has been “perfect”, my 2018 Apple laptop is about to become “vintage” and out of support (definition of vintage https://support.apple.com/en-us/HT2...considered vintage when,less than 7 years ago.)

I have decided to purchase a new apple device with the most amount of GPU cores while trying to keep the price down. This will hopefully give me the “best” experience possible, and buy me some headroom for the future.

Recently i have purchased a macbook M1 Max with 32core GPU for $3500 AUD (new). That seemed the most value from a price vs performance perspective.

Output

I am finding threads like these quite informative, but like most “ordinary folks” the differences are subtle, they are all much better than the original raw files that is produced by a single magic click of a button.

As always it’s never going to be perfect for every single image, but if it’s mostly good most of the time that’s great.. But we need abit of reality check.. you can’t expect clean / detailed results from incredibly high iso (relative to the camera). The basics still apply, always “try” to shoot at the lowest iso where possible.
I had been thinking if we may see some of this to be included in a camera or if some challenges are energy related. That is, if we had some ASIC like device do this in camera what would the energy defecit be. Also at what stage may we have some mild application of what Adobe have created turned in all the time (when can we process it fast enough).

Also, is a route available for maybe a 4/3 like camera to offload this to the cloud to provide some benefits to the user rather than investing in a full frame set up.

Outside of my work my 'need' for these tools is much reduced - nice to have tho.
Processing time

The additional processing time for LRc compared to it’s peers for denoise isn’t much of a deal for me (I am a enthusiast), I just run denoise and do other “life” things while it happens. It’s just another step that hopefully can be eradicated in future releases.
I really need to spend a bit of money in a personal machine. I can't really afford vast amounts and it would be Windows based (or not Apple). Perhaps an Nvidia 3060 or even a bit lower down the scale would be a route for me, and likely as a second hand machine.

One concern of those machines (Vs what your describe from Apple) is the huge amount of energy they require. With our fuel costs in the UK I don't really want a machine that's using a kw. A machine that I can afford, offer speed improvements and save some energy over my aging 7 year old Xeon Dell laptop would be great. I could of course be wanting a bit to much.
Yes, electricity prices are high and will be going higher.

On my end, I am constrained as I need to work on a laptop, I don’t have a dedicated desk, as I am typically moving around in the house etc.

I can’t really comment much about pc power consumption, but i do know there have been comparisons made between Apple and Pc.

Here - https://www.anandtech.com/show/17024/apple-m1-max-performance-review/3

Tbh i didn’t look into it, as it wasn’t an option.

But I am aware that the apple silicon performs the same on battery vs plugged in, this is a departure from what I have been accustomed to (i.e plugged in to a power source for maximum performance. Performance significantly degrades when using battery)
We have so much choice for processing images now - really good news.
That is good. Great for the consumer.
 
Noise comes from nature of light. You can not escape the noise as soon as you deal with light ...
Noise does not come from a nature of light. Noise comes when not enough light hits the sensor.
Noise come from the whole electronic chain, light or no light.
Sure, there is a contribution from electronics (it is indeed important in cases of extremely low light). But the most significant contribution (in case we do see the image created by light) is known as photon noise. The photon noise is scaled as square root (sqrt) of number of photons participated in creating the image . More photons (n) - more noise (sqrt(n)), but signal/noise ratio is proportional n/sqrt(n)=sqrt(n). Thus the signal/noise ratio is increased with the light intensity, despite the power of noise is also increased with the intensity.
You (and anyone who is interested in these things) should appreciate this discussion:

https://www.dpreview.com/forums/thread/4710831
 
Last edited:
Noise comes from nature of light. You can not escape the noise as soon as you deal with light ...
Noise does not come from a nature of light. Noise comes when not enough light hits the sensor.
Noise come from the whole electronic chain, light or no light.
Sure, there is a contribution from electronics (it is indeed important in cases of extremely low light). But the most significant contribution (in case we do see the image created by light) is known as photon noise. The photon noise is scaled as square root (sqrt) of number of photons participated in creating the image . More photons (n) - more noise (sqrt(n)), but signal/noise ratio is proportional n/sqrt(n)=sqrt(n). Thus the signal/noise ratio is increased with the light intensity, despite the power of noise is also increased with the intensity.
You (and anyone who is interested in these things) should appreciate this discussion:

https://www.dpreview.com/forums/thread/4710831
Thanks, Sybersitizen.
 
Noise comes from nature of light. You can not escape the noise as soon as you deal with light ...
Noise does not come from a nature of light. Noise comes when not enough light hits the sensor.
Noise come from the whole electronic chain, light or no light.
Sure, there is a contribution from electronics (it is indeed important in cases of extremely low light). But the most significant contribution (in case we do see the image created by light) is known as photon noise. The photon noise is scaled as square root (sqrt) of number of photons participated in creating the image . More photons (n) - more noise (sqrt(n)), but signal/noise ratio is proportional n/sqrt(n)=sqrt(n). Thus the signal/noise ratio is increased with the light intensity, despite the power of noise is also increased with the intensity.
You (and anyone who is interested in these things) should appreciate this discussion:
'You should appreciate this' - what on earth is wrong with you? Is this an order, demand, law? You interaction has been really poor fella. I don't believe within the spirit or direction of this website.

You should/shall appreciate this - I use these words simply because it's possibly how you function.


Back being human. What is confined here (it can be found elsewhere) is relavant to my discussion. What you chose to remove from my text is and was also relavant.

 
Noise comes from nature of light. You can not escape the noise as soon as you deal with light...
Noise does not come from a nature of light. Noise comes when not enough light hits the sensor.
Noise come from the whole electronic chain, light or no light.
Sure, there is a contribution from electronics (it is indeed important in cases of extremely low light).
It's all from electronics.
But the most significant contribution (in case we do see the image created by light) is known as photon noise.
That's noise created in the electrical domain. Photons do not have noise per say. As an electromagnetic engineer this terminology is unpleasant and leads to such misunderstanding.
Serguei Palto is right, and any authoritative source will agree.
With two electromagnetic doctorates and a chief engineer in the field (pardon the pun) do I count?
Ok!

I am talking with a "chief engineer...".


To make sure that you indeed not the bot, please answer a simple question I ask often my PhD students. The question is very simple and it is as follows.

The first picture below is just a signal from a generator. Sure, as " a chief engineer...", you should know what is the waveform shown below.





Fig.1
Fig.1

The next step we add the noise, Fig.2. BTW, it can be just the light intensity with a DC cut off (hope you understand what I am talking about).

So, we have a problem to recover the original signal shown in Fig.1 from noisy signal in Fig.2.



Fig.2.
Fig.2.

The next step is denoising result in Fig.3. And my question to You is as follows.

What kind of a filter is applyed to the noisy signal shown in Fig.2 in order to get the result shown in Figure 3.

Please confirm your education, in order to make possible the further conversation.



Fig.3.
Fig.3.





I'll ask the question again. Which element of light, or indeed any form of electromagnetism is unwanted? Whichever is that content could then be noise.

If light doesn't interact with an electric field then where are we?
Photon noise is a form of uncertainty that exists in the light itself, independent of any 'electrical domain'. The uncertainty manifests as randomness in the photon stream. 'Strong' or 'bright' photon streams inherently have a relatively high signal to noise ratio; 'weak' or 'dim' photon streams inherently have a much lower one.
High signal to noise ratio? Let us assume that we are deciding to work in this particular none entirely realistic world of balls of energy. Each of these packets has the same energy. Where is the noise? I want 5 packets. I get 5 packets. I don't get anything else. So nothing lost or unwanted.

If we look to Max Planck. He would describe this puzzle as an energy related to wavelength. Which brings another puzzle. That aside if this said light has one wavelength what part of this wavelength of corpuscular light is unwanted?

I'm happy to have all of it.

The element you removed I don't agree is different topic entirely hence I wrote it.
I'm not sure that is such a good description of electromagnetics.

What this talks of us that if we have a physical detector which has some size (say a rectangle) that if we pass light with respect to some timeframe we get that distribution.

However, everything underneath the function shown (it's a good function no problem there) is stuff I want. Hence no noise.

If this tells us that's it's tricky to measure (now we have introduced something funky) some parameter of whatever this function is talking about ( number of sheep within our farm boundary is so a Poisson function) in descrete time slices then I agree. But that's not what you said.

If I take a single moment in time, count my sheep then I have the number in that boundary. No unwanted data, no noise. If we let the clock move on a bit that number may change. Run it long enough and measure at random (more funky stuff - that's going to be tricky) intervals guess what, Poisson is back.

This tells us that there is a temporal variation – noise – that “accompanies” the photons at the input of an imaging system
(I removed the comments about noise reduction, which is an entirely different discussion.
 
Noise comes from nature of light. You can not escape the noise as soon as you deal with light...
Noise does not come from a nature of light. Noise comes when not enough light hits the sensor.
Noise come from the whole electronic chain, light or no light.
Sure, there is a contribution from electronics (it is indeed important in cases of extremely low light).
It's all from electronics.
But the most significant contribution (in case we do see the image created by light) is known as photon noise.
That's noise created in the electrical domain. Photons do not have noise per say. As an electromagnetic engineer this terminology is unpleasant and leads to such misunderstanding.
Serguei Palto is right, and any authoritative source will agree.
With two electromagnetic doctorates and a chief engineer in the field (pardon the pun) do I count?
Ok!

I am talking with a "chief engineer...".

To make sure that you indeed not the bot, please answer a simple question I ask often my PhD students. The question is very simple and it is as follows.

The first picture below is just a signal from a generator. Sure, as " a chief engineer...", you should know what is the waveform shown below.

Fig.1
Fig.1

The next step we add the noise, Fig.2. BTW, it can be just the light intensity with a DC cut off (hope you understand what I am talking about).

So, we have a problem to recover the original signal shown in Fig.1 from noisy signal in Fig.2.

Fig.2.
Fig.2.

The next step is denoising result in Fig.3. And my question to You is as follows.

What kind of a filter is applyed to the noisy signal shown in Fig.2 in order to get the result shown in Figure 3.

Please confirm your education, in order to make possible the further conversation.

Fig.3.
Fig.3.
I'll ask the question again. Which element of light, or indeed any form of electromagnetism is unwanted? Whichever is that content could then be noise.

If light doesn't interact with an electric field then where are we?
Photon noise is a form of uncertainty that exists in the light itself, independent of any 'electrical domain'. The uncertainty manifests as randomness in the photon stream. 'Strong' or 'bright' photon streams inherently have a relatively high signal to noise ratio; 'weak' or 'dim' photon streams inherently have a much lower one.
High signal to noise ratio? Let us assume that we are deciding to work in this particular none entirely realistic world of balls of energy. Each of these packets has the same energy. Where is the noise? I want 5 packets. I get 5 packets. I don't get anything else. So nothing lost or unwanted.

If we look to Max Planck. He would describe this puzzle as an energy related to wavelength. Which brings another puzzle. That aside if this said light has one wavelength what part of this wavelength of corpuscular light is unwanted?

I'm happy to have all of it.

The element you removed I don't agree is different topic entirely hence I wrote it.
I'm not sure that is such a good description of electromagnetics.

What this talks of us that if we have a physical detector which has some size (say a rectangle) that if we pass light with respect to some timeframe we get that distribution.

However, everything underneath the function shown (it's a good function no problem there) is stuff I want. Hence no noise.

If this tells us that's it's tricky to measure (now we have introduced something funky) some parameter of whatever this function is talking about ( number of sheep within our farm boundary is so a Poisson function) in descrete time slices then I agree. But that's not what you said.

If I take a single moment in time, count my sheep then I have the number in that boundary. No unwanted data, no noise. If we let the clock move on a bit that number may change. Run it long enough and measure at random (more funky stuff - that's going to be tricky) intervals guess what, Poisson is back.

This tells us that there is a temporal variation – noise – that “accompanies” the photons at the input of an imaging system
(I removed the comments about noise reduction, which is an entirely different discussion.
The obnoxiousness is gaining momentum when it already was getting to a position of terribleness.

I am talking with a "chief engineer...". < That doesn't entirely make sense.

To make sure that you indeed not the bot, please answer a simple question I ask often my PhD students. The question is very simple and it is as follows.

However your logic is peculiar. You wish me to answer some questions you set to your PhD students (we have no clue as to why and what they study and if it's related).

But if we entertain your pantomime at best does this not confirm I could actually potentially be one of your students?

System engineering exam = not quite ready just yet.

You appear to be trying to battle with one that cannot be won and is very misplaced.
 
Noise comes from nature of light. You can not escape the noise as soon as you deal with light...
Noise does not come from a nature of light. Noise comes when not enough light hits the sensor.
Noise come from the whole electronic chain, light or no light.
Sure, there is a contribution from electronics (it is indeed important in cases of extremely low light).
It's all from electronics.
But the most significant contribution (in case we do see the image created by light) is known as photon noise.
That's noise created in the electrical domain. Photons do not have noise per say. As an electromagnetic engineer this terminology is unpleasant and leads to such misunderstanding.
Serguei Palto is right, and any authoritative source will agree.
With two electromagnetic doctorates and a chief engineer in the field (pardon the pun) do I count?
Ok!

I am talking with a "chief engineer...".

To make sure that you indeed not the bot, please answer a simple question I ask often my PhD students. The question is very simple and it is as follows.

The first picture below is just a signal from a generator. Sure, as " a chief engineer...", you should know what is the waveform shown below.

Fig.1
Fig.1

The next step we add the noise, Fig.2. BTW, it can be just the light intensity with a DC cut off (hope you understand what I am talking about).

So, we have a problem to recover the original signal shown in Fig.1 from noisy signal in Fig.2.

Fig.2.
Fig.2.

The next step is denoising result in Fig.3. And my question to You is as follows.

What kind of a filter is applyed to the noisy signal shown in Fig.2 in order to get the result shown in Figure 3.

Please confirm your education, in order to make possible the further conversation.

Fig.3.
Fig.3.
I'll ask the question again. Which element of light, or indeed any form of electromagnetism is unwanted? Whichever is that content could then be noise.

If light doesn't interact with an electric field then where are we?
Photon noise is a form of uncertainty that exists in the light itself, independent of any 'electrical domain'. The uncertainty manifests as randomness in the photon stream. 'Strong' or 'bright' photon streams inherently have a relatively high signal to noise ratio; 'weak' or 'dim' photon streams inherently have a much lower one.
High signal to noise ratio? Let us assume that we are deciding to work in this particular none entirely realistic world of balls of energy. Each of these packets has the same energy. Where is the noise? I want 5 packets. I get 5 packets. I don't get anything else. So nothing lost or unwanted.

If we look to Max Planck. He would describe this puzzle as an energy related to wavelength. Which brings another puzzle. That aside if this said light has one wavelength what part of this wavelength of corpuscular light is unwanted?

I'm happy to have all of it.

The element you removed I don't agree is different topic entirely hence I wrote it.
I'm not sure that is such a good description of electromagnetics.

What this talks of us that if we have a physical detector which has some size (say a rectangle) that if we pass light with respect to some timeframe we get that distribution.

However, everything underneath the function shown (it's a good function no problem there) is stuff I want. Hence no noise.

If this tells us that's it's tricky to measure (now we have introduced something funky) some parameter of whatever this function is talking about ( number of sheep within our farm boundary is so a Poisson function) in descrete time slices then I agree. But that's not what you said.

If I take a single moment in time, count my sheep then I have the number in that boundary. No unwanted data, no noise. If we let the clock move on a bit that number may change. Run it long enough and measure at random (more funky stuff - that's going to be tricky) intervals guess what, Poisson is back.

This tells us that there is a temporal variation – noise – that “accompanies” the photons at the input of an imaging system
(I removed the comments about noise reduction, which is an entirely different discussion.
The obnoxiousness is gaining momentum when it already was getting to a position of terribleness.

I am talking with a "chief engineer...". < That doesn't entirely make sense.

To make sure that you indeed not the bot, please answer a simple question I ask often my PhD students. The question is very simple and it is as follows.

However your logic is peculiar. You wish me to answer some questions you set to your PhD students (we have no clue as to why and what they study and if it's related).

But if we entertain your pantomime at best does this not confirm I could actually potentially be one of your students?

System engineering exam = not quite ready just yet.

You appear to be trying to battle with one that cannot be won and is very misplaced.
Thus, you are the bot :)
 
Noise comes from nature of light. You can not escape the noise as soon as you deal with light...
Noise does not come from a nature of light. Noise comes when not enough light hits the sensor.
Noise come from the whole electronic chain, light or no light.
Sure, there is a contribution from electronics (it is indeed important in cases of extremely low light).
It's all from electronics.
But the most significant contribution (in case we do see the image created by light) is known as photon noise.
That's noise created in the electrical domain. Photons do not have noise per say. As an electromagnetic engineer this terminology is unpleasant and leads to such misunderstanding.
Serguei Palto is right, and any authoritative source will agree.
With two electromagnetic doctorates and a chief engineer in the field (pardon the pun) do I count?
Ok!

I am talking with a "chief engineer...".

To make sure that you indeed not the bot, please answer a simple question I ask often my PhD students. The question is very simple and it is as follows.

The first picture below is just a signal from a generator. Sure, as " a chief engineer...", you should know what is the waveform shown below.

Fig.1
Fig.1

The next step we add the noise, Fig.2. BTW, it can be just the light intensity with a DC cut off (hope you understand what I am talking about).

So, we have a problem to recover the original signal shown in Fig.1 from noisy signal in Fig.2.

Fig.2.
Fig.2.

The next step is denoising result in Fig.3. And my question to You is as follows.

What kind of a filter is applyed to the noisy signal shown in Fig.2 in order to get the result shown in Figure 3.

Please confirm your education, in order to make possible the further conversation.

Fig.3.
Fig.3.
I'll ask the question again. Which element of light, or indeed any form of electromagnetism is unwanted? Whichever is that content could then be noise.

If light doesn't interact with an electric field then where are we?
Photon noise is a form of uncertainty that exists in the light itself, independent of any 'electrical domain'. The uncertainty manifests as randomness in the photon stream. 'Strong' or 'bright' photon streams inherently have a relatively high signal to noise ratio; 'weak' or 'dim' photon streams inherently have a much lower one.
High signal to noise ratio? Let us assume that we are deciding to work in this particular none entirely realistic world of balls of energy. Each of these packets has the same energy. Where is the noise? I want 5 packets. I get 5 packets. I don't get anything else. So nothing lost or unwanted.

If we look to Max Planck. He would describe this puzzle as an energy related to wavelength. Which brings another puzzle. That aside if this said light has one wavelength what part of this wavelength of corpuscular light is unwanted?

I'm happy to have all of it.

The element you removed I don't agree is different topic entirely hence I wrote it.
I'm not sure that is such a good description of electromagnetics.

What this talks of us that if we have a physical detector which has some size (say a rectangle) that if we pass light with respect to some timeframe we get that distribution.

However, everything underneath the function shown (it's a good function no problem there) is stuff I want. Hence no noise.

If this tells us that's it's tricky to measure (now we have introduced something funky) some parameter of whatever this function is talking about ( number of sheep within our farm boundary is so a Poisson function) in descrete time slices then I agree. But that's not what you said.

If I take a single moment in time, count my sheep then I have the number in that boundary. No unwanted data, no noise. If we let the clock move on a bit that number may change. Run it long enough and measure at random (more funky stuff - that's going to be tricky) intervals guess what, Poisson is back.

This tells us that there is a temporal variation – noise – that “accompanies” the photons at the input of an imaging system
(I removed the comments about noise reduction, which is an entirely different discussion.
The obnoxiousness is gaining momentum when it already was getting to a position of terribleness.

I am talking with a "chief engineer...". < That doesn't entirely make sense.

To make sure that you indeed not the bot, please answer a simple question I ask often my PhD students. The question is very simple and it is as follows.

However your logic is peculiar. You wish me to answer some questions you set to your PhD students (we have no clue as to why and what they study and if it's related).

But if we entertain your pantomime at best does this not confirm I could actually potentially be one of your students?

System engineering exam = not quite ready just yet.

You appear to be trying to battle with one that cannot be won and is very misplaced.
Thus, you are the bot :)
Please cease Sergei it's not good for the soul. I've never heard of the Universe being described as a bot but if it cheers your up then help yourself.

Offending people I think should be avoided.

Take care.
 
A friend with an Intel i5 Mac Mini 16gb asked me to try LrC 12.3 Denoise AI on my M2 Pro 16" Macbook Pro 12/19 32gb. He gave me one of his high ISO Canon R5 45mp raw files to check. He is thinking about updating to a new computer. On his computer it takes 12 minutes. On mine it takes 34 seconds. Both using the default 50% setting.

Topaz Denoise AI 3.6.2 using default settings took 9 seconds.

I also tried Topaz Photo AI 1.3.1. I only used Remove Noise. I turned off Sharpen, Recover Faces, and Enhance Resolution. It took 12 seconds.

The times above are for just processing and saving the file.
I think most people who have LrC from now on will not bother with DXO or Topaz for processing raw files. Most won't care about whether that for a particular photo one may do slightly better, but for another photo not. LrC Denoise AI is built-in and included for no additional cost. Seems like PureRaw really has no reason to exist anymore. Topaz Denoise AI can also handle non-raw files which Denoise AI and PureRaw cannot so that still is useful. Adobe says they are working on making Denoise AI work with non-raw files though.

For people who do not use LrC then DXO and Topaz can still sell to them.
Pretty much where I am. I won’t need DxO moving forward, although LRc Denoise Ai isn’t “perfect”, it’s more than good enough for my needs.

You can read this thread in the DxO forums, heaps of discussion between DxO vs Adobe Denoise Ai (https://feedback.dxo.com/t/lightroo...does-it-compare-to-photolabs-pureraw/32822/56)
The element im less sure about is what would be the most suitable machine for this system? Is it the same as the competition or subtly different?
The task is heavily reliant on GPU, and is optimised for GPU.

Mac user

The current Adobe denoise AI doesn’t leverage the M1/M2 ANE (Apple Neural Engine) once they resolve that, there should be some interesting performance gains (Source : https://community.adobe.com/t5/lightroom-classic-discussions/denoise-ai-in-12-3/m-p/13739400)

In the interim, the timing has been “perfect”, my 2018 Apple laptop is about to become “vintage” and out of support (definition of vintage https://support.apple.com/en-us/HT2...considered vintage when,less than 7 years ago.)

I have decided to purchase a new apple device with the most amount of GPU cores while trying to keep the price down. This will hopefully give me the “best” experience possible, and buy me some headroom for the future.

Recently i have purchased a macbook M1 Max with 32core GPU for $3500 AUD (new). That seemed the most value from a price vs performance perspective.

Output

I am finding threads like these quite informative, but like most “ordinary folks” the differences are subtle, they are all much better than the original raw files that is produced by a single magic click of a button.

As always it’s never going to be perfect for every single image, but if it’s mostly good most of the time that’s great.. But we need abit of reality check.. you can’t expect clean / detailed results from incredibly high iso (relative to the camera). The basics still apply, always “try” to shoot at the lowest iso where possible.

Processing time

The additional processing time for LRc compared to it’s peers for denoise isn’t much of a deal for me (I am a enthusiast), I just run denoise and do other “life” things while it happens. It’s just another step that hopefully can be eradicated in future releases.
This is the sort of thing I'm looking at. If I can get something a little less costly great but I think it would be a big improvement in what I have

 
Noise comes from nature of light. You can not escape the noise as soon as you deal with light...
Noise does not come from a nature of light. Noise comes when not enough light hits the sensor.
Noise come from the whole electronic chain, light or no light.
Sure, there is a contribution from electronics (it is indeed important in cases of extremely low light).
It's all from electronics.
But the most significant contribution (in case we do see the image created by light) is known as photon noise.
That's noise created in the electrical domain. Photons do not have noise per say. As an electromagnetic engineer this terminology is unpleasant and leads to such misunderstanding.
Serguei Palto is right, and any authoritative source will agree.
With two electromagnetic doctorates and a chief engineer in the field (pardon the pun) do I count?
Ok!

I am talking with a "chief engineer...".

To make sure that you indeed not the bot, please answer a simple question I ask often my PhD students. The question is very simple and it is as follows.

The first picture below is just a signal from a generator. Sure, as " a chief engineer...", you should know what is the waveform shown below.

Fig.1
Fig.1

The next step we add the noise, Fig.2. BTW, it can be just the light intensity with a DC cut off (hope you understand what I am talking about).

So, we have a problem to recover the original signal shown in Fig.1 from noisy signal in Fig.2.

Fig.2.
Fig.2.

The next step is denoising result in Fig.3. And my question to You is as follows.

What kind of a filter is applyed to the noisy signal shown in Fig.2 in order to get the result shown in Figure 3.

Please confirm your education, in order to make possible the further conversation.

Fig.3.
Fig.3.
I'll ask the question again. Which element of light, or indeed any form of electromagnetism is unwanted? Whichever is that content could then be noise.

If light doesn't interact with an electric field then where are we?
Photon noise is a form of uncertainty that exists in the light itself, independent of any 'electrical domain'. The uncertainty manifests as randomness in the photon stream. 'Strong' or 'bright' photon streams inherently have a relatively high signal to noise ratio; 'weak' or 'dim' photon streams inherently have a much lower one.
High signal to noise ratio? Let us assume that we are deciding to work in this particular none entirely realistic world of balls of energy. Each of these packets has the same energy. Where is the noise? I want 5 packets. I get 5 packets. I don't get anything else. So nothing lost or unwanted.

If we look to Max Planck. He would describe this puzzle as an energy related to wavelength. Which brings another puzzle. That aside if this said light has one wavelength what part of this wavelength of corpuscular light is unwanted?

I'm happy to have all of it.

The element you removed I don't agree is different topic entirely hence I wrote it.
I'm not sure that is such a good description of electromagnetics.

What this talks of us that if we have a physical detector which has some size (say a rectangle) that if we pass light with respect to some timeframe we get that distribution.

However, everything underneath the function shown (it's a good function no problem there) is stuff I want. Hence no noise.

If this tells us that's it's tricky to measure (now we have introduced something funky) some parameter of whatever this function is talking about ( number of sheep within our farm boundary is so a Poisson function) in descrete time slices then I agree. But that's not what you said.

If I take a single moment in time, count my sheep then I have the number in that boundary. No unwanted data, no noise. If we let the clock move on a bit that number may change. Run it long enough and measure at random (more funky stuff - that's going to be tricky) intervals guess what, Poisson is back.

This tells us that there is a temporal variation – noise – that “accompanies” the photons at the input of an imaging system
(I removed the comments about noise reduction, which is an entirely different discussion.
The obnoxiousness is gaining momentum when it already was getting to a position of terribleness.

I am talking with a "chief engineer...". < That doesn't entirely make sense.

To make sure that you indeed not the bot, please answer a simple question I ask often my PhD students. The question is very simple and it is as follows.

However your logic is peculiar. You wish me to answer some questions you set to your PhD students (we have no clue as to why and what they study and if it's related).

But if we entertain your pantomime at best does this not confirm I could actually potentially be one of your students?

System engineering exam = not quite ready just yet.

You appear to be trying to battle with one that cannot be won and is very misplaced.
Thus, you are the bot :)
Please cease Sergei it's not good for the soul. I've never heard of the Universe being described as a bot but if it cheers your up then help yourself.

Offending people I think should be avoided.

Take care.
Sure. It was not the case you mentioned. You wanted to know...You got the answer.
 
Noise comes from nature of light. You can not escape the noise as soon as you deal with light...
Noise does not come from a nature of light. Noise comes when not enough light hits the sensor.
Noise come from the whole electronic chain, light or no light.
Sure, there is a contribution from electronics (it is indeed important in cases of extremely low light).
It's all from electronics.
But the most significant contribution (in case we do see the image created by light) is known as photon noise.
That's noise created in the electrical domain. Photons do not have noise per say. As an electromagnetic engineer this terminology is unpleasant and leads to such misunderstanding.
Serguei Palto is right, and any authoritative source will agree.
With two electromagnetic doctorates and a chief engineer in the field (pardon the pun) do I count?
Ok!

I am talking with a "chief engineer...".

To make sure that you indeed not the bot, please answer a simple question I ask often my PhD students. The question is very simple and it is as follows.

The first picture below is just a signal from a generator. Sure, as " a chief engineer...", you should know what is the waveform shown below.

Fig.1
Fig.1

The next step we add the noise, Fig.2. BTW, it can be just the light intensity with a DC cut off (hope you understand what I am talking about).

So, we have a problem to recover the original signal shown in Fig.1 from noisy signal in Fig.2.

Fig.2.
Fig.2.

The next step is denoising result in Fig.3. And my question to You is as follows.

What kind of a filter is applyed to the noisy signal shown in Fig.2 in order to get the result shown in Figure 3.

Please confirm your education, in order to make possible the further conversation.

Fig.3.
Fig.3.
I'll ask the question again. Which element of light, or indeed any form of electromagnetism is unwanted? Whichever is that content could then be noise.

If light doesn't interact with an electric field then where are we?
Photon noise is a form of uncertainty that exists in the light itself, independent of any 'electrical domain'. The uncertainty manifests as randomness in the photon stream. 'Strong' or 'bright' photon streams inherently have a relatively high signal to noise ratio; 'weak' or 'dim' photon streams inherently have a much lower one.
High signal to noise ratio? Let us assume that we are deciding to work in this particular none entirely realistic world of balls of energy. Each of these packets has the same energy. Where is the noise? I want 5 packets. I get 5 packets. I don't get anything else. So nothing lost or unwanted.

If we look to Max Planck. He would describe this puzzle as an energy related to wavelength. Which brings another puzzle. That aside if this said light has one wavelength what part of this wavelength of corpuscular light is unwanted?

I'm happy to have all of it.

The element you removed I don't agree is different topic entirely hence I wrote it.
I'm not sure that is such a good description of electromagnetics.

What this talks of us that if we have a physical detector which has some size (say a rectangle) that if we pass light with respect to some timeframe we get that distribution.

However, everything underneath the function shown (it's a good function no problem there) is stuff I want. Hence no noise.

If this tells us that's it's tricky to measure (now we have introduced something funky) some parameter of whatever this function is talking about ( number of sheep within our farm boundary is so a Poisson function) in descrete time slices then I agree. But that's not what you said.

If I take a single moment in time, count my sheep then I have the number in that boundary. No unwanted data, no noise. If we let the clock move on a bit that number may change. Run it long enough and measure at random (more funky stuff - that's going to be tricky) intervals guess what, Poisson is back.

This tells us that there is a temporal variation – noise – that “accompanies” the photons at the input of an imaging system
(I removed the comments about noise reduction, which is an entirely different discussion.
The obnoxiousness is gaining momentum when it already was getting to a position of terribleness.

I am talking with a "chief engineer...". < That doesn't entirely make sense.

To make sure that you indeed not the bot, please answer a simple question I ask often my PhD students. The question is very simple and it is as follows.

However your logic is peculiar. You wish me to answer some questions you set to your PhD students (we have no clue as to why and what they study and if it's related).

But if we entertain your pantomime at best does this not confirm I could actually potentially be one of your students?

System engineering exam = not quite ready just yet.

You appear to be trying to battle with one that cannot be won and is very misplaced.
Thus, you are the bot :)
Please cease Sergei it's not good for the soul. I've never heard of the Universe being described as a bot but if it cheers your up then help yourself.

Offending people I think should be avoided.

Take care.
Sure. It was not the case you mentioned. You wanted to know...You got the answer.
Pleased cease Sergey - that's what I asked for!
 
A friend with an Intel i5 Mac Mini 16gb asked me to try LrC 12.3 Denoise AI on my M2 Pro 16" Macbook Pro 12/19 32gb. He gave me one of his high ISO Canon R5 45mp raw files to check. He is thinking about updating to a new computer. On his computer it takes 12 minutes. On mine it takes 34 seconds. Both using the default 50% setting.

Topaz Denoise AI 3.6.2 using default settings took 9 seconds.

I also tried Topaz Photo AI 1.3.1. I only used Remove Noise. I turned off Sharpen, Recover Faces, and Enhance Resolution. It took 12 seconds.

The times above are for just processing and saving the file.
I think most people who have LrC from now on will not bother with DXO or Topaz for processing raw files. Most won't care about whether that for a particular photo one may do slightly better, but for another photo not. LrC Denoise AI is built-in and included for no additional cost. Seems like PureRaw really has no reason to exist anymore. Topaz Denoise AI can also handle non-raw files which Denoise AI and PureRaw cannot so that still is useful. Adobe says they are working on making Denoise AI work with non-raw files though.

For people who do not use LrC then DXO and Topaz can still sell to them.
Pretty much where I am. I won’t need DxO moving forward, although LRc Denoise Ai isn’t “perfect”, it’s more than good enough for my needs.

You can read this thread in the DxO forums, heaps of discussion between DxO vs Adobe Denoise Ai (https://feedback.dxo.com/t/lightroo...does-it-compare-to-photolabs-pureraw/32822/56)
The element im less sure about is what would be the most suitable machine for this system? Is it the same as the competition or subtly different?
All the AI NR tools make heavy use of the GPU, but the new LR Denoise AI tool is apparently much slower than the DxO and Topaz AI NR tools. But with all of them, a powerful modern GPU provides a dramatic improvement in performance.

Other factors like CPU and disk speeds are comparatively unimportant. Of course, there are also differences between Apple silicon and PC hardware.
Thanks Nigel.

The typical difference I am seeing between LR and Topaz AI is a factor of around 15 on my laptop.

A few reasons why DxO isn't a tool we use but speed Vs what was DeNoise AI being one.
I find that DeNoise AI is about the same speed as DeepPRIME, but XD takes about twice as long on my machine. But the relative speeds probably depend on the graphics card, as XD seems to be optimised for the more recent cards. With low to middling ISO images, image quality is similar, but with really high ISOs (or very low light), XD is much more effective than DeNoise or DeepPRIME, but also risks artefacts unless you turn down the noise model. That's why I reserve its use for the noisiest images.
I had pondered if the ideal machine isn't ideal for all systems.
It all does seem to depend on the GPU. One that's faster with one of these will also be faster with the others, but of course LR will probably be the slowest in each case.
I've seen some refurb Dells with an A3000 in and others with the 3090. Not sure which may be the best direction.
 
A friend with an Intel i5 Mac Mini 16gb asked me to try LrC 12.3 Denoise AI on my M2 Pro 16" Macbook Pro 12/19 32gb. He gave me one of his high ISO Canon R5 45mp raw files to check. He is thinking about updating to a new computer. On his computer it takes 12 minutes. On mine it takes 34 seconds. Both using the default 50% setting.

Topaz Denoise AI 3.6.2 using default settings took 9 seconds.

I also tried Topaz Photo AI 1.3.1. I only used Remove Noise. I turned off Sharpen, Recover Faces, and Enhance Resolution. It took 12 seconds.

The times above are for just processing and saving the file.
Eric Chan, one of the longtime Adobe software engineers, wrote this article about the new Denoise AI:

Denoise demystified

https://blog.adobe.com/en/publish/2023/04/18/denoise-demystified

Finally, we built our machine learning models to take full advantage of the latest platform technologies, including NVIDIA’s TensorCores and the Apple Neural Engine. Using these technologies enables our models to run faster on modern hardware.

But, then Ian Lyon wrote:

https://community.adobe.com/t5/lightroom-classic-discussions/denoise-ai-in-12-3/m-p/13739400

I need to correct myself as the information provided re Denoise using the Neural Engine is not correct. Currently, the Apple Neural Engine found in M1/M2 silicon Macs is not used by Denoise. It's expected that performance will be even better when Denoise does use the Apple Neural Engine.

There is no change to above comments regarding the use of Tensor cores in NVidia cards. That is, Tensor cores are used by Denoise.

Apparently, an issue on Apple side means that the Neural Engine is not used by Adobe Denoise. When the issue is addressed, then Adobe Denoise is ready to take advantage of the Neural Engine.


And then Rikk Flohr confirmed it:

@Ian Lyons information is correct regarding the Neural Engine on Apple devices

Rikk Flohr - Customer Advocacy: Adobe Photography Products


I wonder if there will be a Ventura update such as 13.3.2 that will fix the problem?

This raises the question about whether DXO and Topaz are actually able to use the Neural Engine or not? If they are then it seems like this is more an Adobe problem than an Apple problem, doesn't it? Maybe a LrC 12.3.1 is needed?

If this problem is fixed then I will rerun the 45mp Canon R5 file above. Maybe the time will be much faster than 34 seconds when using the Neural Engine.
 
A friend with an Intel i5 Mac Mini 16gb asked me to try LrC 12.3 Denoise AI on my M2 Pro 16" Macbook Pro 12/19 32gb. He gave me one of his high ISO Canon R5 45mp raw files to check. He is thinking about updating to a new computer. On his computer it takes 12 minutes. On mine it takes 34 seconds. Both using the default 50% setting.

Topaz Denoise AI 3.6.2 using default settings took 9 seconds.

I also tried Topaz Photo AI 1.3.1. I only used Remove Noise. I turned off Sharpen, Recover Faces, and Enhance Resolution. It took 12 seconds.

The times above are for just processing and saving the file.
Eric Chan, one of the longtime Adobe software engineers, wrote this article about the new Denoise AI:

Denoise demystified

https://blog.adobe.com/en/publish/2023/04/18/denoise-demystified

Finally, we built our machine learning models to take full advantage of the latest platform technologies, including NVIDIA’s TensorCores and the Apple Neural Engine. Using these technologies enables our models to run faster on modern hardware.

But, then Ian Lyon wrote:

https://community.adobe.com/t5/lightroom-classic-discussions/denoise-ai-in-12-3/m-p/13739400

I need to correct myself as the information provided re Denoise using the Neural Engine is not correct. Currently, the Apple Neural Engine found in M1/M2 silicon Macs is not used by Denoise. It's expected that performance will be even better when Denoise does use the Apple Neural Engine.

There is no change to above comments regarding the use of Tensor cores in NVidia cards. That is, Tensor cores are used by Denoise.

Apparently, an issue on Apple side means that the Neural Engine is not used by Adobe Denoise. When the issue is addressed, then Adobe Denoise is ready to take advantage of the Neural Engine.


And then Rikk Flohr confirmed it:

@Ian Lyons information is correct regarding the Neural Engine on Apple devices

Rikk Flohr - Customer Advocacy: Adobe Photography Products


I wonder if there will be a Ventura update such as 13.3.2 that will fix the problem?

This raises the question about whether DXO and Topaz are actually able to use the Neural Engine or not?
DxO certainly does:

DxO DeepPrime leverages the native core ML software and the M1/M2 hardware GPU and uses Apple neural engine acceleration.

If they are then it seems like this is more an Adobe problem than an Apple problem, doesn't it?
It seems so.
Maybe a LrC 12.3.1 is needed?

If this problem is fixed then I will rerun the 45mp Canon R5 file above. Maybe the time will be much faster than 34 seconds when using the Neural Engine.
 

Keyboard shortcuts

Back
Top