Usually, AI technology makes use of CUDA cores of NVIDIA cards. AMD has ROCm. I do not know if both technologies are compatible and which one is better.From the minimal words from Adobe's Eric Chan...For all these denoise enhance functions speed stuff we've been talking about, what should I be looking if I was to choose between the RTX3070 and the AMD RX6800? Personally with everything going up in cost, one of the first things I look at is electricity/power consumption Southern California Edison keeps jacking up prices and is going to get worse so I would like to stay below 300 watts hahaha!
No but seriously, which one would you pick and why? I have no clue and don't understand what ray tracing and cuda cores do, the bit rate, and which ones have more RAM bla bla bla all these stuff is really confusing to me I admit. What I would like to know is very simple, which one is the fastest and costs less.
Thanks.
https://blog.adobe.com/en/publish/2023/04/18/denoise-demystified
I don't know the AMD products but the RTX 3070 has tensor cores. All my attempts at using ACR AI Noise Reduction and come in at sub 10 second runs.
Peter
There's dedicated AI cores in NVIDIA (RTX) and Apple Silicon that speed it up a lot. Think of it like a hardware accelerating video encoder which with Intel iGPUs let's my Surface Pro 6 export a video at only half the speed of an RTX 3060Ti even though the general purpose GPU power is next to nothing compared to it.
The one interesting thing about that blurb PMB wrote is that it doesn't mention specific support for Intel's AI cores in their ARC GPUs. Those in some ways are the most powerful for their price but this is where they run into issues with developers not optimizing for them and on the other end Intel's own drivers not always being that great.
Back to the OP I can say an RTX 3060Ti is enough for it. I'd wait a little bit since in the next few weeks the 4060Ti is likely coming out so it'll either be a better buy or push the 3060Ti cards lower.
The 3060 especially refurbished is getting to be a decent deal and the added VRAM could help long term usability though I can't say how much of a performance hit it'll have for Lightroom.

