Graphics cards . . .

I was writing a reply which sadly got lost.

Not sure if the link I shared has been viewed but it covers differences between not just desktops and laptops but also games consoles and even an Nvidia shield.

I had wondered if a laptop may be audibly quieter. So if I'm trying to do a bit of sneaky editing if I manage a half day work from home it won't deafen others when I'm in a teams call.

It was mentioned to me today that the laptop does come with a screen so if I move away from that I'd need another screen.

I did some more reading around other groups but still couldn't see any obvious direction as to what is best for LR. It isn't just the new NR tool which seems much slower than Topaz but the enlarging is pretty slow also. Maybe the difference between a good machine and one that happens to be more ideal for these Adobe tools is small. Lack of data is the problem.
 
I was writing a reply which sadly got lost.
I deleted several posts with political commentary, from several members including my original warning post--when a post is deleted, its replies go with it automatically.
Not sure if the link I shared has been viewed but it covers differences between not just desktops and laptops but also games consoles and even an Nvidia shield.

I had wondered if a laptop may be audibly quieter. So if I'm trying to do a bit of sneaky editing if I manage a half day work from home it won't deafen others when I'm in a teams call.
A desktop can IMO be quieter for the same performance because it has room for a large, efficient CPU heat sink. Laptops are space-constrained in their cooling systems.
It was mentioned to me today that the laptop does come with a screen so if I move away from that I'd need another screen.
Yes.
I did some more reading around other groups but still couldn't see any obvious direction as to what is best for LR. It isn't just the new NR tool which seems much slower than Topaz but the enlarging is pretty slow also. Maybe the difference between a good machine and one that happens to be more ideal for these Adobe tools is small. Lack of data is the problem.
I will refrain from further comment on that, but others are welcome to continue as long as we stay somewhat on topic.
 
I was writing a reply which sadly got lost.
I deleted several posts with political commentary, from several members including my original warning post--when a post is deleted, its replies go with it automatically
Ah no worries. I think I rewrote it okay.
Not sure if the link I shared has been viewed but it covers differences between not just desktops and laptops but also games consoles and even an Nvidia shield.

I had wondered if a laptop may be audibly quieter. So if I'm trying to do a bit of sneaky editing if I manage a half day work from home it won't deafen others when I'm in a teams call.
A desktop can IMO be quieter for the same performance because it has room for a large, efficient CPU heat sink. Laptops are space-constrained in their cooling systems.
That's some good points. I do have a Dell tower for simulation which is in my work office. That come to think of it is quiet. I should check my office is noisy with people shouting at me 😁
It was mentioned to me today that the laptop does come with a screen so if I move away from that I'd need another screen.
Yes.
I did some more reading around other groups but still couldn't see any obvious direction as to what is best for LR. It isn't just the new NR tool which seems much slower than Topaz but the enlarging is pretty slow also. Maybe the difference between a good machine and one that happens to be more ideal for these Adobe tools is small. Lack of data is the problem.
I will refrain from further comment on that, but others are welcome to continue as long as we stay somewhat on topic.
Understood. In the workstation dell machines they fit the A5000 (I think) is this relavant to us (image tools).

Looking at prices I think I'm in the 3060/3070 bracket anyway so I assume they are each relatively similar?

Anything special to avoid or look out for? (For a full size of this is rather than laptop).

I have to sleep briefly and then catch a flight so I beg forgiveness in advance if I do t reply super quickly.
 
I was writing a reply which sadly got lost.
I deleted several posts with political commentary, from several members including my original warning post--when a post is deleted, its replies go with it automatically
Ah no worries. I think I rewrote it okay.
Not sure if the link I shared has been viewed but it covers differences between not just desktops and laptops but also games consoles and even an Nvidia shield.

I had wondered if a laptop may be audibly quieter. So if I'm trying to do a bit of sneaky editing if I manage a half day work from home it won't deafen others when I'm in a teams call.
A desktop can IMO be quieter for the same performance because it has room for a large, efficient CPU heat sink. Laptops are space-constrained in their cooling systems.
That's some good points. I do have a Dell tower for simulation which is in my work office. That come to think of it is quiet. I should check my office is noisy with people shouting at me 😁
It was mentioned to me today that the laptop does come with a screen so if I move away from that I'd need another screen.
Yes.
I did some more reading around other groups but still couldn't see any obvious direction as to what is best for LR. It isn't just the new NR tool which seems much slower than Topaz but the enlarging is pretty slow also. Maybe the difference between a good machine and one that happens to be more ideal for these Adobe tools is small. Lack of data is the problem.
I will refrain from further comment on that, but others are welcome to continue as long as we stay somewhat on topic.
Understood. In the workstation dell machines they fit the A5000 (I think) is this relavant to us (image tools).

Looking at prices I think I'm in the 3060/3070 bracket anyway so I assume they are each relatively similar?

Anything special to avoid or look out for? (For a full size of this is rather than laptop).
We've had many, many threads here dealing with effective configurations of desktop PCs here; I'd suggest looking through previous threads with relevant titles.
I have to sleep briefly and then catch a flight so I beg forgiveness in advance if I do t reply super quickly.
No hurry, no worries. They say that the posts here will be permanently archived even after DPR shuts down, whenever that is.
 
Nvidea 2060 strikes a good point with cost and performance. Currently around £175 used on eBay. I upgraded to a 12gb 3060 for using with Stable Diffusion, but for Photoshop, LR, Capture One the 6GB 2060 performs the same as the 3060 in real world perception.
How many seconds does that 2060 take to Denoise one photo in LR Classic?

Thanks.
 
For all these denoise enhance functions speed stuff we've been talking about, what should I be looking if I was to choose between the RTX3070 and the AMD RX6800? Personally with everything going up in cost, one of the first things I look at is electricity/power consumption Southern California Edison keeps jacking up prices and is going to get worse so I would like to stay below 300 watts hahaha! :-)

No but seriously, which one would you pick and why? I have no clue and don't understand what ray tracing and cuda cores do, the bit rate, and which ones have more RAM bla bla bla all these stuff is really confusing to me I admit. What I would like to know is very simple, which one is the fastest and costs less.

Thanks.
From the minimal words from Adobe's Eric Chan...

https://blog.adobe.com/en/publish/2023/04/18/denoise-demystified

a1fbffc014de4f8e9199f012f69963c4.jpg

I don't know the AMD products but the RTX 3070 has tensor cores. All my attempts at using ACR AI Noise Reduction and come in at sub 10 second runs.

Peter
That's awesome Peter that's pretty darn fast.

I am really really curious as to what an AMD GPU does for this function, does anyone here use one of the most recent faster AMD GPUs? Something such as the RX6800 more or less?
 
There are ARM-based Windows machines.
Actually, that's a good idea. I'd forgotten about the ARM Dev Kit 2023 I used to have; it froze permanently inside the BIOS when I was trying to change the default boot drive. It was very low power usage. It was never meant for consumers, but it was basically a Surface Pro X without the keyboard and display.
I think I'll bail on this thread. It appears that no one will ever supply the data you requested. (No shock there.)
Ephemeris, what you might want is an ARM-based Windows PC like the Surface Pro X. There should be reviews available to tell you if it's suitable.
We do have a surface pro but I think it has an i7 in it and an Iris. It's not much cop even for basic EM modelling but it is splendid to take to customers and show presentations on, pull some mind map in MiRO together and so on.

I didn't know there was an Arm version but I assume that's Qualcomms 8cx series. Doesn't that mean it had the Adreno for graphics? Do you really think that will process images through LR AI NR quickly?
 
Well indeed I suspect that is what drives many people even if they want to claim they are green. More not wanting to spend the earned money.
Maybe I'm profligate, but in my view, the electric bill for my PC usage is a minor fraction of my total living expenses.

They'll pry my corded mouse out of my cold, dead fingers.;-)
Well my bill is manageable for my laptop but if it starts using 500W mean during processing then that fraction is a lot bigger and not insignificant.
 
With regard to the AI noiseware that Adobe's just released.

I've come to the conclusion that my graphics card is definitely due for renewal.

I've currently got an NVIDIA GeForce GT 1030 which came, installed in the computer when I bought it.

It's served me well, albeit a little slowly in some circumstances but now this Adobe business is happening, I've concluded that it's finally got to go.

The question, therefore, is what do I replace it with?

I'll admit to knowing almost nothing about graphics cards but I do know that I'm not about to lay out hundreds on a fancy one if a more moderate one will do a good enough job.

Is there a sensibly priced middling card available, (a hundred pounds or so) armed with enough RAM to get the job done?

I don't need the task completed instantly; I don't know if the file size is relevant, although I suspect that it is. I'm using Z7 files which come out at about 60mb.or more, so if it takes a minute for each one, I'll be happy.

My question, therefore, is; can anyone suggest/recommend a card that'll do what I want please . . . ?

"It's good to be . . . . . . . . . Me!"
I did read in the news that the 3000 series is likely to see a drop in price due to some of the 4000 series getting a drop.

I have just been searching for the article but can't find it. Hopefully another will have seen something similar.
 
Well indeed I suspect that is what drives many people even if they want to claim they are green. More not wanting to spend the earned money.
Maybe I'm profligate, but in my view, the electric bill for my PC usage is a minor fraction of my total living expenses.

They'll pry my corded mouse out of my cold, dead fingers.;-)
Well my bill is manageable for my laptop but if it starts using 500W mean during processing then that fraction is a lot bigger and not insignificant.
Even with a pretty beefy desktop (top-of-the-line CPU and GPU), I'd expect the system to spend most of the time significantly below the maximum power consumption though.

It's during exporting images and constantly adjusting some parameters when the system may be maxed out, and in practice that's not the default while editing. Or at least in my personal usage I tweak something, consider briefly what to do next and then continue, while the system is essentially idle and drops to fairly minimal power consumption. And even while tweaking something specific, I'll often pause briefly to consider if I'm happy with the result so far.
 
Well indeed I suspect that is what drives many people even if they want to claim they are green. More not wanting to spend the earned money.
Maybe I'm profligate, but in my view, the electric bill for my PC usage is a minor fraction of my total living expenses.

They'll pry my corded mouse out of my cold, dead fingers.;-)
Well my bill is manageable for my laptop but if it starts using 500W mean during processing then that fraction is a lot bigger and not insignificant.
Even with a pretty beefy desktop (top-of-the-line CPU and GPU), I'd expect the system to spend most of the time significantly below the maximum power consumption though.

It's during exporting images and constantly adjusting some parameters when the system may be maxed out, and in practice that's not the default while editing. Or at least in my personal usage I tweak something, consider briefly what to do next and then continue, while the system is essentially idle and drops to fairly minimal power consumption. And even while tweaking something specific, I'll often pause briefly to consider if I'm happy with the result so far.
Absolutely understand Mika. I have noticed in this thread people's idle power is pretty high, one was more than most of our laptops use flat out.

It's the usage during a batch of AI NR, and it's exports that's the bit I'm really discussing but that idle power (let's maybe include some emails as idle) looks like some machines use a lot more.

Data appears to be scarce aside the government document I referenced on typical usages.

Keeping the number of drives down to say 1, sticking with 32Gb instead of 64Gb of RAM are all potential areas that save power but I assume the majority of CPU and GPU.

The question started with replacing a laptop with something suitably fast for the NR tools we use. I don't use Adobes' becuase the laptop is too slow. That question looked at laptops and non laptops but it was noticed that where a laptop may have 5A max type C supply the tower had a 750W PSU (I don't know how much lower that gives out but let's assume 700W).
 
Well indeed I suspect that is what drives many people even if they want to claim they are green. More not wanting to spend the earned money.
Maybe I'm profligate, but in my view, the electric bill for my PC usage is a minor fraction of my total living expenses.

They'll pry my corded mouse out of my cold, dead fingers.;-)
Well my bill is manageable for my laptop but if it starts using 500W mean during processing then that fraction is a lot bigger and not insignificant.
Even with a pretty beefy desktop (top-of-the-line CPU and GPU), I'd expect the system to spend most of the time significantly below the maximum power consumption though.

It's during exporting images and constantly adjusting some parameters when the system may be maxed out, and in practice that's not the default while editing. Or at least in my personal usage I tweak something, consider briefly what to do next and then continue, while the system is essentially idle and drops to fairly minimal power consumption. And even while tweaking something specific, I'll often pause briefly to consider if I'm happy with the result so far.
Absolutely understand Mika. I have noticed in this thread people's idle power is pretty high, one was more than most of our laptops use flat out.

It's the usage during a batch of AI NR, and it's exports that's the bit I'm really discussing but that idle power (let's maybe include some emails as idle) looks like some machines use a lot more.

Data appears to be scarce aside the government document I referenced on typical usages.

Keeping the number of drives down to say 1, sticking with 32Gb instead of 64Gb of RAM are all potential areas that save power but I assume the majority of CPU and GPU.

The question started with replacing a laptop with something suitably fast for the NR tools we use. I don't use Adobes' becuase the laptop is too slow. That question looked at laptops and non laptops but it was noticed that where a laptop may have 5A max type C supply the tower had a 750W PSU (I don't know how much lower that gives out but let's assume 700W).
For what it's worth, my 1st generation Ryzen and 1050 Ti GPU idle desktop power consumption (browser and a few applications like GIMP, Darktable etc open but doing nothing) is about 70-75 watts (my UPS reports the power consumption in 5W increments). In principle I'd expect newer CPU's to be slightly more power efficient while idle. For what I've looked about Nvidia's 4070 GPU it seems to consume slightly more power while idling vs. my current GPU, but the difference is fairly small (a few watts rather than dozens of watts).

Edit: Slightly ironically, I've been letting the very same computer to run for a day and half doing some machine learning about Commodore 64 - based songs and synthesizing more similar music for more than a day now, so I guess that's worth a week or two of regular power consumption, with still fairly awful results even if the style is kind of getting recognizable. Oh well.
 
Last edited:
Well indeed I suspect that is what drives many people even if they want to claim they are green. More not wanting to spend the earned money.
Maybe I'm profligate, but in my view, the electric bill for my PC usage is a minor fraction of my total living expenses.

They'll pry my corded mouse out of my cold, dead fingers.;-)
Well my bill is manageable for my laptop but if it starts using 500W mean during processing then that fraction is a lot bigger and not insignificant.
Even with a pretty beefy desktop (top-of-the-line CPU and GPU), I'd expect the system to spend most of the time significantly below the maximum power consumption though.

It's during exporting images and constantly adjusting some parameters when the system may be maxed out, and in practice that's not the default while editing. Or at least in my personal usage I tweak something, consider briefly what to do next and then continue, while the system is essentially idle and drops to fairly minimal power consumption. And even while tweaking something specific, I'll often pause briefly to consider if I'm happy with the result so far.
Absolutely understand Mika. I have noticed in this thread people's idle power is pretty high, one was more than most of our laptops use flat out.

It's the usage during a batch of AI NR, and it's exports that's the bit I'm really discussing but that idle power (let's maybe include some emails as idle) looks like some machines use a lot more.

Data appears to be scarce aside the government document I referenced on typical usages.

Keeping the number of drives down to say 1, sticking with 32Gb instead of 64Gb of RAM are all potential areas that save power but I assume the majority of CPU and GPU.

The question started with replacing a laptop with something suitably fast for the NR tools we use. I don't use Adobes' becuase the laptop is too slow. That question looked at laptops and non laptops but it was noticed that where a laptop may have 5A max type C supply the tower had a 750W PSU (I don't know how much lower that gives out but let's assume 700W).
For what it's worth, my 1st generation Ryzen and 1050 Ti GPU idle desktop power consumption (browser and a few applications like GIMP, Darktable etc open but doing nothing) is about 70-75 watts (my UPS reports the power consumption in 5W increments). In principle I'd expect newer CPU's to be slightly more power efficient while idle. For what I've looked about Nvidia's 4070 GPU it seems to consume slightly more power while idling vs. my current GPU, but the difference is fairly small (a few watts rather than dozens of watts).
Hey Mika. That's helpful.

The 4070 is probably a bit expensive for me at the moment, they look to be around £600 and upwards.

They read well of course.

I looked at these from Dell which have the 4070, one with an 18" screen. The one on the left is a type C charger one on the right is one of those specific to Dell. One on the right weighs over 4kg.

Not sure if this means it can keep running at a high workload longer, bigver, more fans and the like.



b387a659c0184fe1997e63f82e4d3690.jpg
 
We do have a surface pro but I think it has an i7 in it and an Iris. It's not much cop even for basic EM modelling but it is splendid to take to customers and show presentations on, pull some mind map in MiRO together and so on.

I didn't know there was an Arm version but I assume that's Qualcomms 8cx series. Doesn't that mean it had the Adreno for graphics? Do you really think that will process images through LR AI NR quickly?
No, not at all, I was only thinking of power consumption at that point. I had the ARM Dev Kit version of the Surface Pro X; it had fairly decent performance with native code, but running the x64 emulation seemed to take a big hit.

For high performance in a laptop, I'd probably look for something like the big Alienware you show in your post, but I'm not very familiar with powerful laptops. Someone else here is likely to know more about them than I.
 
Well indeed I suspect that is what drives many people even if they want to claim they are green. More not wanting to spend the earned money.
Maybe I'm profligate, but in my view, the electric bill for my PC usage is a minor fraction of my total living expenses.

They'll pry my corded mouse out of my cold, dead fingers.;-)
Well my bill is manageable for my laptop but if it starts using 500W mean during processing then that fraction is a lot bigger and not insignificant.
Even with a pretty beefy desktop (top-of-the-line CPU and GPU), I'd expect the system to spend most of the time significantly below the maximum power consumption though.

It's during exporting images and constantly adjusting some parameters when the system may be maxed out, and in practice that's not the default while editing. Or at least in my personal usage I tweak something, consider briefly what to do next and then continue, while the system is essentially idle and drops to fairly minimal power consumption. And even while tweaking something specific, I'll often pause briefly to consider if I'm happy with the result so far.
Absolutely understand Mika. I have noticed in this thread people's idle power is pretty high, one was more than most of our laptops use flat out.

It's the usage during a batch of AI NR, and it's exports that's the bit I'm really discussing but that idle power (let's maybe include some emails as idle) looks like some machines use a lot more.

Data appears to be scarce aside the government document I referenced on typical usages.

Keeping the number of drives down to say 1, sticking with 32Gb instead of 64Gb of RAM are all potential areas that save power but I assume the majority of CPU and GPU.

The question started with replacing a laptop with something suitably fast for the NR tools we use. I don't use Adobes' becuase the laptop is too slow. That question looked at laptops and non laptops but it was noticed that where a laptop may have 5A max type C supply the tower had a 750W PSU (I don't know how much lower that gives out but let's assume 700W).
For what it's worth, my 1st generation Ryzen and 1050 Ti GPU idle desktop power consumption (browser and a few applications like GIMP, Darktable etc open but doing nothing) is about 70-75 watts (my UPS reports the power consumption in 5W increments). In principle I'd expect newer CPU's to be slightly more power efficient while idle. For what I've looked about Nvidia's 4070 GPU it seems to consume slightly more power while idling vs. my current GPU, but the difference is fairly small (a few watts rather than dozens of watts).
Hey Mika. That's helpful.

The 4070 is probably a bit expensive for me at the moment, they look to be around £600 and upwards.

They read well of course.

I looked at these from Dell which have the 4070, one with an 18" screen. The one on the left is a type C charger one on the right is one of those specific to Dell. One on the right weighs over 4kg.

Not sure if this means it can keep running at a high workload longer, bigver, more fans and the like.

b387a659c0184fe1997e63f82e4d3690.jpg
I am also looking at a new laptop. I noticed to my disappointment that the new Dell XPS 17" has limited the power for the GPUs to 60 W, which is well below the RTX 4070 200W TDP. Does not look very good. Maybe that is typical for a non-professional laptop. I have an old Dell Precision 7730 (with a Quadro P3200), which needs to be replaced. The AI denoise in ACR on my 45mp 7Z files takes almost 2 min.

I am currently leaning towards at a new Precision 7780. Still need to check if Dell has limited the power of the GPUs.

--
Kind regards
Kaj
http://www.pbase.com/kaj_e
WSSA member #13
It's about time we started to take photography seriously and treat it as a hobby.- Elliott Erwitt
 
Last edited:
I am also looking at a new laptop. I noticed to my disappointment that the new Dell XPS 17" has limited the power for the GPUs to 60 W, which is well below the RTX 4070 200W TDP. Does not look very good. Maybe that is typical for a non-professional laptop. I have an old Dell Precision 7730 (with a Quadro P3200), which needs to be replaced. The AI denoise in ACR on my 45mp 7Z files takes almost 2 min.

I am currently leaning towards at a new Precision 7780. Still need to check if Dell has limited the power of the GPUs.
200W is the TDP of a desktop RTX 4070 . The mobile 4070 is spec'd at 115W.

I have no clue about the performance hit in limiting the mobile 4070 to 60W.
 
Well indeed I suspect that is what drives many people even if they want to claim they are green. More not wanting to spend the earned money.
Maybe I'm profligate, but in my view, the electric bill for my PC usage is a minor fraction of my total living expenses.

They'll pry my corded mouse out of my cold, dead fingers.;-)
Well my bill is manageable for my laptop but if it starts using 500W mean during processing then that fraction is a lot bigger and not insignificant.
Even with a pretty beefy desktop (top-of-the-line CPU and GPU), I'd expect the system to spend most of the time significantly below the maximum power consumption though.

It's during exporting images and constantly adjusting some parameters when the system may be maxed out, and in practice that's not the default while editing. Or at least in my personal usage I tweak something, consider briefly what to do next and then continue, while the system is essentially idle and drops to fairly minimal power consumption. And even while tweaking something specific, I'll often pause briefly to consider if I'm happy with the result so far.
Absolutely understand Mika. I have noticed in this thread people's idle power is pretty high, one was more than most of our laptops use flat out.

It's the usage during a batch of AI NR, and it's exports that's the bit I'm really discussing but that idle power (let's maybe include some emails as idle) looks like some machines use a lot more.

Data appears to be scarce aside the government document I referenced on typical usages.

Keeping the number of drives down to say 1, sticking with 32Gb instead of 64Gb of RAM are all potential areas that save power but I assume the majority of CPU and GPU.

The question started with replacing a laptop with something suitably fast for the NR tools we use. I don't use Adobes' becuase the laptop is too slow. That question looked at laptops and non laptops but it was noticed that where a laptop may have 5A max type C supply the tower had a 750W PSU (I don't know how much lower that gives out but let's assume 700W).
For what it's worth, my 1st generation Ryzen and 1050 Ti GPU idle desktop power consumption (browser and a few applications like GIMP, Darktable etc open but doing nothing) is about 70-75 watts (my UPS reports the power consumption in 5W increments). In principle I'd expect newer CPU's to be slightly more power efficient while idle. For what I've looked about Nvidia's 4070 GPU it seems to consume slightly more power while idling vs. my current GPU, but the difference is fairly small (a few watts rather than dozens of watts).
Hey Mika. That's helpful.

The 4070 is probably a bit expensive for me at the moment, they look to be around £600 and upwards.

They read well of course.

I looked at these from Dell which have the 4070, one with an 18" screen. The one on the left is a type C charger one on the right is one of those specific to Dell. One on the right weighs over 4kg.

Not sure if this means it can keep running at a high workload longer, bigver, more fans and the like.

b387a659c0184fe1997e63f82e4d3690.jpg
I am also looking at a new laptop. I noticed to my disappointment that the new Dell XPS 17" has limited the power for the GPUs to 60 W, which is well below the RTX 4070 200W TDP. Does not look very good. Maybe that is typical for a non-professional laptop. I have an old Dell Precision 7730 (with a Quadro P3200), which needs to be replaced. The AI denoise in ACR on my 45mp 7Z files takes almost 2 min.

I am currently leaning towards at a new Precision 7780. Still need to check if Dell has limited the power of the GPUs.

--
Kind regards
Kaj
http://www.pbase.com/kaj_e
WSSA member #13
It's about time we started to take photography seriously and treat it as a hobby.- Elliott Erwitt


All very helpful Kaj. My laptop is a 6800, Xeon 8 core, 32gb Ram and a GFX card (forget which one) plus a 512tb SSD which was fitted from new.

It's a big heavy machine but has been fantastic.
 
From the minimal words from Adobe's Eric Chan...

https://blog.adobe.com/en/publish/2023/04/18/denoise-demystified

a1fbffc014de4f8e9199f012f69963c4.jpg

I don't know the AMD products but the RTX 3070 has tensor cores. All my attempts at using ACR AI Noise Reduction and come in at sub 10 second runs.

Peter
Usually, AI technology makes use of CUDA cores of NVIDIA cards. AMD has ROCm. I do not know if both technologies are compatible and which one is better.
There's dedicated AI cores in NVIDIA (RTX) and Apple Silicon that speed it up a lot. Think of it like a hardware accelerating video encoder which with Intel iGPUs let's my Surface Pro 6 export a video at only half the speed of an RTX 3060Ti even though the general purpose GPU power is next to nothing compared to it.
Currently Lightroom Denoise AI cannot use the Apple Neural Engine because of an Apple problem. See this whole post that explains it:

https://www.dpreview.com/forums/post/67022937

You will also note that DXO tells people using Ventura (Monterey is okay) to use the slower GPU rather than the Neural Engine because of an Apple problem. I do not know if it is the same or different than the problem Adobe discovered.
The one interesting thing about that blurb PMB wrote is that it doesn't mention specific support for Intel's AI cores in their ARC GPUs. Those in some ways are the most powerful for their price but this is where they run into issues with developers not optimizing for them and on the other end Intel's own drivers not always being that great.

Back to the OP I can say an RTX 3060Ti is enough for it. I'd wait a little bit since in the next few weeks the 4060Ti is likely coming out so it'll either be a better buy or push the 3060Ti cards lower.

The 3060 especially refurbished is getting to be a decent deal and the added VRAM could help long term usability though I can't say how much of a performance hit it'll have for Lightroom.
 
My XPS 17 is powered via the USB_C port, not a dedicated power supply input. Even at higher than the USB spec power delivery, they can have trouble keeping up with heavy use - leading to battery drain. The Alienware might have a separate dedicated power supply input which might keep up when externally powered. I'd wonder about run time on battery alone with the most capable CPUs and GPUs.
 
I'd wonder about run time on battery alone with the most capable CPUs and GPUs.
The 100Wh capacity limit the FAA has imposed on Li-ion batteries for flight pretty much guarantees a short battery runtime for these powerful laptops.
 

Keyboard shortcuts

Back
Top