Best video card for still photo AI in the $300 to $500 range

Wayne Larmon

Forum Pro
Messages
10,998
Solutions
1
Reaction score
1,156
Location
Upstate New York, US
I know that this has been discussed, but searching DPReview isn't giving me the exact answer I need.

I just started the process of buying a new compute that will be built by a local system integrator. I want to find out what is the best video card/GPU for photo AI. Right now, the fastest computer I have that is capable of running the most recent version of Photoshop is my laptop that only has motherboard video. Adobe AI denoise takes 4-5 minutes to denoise 20 megapixel Canon 6D images. I can't even try any Topaz programs because they require a real GPU.

I tried to Google research video cards but this is confusing because so many sites that test video GPU cards are oriented towards gaming. Ditto for Amazon user reviews under video cards. Right now I'm only interested in still photo processing. And maybe looking at doing local LLM (or other) training. I haven't done any video processing but I'll probably need to do simple "delete the dull stuff" and "join video snippets into a single video" type editing. From smartphone videos.

I don't do any gaming and have no interest in complicated video editing. Nor in high end CAD.

After doing an evening's research I am looking at

ASUS ProArt GeForce RTX™ 4060 Ti 16GB OC Edition $499.99

mainly because it has 16GB of VRAM.

I really don't want to go above $500.

TIA.

Wayne
 
Last edited:
I know that this has been discussed, but searching DPReview isn't giving me the exact answer I need.

I just started the process of buying a new compute that will be built by a local system integrator. I want to find out what is the best video card/GPU for photo AI. Right now, the fastest computer I have that is capable of running the most recent version of Photoshop is my laptop that only has motherboard video. Adobe AI denoise takes 4-5 minutes to denoise 20 megapixel Canon 6D images. I can't even try any Topaz programs because they require a real GPU.

I tried to Google research video cards but this is confusing because so many sites that test video GPU cards are oriented towards gaming. Ditto for Amazon user reviews under video cards. Right now I'm only interested in still photo processing. And maybe looking at doing local LLM (or other) training. I haven't done any video processing but I'll probably need to do simple "delete the dull stuff" and "join video snippets into a single video" type editing. From smartphone videos.

I don't do any gaming and have no interest in complicated video editing. Nor in high end CAD.

After doing an evening's research I am looking at

ASUS ProArt GeForce RTX™ 4060 Ti 16GB OC Edition $499.99

mainly because it has 16GB of VRAM.

I really don't want to go above $500.

TIA.

Wayne
Get the 8GB card. Those apps will not make use of 16GB of VRAM.
 
For the money, nice choice of you're going to need the VRAM.
I'm still fuzzy on which parameters are important for different tasks. I *think* I remember reading that for training LLMs or adding a style to Stable Diffusion the amount of VRAM is critical. Sort of like 16 gigs is entry level. But I don't know how important the amount of VRAM is for things like AI denoising and generative fill operations.

I think that fast AI denoising is most important to me right now. I want to be able to use one of my DSLRs or mirrorless camera to shoot in ambient light without flash (like I can do now with my Pixel 8 Pro) by cranking the ISO high enough to get a usable shutter speed. And then denoising the whole shoot as a single batch operation. Any other AI operations would probably be on a single image basis (generative fill, etc.)

My Pixel 8 Pro is great as far as it goes, but I now realize that its image quality can only go so far. I can't uprez 12 megapixel JPEGs, even using ACR AI uprezzing. But I know I can uprez ILC raw files and get meaningful improvements. (I wouldn't be denoising high ISO images that I want to uprez; I'd only uprez low ISO images.)

For AI denoising, I don't know if I need a $500 16 GB card, or if a $300 card (with less VRAM) would do. Right now, training LLMs and Stable Diffusion are a low priority.

Wayne
 
For the money, nice choice of you're going to need the VRAM.
I'm still fuzzy on which parameters are important for different tasks. I *think* I remember reading that for training LLMs or adding a style to Stable Diffusion the amount of VRAM is critical. Sort of like 16 gigs is entry level. But I don't know how important the amount of VRAM is for things like AI denoising and generative fill operations.

I think that fast AI denoising is most important to me right now. I want to be able to use one of my DSLRs or mirrorless camera to shoot in ambient light without flash (like I can do now with my Pixel 8 Pro) by cranking the ISO high enough to get a usable shutter speed. And then denoising the whole shoot as a single batch operation. Any other AI operations would probably be on a single image basis (generative fill, etc.)

My Pixel 8 Pro is great as far as it goes, but I now realize that its image quality can only go so far. I can't uprez 12 megapixel JPEGs, even using ACR AI uprezzing. But I know I can uprez ILC raw files and get meaningful improvements. (I wouldn't be denoising high ISO images that I want to uprez; I'd only uprez low ISO images.)

For AI denoising, I don't know if I need a $500 16 GB card, or if a $300 card (with less VRAM) would do. Right now, training LLMs and Stable Diffusion are a low priority.

Wayne
You can see the denoise time in LrC or ACR for a 60MP fie in this link (roll down the page to the table):

https://www.fredmiranda.com/forum/topic/1804640/17

For your 20 MP Canon 6D files expect something like 1/3 -1/2 of the times for the 60MP file in the table.

To future proof the choice pic the best card that fits you budget.

I think the 4070 Ti you have posted is a great choice.

--
Kind regards
Kaj
http://www.pbase.com/kaj_e
WSSA member #13
It's about time we started to take photography seriously and treat it as a hobby.- Elliott Erwitt
 
Last edited:
I know that this has been discussed, but searching DPReview isn't giving me the exact answer I need.

I just started the process of buying a new compute that will be built by a local system integrator. I want to find out what is the best video card/GPU for photo AI. Right now, the fastest computer I have that is capable of running the most recent version of Photoshop is my laptop that only has motherboard video. Adobe AI denoise takes 4-5 minutes to denoise 20 megapixel Canon 6D images. I can't even try any Topaz programs because they require a real GPU.

I tried to Google research video cards but this is confusing because so many sites that test video GPU cards are oriented towards gaming. Ditto for Amazon user reviews under video cards. Right now I'm only interested in still photo processing. And maybe looking at doing local LLM (or other) training. I haven't done any video processing but I'll probably need to do simple "delete the dull stuff" and "join video snippets into a single video" type editing. From smartphone videos.

I don't do any gaming and have no interest in complicated video editing. Nor in high end CAD.

After doing an evening's research I am looking at

ASUS ProArt GeForce RTX™ 4060 Ti 16GB OC Edition $499.99

mainly because it has 16GB of VRAM.

I really don't want to go above $500.

TIA.

Wayne
For photo editing only I would save some money or put that saved into a better motherboard, CPU and DDR, Monitor etc.

8GB would be plenty for Photo edding, perhaps something like this that has a similar clock speed.

 
I think that fast AI denoising is most important to me right now.
As others have said, a 4070 is baseline. It's roughly 30% of "everything" over the rest of the field. Start scheming for Black Friday or redefine "fast".
 
Training LLMs requires a huge amount of computational power when there are a lot of parameters. I take an existing model and add data (like user manuals) to it and it takes a bit of time while also using a lot of the GPU.

For photo processing, Adobe and Topaz "recommend" a GPU with 8GB of VRAM if you plan to use all of their features that take advantage of a GPU. They provide little to no information on exactly how much video memory something like AI denoise uses, probably because there are so many variables such as the amount of data in a photo, the ISO of the photo, etc.

A 3xxx series or newer GPU with 8GB of VRAM (or more) should provide you with some future proofing. You could purchase a used RTX 3060 for much less than your budget and see a really big increase in AI denoise speed as using a CPU to do this is much slower.

Anyway, that's why I commented that the 4060 Ti you listed looked good as it has pretty good processing power while also having a lot of VRAM.
 
I don't think objective tests bear out any benefit for any purpose of 16 vs 8 gb VRAM in the 4060, bigger is not always better. The reason is that the memory bandwidth of that GPU is too small/slow to effectively use the extra RAM, at least in gaming. I would bet the same holds for Photoshop and video rendering.

If you search for Puget Sound's own test results you can read for yourself the differences, or lack thereof, between results over a wide range of GPUs. PS is still lightly threaded and responds more to CPU than GPU horsepower. The only process that effectively uses the GPU seems to be the noise reducer in the ACR. User perception of Photoshop throughput is affected by I/O and internet speeds.

The Puget LR script is still beta. You can sort through the results uploaded from users. Scores seem all over the place with similar gear. I don't use LR so I don't care.

In my experience Topaz, eg Sharpen, is the most GPU intensive image processing program. My experience with serial/stepwise upgrades is that Topaz, while requiring at least an entry lever GPU, responds more to a CPU than GPU upgrade. For reasons I can not explain on my two comparable boxes, one with an nVidia 3060 and one with an AMD 6600, Topaz sharpen runs markedly faster on the AMD. In any event if you watch the graphs in task manager of CPU/GPU use it seems the GPU kind of lopes along compared to how the CPU is driven--it does on my desktops.

If you render video in a Cuda dependent program there is scaling reported with nVidia horsepower.
 
Training LLMs requires a huge amount of computational power when there are a lot of parameters. I take an existing model and add data (like user manuals) to it and it takes a bit of time while also using a lot of the GPU.

For photo processing, Adobe and Topaz "recommend" a GPU with 8GB of VRAM if you plan to use all of their features that take advantage of a GPU. They provide little to no information on exactly how much video memory something like AI denoise uses, probably because there are so many variables such as the amount of data in a photo, the ISO of the photo, etc.

A 3xxx series or newer GPU with 8GB of VRAM (or more) should provide you with some future proofing. You could purchase a used RTX 3060 for much less than your budget and see a really big increase in AI denoise speed as using a CPU to do this is much slower.

Anyway, that's why I commented that the 4060 Ti you listed looked good as it has pretty good processing power while also having a lot of VRAM.
The

ASUS Dual NVIDIA GeForce RTX 3060 V2 OC Edition 12GB GDDR6 ~$300 new; ~$223 used

is looking good right now. It should speed up AI denoising enough so that it is usable for batch processing keepers. I'm not a pro so I'm not going to be doing a lot of image processing.

I think that 12 GB should be enough to let me dip my toes into LLM type training. I've read that it is usually cheaper to rent time in the cloud than it does to splurge out on a high end consumer GPU if you are doing heavier training. (Again, I have no experience training LLMs; I want to get my feet wet.)

Thanks to all that have responded!

Wayne
 
Training LLMs requires a huge amount of computational power when there are a lot of parameters. I take an existing model and add data (like user manuals) to it and it takes a bit of time while also using a lot of the GPU.
Yeah, it's important to understand that the computational requirements for training an AI neural net are much greater than the requirements for using the net once it's been developed. Don't let the requirements for the former dictate what you need for the latter.
 
Training LLMs requires a huge amount of computational power when there are a lot of parameters. I take an existing model and add data (like user manuals) to it and it takes a bit of time while also using a lot of the GPU.

For photo processing, Adobe and Topaz "recommend" a GPU with 8GB of VRAM if you plan to use all of their features that take advantage of a GPU. They provide little to no information on exactly how much video memory something like AI denoise uses, probably because there are so many variables such as the amount of data in a photo, the ISO of the photo, etc.

A 3xxx series or newer GPU with 8GB of VRAM (or more) should provide you with some future proofing. You could purchase a used RTX 3060 for much less than your budget and see a really big increase in AI denoise speed as using a CPU to do this is much slower.

Anyway, that's why I commented that the 4060 Ti you listed looked good as it has pretty good processing power while also having a lot of VRAM.
The

ASUS Dual NVIDIA GeForce RTX 3060 V2 OC Edition 12GB GDDR6 ~$300 new; ~$223 used

is looking good right now. It should speed up AI denoising enough so that it is usable for batch processing keepers. I'm not a pro so I'm not going to be doing a lot of image processing.

I think that 12 GB should be enough to let me dip my toes into LLM type training. I've read that it is usually cheaper to rent time in the cloud than it does to splurge out on a high end consumer GPU if you are doing heavier training. (Again, I have no experience training LLMs; I want to get my feet wet.)

Thanks to all that have responded!

Wayne
The 3060 12GB was on my radar as well as the 4060, for me it was a toss up between the two.

I think you will be happy with either, spend the money saved on a better cpu or ram etc.
 
Why would a photographer who doesn't play games buy Nvidia cards when AMD is cheaper?

(Why would anyone, actually?)
 
Last edited:
Why would a photographer who doesn't play games buy Nvidia cards when AMD is cheaper?

(Why would anyone, actually?)
Because I also wrote:
Right now I'm only interested in still photo processing. And maybe looking at doing local LLM (or other) training.
(Emphasis added). AFAIK, AI (sic) training mostly runs on Nvidia, not on AMD.

However, because of reported problems with new Intel processors, I'll be getting an AMD processor in my new machine.

Based mostly on the advice from this thread, I've settled on NVIDIA GeForce RTX 3060 V2 OC Edition 12G.

Wayne
 
Last edited:
Why would a photographer who doesn't play games buy Nvidia cards when AMD is cheaper?

(Why would anyone, actually?)
Maybe said photographer wishes to use software that requires nVidia proprietary extensions (like Cuda).

Or the photog has run afoul of software whose authors had issues with AMD drivers. (I believe Affinity had that, a few years ago.)

Or the photog feels that buying nVidia is safer.

I'm not any one of those, but my current cards are nVidia. I had a 6900 XT, and was impressed that the AMD control panel at the time had a better interface than anything nVidia has produced so far.

My primary CPU at the moment is an AMD Threadripper.
 

Keyboard shortcuts

Back
Top