Desktop PC processing large numbers of images

Ephemeris

Veteran Member
Messages
6,916
Solutions
2
Reaction score
1,292
Location
West Yorkshire, UK
Hi folks.

We often find ourselves processing a large number of images. The source is some canon Eos R5 cameras. Usually shoot JPEG + RAW and often just process the JPEGS.

Images are often at ISO 6400 or higher.

They are then processed in Topaz Denoise AI (will process jpeg and raw).

It is extremely time consuming to batch process 400 to 600 images per day.

Is there a way to process over a cluster?

If we were to build a new PC to speed this job up what should we be looking at building? (It must run Windows or a variant of Linux)

Many thanks.
 
You may get better advice if you post what sort of PC hardware you're using now.

And: why are you shooting RAWs, if you're not using them? Just curious.
 
Thanks for your help.

I do use the RAW images and they all get stored and committed for any future use. However, if the JPEG will suffice then I normally opt to use them. If not it's back to the RAW.

I'm on my laptop today and that's

Intel I7 4810 3.2Ghz (Xeon family)

32Gb Ram

1TB HD

An internal GFX which is Intel 4600

A second GFX which is Nvidia quadro 4100m

We have some tower PCs used for simulations, a virtual machine and a cluster built by NVidia for simulations. Both simulation systems sit on a Linux platform, the laptops, towers and VM are Windows.
 
Thanks for your help.

I do use the RAW images and they all get stored and committed for any future use. However, if the JPEG will suffice then I normally opt to use them. If not it's back to the RAW.

I'm on my laptop today and that's

Intel I7 4810 3.2Ghz (Xeon family)

32Gb Ram

1TB HD

An internal GFX which is Intel 4600

A second GFX which is Nvidia quadro 4100m

We have some tower PCs used for simulations, a virtual machine and a cluster built by NVidia for simulations. Both simulation systems sit on a Linux platform, the laptops, towers and VM are Windows.
That's a fairly old laptop. (2014?) It should be possible to get a significant boost using a more recent CPU and GPU. (Topaz products benefit from GPU acceleration more than Adobe Photoshop and Lightroom, apparently. I think that there are diminishing returns for GPUs better than an nVidia RTX 3060, though.)

Topaz software is written for Windows and MacOS only. There is an old thread on benchmarks for Topaz Sharpen AI. PC benches Newer versions of Sharpen give longer timings (but may give superior results).

Clusters are far outside my experience. I'm an amateur, and derive no income from photography or computing.

The obvious thing to try would be a desktop with a better CPU and GPU. Maybe you'd get a factor of 2 or better in performance over the laptop.

It'd be better if someone with more technical expertise could advise you.
 
Thanks for your help.

I do use the RAW images and they all get stored and committed for any future use. However, if the JPEG will suffice then I normally opt to use them. If not it's back to the RAW.

I'm on my laptop today and that's

Intel I7 4810 3.2Ghz (Xeon family)

32Gb Ram

1TB HD

An internal GFX which is Intel 4600

A second GFX which is Nvidia quadro 4100m

We have some tower PCs used for simulations, a virtual machine and a cluster built by NVidia for simulations. Both simulation systems sit on a Linux platform, the laptops, towers and VM are Windows.
That's a fairly old laptop. (2014?) It should be possible to get a significant boost using a more recent CPU and GPU. (Topaz products benefit from GPU acceleration more than Adobe Photoshop and Lightroom, apparently. I think that there are diminishing returns for GPUs better than an nVidia RTX 3060, though.)

Topaz software is written for Windows and MacOS only. There is an old thread on benchmarks for Topaz Sharpen AI. PC benches Newer versions of Sharpen give longer timings (but may give superior results).

Clusters are far outside my experience. I'm an amateur, and derive no income from photography or computing.

The obvious thing to try would be a desktop with a better CPU and GPU. Maybe you'd get a factor of 2 or better in performance over the laptop.

It'd be better if someone with more technical expertise could advise you.
Hey Bob thanks for taking the time to reply.

The laptop is 2017 and it's one of our best for simulations at the moment. We have others almost identical but with a slightly new GFX card.

Wth a decent desktop I could use it remotely (it could be rack mounted but be handy to have a powerful machine with a screen in the office).

The gfx card you mention doesn't look silly expensive. Maybe we could try that in one of our current machines and see what we find.

Really appreciate Bob
 
Follow the link provided in the previous post. Download the raw images and process them as described. Time the length of the processing as described. Supply your results. I'll do the same and we can compare the difference in processing time and hardware.
 
Thanks for your help.

I do use the RAW images and they all get stored and committed for any future use. However, if the JPEG will suffice then I normally opt to use them. If not it's back to the RAW.

I'm on my laptop today and that's

Intel I7 4810 3.2Ghz (Xeon family)

32Gb Ram

1TB HD

An internal GFX which is Intel 4600

A second GFX which is Nvidia quadro 4100m
If that's the machine you're using you've got multiple weaknesses. The simplest upgrade would be to replace the HDD with a SATA SSD.

But that would just be the proverbial lipstick on a pig. 4Th gen CPU when we've had desktop 12th gen for quite awhile and might get 13th soon. Your ram is slow. Your Quadro is I think four generations out of date and soon to be five.

A new desktop would be the smarter investment.
 
Thanks for your help.

I do use the RAW images and they all get stored and committed for any future use. However, if the JPEG will suffice then I normally opt to use them. If not it's back to the RAW.

I'm on my laptop today and that's

Intel I7 4810 3.2Ghz (Xeon family)

32Gb Ram

1TB HD

An internal GFX which is Intel 4600

A second GFX which is Nvidia quadro 4100m
If that's the machine you're using you've got multiple weaknesses. The simplest upgrade would be to replace the HDD with a SATA SSD.

But that would just be the proverbial lipstick on a pig. 4Th gen CPU when we've had desktop 12th gen for quite awhile and might get 13th soon. Your ram is slow. Your Quadro is I think four generations out of date and soon to be five.

A new desktop would be the smarter investment.
Thanks for your reply.

The machine has a fast samsung SATA SSD currently.

We do have (well one of my guys) a Microsoft suface 4 with a i7-1185G7 and 32Gb ram. Mainly used for modifying vehicle harnesses on the fly. That won't really run the simulation software so well and trying to use Topaz is, if anything slower as it backs off when it gets got.

So, going back to the original question. What should I look at when building a desktop?
 
Thanks for your help.

I do use the RAW images and they all get stored and committed for any future use. However, if the JPEG will suffice then I normally opt to use them. If not it's back to the RAW.

I'm on my laptop today and that's

Intel I7 4810 3.2Ghz (Xeon family)

32Gb Ram

1TB HD

An internal GFX which is Intel 4600

A second GFX which is Nvidia quadro 4100m
If that's the machine you're using you've got multiple weaknesses. The simplest upgrade would be to replace the HDD with a SATA SSD.

But that would just be the proverbial lipstick on a pig. 4Th gen CPU when we've had desktop 12th gen for quite awhile and might get 13th soon. Your ram is slow. Your Quadro is I think four generations out of date and soon to be five.

A new desktop would be the smarter investment.
Thanks for your reply.

The machine has a fast samsung SATA SSD currently.

We do have (well one of my guys) a Microsoft suface 4 with a i7-1185G7 and 32Gb ram. Mainly used for modifying vehicle harnesses on the fly. That won't really run the simulation software so well and trying to use Topaz is, if anything slower as it backs off when it gets got.

So, going back to the original question. What should I look at when building a desktop?
Do you have a budget?

A high-end desktop might use an Intel I9-12900K CPU, Z690 motherboard, 32GB (2 X 16GB) of DDR5 RAM, RTX 3070 ti graphics card. For example:

PCPartPicker

Just under 2k with no case, power supply, storage, etc.

Please don't take this as a recipe. Just an example.

I can't guess how much improvement you'd get over one of you better desktops, but it ought to be much better than the laptop you've mentioned.
 
Last edited:

Doesn't directly answer your question but it's basically pointing out the GPU will matter most.

I'd aim for a system with a good mid range CPU. I7k should be fine. Reasonable amount of ram. NVME drives. A big beefy high quality PSU. A good case that breathes well. The best GPU you can find and afford.
 
Hi Bob.

Thank you that is really helpful and kind of you.

A budget is an interesting topic becuase with a suitable GFC card it may also double up as a stand alone simulation machine (Mainly 3D EM tools and some multiphysics from Ansys).

The GFX cars you chose - I looked at a few other and then looked at the details of the 3090. Wow what a price. Does this really give some significant performance improvements?
 
Hi Bob.

Thank you that is really helpful and kind of you.

A budget is an interesting topic becuase with a suitable GFC card it may also double up as a stand alone simulation machine (Mainly 3D EM tools and some multiphysics from Ansys).

The GFX cars you chose - I looked at a few other and then looked at the details of the 3090. Wow what a price. Does this really give some significant performance improvements?
The benefits of a RTX 3090 vs, say, a RTX 3060 for your simulation software are likely hard to quantify unless you can find specific benchmarks online for it.

The original purpose of high-end graphics cards was playing demanding PC video games; there are plenty of comparison benchmarks available for those, which may or may not be helpful in interpreting their usefulness for you.

For high-end gaming, the more expensive cards can certainly give substantial improvements if their requirements for more power and cooling are met.
 
The 3090 is basically the workstation A6000 with less ram and overclocked. Plus it draws more power.

If he can find benchmarks of the A6000 the 3090 might be faster if the lower VRAM isn't an issue.
 
Hi folks.

We often find ourselves processing a large number of images. The source is some canon Eos R5 cameras. Usually shoot JPEG + RAW and often just process the JPEGS.

Images are often at ISO 6400 or higher.

They are then processed in Topaz Denoise AI (will process jpeg and raw).

It is extremely time consuming to batch process 400 to 600 images per day.

Is there a way to process over a cluster?

If we were to build a new PC to speed this job up what should we be looking at building? (It must run Windows or a variant of Linux)

Many thanks.
I just finished building a pc for my cousin based on the spec of a $2,500 Dell PC. My total cost is $1,775. The only difference between the two is mine runs on Wins 10 Pro vs. Wins 11 installed in the Dell. The one I built is ready for Intel i9 processor and Wins 11 updates.

The attached photo doesn’t show a $75 Corsair mid tower (model 4000D, included in the total cost).

The Samsung 980 Pro 1 TB SSD will be the boot drive.

df8bdd870f124031a08dda78534220d4.jpg

--
https://flickr.com/photos/10121023@N08/albums
 
Last edited:
https://www.pugetsystems.com/blog/2021/02/24/Consultant-s-Corner---Why-I-Turned-Down-a-Sale-2073/

Doesn't directly answer your question but it's basically pointing out the GPU will matter most.

I'd aim for a system with a good mid range CPU. I7k should be fine. Reasonable amount of ram. NVME drives. A big beefy high quality PSU. A good case that breathes well. The best GPU you can find and afford.
Interesting. The RTX 2080 has 2944 CUDA cores. The 12GB 3080, 8960. I'd forgotten that the difference was that large.

RTX 3070 ti, 6144 cores. RTX 3060 ti, 4864.

And the RTX 3090 ti, 10752.
 
Hi Bob.

Thank you that is really helpful and kind of you.

A budget is an interesting topic becuase with a suitable GFC card it may also double up as a stand alone simulation machine (Mainly 3D EM tools and some multiphysics from Ansys).

The GFX cars you chose - I looked at a few other and then looked at the details of the 3090. Wow what a price. Does this really give some significant performance improvements?
It appears that the main thing you get with a 3090 over, say, a 3080, is twice the VRAM. I doubt that would help Topaz applications, but I don't know that for certain.

If the most significant criterion is CUDA cores, the 3090 doesn't have a lot more than a 3080 (10752 vs. 8960 for the 12GB 3080).
 
The 3090 is basically the workstation A6000 with less ram and overclocked. Plus it draws more power.

If he can find benchmarks of the A6000 the 3090 might be faster if the lower VRAM isn't an issue.
I don't normally pay attention to workstation cards. Too rich for my blood. At Newegg, the A6000 is a bit less than $5k US, compared to $2k for a mass-market (?) RTX 3090 ti.

I wondered how a 300W card could get by with a single 8 pin auxiliary power connector. The answer is that it's not a PCI-E connector, but rather an EPS 12V one (as used to supply power to a high-powered CPU).
 
Hi Bob.

Thank you that is really helpful and kind of you.

A budget is an interesting topic becuase with a suitable GFC card it may also double up as a stand alone simulation machine (Mainly 3D EM tools and some multiphysics from Ansys).

The GFX cars you chose - I looked at a few other and then looked at the details of the 3090. Wow what a price. Does this really give some significant performance improvements?
The benefits of a RTX 3090 vs, say, a RTX 3060 for your simulation software are likely hard to quantify unless you can find specific benchmarks online for it.

The original purpose of high-end graphics cards was playing demanding PC video games; there are plenty of comparison benchmarks available for those, which may or may not be helpful in interpreting their usefulness for you.

For high-end gaming, the more expensive cards can certainly give substantial improvements if their requirements for more power and cooling are met.
Thanks Austinian
 

Keyboard shortcuts

Back
Top