Snap Happy
Senior Member
I have a pretty powerful laptop, but was curious to see the effect of an eGPU, especially since the real-world benefits of GPUs for Capture One/Photoshop are difficult to predict.
The laptop is:
The eGPU is a NVidia RTX 3070Ti in a Razer Core X enclosure. Specific model is Gigabyte Vision OC 8GB, so a mildly over-clocked version, but marketed more at content creation than gaming.
I picked the 3070Ti for no-more scientific reasons than it's what Puget spec as base standard in their PS Workstations, and the price was only super-ridiculous, not mega-ridiculous like a 3080/90.
Getting the eGPU working
I had read that the big challenge here would be getting drivers for the eGPU installed. There are many reports on the web of having to remove existing drivers bla bla, but I found what every I did, the drivers installed fine for the T2000, but Windows stubbornly continued to treat the eGPU as a 'standard display adapter'.
I tried to manually install drivers for the eGPU, but could not locate the drivers already installed for the T2000. Then I realised that the folder that the studio driver installer unzips the driver files to is then removed once installation is completed.
So I ran the installer again, but cancelled it once it had extracted the driver files to the temporary folder, then I copied that folder to another location. I did a manual driver update for the eGPU, and navigated to this folder to grab the driver. Voila!
3070Ti was not available to select in the driver setup, so I picked Quadro RTX 3070*. Previous searches for the correct driver on the Nvidia site for each my GPUs had lead me to exactly the same driver file, so I was pretty confident that this would work, which it did!
(*if you want to correct the eGPU name in Device Manager, you can add a Friendly Name in the Registry - which is why in the screenshot below it is named correctly)
Benchmarking
After carefully assigning programs to use the GPU in Windows and Nvidia 3D settings, I ran the Puget PS benchmark again, with the monitor plugged into the eGPU.
But.... the Puget PS benchmark actually got worse, down to 760 with the GPU rating dropping from 85 to 77. This is despite peaks of 70% utilisation of the eGPU.
I suspect this is because Photoshop can only use one GPU, and was therefore throwing everything at the 'more powerful' eGPU, but then any benefit may have been 'lost in translation' between CPU and GPU.
Quite a disappointment though, given this GPU offers 6x the CUDA cores and about 4x the memory bandwidth compared to the internal Quadro.
Next up, in Windows Graphics settings I turned on 'Hardware-accelerated GPU scheduling'. At this point, Windows moved about ten apps/processes to the eGPU. These seemed mostly to be things that ran in the background like Dropbox and Jabra Direct.
This had a subtle effect of making the machine subjectively seem to be smoother when switching between windows and apps in general use. Or may be I just had a beer by then.
Capture One performance
This is what I was most interested in, as I had read that C1 can make extensive use of a many as 4 GPUs from the same chipset vendor. There's no easy way to benchmark C1 and compare to others, so the performance assessment that follows is subjective, but the GPU utilisation is not.
I use a Tangent Wave Elements Kb analogue controller with C1 and when adjusting, for example, structure, on a 100MP RAW image, without the eGPU it is quite easy to twiddle the knob a bit too fast and then waiting a fraction of a second for the screen to catch up, only to realise you have 'over-cooked' it, and then having to dial it back.
With the eGPU there is less lag between making a small adjustments and the screen catching up, which makes it easier and more intuitive to make subtle adjustments.
While doing this, this is what I see in terms of utilisation:

Approximately 60% utilisation of eGPU (GPU 2) by Capture One
Capture One is clearly making good use of both GPUs, which I am very happy about. In this instance, the T2000 was dedicated to C1, the eGPU was also running C1 but also some of the apps sent to it by Windows as described earlier (and these only account for a few % at most). The CPU is unstressed, so the system seems to be nicely balanced.
Hardware Review
Not much to say about the Razer Core X that hasn't been said already. Well built, easy to use and looks neat. Large enough (physically and PSU watts) to take the beefiest of GPUs. I got it on offer direct from Razer at £180, so very good value.
Only slight negative is the included Thunderbolt cable, which may be a bit short depending on your desk layout. If you need to place your laptop with it's Thunderbolt ports on the opposite side from where you want to put the eGPU, you'll need a new cable. I replaced it with a 1m CalDigit Thunderbolt 4 cable for £30.
The Gigabyte GPU seems to deliver the goods, and in this application where it is stressed only intemittently, it's quiet, with the single Razer Core X fan providing enough cooling for the GPU's 3 fans to only occasionally need to spin up.
Gigabyte Aorus software allows you to monitor the card, but the associated Fusion RGB 2.0 software just didn't work - I could not get any LEDs on the card to light at all. This seems to be a common problem judging by Internet reports. I don't really care much, I suppose it would have been nice to see the thing light up and change colour in response to the load on it. In the end I un-installed all the Gigabyte software.
Value for money? Well, it's in the middle of the high-end of Nvidia's range, it's a GPU and it's late 2021....so value for money, not so much. That said, if you are patient and shop around you can find something actually in stock for a price that is 10% less ridiculous than most are asking...
Conclusions
If you pair an eGPU with (say) a Dell XPS 13, you would have a pretty portable/powerful solution, which would be difficult to match in a single laptop, even at todays' GPU prices.
Although it won't be portable and powerful at the same time ;-)
That might be good value for some use-cases and it will certainly make me think differently when I come to replace the bulky and expensive Dell Precision.
My observations:
The camera is not your tool. The light is.
Tim
timtuckerphotography.com
timtuckerphotography.com
The laptop is:
- Dell Precision 7540
- Windows 10 Pro 64
- i9-9980HK
- 128GB RAM 2666Mhz
- Drives are M.2 NVME (C:\ = Dell Class 50 i.e. 'fast', D:\ - where the image files are - 2x Samsung Evo 970 RAID 1 mirrored)
- Nvidia Quadro T2000 (NVidia Studio Ready driver)
- Thunderbolt 3
The eGPU is a NVidia RTX 3070Ti in a Razer Core X enclosure. Specific model is Gigabyte Vision OC 8GB, so a mildly over-clocked version, but marketed more at content creation than gaming.
I picked the 3070Ti for no-more scientific reasons than it's what Puget spec as base standard in their PS Workstations, and the price was only super-ridiculous, not mega-ridiculous like a 3080/90.
Getting the eGPU working
I had read that the big challenge here would be getting drivers for the eGPU installed. There are many reports on the web of having to remove existing drivers bla bla, but I found what every I did, the drivers installed fine for the T2000, but Windows stubbornly continued to treat the eGPU as a 'standard display adapter'.
I tried to manually install drivers for the eGPU, but could not locate the drivers already installed for the T2000. Then I realised that the folder that the studio driver installer unzips the driver files to is then removed once installation is completed.
So I ran the installer again, but cancelled it once it had extracted the driver files to the temporary folder, then I copied that folder to another location. I did a manual driver update for the eGPU, and navigated to this folder to grab the driver. Voila!
3070Ti was not available to select in the driver setup, so I picked Quadro RTX 3070*. Previous searches for the correct driver on the Nvidia site for each my GPUs had lead me to exactly the same driver file, so I was pretty confident that this would work, which it did!
(*if you want to correct the eGPU name in Device Manager, you can add a Friendly Name in the Registry - which is why in the screenshot below it is named correctly)
Benchmarking
After carefully assigning programs to use the GPU in Windows and Nvidia 3D settings, I ran the Puget PS benchmark again, with the monitor plugged into the eGPU.
But.... the Puget PS benchmark actually got worse, down to 760 with the GPU rating dropping from 85 to 77. This is despite peaks of 70% utilisation of the eGPU.
I suspect this is because Photoshop can only use one GPU, and was therefore throwing everything at the 'more powerful' eGPU, but then any benefit may have been 'lost in translation' between CPU and GPU.
Quite a disappointment though, given this GPU offers 6x the CUDA cores and about 4x the memory bandwidth compared to the internal Quadro.
Next up, in Windows Graphics settings I turned on 'Hardware-accelerated GPU scheduling'. At this point, Windows moved about ten apps/processes to the eGPU. These seemed mostly to be things that ran in the background like Dropbox and Jabra Direct.
This had a subtle effect of making the machine subjectively seem to be smoother when switching between windows and apps in general use. Or may be I just had a beer by then.
Capture One performance
This is what I was most interested in, as I had read that C1 can make extensive use of a many as 4 GPUs from the same chipset vendor. There's no easy way to benchmark C1 and compare to others, so the performance assessment that follows is subjective, but the GPU utilisation is not.
I use a Tangent Wave Elements Kb analogue controller with C1 and when adjusting, for example, structure, on a 100MP RAW image, without the eGPU it is quite easy to twiddle the knob a bit too fast and then waiting a fraction of a second for the screen to catch up, only to realise you have 'over-cooked' it, and then having to dial it back.
With the eGPU there is less lag between making a small adjustments and the screen catching up, which makes it easier and more intuitive to make subtle adjustments.
While doing this, this is what I see in terms of utilisation:

Approximately 60% utilisation of eGPU (GPU 2) by Capture One
Capture One is clearly making good use of both GPUs, which I am very happy about. In this instance, the T2000 was dedicated to C1, the eGPU was also running C1 but also some of the apps sent to it by Windows as described earlier (and these only account for a few % at most). The CPU is unstressed, so the system seems to be nicely balanced.
Hardware Review
Not much to say about the Razer Core X that hasn't been said already. Well built, easy to use and looks neat. Large enough (physically and PSU watts) to take the beefiest of GPUs. I got it on offer direct from Razer at £180, so very good value.
Only slight negative is the included Thunderbolt cable, which may be a bit short depending on your desk layout. If you need to place your laptop with it's Thunderbolt ports on the opposite side from where you want to put the eGPU, you'll need a new cable. I replaced it with a 1m CalDigit Thunderbolt 4 cable for £30.
The Gigabyte GPU seems to deliver the goods, and in this application where it is stressed only intemittently, it's quiet, with the single Razer Core X fan providing enough cooling for the GPU's 3 fans to only occasionally need to spin up.
Gigabyte Aorus software allows you to monitor the card, but the associated Fusion RGB 2.0 software just didn't work - I could not get any LEDs on the card to light at all. This seems to be a common problem judging by Internet reports. I don't really care much, I suppose it would have been nice to see the thing light up and change colour in response to the load on it. In the end I un-installed all the Gigabyte software.
Value for money? Well, it's in the middle of the high-end of Nvidia's range, it's a GPU and it's late 2021....so value for money, not so much. That said, if you are patient and shop around you can find something actually in stock for a price that is 10% less ridiculous than most are asking...
Conclusions
If you pair an eGPU with (say) a Dell XPS 13, you would have a pretty portable/powerful solution, which would be difficult to match in a single laptop, even at todays' GPU prices.
Although it won't be portable and powerful at the same time ;-)
That might be good value for some use-cases and it will certainly make me think differently when I come to replace the bulky and expensive Dell Precision.
My observations:
- Windows eGPU setup not necessarily as complicated as you may have read
- Capture One exploits multiple GPUs exactly as promised
- For Photoshop, if you already have a good dedicated laptop GPU, an eGPU may not improve things
The camera is not your tool. The light is.
Tim
Tim Tucker LRPS blogging on photography — timtuckerphotography.com
Blog by Tim Tucker LRPS, a fine art landscape photographe based in the UK. Posts on creativity, technique, equipment of interest to fellow photographers and those interested in his approach and philosophy to his work.
timtuckerphotography.com
Fine art landscape photography by UK photographer Tim Tucker LRPS. Based in the Midlands, Tim is an experienced landscape photographer shooting with Fujifilm X and GFX medium format digital camera. His stock photography has been widely published in magazines, books and calendars. He also maintains a