Capture One on Dell XPS 15 - performance question
TL;DR summary - Can I trust a performance evaluation (based on actually running CO) of an off-the shelf XPS-15, or are tweaks needed to make it perform to it's fullest potential?
I’m considering purchasing a Dell XPS 15 (9560) for use with Capture One, working with 24 MP X-trans raw files, and am concerned about performance.
I’ve done some testing, which left me a little disappointed, and I’d to ask the collective if my results match, or do not match, those who use CO with the XPS 15 when processing raws.
My test was to go to Costco, and install CO 10.2 from a USB stick, and then import 10 raw files to see how responsive the software was.
On the first run, CO took a few minutes to optimize for the GPU. However, it did crash before finishing that task. I then launched CO again, at which point the Preferences dialog told me that acceleration was being used. At that point, I imported my raw files (with a preview size of 3840) and proceeded to move through the images – sometimes full-screen, sometimes not. I also played around with sliders such as sharpening & shadow recovery, etc.
Overall, the performance was OK, but not stellar. For example, the same operations are noticeably faster on my work-provided late-2013 MBP when using the built-in Retina display (2.3 GHz i7, 16 GB, GeForce GT 750M w/2GB) (please, this is not an intent to start a flame war – I use both and just want the best interactive performance). Connecting a 4k display to the Mac results in abysmal performance, so clearly the XPS 15 handles 4k much better than a 2013 MBP, but I’m a little surprised that the MBP feels faster when working at 2880x1800, as compared to the XPS 15 when working at 3840x2160. Yes I know you can reduce the resolution on the Dell, but that results in a fuzzy screen and I don’t want to go there.
So, my question is, was my test flawed in some way? Do new GPU drivers need to be installed on the Dell before you get full performance? I’m aware of the throttling issues on the Dell related to temperature rise, and perhaps I was seeing that. My other thought is that a 2GB GPU just doesn’t have the capacity to handle 4K image generation, while it is enough for 2880x1800.
I don’t mind going to 4k on the 15” screen, but I also don't see a strong need for it, and if it causes this level of performance problem, I probably will go with a different system. Any thoughts?
|Post (hide subjects)||Posted by||When|
|11 months ago|
|11 months ago|
|11 months ago||1|
|walkersons fields by George Veltchev|
from -Waiting for Autumn- (in Full Colours Only)
|A smile is worth a thousand words by alberto_b|
from Fill the frame