GPU USELESS For Lightroom Classic!

Started 3 months ago | Discussions thread
OP Batdude Veteran Member • Posts: 4,952
Re: Never mind

Simon Garrett wrote:

Batdude wrote:

Batdude wrote:

I feel like something is wrong. Maybe my PC hasn't been set up properly, I don't know what it is but I have that gut feeling that something is not right.

I did a really quick test, and this is something I have done hundreds of times with the i7-6700K I had. After exporting all the 1900 jpeg photos to a folder, there I tried copying and pasting those 1900 photos to a different folder to start separating stuff for my customers, from the same M.2. Folder to folder. I actually just tried both M.2 SSD C: drive and my second M.2 SSD for storage.

Found some interesting stuff and looks like the XMP in the BIOS was disabled and the RAM speed was set to 2100MHZ. Enabled it and the 3600MHZ kicked in. Big difference and now is copying/transferring much faster so that's fantastic.

For some reason it feels like my i7-6700K is faster. I recall that every time I did that with the intel transferring the photos was almost immediate and you wouldn't even see the copy transfer window. With my new Ryzen 3900X system is taking five entire seconds. I feel like something is REALLY wrong here.

The PC shop where I had it built is closed tomorrow so I'll have to wait til Monday to take it back but meanwhile can you think of anything that could be wrong with this? I just cannot believe that this Ryzen CPU is behaving much slower than the i7-6700K. I mean, every single component is much newer at double the speed. Perhaps I'm not having a "lightroom" problem but something else instead???

Any windows/AMD experts out there?

I also went ahead and tried exporting the 1900 images all over and this time it took 11:40 compared to the 15 minutes I was getting before enabling XMP. This is more like it

What motherboard are you using?

Hi Simon.

Gigabyte B550 Aorus Pro AC.

FYI I also had a Gigabyte in my previous intel system and had no issues.

I have an Asus (X570-F Gaming), and apart from turning on XMP (called DOCP in Asus boards), I reduced VDDCR by 0.1V.

You manually reduced the voltage? What did that do?

Googling, I found bloggers saying that, compared to Intel processors, AMD processors tend to throttle back clock speeds more to limit temperatures and powers. Asus (and perhaps other mb makers) set VDDCR rather high by default, which reduces the risk of blue screens, but increases power consumption (and temperature) and thus means that multi-core speed may be reduced as the chip throttles back more. In Asus mbs, the setting for VDDCR allows "Auto", "Manual", or "Offset". If you choose the latter you then choose "+" or "-", and then the offset value (e.g. "-" and 0.1V). It can probably go lower, but I haven't tried. It's completely stable at an offset of -0.1V on mine, and multi-core benchmarks are a few percent higher. Marginal, but why not if it's stable?

To be honest you speaking a language here that I'm not familiar with and I've NEVER seen and messed around with this sort of stuff.

You might find other motherboard settings with safe but non-optimal defaults.

I kind of agree with that. It looks to me that this system is really not a "plug and play" as my previous one and more has to be done to it. Are all or most new, or higher end CPUs and motherboards all do this where you have to manually set all these things yourself?

For sure I will take this puppy back to the shop first thing Monday because i'm sure they know way more than me about this, but I think I have a few weeks to decide and return the MB and CPU, but I don't know if with a higher end intel CPU I will run into the same situation?

 Batdude's gear list:Batdude's gear list
Fujifilm X10 Nikon D4 Fujifilm X-E1 Fujifilm X-T1 Nikon AF Nikkor 50mm f/1.8D +8 more
Post (hide subjects) Posted by
MOD Austinian
MOD Austinian
MOD Austinian
MOD Austinian
MOD Austinian
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow