Need Graphic Card Recommendations for Lightroom CC

reid thaler

Senior Member
Messages
1,860
Reaction score
115
Location
San Francisco Bay Area, CA, US
Have Asus P877-vlk motherboard Current Card is Gigabyte GeForce 8500 GT (GV-NX85T512HP) 512 MB CPU RAM Will a new card speed things up?

Suports PCI-Express 3.0 x16 Would like to stay with NVIDIA, something fanless to be quiet. Likely 2 GB GPU RAM.

i7 and 16 GB RAM

How much diff is there between PCI-Express 3.0 and 2.0 if I'm not a gamer? Would I see a difference? Is 2.0 fine?

Other thoughts and suggestions appreciated. Recommended cards?

--
Thanks!
Reid
www.lumiograph.com
Kodak Brownie
Argus 126
Quaker Oats Container Pinhole Camera
 
Last edited:
Hello reid:

I use a Nvidia GTX 750 (non ti) this works exceptionally well with LR 6.

These cards have a fan and are very (as in I do not hear it) quiet.

I do not know if you will see a speed difference between PCI 3.0 vs 2.0. Why not use the quickest data path possible?

The GTX 750 & 750 ti are rated for a psu with 300 watts.

Best Regards,

Guido
 
I am in the process of building two new desktop pc's based on the I7 5820K. I have this week purchased two ASUS GTX960 DC2 OC STRIX 2GB DDR5.


I know that you said you would prefer fan less card and this card is a twin fan unit, but have a look at the cooling method of this card. It has a built in direct cooling system and the associated fans are temperature controlled. They only switch on when the card temp rises, so unless you load it really heavily the fans do not run and it is silent.

I picked this card as it provides thee DP connections at a great price and the cooling system of this card finalised my choice. I do not play games so this card suits my needs well.

Reviews here,



Both of these links should take you to the "Final Thoughts" pages, but both reviews a worth a full read.

I hope this helps.
 
can I run DDR5 as long as I has PCIe 3.

Thanks for the feedback. I'll take a card with a fan if it's quiet. Would I see much difference between the 700 and 900 series Nvidia?

I have a legacy CRT monitor I use occasionally for a 2nd monitor (main one is NEC pa271w)

Would like a vga port, but I guess I could use my old card. Just wondering it that'll be too much for my Seasonic 460 Power supply.

Thanks!
 
Have Asus P877-vlk motherboard Current Card is Gigabyte GeForce 8500 GT (GV-NX85T512HP) 512 MB CPU RAM Will a new card speed things up?

Suports PCI-Express 3.0 x16 Would like to stay with NVIDIA, something fanless to be quiet. Likely 2 GB GPU RAM.

i7 and 16 GB RAM

How much diff is there between PCI-Express 3.0 and 2.0 if I'm not a gamer? Would I see a difference? Is 2.0 fine?

Other thoughts and suggestions appreciated. Recommended cards?
Don't worry about the difference between PCIe 2.0 and 3.0 for your purposes. If you were running something like a GTX 980 (a card selling for over $800 discounted), then it might be a little faster with a PCIe 3.0 slot, as it can handle very high bandwidths that way.

But, for the types of cards in your apparent price range, it really doesn't make any difference, as PCIe x16 2.0 is plenty fast enough to support the bandwidth needed for most cards.

Of course, all newer cards are going to be PCIe 3.0 anyway, and you'll want to get a newer card if spending money on one, so you get one that supports the latest DirecX, OpenGL and OpenCL standards.

Until we see benchmarks of LR CC using different card models doing different tasks, there is no way to know how much difference one card will make versus another.

With Photoshop CC, what users found was the the newer Smart Sharpen filter was significantly faster with some of the newer card models compared to older card models within the Nvidia Lineup.

For example, the GTX 750 Ti versus the older GTX 650 Ti, as reported in this thread by Thomas Niemann:

http://www.dpreview.com/forums/post/53284238

The newer GTX 750, GTX 750 Ti, GTX 960, GTX 970 and GTX 980 models are the only cards within the Nvidia lineup based on a newer Maxwell architecture design, that offers better performance with OpenCL compared to the earlier generation cards they replaced.

Those cards also have the best performance per watt of any cards made (as Nvidia went to great lengths to reduce power consumption and heat generated with the new Maxwell design).

Will more features take advantage of a faster card in Lightroom CC? Again, until someone benchmarks different features using different card models, there is no way to know.

But, I'd probably buy a faster card for "future proofing" if budget permits.

On the lower end of the scale, I'd probably go with this Asus GTX 750 (model# GTX750-DCSL-2GD5) if you really want passive cooling in a card.

http://www.amazon.com/Asus-GTX750-DCSL-2GD5-ASUS-Graphics-Cards/dp/B00RL2SKXY

But, even the GTX 750 and GTX 750 Ti cards with typical fans have a reputation for being very quiet. This Asus "STRIX" GTX 750 Ti is probably the quietest you'll find:

http://www.amazon.com/Asus-STRIX-GTX750TI-OC-2GD5-ASUS-Graphics-Cards/dp/B00M9ZZ1Z8

Basically, it's fan won't even need to run unless under very heavy load, and even then it's likely to be very quiet from reviews I've seen.

Or, if budget permits and you want something a lot faster, look at something like this GTX 960 STRIX model instead:

http://www.amazon.com/Asus-STRIX-GTX960-DC2OC-2GD5-ASUS-Graphics-Cards/dp/B00S9SGMZM

The GTX 960 tests more than twice as fast on some of the OpenCL benchmarks compared to the GTX 750 Ti. See the Luxmark tests on this page for one example of that:

http://benchmarkreviews.com/24899/asus-geforce-gtx-960-strix-video-card-review/12/

Note how the GTX 750 Ti was more than 3 times as fast on that Luxmark test compared to the previous generation GTX 650 Ti?

The better OpenCL performance we're seeing with the newer Maxwell generation cards is probably why I've seen users report much better processing times using the Smart Sharpen Filter in Photoshop CC when comparing times between a GTX 650 Ti and 750 Ti (since Adobe is using OpenCL for GPU accelerated features now).

But, the Smart Sharpen Filter was the only filter in Photoshop CC that would take advantage of a card as fast as the 750 Ti. Other GPU accelerated features really didn't need a card as fast.

Lightroom CC?

Your guess is as good as mine, as I haven't seen anyone benchmarking different card models with any of it's features yet (and I don't know which features would benefit from a fast card either).

In the past, we've tended to see rapidly diminishing returns once you get to a certain speed card (where you could buy a card twice as fast and see no further improvement).

Some of that started to change with filters like Smart Sharpen in PS CC though (where users did notice a bigger difference with a faster card).

Of course, as time passes, I'd expect to see more and more software that can take better advantage of a faster GPU.

We're also seeing higher and higher resolution displays, and trying to push 4K displays at faster frame rates is going to need a good card too.

Anyway, given your system, if budget permitted, I might consider a GTX 960 for it, as it's the newest card in the Nvidia Lineup. It also has an HDMI 2.0 port (whereas the GTX 750 and GTX 750 Ti models only support HDMI 1.4).

That means you could drive a 4K display at 60 frames per second via the HDMI port in the GTX 960 (something you can't do via the HDMI port from cards like the GTX 750 or GTX 750 Ti).

The GTX 960 is also the *only* card made now that has both H.265 (4K standard) decoding and HDMI 2.0 (where the decoding of H.265 video is performed in hardware). Of course, that's not as big of concern with a faster CPU. But, it's interesting that Nvidia included that feature with their new GTX 960 model (other cards don't have H.265 decoding).

Of course, we're not seeing but a handful of computer displays yet that even have HDMI 2.0 inputs. That feature is more common on some of the brand new 40" and larger TV Sets now.

Also, with a 4K computer display, you could still drive it at faster frame rates via a DisplayPort (you wouldn't need to use HDMI if both the card and display had a DisplayPort).

But, as time passes, I expect we'll see more and more monitors, TVs, etc. adopting the HDMI 2.0 standard with higher 4K resolutions.

So, from a future proofing stand point, if budget permits, I think I might consider going with a GTX 960 instead of a GTX 750 or 750 Ti.

That way, you'd be getting a card that tests more than twice as fast as a GTX 750Ti, plus a card that supports the newer HDMI 2.0 standard (for a price that's not twice as high, as I see the Asus GTX 960 STRIX model (their quieter design where the fan doesn't even run unless under heavy load, with very quiet fans if they are running, since higher RPMs are not needed to cool a GTX 960) listed for $188.99 on Amazon right now, which is only $29 more than the Asus GTX 750 TI Strix model.

Asus GTX 750 Ti Strix ($159.99 right this minute)

http://www.amazon.com/Asus-STRIX-GTX750TI-OC-2GD5-ASUS-Graphics-Cards/dp/B00M9ZZ1Z8

Asus GTX 960 Strix ($188.99 right this minute)

http://www.amazon.com/Asus-STRIX-GTX960-DC2OC-2GD5-ASUS-Graphics-Cards/dp/B00S9SGMZM

As for your Seasonic 460w PSU, most of the models I've seen from them have +12v rails rated at something like 38 amps and have dual 6+2 pin PCIe connectors included, too.

That GTX 960 only needs a single 6 pin PCIe connector to work (it draws 75 Watts from the PCIe slot, and the rest it needs from a single 75 Watt 6 Pin PCIe connector, not dual 6 pins, 8 pin, etc. like higher power draw cards need). The latest Maxwell design is very power efficient compared to any other cards on the market.

Even under "torture" tests, the most I've seen it reported to draw was 144 watts.. But, you'd be hard pressed to get it pulling that much, as they were probably overclocking it significantly for that torture test.

See it in a chart on this page:

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-960,4038-8.html

In fact, Nvidia specs only show a 120 Watt TDP for it, with a 400 Watt PSU recommended. See their specs for it:

http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-960/specifications

Chances are, your entire system fully loaded including that card fully loaded being stressed with benchmarks tests is only going to pull around 200 Watts total (as the newer Core i7 4790k and similar CPUs have a TDP of 84 Watts under full load and the GTX 960 has a rated TDP of 120 Watts (and I doubt you'd get it to much over 100 Watts anyway without using Asus software to OC it significantly); with the MB, fans and drives using very little more.

IOW, your PSU shouldn't get to much over 50% load with your entire system being stress tested, including that card, your CPU, motherboard, drives and fans. For that matter, I suspect you'd have to try really, really, hard to get your entire system pulling much over 200 watts total (as it's hard to get a CPU and GPU at 100% load at the same time).

Anyway, there is no way to know if Lightroom CC will benefit much from a faster card, until see start seeing some benchmarks of it's features from users that are testing it with different video card models.

So, there is no guarantee that a faster card is going to help anything. But, if it were my money and I were buying a new card, I'd buy the GTX 750 with 2GB of memory at a minimum.

But, I'd go ahead and make the jump to the GTX 960 (Nvidia's newest video card model with features like HDMI 2.0 and H.265 decoding that are not available in the GTX 750 or GTX 750 Ti) if budget permitted, giving me a card that tests more than twice as fast as a GTX 750 Ti for very little price difference (since I expect to see more and more software taking advantage of a faster video card as time passes)

BTW, there is a new 4GB model of the Asus GTX 960 STRIX coming out very soon. It was announced last month and was supposed to be available this month:


But, I don't see it listed on popular vendor sites yet.

Of course, most of what we do now doesn't need more than 2GB (with even 1GB working fine for most purposes). But, in the future, I suspect we'll see more and more apps that are actually running on the GPU versus the CPU. So, more memory wouldn't hurt anything for future proofing.

---
JimC
------
 
Last edited:
Jim,

Thanks for the very, very extensive detailed response. The 960 looks like the way to go. A couple of further questions. I wonder how much of a performance difference I'll see, as I can't say Lightroom lags significantly. My system is slightly overclocked via Asus software, running around max around 4100 mhz. So how much would I be gaining with a new graphics card? Is is really worth it? I can't image it will comparable to switching from to a SSD from a HHD

Can I keep my current graphics card installed so I can run my legacy CRT with a VGA port? Is the system still taxed if the 2nd monitor is usually turned off?


Thanks!
Reid
www.lumiograph.com
Kodak Brownie
Argus 126
Quaker Oats Container Pinhole Camera
 
Hi Reid:

Now, with all of these opinions. Do some research on the cards to validate opinions before you make a final decision.

Have Fun

Best Regards,

Guido
 
Hi Reid:

Now, with all of these opinions. Do some research on the cards to validate opinions before you make a final decision.

Have Fun

Best Regards,

Guido
Looks like the 960 would be the best bet, but still wondering how much difference is it going to make working with light room
--
Thanks!

Reid
www.lumiograph.com

Kodak Brownie
Argus 126
Quaker Oats Container Pinhole Camera
 
[snip]
Will more features take advantage of a faster card in Lightroom CC? Again, until someone benchmarks different features using different card models, there is no way to know.

But, I'd probably buy a faster card for "future proofing" if budget permits.
I found some articles and videos showing performance differences with and without GPU acceleration enabled, and from what I'm gathering, you probably won't see much or any difference between what you have now, and a faster card.

What type of displays are you using? The biggest difference you might see is with a higher resolution display.

Adobe recommends 1GB of vRAM for Lightroom 6/CC, with 2GB recommended for higher resolution 4K displays.

So, with only 512MB with your 8800 GT, you're below the recommended requirements for a video card, even without a higher resolution display model.

If GPU acceleration is working, it's probably because your 8800 GT card model "squeaked" by since it supports OpenGL 3.3, DirectX 10 and Shader Model 4.0 from what I can find of how it works with the latest driver

IOW, it looks LR 6 and CC are not going to use a card that doesn't have OpenGL 3.3 support. But, if GPU acceleration shows up as being enabled with your card, then they may not be enforcing the 1GB vRAM Minimum (as I see that's a suggested minimum memory size, and it might not be a mandatory requirement).

Basically, only the develop module is using any kind of GPU acceleration, and that appears to be limited to rendering times when adjusting sliders and using brushes; using more optimized OpenGL calls for the rendering, versus using any kind of OpenCL code like Photoshop CC is using for some of it's filters.

IOW, Photoshop is using both OpenCL and OpenGL accelerated features, where you can see *much* faster speeds with a card with better OpenCL performance using the latest Smart Sharpen filter.

But, Lightroom is only making use of OpenGL (at least in the current LR6/CC release that was just launched).

Here's someone using a Core i7 5960x CPU (8 cores supporting 16 threads), OCd to 4.1Ghz from what I can tell by the charts included. Note the comments about where he's seeing any differences, immediately following the performance charts included in press material by Adobe:

http://www.tested.com/tech/photography/522430-living-photography-adobe-lightroom-6-review/

He was shocked at how much better things like using a brush he likes were with GPU acceleration. But, he wasn't really impressed with much else (of course, he does have a *very* fast system).

Here's someone showing the main differences you'll see with the new GPU accelerated features in LR 6/CC on video:


The comments I've seen indicate zero difference using something like Intel Iris Graphics with a Macbook Pro, compared to using a card that's many times as fast.

But, even HD 4600 integrated graphics like you'll find with the latest Core i5 4xxx and Core i7 4xxx CPUs tests faster than your Nvidia 8800 GT, and the Iris Pro is even faster.

If you're using a 3rd generation CPU (Core i5 3xxx or Core i7 3xxx model) with HD 4000 integrated graphics, then your 8800 GT is needed (as only HD 4400 or later Intel video chipsets are supported for GPU accelerated features in LR 6/CC).

But, if you're using a 4th generation CPU with HD 4600 graphcs, then I'd yank your video card and use the integrated graphics instead (as HD 4600 integrated graphics tests slightly faster than the 8800 GT), since it looks like your Motherboard has HDMI, DVI and DisplayPort Outputs.

Anyway, it looks like for this release of Lightroom, the ability to leverage a GPU is very minimal.

So, unless you're using other software that takes better advantage of a GPU, then you may want to hold off before spending any money on a new graphics card for Lightroom only purposes.

Of course, as time passes, we'll see more and more software taking better advantage of a faster GPU.

Lightroom 6/CC is just not doing that yet (except for the develop module, where even a lower end card with OpenGL 3.3 support is probably "good enough").

Now, if you're also using Photoshop CC, the latest Smart Sharpen filter takes advantage of a faster card. Ditto for many third party plugins and filters. But, for Lightroom only use, you may want to hold off a while and see if Adobe starts to take better advantage of a GPU later in future updates.

--
JimC
------
 
Last edited:
Jim,

I'm still feeling a bit confused

Thanks again for your posts. I look at the YouTube video, and it looks like the benefit of the graphic card is pretty minimal, especially if all it's going to give me his some improvement on the development sliders and filter tools.

I'm fine with spending good money on a graphics card, and even for future proofing, but I'm wondering I'm going to see much benefit even right now. I'm wondering if I use graphic card on eBay is going to give me the same bang that I can get now, at the fraction of the cost of the 960 card.

I feel like I'm in the dark a little bit as I don't have a good sense is to how much improvement I would see from my GeForce 8500 card, and anything else
 
I'm fine with spending good money on a graphics card, and even for future proofing, but I'm wondering I'm going to see much benefit even right now. I'm wondering if I use graphic card on eBay is going to give me the same bang that I can get now, at the fraction of the cost of the 960 card.
Same "bang"?

For gaming perhaps. But, not for GPU acceleration using OpenCL.

Older generation Nvidia cards are *terrible* in areas like OpenCL performance compared to the newer Maxwell Generation Cards. That's probably why I saw one member that upgraded from a GTX 650 Ti to a Maxwell based GTX 750 Ti, just to get much better performance with tasks like the Smart Sharpen Filter in Photoshop CC.

Nvidia OpenCL performance has always lagged *way* behind AMD in the past. But, beginning with the Maxwell based cards (GTX 750, 750 Ti, 960, 970, 980), Nvidia has focused on improving OpenCL performance (dramatically in comparison to older generations), probably in anticipation of more new applications that will take advantage of it.

Heck, in one OpenCL benchmark in an article reviewing one, an Nvidia GTX 750 Ti had two thirds of the performance of a Kepler based Nvidia Titan card (costing a couple of grand). As shown in a different OpenCL benchmark I linked to in my previous post to this thread, a GTX 750 Ti tested more than 3 times as fast as a last generation GTX 650 Ti, too.

If you're going to spend money on another card for future proofing, buy a new model that supports the latest standards (OpenGL, OpenCL, DirectX, 4K video encoding, etc.), with dramatically better OpenCL performance, not an older generation card.

To sound like a broken record, as mentioned (and in the benchmarks I linked to), in the Nvidia lineup, the newer Maxwell Generation card models (GTX 750, GTX 750 Ti, GTX 960, 970, 980) are more than 3 times as fast as the last generation Kepler based cards they replaced for OpenCL use from what I've seen on benchmarks. Note that only the GTX 750 and 750 Ti cards in the Nvidia 7xx series cards are Maxwell (other 7xx series cards are last generation Kepler based cards).

So, IMO, it would be a really bad idea to buy something like a GTX 650 Ti or similar Kepler based card on Ebay just to save a few bucks, when a newer Maxwell based GTX 750 Ti is several times as fast for OpenCL use and draws a lot less power, is quieter and supports newer standards (including H.265 4K encoding if you ever moved into video editing later).

The GTX 960 even supports HDMI 2.0 and H.265 decoding (not just encoding like the other Maxwell based cards support).

So, personally, if I were buying a new card, I'd go with a GTX 750 with 2GB of memory at a minimum, and go with a GTX 960 if budget permits, as discussed in my earlier post here:

http://www.dpreview.com/forums/post/55701832

Just because Lightroom 6/CC doesn't make much use of the GPU yet doesn't mean other apps won't in the near future.

Heck, even the Smart Sharpen Filter in Photoshop CC takes advantage of a faster card. See the test results I linked to in that post from someone here that upgraded from a GTX 650 Ti to a 750 Ti and posted benchmarks using that Photpshop filter with both cards.

Some third party filters can also take advantage of a faster GPU now, as can a number of other image editing products (for example, I'm using Corel AfterShot Pro and it uses OpenCL for GPU Acceleration, as do a number of other apps like DxO Optics Pro and more). Many video editors can make use of either OpenCL or CUDA acceleration, too.

Adobe just didn't do much in the way of GPU accelerated features with Lightroom CC/6 (yet anyway).

But, because the latest Maxwell generation cards from Nvidia are dramatically faster than the earlier generation Nvidia cards for OpenCL tasks, I would not spend any money on an older generation card, since I'd expect to see more and more apps taking advantage of faster cards with better OpenCL performance as time passes.

Lightroom 6/CC is just not one of them (it doesn't even use OpenCL from what i can tell, and is only using OpenGL to speed up rendering times with the Develop Module when using available sliders and brushes)

In any event, I doubt you'd see any difference with Lightroom by using a faster card than what you already have unless you buy a newer 4K display (and of course, you'd want to make sure any card you buy supports a 4K display at 60Hz vs 30Hz, meaning only newer card models).

I would certainly not spend any money on an older generation card (the GTX 6xx series, and the GTX 7xx cards except for the 750/750 TI are Kepler versus Maxwell and have very poor OpenCL performance in comparison to Maxwell) to replace it for "future proofing", given how much better the latest generation Maxwell cards are in areas like GPU accelerated features that are using OpenCL, with support for newer features like H.265 4K video encoding that you do not get with the previous generation cards.

--
JimC
------
 
Last edited:
In any event, I doubt you'd see any difference with Lightroom by using a faster card than what you already have unless you buy a newer 4K display (and of course, you'd want to make sure any card you buy supports a 4K display at 60Hz vs 30Hz, meaning only newer card models).

JimC
------
So if it sound like there would be no benefit of a new card in Lightroom, and I'm not doing much in Photoshop, is there any reason or benefit to buying a new card? Everything will be faster and cheaper tomorrow.
 
In any event, I doubt you'd see any difference with Lightroom by using a faster card than what you already have unless you buy a newer 4K display (and of course, you'd want to make sure any card you buy supports a 4K display at 60Hz vs 30Hz, meaning only newer card models).

JimC
------
So if it sound like there would be no benefit of a new card in Lightroom, and I'm not doing much in Photoshop, is there any reason or benefit to buying a new card? Everything will be faster and cheaper tomorrow.
A quote of the last sentence in an earlier post to this thread:

"But, for Lightroom only use, you may want to hold off a while and see if Adobe starts to take better advantage of a GPU later in future updates."


IOW, if you don't need a new card for anything else you're doing, then you may want to wait until you do need one.

As you put it, "Everything will be faster and cheaper tomorrow".

So, when you are doing something that would benefit from a new card (and/or Adobe updates Lightroom to take better advantage of a GPU in future updates), see what's available then (as you may find a card with faster performance, newer features, etc. for the same or less money as a card would cost you right now).

---
JimC
------
 
Jim and others,

Since it's been about a year or so since Lightroom introduced GPU support, I thought it would be time to revisit the issue. In the last year, I upgraded my DSLR, so now have quite a few more pixels to push around, and thought see what the current thinking and experience is regarding lightroom and upgrading my video card, and if that will make any difference.

If you look at the beginning of this post, you'll see my current GPU. In recent articles, I read that improving the GPU will only make a difference if I'm pushing a 4K monitor.

Just thought I'd check with you to see your current thinking, experience, and recommendations. After all these years, I've just kind of accepted that lightroom run slow, but don't want to give up on it entirely
 

Keyboard shortcuts

Back
Top