Does Adobe Lightroom (Not) Read The GPU??

Started 7 months ago | Questions thread
ForumParentFirstPrevious
Flat view
Batdude
Batdude Veteran Member • Posts: 6,999
Does Adobe Lightroom (Not) Read The GPU??

I have that dumb question above. Why am I asking that? Well, I just noticed that from one of Puget Systems test bench data chart, I'm looking at kind of older GPUs, and then all the way to the right are the RTX3080 and 3090. Those are the most expensive higher end GPU in that list. Those GPUs have tons more cuda cores and tensor cores and this and that.

So my dumb question is that, does Adobe Lightroom NOT able to read the GPU that's being used? Why is Denoise so damn slow? Yeah, Denoise test is not on that list, but the idea is pretty much the same because if you notice the list, most of ALL those GPUs the older ones and the newer much more expensive they all do this and that at pretty much the same seconds. There is hardly any difference, and my gut feeling is telling me that if they run Denoise all those GPUs will perform almost the same as well.

So, when someone tells me that I "need" tensor cores and this and that, this Puget Systems bench test is showing me that is not true.

https://www.pugetsystems.com/labs/articles/Adobe-Lightroom-Classic---NVIDIA-GeForce-RTX-3080-3090-Performance-1893/

 Batdude's gear list:Batdude's gear list
Fujifilm X10 Nikon D4 Fujifilm X-E1 Fujifilm X-T1 Fujifilm GFX 50S +12 more
ANSWER:
This question has not been answered yet.
ForumParentFirstPrevious
Flat view
ForumParentFirstPrevious
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow