Are You Going For AMD Or Nvidia?

Batdude

Veteran Member
Messages
7,274
Solutions
9
Reaction score
5,267
Location
US
Which GPU do you see yourself going for within the next few months and why?



I see that AMD continues to improve their GPU at a lower price and they also continue to innovate and also improve their software. Please correct me if I’m wrong.

When it comes down to mostly photography application where do you see the most improvement/performance boost per dollar between these two competitors? I also hear that intel might be jumping into the GPU business?



What is your expert opinion on this?
 
The 4070 is indeed very good. But I (might) actually end up with AMD. Maybe the new 7800XT. I don't like NVidia pricing to performance at all compared to AMD. AMD will also provide AI capability. I think :-)
What specifically is "AI capability"? Both vendors claim it for their new generation, without describing what it is. My fuzzy impression is that AI is heuristic rules determined by huge amounts of very basic correlation calculations on large data sets -- so rather simple software, but replicated on a massive scale.

Everything seems to claim AI "seasoning" today, and consumers have no idea what is meant.
NVIDIA has a dedicated chunk of the chip to AI, totally separate cores from the traditional GPU. You can get great performance for dedicated work flows doing this just at the cost of a larger chip or fewer rendering cores or other parts of the chip.

I forget off hand exactly how the AMD solution is built into it though they have done some things for AI.

NVIDIA's advantage is that they're on 4th gen AI cores and have been making big leaps each time.
Till this day I have not seen anything "impressive" from Nvidia. At least with the PP software I'm using. They're totally okay, but nothing mind blowing for the price they are asking for their GPUs.
 
The 4070 is indeed very good. But I (might) actually end up with AMD. Maybe the new 7800XT. I don't like NVidia pricing to performance at all compared to AMD. AMD will also provide AI capability. I think :-)
What specifically is "AI capability"? Both vendors claim it for their new generation, without describing what it is. My fuzzy impression is that AI is heuristic rules determined by huge amounts of very basic correlation calculations on large data sets -- so rather simple software, but replicated on a massive scale.

Everything seems to claim AI "seasoning" today, and consumers have no idea what is meant.
NVIDIA has a dedicated chunk of the chip to AI, totally separate cores from the traditional GPU. You can get great performance for dedicated work flows doing this just at the cost of a larger chip or fewer rendering cores or other parts of the chip.

I forget off hand exactly how the AMD solution is built into it though they have done some things for AI.

NVIDIA's advantage is that they're on 4th gen AI cores and have been making big leaps each time.
Till this day I have not seen anything "impressive" from Nvidia. At least with the PP software I'm using. They're totally okay, but nothing mind blowing for the price they are asking for their GPUs.
It's going to vary by workload and app. If everything you use works fine with AMD it's a great option.

But for example with video apps their media encoder isn't as good as NVIDIAs. Even Intel has an advantage there though compatibility is an even bigger issue with their GPUs than AMD.
 
The 4070 is indeed very good. But I (might) actually end up with AMD. Maybe the new 7800XT. I don't like NVidia pricing to performance at all compared to AMD. AMD will also provide AI capability. I think :-)
What specifically is "AI capability"? Both vendors claim it for their new generation, without describing what it is. My fuzzy impression is that AI is heuristic rules determined by huge amounts of very basic correlation calculations on large data sets -- so rather simple software, but replicated on a massive scale.

Everything seems to claim AI "seasoning" today, and consumers have no idea what is meant.
Exactly. To be honest I don't even know either. Maybe is just a marketing thing. The software (Adobe Lightroom) doesn't do much with even THE most expensive GPUs. So yeah. I'm not going to spend more than $500 dollars on a GPU and I'm just waiting for the right moment, for the right product at the right price. I am in no rush.
I would wait for intel's new models. I am guessing less than 500 bucks for the top model. The A750 right now is a great deal. if you have a 12th or 13th gen intel chip, it's even better.
 
The 4070 is indeed very good. But I (might) actually end up with AMD. Maybe the new 7800XT. I don't like NVidia pricing to performance at all compared to AMD. AMD will also provide AI capability. I think :-)
What specifically is "AI capability"? Both vendors claim it for their new generation, without describing what it is. My fuzzy impression is that AI is heuristic rules determined by huge amounts of very basic correlation calculations on large data sets -- so rather simple software, but replicated on a massive scale.

Everything seems to claim AI "seasoning" today, and consumers have no idea what is meant.
Exactly. To be honest I don't even know either. Maybe is just a marketing thing. The software (Adobe Lightroom) doesn't do much with even THE most expensive GPUs. So yeah. I'm not going to spend more than $500 dollars on a GPU and I'm just waiting for the right moment, for the right product at the right price. I am in no rush.
I would wait for intel's new models. I am guessing less than 500 bucks for the top model. The A750 right now is a great deal. if you have a 12th or 13th gen intel chip, it's even better.
That’s what I’m doing waiting. Yes I will not mind buying a different brand GPU as long as they work with Adobe Lightroom to boost up speed, but, if Adobe continues not taking full advantage of any of these then it will be completely useless and might as well just buy something like a used $300 GPU.

The other option is to simply dump Adobe Lightroom and switch to a much better PP software and start how to use it from scratch. That would be not easy for me to do, but I am getting tired of Adobe’s slowest speed performance compared to other software.
--
Fronterra Photography Tours
One Lens, No Problem
The Point and Shoot Pro
The People of the Red and White stand with the people of the yellow and blue!
 
The 4070 is indeed very good. But I (might) actually end up with AMD. Maybe the new 7800XT. I don't like NVidia pricing to performance at all compared to AMD. AMD will also provide AI capability. I think :-)
What specifically is "AI capability"? Both vendors claim it for their new generation, without describing what it is. My fuzzy impression is that AI is heuristic rules determined by huge amounts of very basic correlation calculations on large data sets -- so rather simple software, but replicated on a massive scale.

Everything seems to claim AI "seasoning" today, and consumers have no idea what is meant.
Exactly. To be honest I don't even know either. Maybe is just a marketing thing. The software (Adobe Lightroom) doesn't do much with even THE most expensive GPUs. So yeah. I'm not going to spend more than $500 dollars on a GPU and I'm just waiting for the right moment, for the right product at the right price. I am in no rush.
I would wait for intel's new models. I am guessing less than 500 bucks for the top model. The A750 right now is a great deal. if you have a 12th or 13th gen intel chip, it's even better.
That’s what I’m doing waiting. Yes I will not mind buying a different brand GPU as long as they work with Adobe Lightroom to boost up speed, but, if Adobe continues not taking full advantage of any of these then it will be completely useless and might as well just buy something like a used $300 GPU.

The other option is to simply dump Adobe Lightroom and switch to a much better PP software and start how to use it from scratch. That would be not easy for me to do, but I am getting tired of Adobe’s slowest speed performance compared to other software.
The best thing I have done was ditch Adobe. I use alternatives that all are pay once and done, and no sub models.
 
The 4070 is indeed very good. But I (might) actually end up with AMD. Maybe the new 7800XT. I don't like NVidia pricing to performance at all compared to AMD. AMD will also provide AI capability. I think :-)
What specifically is "AI capability"? Both vendors claim it for their new generation, without describing what it is. My fuzzy impression is that AI is heuristic rules determined by huge amounts of very basic correlation calculations on large data sets -- so rather simple software, but replicated on a massive scale.

Everything seems to claim AI "seasoning" today, and consumers have no idea what is meant.
Exactly. To be honest I don't even know either. Maybe is just a marketing thing. The software (Adobe Lightroom) doesn't do much with even THE most expensive GPUs. So yeah. I'm not going to spend more than $500 dollars on a GPU and I'm just waiting for the right moment, for the right product at the right price. I am in no rush.
I would wait for intel's new models. I am guessing less than 500 bucks for the top model. The A750 right now is a great deal. if you have a 12th or 13th gen intel chip, it's even better.
That’s what I’m doing waiting. Yes I will not mind buying a different brand GPU as long as they work with Adobe Lightroom to boost up speed, but, if Adobe continues not taking full advantage of any of these then it will be completely useless and might as well just buy something like a used $300 GPU.

The other option is to simply dump Adobe Lightroom and switch to a much better PP software and start how to use it from scratch. That would be not easy for me to do, but I am getting tired of Adobe’s slowest speed performance compared to other software.
The best thing I have done was ditch Adobe. I use alternatives that all are pay once and done, and no sub models.
 
The 4070 is indeed very good. But I (might) actually end up with AMD. Maybe the new 7800XT. I don't like NVidia pricing to performance at all compared to AMD. AMD will also provide AI capability. I think :-)
What specifically is "AI capability"? Both vendors claim it for their new generation, without describing what it is. My fuzzy impression is that AI is heuristic rules determined by huge amounts of very basic correlation calculations on large data sets -- so rather simple software, but replicated on a massive scale.

Everything seems to claim AI "seasoning" today, and consumers have no idea what is meant.
Exactly. To be honest I don't even know either. Maybe is just a marketing thing. The software (Adobe Lightroom) doesn't do much with even THE most expensive GPUs. So yeah. I'm not going to spend more than $500 dollars on a GPU and I'm just waiting for the right moment, for the right product at the right price. I am in no rush.
I would wait for intel's new models. I am guessing less than 500 bucks for the top model. The A750 right now is a great deal. if you have a 12th or 13th gen intel chip, it's even better.
That’s what I’m doing waiting. Yes I will not mind buying a different brand GPU as long as they work with Adobe Lightroom to boost up speed, but, if Adobe continues not taking full advantage of any of these then it will be completely useless and might as well just buy something like a used $300 GPU.

The other option is to simply dump Adobe Lightroom and switch to a much better PP software and start how to use it from scratch. That would be not easy for me to do, but I am getting tired of Adobe’s slowest speed performance compared to other software.
The best thing I have done was ditch Adobe. I use alternatives that all are pay once and done, and no sub models.
Yeah I am still considering doing that as well. I feel that the software I’m using is not fully utilizing the hardware I have, including my CPU.

I do have a stupid question that I just thought about. Does Puget Systems have benchmark tests results with other post processing software such as Capture One?
I am not sure. That's something a quick visit to their website would confirm. I use Affinity suite, Luminar Neo, Polarr, Resolve and VideoProc-Vlogger for my software. It's all fast and snappy on both my systems.

--

Fronterra Photography Tours
One Lens, No Problem
The Point and Shoot Pro
The People of the Red and White stand with the people of the yellow and blue!
 
The 4070 is indeed very good. But I (might) actually end up with AMD. Maybe the new 7800XT. I don't like NVidia pricing to performance at all compared to AMD. AMD will also provide AI capability. I think :-)
What specifically is "AI capability"? Both vendors claim it for their new generation, without describing what it is. My fuzzy impression is that AI is heuristic rules determined by huge amounts of very basic correlation calculations on large data sets -- so rather simple software, but replicated on a massive scale.

Everything seems to claim AI "seasoning" today, and consumers have no idea what is meant.
Exactly. To be honest I don't even know either. Maybe is just a marketing thing. The software (Adobe Lightroom) doesn't do much with even THE most expensive GPUs. So yeah. I'm not going to spend more than $500 dollars on a GPU and I'm just waiting for the right moment, for the right product at the right price. I am in no rush.
I would wait for intel's new models. I am guessing less than 500 bucks for the top model. The A750 right now is a great deal. if you have a 12th or 13th gen intel chip, it's even better.
That’s what I’m doing waiting. Yes I will not mind buying a different brand GPU as long as they work with Adobe Lightroom to boost up speed, but, if Adobe continues not taking full advantage of any of these then it will be completely useless and might as well just buy something like a used $300 GPU.

The other option is to simply dump Adobe Lightroom and switch to a much better PP software and start how to use it from scratch. That would be not easy for me to do, but I am getting tired of Adobe’s slowest speed performance compared to other software.
The best thing I have done was ditch Adobe. I use alternatives that all are pay once and done, and no sub models.
Yeah I am still considering doing that as well. I feel that the software I’m using is not fully utilizing the hardware I have, including my CPU.

I do have a stupid question that I just thought about. Does Puget Systems have benchmark tests results with other post processing software such as Capture One?
I am not sure. That's something a quick visit to their website would confirm. I use Affinity suite, Luminar Neo, Polarr, Resolve and VideoProc-Vlogger for my software. It's all fast and snappy on both my systems.
I have never used this software I will have to look into it thanks.
--

Fronterra Photography Tours
One Lens, No Problem
The Point and Shoot Pro
The People of the Red and White stand with the people of the yellow and blue!
 
Which GPU do you see yourself going for within the next few months and why?

I see that AMD continues to improve their GPU at a lower price and they also continue to innovate and also improve their software. Please correct me if I’m wrong.

When it comes down to mostly photography application where do you see the most improvement/performance boost per dollar between these two competitors? I also hear that intel might be jumping into the GPU business?

What is your expert opinion on this?
Imaging SW will continue to make greater use of GPU power in the future. So will other AI applications. And fast slider response makes PP more enjoyable and efficient.

RTX 40-series is a significantly new generation of processor architecture over 30-series, so likely to be supported for a while. Also much better performance/power.

RTX 4070 seems like the sweet spot for me. Still in short supply at $600, but I can wait a few months...
The 4070 is indeed very good. But I (might) actually end up with AMD. Maybe the new 7800XT. I don't like NVidia pricing to performance at all compared to AMD.
Batman, I looked at your profile and I saw over $10K of current photo gear, and as much that you previously owned. Why do you care about a difference of $200 to get the optimal graphics card? Maybe Nvidia charges a bit more because there is a much greater demand for their cards from knowledgeable customers.
 
Last edited:
Which GPU do you see yourself going for within the next few months and why?

I see that AMD continues to improve their GPU at a lower price and they also continue to innovate and also improve their software. Please correct me if I’m wrong.

When it comes down to mostly photography application where do you see the most improvement/performance boost per dollar between these two competitors? I also hear that intel might be jumping into the GPU business?

What is your expert opinion on this?
Imaging SW will continue to make greater use of GPU power in the future. So will other AI applications. And fast slider response makes PP more enjoyable and efficient.

RTX 40-series is a significantly new generation of processor architecture over 30-series, so likely to be supported for a while. Also much better performance/power.

RTX 4070 seems like the sweet spot for me. Still in short supply at $600, but I can wait a few months...
The 4070 is indeed very good. But I (might) actually end up with AMD. Maybe the new 7800XT. I don't like NVidia pricing to performance at all compared to AMD.
Batman, I looked at your profile and I saw over $10K of current photo gear, and as much that you previously owned. Why do you care about a difference of $200 to get the optimal graphics card? Maybe Nvidia charges a bit more because there is a much greater demand for their cards from knowledgeable customers.
I care about $200 because I don’t like to get ripped off, and I like options 😃
 
Last edited:
The Nvidia cards are priced as such because crazed gamers want every FPS they can get because they think having 130fps will make them play better than if they had 128fps. Simple. The top level cards are really NOT needed for photo/video production for most people. My system right now is on par with M2 pro level performance when doing video/photo productions and that's a 3 year old machine.

Having a 4090 or two in the box when doing HUGE projects is great, but having some lower end stuff will chew through it without issue as well.
 
Swapped from an RX580 to a RTX4060. Adobe's Denoise features having been written for the Nvidia tensor cores was the deciding factor.
 

Keyboard shortcuts

Back
Top