Mac not recommended for new Topaz AI models

bakubo

Senior Member
Messages
1,002
Solutions
1
Reaction score
2,146
There is a big thread on the Topaz website right now that is mostly about the new Topaz licensing model, but embedded in it is stuff about how Apple Macs will not be supported well from now on because of limitations in the Neural Engine and GPU. This likely explains the problem Adobe has using the Neural Engine for denoise also.

https://www.dpreview.com/forums/post/68435541

fb11717bdbc443ebb60223d6971279fc.jpg



b406500d08db4c17b2a4d33dde157027.jpg
 

Attachments

  • ed5b720de30f4770b6fd7825db4b6950.jpg
    ed5b720de30f4770b6fd7825db4b6950.jpg
    386.6 KB · Views: 0
Last edited:
Interesting observations and I have no way of refuting them. Future local AI models may require more compuTer power and undoubtedly Apple would rise to the challenge to stay relevant. Based on current set ups and requirements, the M4 architecture is more than adequate.
 
Things could move right past those of us who use Macs. One more instance of Apple not paying attention to AI. I really hope that they are on this.
 
That's not encouraging. It's not that it didn't work. When Adobe enabled it my MacBook Air M1 speed improved by 75%. Adobe said they were not happy with the deep shadow performance. Some people on the Adobe forums said they never really noticed an issue and were hoping Adobe could make it a feature users could enable/disable. LrC15 is around the corner. Well we see.
 
Interesting observations and I have no way of refuting them. Future local AI models may require more compuTer power and undoubtedly Apple would rise to the challenge to stay relevant. Based on current set ups and requirements, the M4 architecture is more than adequate.
From what I gather, the M4 may be more than adequate until 9/15. Then on 9/16 when the new models are released it will no longer be more than adequate. Probably can still be run, but maybe so slow that most people won't want to do it. Particularly the new video models, but maybe the new photo models too.
 
Last edited:
That's not encouraging. It's not that it didn't work. When Adobe enabled it my MacBook Air M1 speed improved by 75%. Adobe said they were not happy with the deep shadow performance. Some people on the Adobe forums said they never really noticed an issue and were hoping Adobe could make it a feature users could enable/disable. LrC15 is around the corner. Well we see.
See my post from almost a year ago:

Apple 4-bit quantization?

 
There is a big thread on the Topaz website right now that is mostly about the new Topaz licensing model, but embedded in it is stuff about how Apple Macs will not be supported well from now on because of limitations in the Neural Engine and GPU. This likely explains the problem Adobe has using the Neural Engine for denoise also.

https://www.dpreview.com/forums/post/68435541

fb11717bdbc443ebb60223d6971279fc.jpg

b406500d08db4c17b2a4d33dde157027.jpg
Another Topaz post:



dc7a14e1eec24315b8e883712c0b3c94.jpg
 
That's not encouraging. It's not that it didn't work. When Adobe enabled it my MacBook Air M1 speed improved by 75%. Adobe said they were not happy with the deep shadow performance. Some people on the Adobe forums said they never really noticed an issue and were hoping Adobe could make it a feature users could enable/disable. LrC15 is around the corner. Well we see.
See my post from almost a year ago:

Apple 4-bit quantization?

https://www.dpreview.com/forums/post/67987251
I'm not going to switch to PC nor upgrade my Mac Mini which I got a few months ago. I don't plan on using Photo AI as Adobe offers pretty much everything I need.

All the AI tools in LrC are fast except for Denoise AI. That is also system dependant. I can't speak for others but if I was earning and needed the speed I'd have two choices. Get PC or a Mac Studio with 64GB and lots of cores. I've read some clocking Denoise at 8 seconds.

That is just Denoise AI. Obviously I can't speak for everyone and AI will just continue to grow. While this sounds a tad disappointing I can live with it. It will be interesting to see where this goes.
 
Interesting observations and I have no way of refuting them. Future local AI models may require more compuTer power and undoubtedly Apple would rise to the challenge to stay relevant. Based on current set ups and requirements, the M4 architecture is more than adequate.
From what I gather, the M4 may be more than adequate until 9/15. Then on 9/16 when the new models are released it will no longer be more than adequate. Probably can still be run, but maybe so slow that most people won't want to do it. Particularly the new video models, but maybe the new photo models too.
While the premise might be accurate, I’m hardpressed to understand why Topaz, or any company for that matter would market software which could be run on a minimal number of machines? Not a great marketing move IMHO. There aren’t many folks who will spend $3k+ alone for a video card simply to run photo editing software.

While I don’t doubt what he is saying about the computing requirements of local gen AI, the inference is that until the common hardware catches up to the needs of AI models, cloud based computing will be necessary. FWIW, an M4 system blows the doors off of commonly used 3xxx and 4xxx NVIDIA cards for local gen AI in my experience.
 
Last edited:
Interesting observations and I have no way of refuting them. Future local AI models may require more compuTer power and undoubtedly Apple would rise to the challenge to stay relevant. Based on current set ups and requirements, the M4 architecture is more than adequate.
From what I gather, the M4 may be more than adequate until 9/15. Then on 9/16 when the new models are released it will no longer be more than adequate. Probably can still be run, but maybe so slow that most people won't want to do it. Particularly the new video models, but maybe the new photo models too.
While the premise might be accurate, I’m hardpressed to understand why Topaz, or any company for that matter would market software which could be run on a minimal number of machines? Not a great marketing move IMHO. There aren’t many folks who will spend $3k+ alone for a video card simply to run photo editing software.

While I don’t doubt what he is saying about the computing requirements of local gen AI, the inference is that until the common hardware catches up to the needs of AI models, cloud based computing will be necessary. FWIW, an M4 system blows the doors off of commonly used 3xxx and 4xxx NVIDIA cards for local gen AI in my experience.
Good points. My older MacBook Air M1, 16GB was faster than my 2019 iMac Intel, 64GB, 8 GB at everything expect Adobe Denoise AI. My new MB Air M3, 24GB smokes it.

Perhaps Adobe will find a different root than depending on the NE. They made some significant improvements since the release. I still don’t mind waiting for quality results.
 
I believe Topaz may be referring to shifting the entire AI library to the local computer. Topaz, which has made some very bad decisions and boxed itself into being a niche software vendor.

Local AI mining of any ilk requires at least one, usually more than one, nVidia 4090/5090 to render "efficiently", recognizing how ludicrous it is to use the word "efficiency" at all in this context. "Tolerable speed" should be substituted for "efficiency."

Electricity guzzling AI data farms have racks and racks of what are essentially 4090/5090 GPUs with even more memory and higher power envelopes to sift through and correlate massive amounts of data. One of the greatest threats of AI is the ecological harm caused simply by the electricity requirements.

The xx90 consumer versions of these GPUs are notorious for sucking power and melting cables. You don't want that in your macBook or macMini. No one has come up, yet, with equally powerful AI silicon that does not require massive amounts of electricity or can break the nVidia Cuda choke hold.

Back down Apple fan boys, nobody is dissing the beloved Apple. Apple is not in the commercial AI space and has no need to be, they just need to have something to market that is competitive with Copilot and just as useless on a local machine.
 
I believe Topaz may be referring to shifting the entire AI library to the local computer. Topaz, which has made some very bad decisions and boxed itself into being a niche software vendor.
That's possible though unwise. More likely, it may be an "option" for those who don't want to pay for tokens/credits for online processing.
Local AI mining of any ilk requires at least one, usually more than one, nVidia 4090/5090 to render "efficiently", recognizing how ludicrous it is to use the word "efficiency" at all in this context. "Tolerable speed" should be substituted for "efficiency."
Yes, at a minimum though many other components are required. The RTX Ada or Pro are better choices than a 4090/5090. My friends at Puget Systems have a good white paper on the topic.
Electricity guzzling AI data farms have racks and racks of what are essentially 4090/5090 GPUs with even more memory and higher power envelopes to sift through and correlate massive amounts of data. One of the greatest threats of AI is the ecological harm caused simply by the electricity requirements.
No, they don't. They are racks of servers with cutting edge pro cards like those I mentioned. Yes, they consume vast amounts of electricity and generate insane levels of heat.
The xx90 consumer versions of these GPUs are notorious for sucking power and melting cables. You don't want that in your macBook or macMini. No one has come up, yet, with equally powerful AI silicon that does not require massive amounts of electricity or can break the nVidia Cuda choke hold.
The second part is accurate.
Back down Apple fan boys, nobody is dissing the beloved Apple. Apple is not in the commercial AI space and has no need to be, they just need to have something to market that is competitive with Copilot and just as useless on a local machine.
Not a fan boy but being realistic. There is always a constant ebb and flow between local processing and central servers. I've been in computing for over 40 years and experienced the movement between massive, central machines with remote terminals to stand alone PC's and the present compromise. Consider the music or video content industry as an example. It transitioned from records/tapes to CD's to the iPod to streaming. Likewise, a similar migration occurred with movies/films. The AI industry is in evolution and the current processing power/equipment required is enormous. Will there be refinements in the language/hardware to make individual devices self sufficient or will we be beholden to a distributive system? Who knows. To quote Yogi Berra, "It's tough to make predictions, especially about the future". My best advice is this, namely enjoy more and worry less. We're in amazing times.
 
Last edited:
It seems the trend with many software producers is to keep increasing the amount of "AI" in their products. I'm skeptical that modern sharpening and de-noise software is actually using AI verses just using the term as a buzz word to describe their latest incarnation of software.

Some algorithms benefit from specialized hardware and can run faster than on a general CPU.

BUT, then there is this:

AI replacing photographers

Looking at AI renderings leaves me with a plastic feeling. Everything is too perfect. Perfect model, perfect lighting, perfect composition... And none of it real.

I expect there will be a need in the future to mark every type of image or movie with the sources of it's generation. So much of what we see on the news and advertising today are manipulated images.
 
It seems the trend with many software producers is to keep increasing the amount of "AI" in their products. I'm skeptical that modern sharpening and de-noise software is actually using AI verses just using the term as a buzz word to describe their latest incarnation of software.
There have been quite a few conversations about AI. Until it actually becomes self aware is it not all advanced algorithms? Those who can collect the most data and train most quickly win the race. And it is a race. We call it AI because it’s just easier to type.
Some algorithms benefit from specialized hardware and can run faster than on a general CPU.

BUT, then there is this:

AI replacing photographers

Looking at AI renderings leaves me with a plastic feeling. Everything is too perfect. Perfect model, perfect lighting, perfect composition... And none of it real.
Not if done tastefully. When shooting birds at ISO 20,000 I don’t remember seeing noise (grain) when looking at them sans camera. All those decades of spectacular National Geographic photos have been touched up.
I expect there will be a need in the future to mark every type of image or movie with the sources of it's generation. So much of what we see on the news and advertising today are manipulated images.
Not sure about all news. Some of the established agencies still vet whether we don’t agree with or like how they cover it. They will get called out for it. That will likely change over time and as you said a verification. When it reaches to the point where you can’t tell that will sure change things.
 
I believe Topaz may be referring to shifting the entire AI library to the local computer. Topaz, which has made some very bad decisions and boxed itself into being a niche software vendor.

Local AI mining of any ilk requires at least one, usually more than one, nVidia 4090/5090 to render "efficiently", recognizing how ludicrous it is to use the word "efficiency" at all in this context. "Tolerable speed" should be substituted for "efficiency."

Electricity guzzling AI data farms have racks and racks of what are essentially 4090/5090 GPUs with even more memory and higher power envelopes to sift through and correlate massive amounts of data. One of the greatest threats of AI is the ecological harm caused simply by the electricity requirements.

The xx90 consumer versions of these GPUs are notorious for sucking power and melting cables. You don't want that in your macBook or macMini. No one has come up, yet, with equally powerful AI silicon that does not require massive amounts of electricity or can break the nVidia Cuda choke hold.

Back down Apple fan boys, nobody is dissing the beloved Apple. Apple is not in the commercial AI space and has no need to be, they just need to have something to market that is competitive with Copilot and just as useless on a local machine.
Ok I’m breathing again…. Whew that was close.
 
This just didn't/doesn't affect Topaz. I was looking more information and came across this from the below link. I had read that DXO found workarounds but was never sure if they solved the actual problem.



52d319d08f5c4b019cce6eda93014c55.jpg






--
Funny how millions of people on an internet platform where they can communicate instantaneously with people on the other side of the world using incredibly powerful handheld computers linked to orbiting the satellites hundreds of miles in space don’t believe in science. Neil deGrasse Tyson
 
It seems the trend with many software producers is to keep increasing the amount of "AI" in their products. I'm skeptical that modern sharpening and de-noise software is actually using AI verses just using the term as a buzz word to describe their latest incarnation of software.
There have been quite a few conversations about AI. Until it actually becomes self aware is it not all advanced algorithms? Those who can collect the most data and train most quickly win the race. And it is a race. We call it AI because it’s just easier to type.
You mention, "who can collect the most data" as an attribute to AI. I see this as "who has the most data to copy". Much of what is called "AI" is a match/copy/paste/blend algorithm. It basically "steals" content and produces a legally different version.
Some algorithms benefit from specialized hardware and can run faster than on a general CPU.

BUT, then there is this:

AI replacing photographers

Looking at AI renderings leaves me with a plastic feeling. Everything is too perfect. Perfect model, perfect lighting, perfect composition... And none of it real.
Not if done tastefully. When shooting birds at ISO 20,000 I don’t remember seeing noise (grain) when looking at them sans camera. All those decades of spectacular National Geographic photos have been touched up.
I'm guessing you didn't watch the video. It has nothing to do with "touch up".
 
It seems the trend with many software producers is to keep increasing the amount of "AI" in their products. I'm skeptical that modern sharpening and de-noise software is actually using AI verses just using the term as a buzz word to describe their latest incarnation of software.
There have been quite a few conversations about AI. Until it actually becomes self aware is it not all advanced algorithms? Those who can collect the most data and train most quickly win the race. And it is a race. We call it AI because it’s just easier to type.
You mention, "who can collect the most data" as an attribute to AI. I see this as "who has the most data to copy". Much of what is called "AI" is a match/copy/paste/blend algorithm. It basically "steals" content and produces a legally different version.
That would also apply to the big 5. Google, etc that everyone uses.

Some algorithms benefit from specialized hardware and can run faster than on a general CPU.
BUT, then there is this:

AI replacing photographers

Looking at AI renderings leaves me with a plastic feeling. Everything is too perfect. Perfect model, perfect lighting, perfect composition... And none of it real.
Not if done tastefully. When shooting birds at ISO 20,000 I don’t remember seeing noise (grain) when looking at them sans camera. All those decades of spectacular National Geographic photos have been touched up.
I'm guessing you didn't watch the video. It has nothing to do with "touch up".
I've seen similar videos. Sorry for missing your point.
 
This just didn't/doesn't affect Topaz. I was looking more information and came across this from the below link. I had read that DXO found workarounds but was never sure if they solved the actual problem.

52d319d08f5c4b019cce6eda93014c55.jpg

https://forums.macrumors.com/threads/adobe-and-dxo-have-found-bugs-in-neural-engine.2389495/
Two different issues entirely. The one you bring to the fore is a bug. The issue that the OP was advancing was computational gpu cores to process the AI information. I have to say that the Topaz AI components are floundering already on MAC. If I try to perform some local gen upsizing it takes at a minimum of 3 to as long as 13 minutes depending on the AI model chosen. As long as Topaz maintains cloud based options, it will be viable with MAC owners.
 
I have stopped using heavy AI editing on my photos as I have notice on some pictures it actually changed the appearance of the animal/bird I was post-processing. The first one I thought I was mistaken, but second picture edited confirmed it wasn't me.
 
Last edited:

Keyboard shortcuts

Back
Top