Will future M1 speeds increase rapidly ?

mangurian

Veteran Member
Messages
1,539
Reaction score
890
Location
US
I bought into the M1 and started to transition to Mac (got a MB Pro). Speed is my most important consideration. I am good with computers, so owning PCs never bothered me. Battery life is of little importance to me since I am usually home. As the M1 developes, maybe I will be back. I was also lucky my (12th gen I7) Dell hadn't sold. My daughter is Mac all the way, so she gets the MB. She already has a MB and an iMac. Maybe I will be back in an m1 generation or two.

https://www.techspot.com/review/2425-intel-core-i9-12900hk/
 
It's hard to say. Apple doesn't hint at their future products as much as AMD/Intel where we know some details a few generations out and at the very least their potential. You can look at Apple's progress with their previous chips and see the gains they get but there's always the question of if this will just be an incremental upgrade or a bigger change. They also will be going on a new process which will help with how hard they can push everything at a given power level and they're all about efficiency with Apple Silicon.
 
I need to add to what I wrote. I don't want to mislead folks. The MB I got was not an M1 it was an Intel model. As I said, I was just starting to transition.
 
They have to make small steps forward even if they have the technology ready.

Its a bussiness plan to have each year new product…

if they will trow on you the best what they have now, next year they will have nothing to offer …
 
They have to make small steps forward even if they have the technology ready.

Its a bussiness plan to have each year new product…

if they will trow on you the best what they have now, next year they will have nothing to offer …
.....but, why not a version at least or close to the speed of AMD and Intel ? The benchmarks look bad for photo and video.
 
.....but, why not a version at least or close to the speed of AMD and Intel ? The benchmarks look bad for photo and video.
Those boastful comparisons to the 12900 and the 3090 are looking pretty dishonest today.
 
That's a M1 pro... take the same Benchmark with a M1 Ultra, we will see a different result.
Ultra is for business or rich folks only. The system STARTING price is $3,999.

Though the desktop starts at $1,999, to get one with the M1 Ultra chip will require to pay at least $3,999. You can pre-order them now via Apple's website, and they'll begin shipping March 18.

When I chose the best M1 Ultra and 2TB of SSD, Apple came back with: $6664.00

funny. That's without a display or even cables. Here's the breakdown. It's not for the typical dpreviewer.

Mac Studio


Hardware
  • Apple M1 Ultra with 20-core CPU, 64-core GPU, 32-core Neural Engine
  • 128GB unified memory
  • 2TB SSD storage
  • Front: Two Thunderbolt 4 ports, one SDXC card slot
  • Back: Four Thunderbolt 4 ports, two USB-A ports, one HDMI port, one 10Gb Ethernet port, one 3.5 mm headphone jack
 


When I chose the best M1 Ultra and 2TB of SSD, Apple came back with: $6664.00

funny. That's without a display or even cables. Here's the breakdown. It's not for the typical dpreviewer.

Mac Studio


Hardware
  • Apple M1 Ultra with 20-core CPU, 64-core GPU, 32-core Neural Engine
  • 128GB unified memory
  • 2TB SSD storage
I think your price also included Final Cut Pro and Logic Pro, else it would be 6199.

You'd reduce it another $1000 by dropping the GPU from 64 cores to 48. This upgrades seem to be a particularly low value, whereas the doubling of everything else that you get with the Ultra can be very productive.

The memory upgrade also is a lot (800) and a limited number of people would use it in the forseeable future.

The SSD upgrade to 2TB is a bit pricy (400 for 1 extra TB), but that's been routine for all of the VARs.

So system price is 4399. Still not at all cheap. But if you're paying it rather than the company, a better spec.
 
No added software was included. I pasted the actual description as given by Apple.
 
.....but, why not a version at least or close to the speed of AMD and Intel ? The benchmarks look bad for photo and video.
The M1's single-core CPU benchmark speeds were well ahead of the the speeds of any mobile Intel or AMD CPU when the M1 came out.

An Intel Core i9-12900K can hit 2260 on Geekbench 5 single-core tests, in desktops that can handle a TDP of 125 to 241 watts. Its score would be 0 in a laptop, where its power demands would rapidly drain the battery while baking the laptop from the inside.

The M1 in a MacBook Air "only" hits 1705. It does so on – I think – about 10 watts, in a thin and light laptop that has no cooling fan.

How about if we look for a mobile Intel CPU with a TDP of 45 watts or less?

https://ark.intel.com/content/www/u...1_Filter-Family=122139&2_MarketSegment=Mobile

Say, a Intel® Core™ i7-10870H . 8 cores, 45 watts TDP. That one has single-core scores somewhere in the range of 1335, at best. Its multi-core scores trail slightly behind those of the M1 MacBook Air. Its recommended customer price is $417 just for the chip alone.
 
That's a M1 pro... take the same Benchmark with a M1 Ultra, we will see a different result.
Ultra is for business or rich folks only. The system STARTING price is $3,999.
The GeForce RTX 3090 is also for business or rich folks only.

https://www.tomshardware.com/news/geforce-rtx-3090-ti-candian-pricing

"the RTX still averages $2,126 [USD] on third-party marketplaces like eBay" … and … a Canadian retailer's prices for RTX 3090 Ti based graphics cards correspond to "$3,680 [USD] and $4,143 [USD], respectively."
 
.....but, why not a version at least or close to the speed of AMD and Intel ? The benchmarks look bad for photo and video.
The M1's single-core CPU benchmark speeds were well ahead of the the speeds of any mobile Intel or AMD CPU when the M1 came out.

An Intel Core i9-12900K can hit 2260 on Geekbench 5 single-core tests, in desktops that can handle a TDP of 125 to 241 watts. Its score would be 0 in a laptop, where its power demands would rapidly drain the battery while baking the laptop from the inside.

The M1 in a MacBook Air "only" hits 1705. It does so on – I think – about 10 watts, in a thin and light laptop that has no cooling fan.

How about if we look for a mobile Intel CPU with a TDP of 45 watts or less?

https://ark.intel.com/content/www/u...1_Filter-Family=122139&2_MarketSegment=Mobile

Say, a Intel® Core™ i7-10870H . 8 cores, 45 watts TDP. That one has single-core scores somewhere in the range of 1335, at best. Its multi-core scores trail slightly behind those of the M1 MacBook Air. Its recommended customer price is $417 just for the chip alone.
The i9-12900H has a single core of around 1700-1850 depending on the laptop.

https://browser.geekbench.com/v5/cpu/search?utf8=✓&q=12900H

It's also a 45w chip. https://www.intel.com/content/www/u...-24m-cache-up-to-5-00-ghz/specifications.html

Here's a good video to check out detailing it against the M1 Pro. And you can also see the huge strides they've made over that i7-10870H.



Though the Apple performance per watt is still really impressive. Intel is using 24-33W for a single thread depending if it's i7 or i9 while Apple is only using 7W.
 
Last edited:
How about if we look for a mobile Intel CPU with a TDP of 45 watts or less?
why would one choose to limit themselves so, for a desktop of either OS?

That's the argument to use to make excuses for inferior performance. It took me 20% longer, but I saved a KWh all day!

And if the goal is to measure energy use/cost per unit of work, then add in the rest of the system and the monitor and record how long it takes to do common jobs.

I've seen enough of geek bench, and cinebench. Let's do some PS/LR/PP jobs.

Let's see idle power, full power, and since single core keeps coming up, the system consumption in that state. Max Tech has been doing some excellent videos since the embargo ended, but coming slowly.

Another benchmarker unfortunately showed the M1 Max to be marginally slower than the ones in the MBP, and even to some with the M1 pro, in the PS/LR realm.
 
So, which one has the specs Apple quoted. The Ultra?
I didn't watch the keynote, but if you go to the Mac Studio page and drill down to the CPU page, the graphs are labeled as to which are for the Max and which are for the Ultra.

You have to go back out of that page and look at the footnotes to see which GPUs Apple was testing against (this varies by graph). The benchmarks are "selected" ones and I am sure Apple likes benchmarks that are favorable to them, just as much as PC vendors and discrete GPU vendors do.

If what you are interested in is playing extreme high-end 3D games, I think you would be better off with a Wintel PC with a high-end, expensive graphics card … or with the latest generation of one of the Microsoft or Sony gaming consoles. This is as much a question of software availability and tuning, as of whether the M1 Max can hold its own with mid-range discrete GPUs, or whether the M1 Ultra can challenge high-end ones.
 
My favorite tech show is Leo Laporte - The Tech Guy. He provided an interesting data point this weekend. He usually recommends Mac over PC (he says only tech savvy people or those who must run Windows only software should use PCs). He ran a photogrammetry program on his wife's Mac with an M1 Ultra. It took 20 hours. He then ran the same problem on a PC with a Ryzen cpu and a last gen (3080) Nvidia gpu and it took only 7 hours. He says it is probably because Mac does not get much omph from its gpu.
 
My favorite tech show is Leo Laporte - The Tech Guy. He provided an interesting data point this weekend. He usually recommends Mac over PC (he says only tech savvy people or those who must run Windows only software should use PCs). He ran a photogrammetry program on his wife's Mac with an M1 Ultra. It took 20 hours. He then ran the same problem on a PC with a Ryzen cpu and a last gen (3080) Nvidia gpu and it took only 7 hours. He says it is probably because Mac does not get much omph from its gpu.
They're very different architectures so that doesn't surprise me. One of Apple's big focuses is devoting a lot of die space to very specific use cases like the media engine. But in many brute force GPU compute tasks the M1 family's GPUs have fallen short of Apples claims. So basically it's not a bad GPU in any way but if you have a specific task that you're workflow is dependent on search how each architecture handles it.
 
My favorite tech show is Leo Laporte - The Tech Guy. He provided an interesting data point this weekend. He usually recommends Mac over PC (he says only tech savvy people or those who must run Windows only software should use PCs). He ran a photogrammetry program on his wife's Mac with an M1 Ultra. It took 20 hours. He then ran the same problem on a PC with a Ryzen cpu and a last gen (3080) Nvidia gpu and it took only 7 hours. He says it is probably because Mac does not get much omph from its gpu.
that begs for a citation to understand the test.

Esoteric software like this is unlikely to be recompiled, so this could be running in emulation. And as the other responder noted, the M1 optimizes for common use cases very successfully, but for general compute is inferior to the bigger iron out there.
 
My favorite tech show is Leo Laporte - The Tech Guy. He provided an interesting data point this weekend. He usually recommends Mac over PC (he says only tech savvy people or those who must run Windows only software should use PCs). He ran a photogrammetry program on his wife's Mac with an M1 Ultra. It took 20 hours. He then ran the same problem on a PC with a Ryzen cpu and a last gen (3080) Nvidia gpu and it took only 7 hours. He says it is probably because Mac does not get much omph from its gpu.
i suspect the photogrammetry program is using ray-tracing which the 3080 has hardware support for acceleration. it's fairly well known that the Apple Silicon does not yet have dedicated hardware acceleration for 3D ray tracing.

got to pick the appropriate benchmark for your particular workflow.
 

Keyboard shortcuts

Back
Top