future of intel/windows platoform

foot

Veteran Member
Messages
5,122
Reaction score
865
Location
US
This post isn't to start a win vs mac vs linux discussion.

It's to remind everyone that healthy platforms evolve. This is true of the current CPU processor generations, and that this is great for us, the consumers. Higher performance, lower power, less expensive options

Currently Apple has done an amazing job with their new M1x systems. So has AMD. (Heck even qualcom has joined in with their new snapdragon processor system although I'm not sure where they are going with it...it is supported by Linux now!)

This is great for intel users, since Intel is busily building their new processors, which should apply competitive pressure back to Apple.

This is a constant "cat and mouse" game, each doing their competitive best to deliver better and better computer choices for us, the consumers.

True, for me personally, I'm moving toward Linux. But even there I expect to use my current win10 laptop for many years -maybe 5++years- before I need a new one.

(for Linux I've already been using the raspberry pi for some DIY network programs, and I'm refubing an older laptop to run Linux)

Heck, I had to buy my current laptop ASAP due to my older one crashing, at the start of covy. I couldn't even shop in retail stores since they were closed to foot traffic. Yet even as a desperation buy I got an increadable deal...AMD ryzen 7 4800H, with NVIDIA GeForce GTX 1660 Ti GPU. 16 gig of RAM (that I can upgrade when I want), Great price too.

It's so over spec that I run it at 50% cpu, and use the builtin AMD gpu and not the nvidia. (if you do alot of video/photo editing, you might choose to "crank it up" for extra performance)

People should not jump ship and switch computer platofrms anymore than they should jump ship every other month, due to some percieved technical advantage of Nikon vs Canon vs Sony vs Pentax vs m43...they are all very good
 
I use different operating systems for different purposes.
  • Windows for photo/video editing
  • Pop!_OS Linux for general purpose computing
  • Truenas for file servers
  • Ubuntu for application servers
Until recently, I had never thought about including Apple/macOS, but with the performance offered by Apple silicon, and the difficulty obtaining NVIDIA graphics cards, I might shift from Windows to macOS when I upgrade my desktop and laptop in a year or two. Most of the photo/video editing software I use is licensed for installation on either Windows or macOS, to the migration cost isn't too much.

I would like to use Linux for photo/video editing, but I have experienced ongoing inconsistency with how well fractional scaling and color management are handled. Linux is not currently the best solution for my photo/video editing workflow.

I expect to keep using multiple operating systems and am open to changing how I use each as my computer needs evolve.
 
Last edited:
Currently Apple has done an amazing job with their new M1x systems.
Have they? Cutting ram and storage plus all forms of connectivity than claiming it's "cheap" sounds like a con not a great job.

You might want to wonder why the next "pro" Mac is supposedly going to use a Xeon chip and not a M1 of some form.

Apple does have many who apologize for all it's failings. They do get away with shipping broken products and blaming the end user. They are good at that.

Lets not forget Apple is the company that basically banned Nvidia GPUs from their systems. They didn't stop selling them they made it almost impossible for users to install their own Nvidia cards.

Or just ask yourself why Apple didn't even bother to include TB4 on their M1 machines after fawning over TB for years.
 
Until recently, I had never thought about including Apple/macOS, but with the performance offered by Apple silicon, and the difficulty obtaining NVIDIA graphics cards, I might shift from Windows to macOS when I upgrade my desktop and laptop in a year or two. ... snip ....
I sense a change coming with windows:
  • Windows 11 can run Android apps.
  • Android runs on ARM based architecture (and others).
  • Apple M1 runs on ARM based architecture and is fast yet said to use one tenth the electrical power.
  • Some states have banned the sale of a/c power hungry gaming computers.
Maybe Win11 can be compiled to run on ARM based architecture. I wouldn't be surprised that Windows will end up running on ARM based architecture and join the smart phone Android tsunami. Windows 11 is already trending toward that.

Sky
 
Last edited:
Currently Apple has done an amazing job with their new M1x systems.
Have they? Cutting ram and storage plus all forms of connectivity than claiming it's "cheap" sounds like a con not a great job.

You might want to wonder why the next "pro" Mac is supposedly going to use a Xeon chip and not a M1 of some form.

Apple does have many who apologize for all it's failings. They do get away with shipping broken products and blaming the end user. They are good at that.

Lets not forget Apple is the company that basically banned Nvidia GPUs from their systems. They didn't stop selling them they made it almost impossible for users to install their own Nvidia cards.

Or just ask yourself why Apple didn't even bother to include TB4 on their M1 machines after fawning over TB for years.
Exactly this. I have talked to many professional video editors who state the M1 is no good for video editing, it will do quick edits ok, but once you start pushing it, it falls flat. testing has proven this as well. It's because of the ram. 16 gb for everything is not enough for heavy lifting.

WIndows/linux debate is all personal preference at this point. The software I use is not available on linux, and I always find distros Janky to use.
 
Currently Apple has done an amazing job with their new M1x systems.
Have they? Cutting ram and storage plus all forms of connectivity than claiming it's "cheap" sounds like a con not a great job.

You might want to wonder why the next "pro" Mac is supposedly going to use a Xeon chip and not a M1 of some form.

Apple does have many who apologize for all it's failings. They do get away with shipping broken products and blaming the end user. They are good at that.

Lets not forget Apple is the company that basically banned Nvidia GPUs from their systems. They didn't stop selling them they made it almost impossible for users to install their own Nvidia cards.

Or just ask yourself why Apple didn't even bother to include TB4 on their M1 machines after fawning over TB for years.
Exactly this. I have talked to many professional video editors who state the M1 is no good for video editing, it will do quick edits ok, but once you start pushing it, it falls flat. testing has proven this as well. It's because of the ram. 16 gb for everything is not enough for heavy lifting.

WIndows/linux debate is all personal preference at this point. The software I use is not available on linux, and I always find distros Janky to use.
I think using current M1 systems for heavy lifting, such as serious video editing, is a mistake. My understanding is that the M1 was not intended for very demanding workloads, but future releases, such as the M1X, would be better suited to handle professional video editing tasks.

I don't know how much Apple marketing is to blame, but I saw several YouTube reviewers making claims about M1-based laptops that seemed beyond belief. I don't think the graphics in the current chips perform much better than an NVIDIA 1050 TI.

I do understand that the architecture is capable of performing very well with much lower amounts of memory than a typical Windows-based system, but unless I can buy Apple silicon that supports at least 32GB I won't consider it for serious work.

I tend to use Linux for general purpose computing and use Windows systems with an appliance mindset. I have a Windows desktop that is set up for photo/video editing, and I don't use it for other purposes.
 
Until recently, I had never thought about including Apple/macOS, but with the performance offered by Apple silicon, and the difficulty obtaining NVIDIA graphics cards, I might shift from Windows to macOS when I upgrade my desktop and laptop in a year or two. ... snip ....
I sense a change coming with windows:
  • Windows 11 can run Android apps.
  • Android runs on ARM based architecture (and others).
  • Apple M1 runs on ARM based architecture and is fast yet said to use one tenth the electrical power.
  • Some states have banned the sale of a/c power hungry gaming computers.
Maybe Win11 can be compiled to run on ARM based architecture. I wouldn't be surprised that Windows will end up running on ARM based architecture and join the smart phone Android tsunami. Windows 11 is already trending toward that.
Windows has run on Arm for years.

Android is basically a Linux distribution just like Chrome OS is and both run on a wide range of platforms. I think the majority of Chromebooks are Intel especially the more powerful.

Windows running Android is about squeezing Apple. The more tied you are to your Android/Windows system with it's 90% market share the less chance you'll move to Apple.

Apple tries the same sort of thing to tie it's users in.
 
It's the same with cameras, there is no real progess anymore, which is good because the hardware we have now is amazing.

Let me give you an example. In 2007 i was still using a pentium 4 computer and a canon 400D. Decent specs on both, but the computer was already struggling to run both programs and video games. The 3 ghz processor from 2004 aged so badly in just 3 years that it used to run at 100% all the time.

Fast forward to 2012, so 5 years later. Computer hardware had evolved so much that processing power had like quadrupled at the same price per components.

Intel's second gen I core cpus from 2011/2012 were so good they can run the latest programs even today. Basically, if you take a high end computer of the era and exchange the gpu, you have a modern pc once again in 2021.

This was not possible before the I core generation.

The idea is, computers age far slower these days, i still have a first gen I5 laptop and it runs fine up to this, didn't need to even instal an ssd.

The same principle applies to cameras. A midrange dslr from 2011/2012 can be considered a modern camera, up to date camera if you ignore some advancements, like video.

So, when people say they have to upgrade their computers and cameras they usually end up with computers that perform the same as their old setups.

Sure, professionals benefit from having 16 thread cpus and 12 gb of gddr5 gpu memory, but i'm talking about the average consumer here, like me.

Most people would benefit more from learning how to manage their existing pcs rather than buying a new one every 3-4 years.

My main computer today contains runs on a 5 year old dual core I3 6100 cpu. According to "tech experts" it is obsolete and worthless, but in actual use it's pretty amazing.

The only way i found to choke its performance was to run two 4K video files at the same time, something an I7 6700 could probably do.

The multitasking capabilities of these cpus are already fantastic, even more so when we're discussing 8/16 threads.

I'm just saying, how many people need to run two 4K movies at the same time or other crazy multitasking like 50 chrome tabs in order to utilize the capabilities of even a midrange pc made anytime in the last few years?

As for cameras, my 2012 sony rx100 works just as well as my rx100 m7, the only reason i bought the M7 was for the zoom.

I think the time has come to get over the hardware wars and demand better software instead, because that's an area of decline.

To put this in plain words, windows 10 and 11 are just bad. Linux has evolved, but it's still user unfriendly.

Chrome uses so much ram and cpu power for basic activities i simply don't understand how this is possible.

PC games are less optimized now than ever before.

Not to mention the way www evolved over the years, nowadays browsing the web feels like visiting a bazar while 10 different vendors try to sell you stuff you don't need. It's ads and trackers everywhere, can't do anything without several bots tracking your every move, along with Microsoft gathering and selling your user data.

In conclusion, 2011-2021 is the decade where we enjoyed having access to the best hardware ever made, coupled to the worst software and internet services.

So thanks, software engineers, for making web pages load slower in 2021 than they did 15 years ago, while the hardware requirements have increased 10 fold. The quest to making state of the art hardware run like a potato powered light bulb continues.
 
Currently Apple has done an amazing job with their new M1x systems.
Have they? Cutting ram and storage plus all forms of connectivity than claiming it's "cheap" sounds like a con not a great job.

You might want to wonder why the next "pro" Mac is supposedly going to use a Xeon chip and not a M1 of some form.

Apple does have many who apologize for all it's failings. They do get away with shipping broken products and blaming the end user. They are good at that.

Lets not forget Apple is the company that basically banned Nvidia GPUs from their systems. They didn't stop selling them they made it almost impossible for users to install their own Nvidia cards.

Or just ask yourself why Apple didn't even bother to include TB4 on their M1 machines after fawning over TB for years.
Exactly this. I have talked to many professional video editors who state the M1 is no good for video editing, it will do quick edits ok, but once you start pushing it, it falls flat. testing has proven this as well. It's because of the ram. 16 gb for everything is not enough for heavy lifting.

WIndows/linux debate is all personal preference at this point. The software I use is not available on linux, and I always find distros Janky to use.
I think using current M1 systems for heavy lifting, such as serious video editing, is a mistake. My understanding is that the M1 was not intended for very demanding workloads, but future releases, such as the M1X, would be better suited to handle professional video editing tasks.

I don't know how much Apple marketing is to blame, but I saw several YouTube reviewers making claims about M1-based laptops that seemed beyond belief. I don't think the graphics in the current chips perform much better than an NVIDIA 1050 TI.

I do understand that the architecture is capable of performing very well with much lower amounts of memory than a typical Windows-based system, but unless I can buy Apple silicon that supports at least 32GB I won't consider it for serious work.

I tend to use Linux for general purpose computing and use Windows systems with an appliance mindset. I have a Windows desktop that is set up for photo/video editing, and I don't use it for other purposes.
The unbelievable claims were because of a couple of different things. 1. Apple marketing leaves TONS of information out and just puts out the "good stuff". 2. Most youtubers that review these systems have drunk the Kool-Aid long before M1 was a thing and take everything the fruit says as gospel. 3. If you watch actual unbiased videos regarding the M1, you will see that they all prove that the M1 is not made for serious work. The issue lies in its architecture. 16gb of ram, no matter how it's utilized is not sufficient for heavy graphics work like batch photo, video or cad/graphics creation. My video card itself has 12gb and my system has 128gb. My notebook has 64gb with 8gb being utilized for video processing. It does not matter that the chips are this or that. Not having enough ram is still a bottle neck.

It's great as a basic computing device where you are surfing the net or doing basic work like word documents or editing iPhone videos. Beyond that, a desktop or notebook with dedicated video cards are going to run circles around it. Even most apple centric sites, channels etc. have all moved back to intel macs to do their work since the M1 can't do it in a timely matter, or at all.

--
One Lens, No Problem
The Point and Shoot Pro
 
Last edited:
Well Windows 11 works perfectly fine on my Raspberry Pi 4
 
Windows has run on Arm for years.
...Could you give some examples...
surface Pro X, HP envy X2, acer swift 7 and other. Google windows Arm PC.
Windows was designed from the ground up to be a portable OS with a "hardware abstraction layer" to do the work of mating the kernel to the platform.

The biggest issue with Windows and ARM is the performance of x86/x64 application emulation combined with the availability of drivers for oddball peripherals. In the Windows world backward compatibility with a huge catalogue of weird hardware and software is absolutely essential for market acceptance.

It seems to me that there are a lot of real gearhead Windows users and they make a lot of noise about performance and incredibly niche cases which seems to prevent alternative hardware platforms from making any inroads.
 
I think Microsoft themselves should do things to help push the demise of 32-bit software. I honestly don’t think that making 64-bit hardware a requirement for Win 11 will do much if Windows OS support for running 32-bit software remains in place as it has been.

An announced full cut-off of 32-bit ala Apple isn’t something I could ever see happening for Windows. What could be done, however, is to just start making life harder in general for ongoing use of 32-bit. I’m thinking of things like making the 32-bit “subsystem” a Windows feature that isn’t installed by default.

Overall, I just get a vibe that there’s a disconnect between the hardware and software side for dropping 32-bit. 32-bit hardware is still around of course, but it’s very much in the rear view mirror at this point. 32-bit software, on the other hand, seems to continue to be a completely viable option because of how easy and universal the support for it remains within Windows. I just think this is something Microsoft should start to gently “tighten the screws on” to some degree.
 
I think Microsoft themselves should do things to help push the demise of 32-bit software. I honestly don’t think that making 64-bit hardware a requirement for Win 11 will do much if Windows OS support for running 32-bit software remains in place as it has been.

An announced full cut-off of 32-bit ala Apple isn’t something I could ever see happening for Windows. What could be done, however, is to just start making life harder in general for ongoing use of 32-bit. I’m thinking of things like making the 32-bit “subsystem” a Windows feature that isn’t installed by default.

Overall, I just get a vibe that there’s a disconnect between the hardware and software side for dropping 32-bit. 32-bit hardware is still around of course, but it’s very much in the rear view mirror at this point. 32-bit software, on the other hand, seems to continue to be a completely viable option because of how easy and universal the support for it remains within Windows. I just think this is something Microsoft should start to gently “tighten the screws on” to some degree.
They are moving closer to it. No 32bit apps/software will be available in the store moving forward. So, i think they are giving people these next 5 years to move forward or be left unsupported. It's time.
 
Windows was designed from the ground up to be a portable OS with a "hardware abstraction layer" to do the work of mating the kernel to the platform.
Yes, I remember way back when, winNT was developed simultaneously on seeveral platforms. And back then the switch from 16-bit to 32-bit was a huge deal...lol

"Initially, it supported several instruction set architectures, including IA-32, MIPS, and DEC Alpha; support for PowerPC, Itanium, x64, and ARM were added later."

https://en.wikipedia.org/wiki/Windows_NT

It seems to me that there are a lot of real gearhead Windows users and they make a lot of noise about performance and incredibly niche cases which seems to prevent alternative hardware platforms from making any inroads.
in the past, microsoft dropped support for the logictech c270 webcam...which I use (inexpensive, audio is very acceptable and good enough quality for OBS and Zoom). No need to use my Sanson G-Track Pro microphone. I bought 3 of them, and use them on various computers.

So I appreciate that others complained enough that Microsoft brought support back for the c270, which I find very useful

For zooming I like to use OBS as a virtual camera, so I can share my destop (without the other side enabling to share my desktop with them - not every one wants to muck around with settings...many just want to use the default settings)

It also works pretty good if I want to make a personal VLOG and include the desktop
 

Keyboard shortcuts

Back
Top