From Andrei Frumusanu’s “Apple Announces The Apple Silicon M1: Ditching x86 – What to Expect, Based on A14” posted Tuesday:
Apple claims the M1 to be the fastest CPU in the world. Given our data on the A14, beating all of Intel’s designs, and just falling short of AMD’s newest Zen3 chips – a higher clocked Firestorm above 3GHz, the 50% larger L2 cache, and an unleashed TDP, we can certainly believe Apple and the M1 to be able to achieve that claim.
This moment has been brewing for years now, and the new Apple Silicon is both shocking, but also very much expected. In the coming weeks we’ll be trying to get our hands on the new hardware and verify Apple’s claims.
Intel has stagnated itself out of the market, and has lost a major customer today. AMD has shown lots of progress lately, however it’ll be incredibly hard to catch up to Apple’s power efficiency. If Apple’s performance trajectory continues at this pace, the x86 performance crown might never be regained.
My take: What Anandtech says, developers tend to believe.
Most of the cloud data centers dedicate some servers to specific kinds of work-loads. Power consumption is a big cost factor for the energy to run computers and to cool them.
We must be patient to see how and when this plays out. Remember, pre-2000, Sun’s SPARC chips ran most internet workloads. The shift to Lintel was fast and furious.
Windows on Arm may happen much sooner than everyone expects (especially if Apple helps fund it) and Microsoft doesn’t have much allegiance to Intel anymore. A lot of Apple Macs might be sold to Windows users and I think Apple would be very fine with that.
You know MSFT is going to port Office to the M1, and it is pursuing an ARM based version of Windows. Unless Intel develops an ARM processor that can run Windows I think the Company is toast. Even then it will have competitors (AMD anyone?).
If Apple also were to extend the same designs and 5nm architecture over to its own memory and other support chips, it could shrink motherboards in half. Imagine 1/2 size width servers fitting 2 units in one rack space while using less than 1 unit’s power requirement. This could reduce response time and latency for AI, machine learning, AR, and general web hosting and data retrieval.
So a lot of potential advantages for Apple’s own internal use AND a new potential revenue stream (or support headache) if Apple were to sell for external use. Big question is what type of server OS it would run under and what potential hacking would occur especially in poorly prepared and configured systems.
So many things to consider.
Frankly, only Apple could pull off a wholesale move away from x86, by virtue of both its iPhone and its system (as opposed to just hardware or software) expertise. The latest Stratechery is a very good read: https://stratechery.com/2020/apples-shifting-differentiation/ Note how the advantage shifts back and forth between HW led and SW led integration.
Where does this leave the PC market for the industry now that Apple is designing and sourcing its own chips?
Windows’ major market is Corporate use, and Corporations move at a glacial pace, especially if it entails large capital expenditures.
I think where we will see major changes in the PC market will be incremental through further adoption of MacBooks by corporate road warriors and BYOC work environments.
Now the creative market, that’s another story. They always have, and always will demand more performance. The creatives will adopt M1 systems much faster that the overall corporate world.
Your average consumers will probably hang on to their Intel powered Macs a little longer before making the switch.
I’m certain there are significant benefits to Apple’s strategy. Can any of our readers provide a bit of insight on the issue?
Back in the early 80s, there were a couple of different instruction set architectures. The 68000 was one set, that’s what Apple used for the Mac. Intel had its 8086 architecture, that’s what Windows used. There were several other candidates, but as an over-simplification, non-Windows systems went with 68000 (including a lot of embedded devices), and PCs went with 8086. I was working at Siemens Research in the mid 80s, on a collaborative project with Intel using a very different approach (a capability machine derived from the Intel 432, if you know your computing history.) Intel lost interest because of the massive success of the x86 line in personal computers, and then low-end servers. Too bad, the BiiN approach would have been a real step forward for -secure- computing.
Clearly, Apple has vast amounts of information about how their products are used, which parts are stressed more, where the bottlenecks are so they could use this to inform their chip design.
I think the huge value of the M1, and future M series chips, is in added circuitry that speeds up the Mac as opposed to just speeding up the CPU.
A simple example is the inclusion of dedicated circuitry to encode/decode video. This is not unique to Apple but it is an example of how dedicated circuitry can offload a task from the main processor and perform it more efficiently.
Apple says this chip has over 16 billion transistors. This is an astounding number. My guess is that Apple has identified tasks, such as the video encode/decode, which could be performed better by dedicated circuitry. With so many transistors available they could, in principle, find ways to perform operations with many fewer clock cycles. In the old days, silicon was expensive so you would try to be clever about coding algorithms that reused the same registers/ports etc. Now you can use silicon to speed things up.
Suppose there are four possible outcomes of a problem you do over and over. Instead of doing this sequentially you could use more silicon to do all four in parallel and pick the winner at the end getting a 4x speed up.
Last night, thinking about one of the ways Apple was able to speed up the M1, was the elimination of Windows backwards compatibility in Intel”s processors. Intel’s chip progress has been burdened for years with MSFT’s demand for Windows backwards compatibility.
Corporate America was MSFT’s bread and butter, and Gates didn’t want to give the Corporate world an excuse to consider other OSs. As long as the Corporate world didn’t have to make major expenditures on its IT infrastructure with each new iteration of Windows Gates was happy. This forced the other half of the Wintel duopoly, Intel, to support legacy Windows applications. That added a lot of excess baggage to Windows, Windows applications and Intel processors.
Jobs was never afraid to cannibalize its own products, if it meant offering something better. It took a while to get all the parts to launch day, but we are witnessing the mother of all acts of corporate cannibalism, and the other side of it is going to be wonderful.
Because it can power your PC keyboard, monitor and mouse, the consumer gets much more power than comparably priced PCs. If you are an iPhone/iPad/Apple Watch/Air Pod user thinking of upgrading to a Mac (30% of Macs sold in the September quarter were switchers) this is the machine for you. That would also explain the short reappearance of the PC Guy. I suspect we’ll be seeing much more of him in the future.
In my opinion Apple is making an aggressive move for share. That means more Accessories and Service revenues as well.
We used to make a lot of money buying a 1MB memory board for $5,000 and selling it in our product as a $10,000 accessory. We declined to add a trackball to our system because the wholesale cost was about $5,000. I bought my wife a 500MB hard drive for her work for $500 and we were amazed that it was so cheap. I could go on and on. Look up the specs for the first few versions of Cray supercomputers, none of which performed as well as the current Mac mini.
I get it. By comparison the Mac mini might be a few dollars more than some comparable machine from someone else, but … what we get for today’s inflation devalued dollars is amazing.
But I’m complaining about the Mini now vs the earlier Minis. I was never expecting Mac Pro performance from a Mini, but I would like it to have a cheap upgrade path (RAM and particularly hard drive.) The original Mini cost $499. It used commodity RAM and Storage, so you could upgrade it with 3rd party parts a lot cheaper than Apple prices. And it came with a plethora of ports, so you could hook a lot of stuff to it: https://igotoffer.com/apple/mac-mini-g4-1-25
I’ve bought a bunch of Minis since buying that first generation, probably 7 or 8 (lost count, but that includes 2 for my mother as well as a bunch for myself/wife that we’ve used as servers as well as desktop machines.) But my last Mini is a ’14 model, because I thought the later models were over-priced. I’d much prefer to see a cheaper Mini. I think $600 would have been a better price point for that level of machine.
“No disrespect, but I’m old so all of these products look ridiculously cheap to me.”
Our first computer was a 1984 Mac with 128 K of RAM. It came with two programs, MacWrite and MacPaint. It had a tiny black and white screen. A really noisy dot matrix printer was extra, and the whole thing cost ~$3,000 – in 1984 dollars! We bought it on credit, so add more for that. Best investment Donna and I ever made, hands down.
A few hundred bucks for this jewel of a machine? Practically giving it away.
Are you all forgetting about Nuvia and the collection of ex-Apple Chip Designers who were allegedly going after the data center processing space!?
https://www.reuters.com/article/us-apple-nuvia-lawsuit-idUSKBN1ZK16R
Likely all tied up in litigation, except oh wait, they are still pulling in funding:
https://nuviainc.com/news