Anandtech: Apple’s M1 is the world’s fastest CPU

From Andrei Frumusanu’s “Apple Announces The Apple Silicon M1: Ditching x86 – What to Expect, Based on A14” posted Tuesday:

apple m1 anantech worlds fastest cpuApple claims the M1 to be the fastest CPU in the world. Given our data on the A14, beating all of Intel’s designs, and just falling short of AMD’s newest Zen3 chips – a higher clocked Firestorm above 3GHz, the 50% larger L2 cache, and an unleashed TDP, we can certainly believe Apple and the M1 to be able to achieve that claim.

This moment has been brewing for years now, and the new Apple Silicon is both shocking, but also very much expected. In the coming weeks we’ll be trying to get our hands on the new hardware and verify Apple’s claims.

Intel has stagnated itself out of the market, and has lost a major customer today. AMD has shown lots of progress lately, however it’ll be incredibly hard to catch up to Apple’s power efficiency. If Apple’s performance trajectory continues at this pace, the x86 performance crown might never be regained.

My take: What Anandtech says, developers tend to believe.

25 Comments

  1. Fred Stein said:
    Next: The cloud data centers.

    Most of the cloud data centers dedicate some servers to specific kinds of work-loads. Power consumption is a big cost factor for the energy to run computers and to cool them.

    We must be patient to see how and when this plays out. Remember, pre-2000, Sun’s SPARC chips ran most internet workloads. The shift to Lintel was fast and furious.

    4
    November 11, 2020
    • Roger Schutte said:
      Fred, how would you see that working out? Apple selling chips to cloud providers? – Certainly not right away, but maybe in a few years. Apple selling servers to cloud providers? I’m very doubtful on that. Apple using these chips in their own data centers? Most definitely yes due to the chips performance and the power savings you mention.

      Windows on Arm may happen much sooner than everyone expects (especially if Apple helps fund it) and Microsoft doesn’t have much allegiance to Intel anymore. A lot of Apple Macs might be sold to Windows users and I think Apple would be very fine with that.

      2
      November 11, 2020
      • Gregg Thurman said:
        Intel’s efforts to diversify away from its reliance on Winters has been an absolute failure. Most notable among those failures is its attempt to compete against Qualcomm in the 5G arena. Its GPUs weren’t all that good either.

        You know MSFT is going to port Office to the M1, and it is pursuing an ARM based version of Windows. Unless Intel develops an ARM processor that can run Windows I think the Company is toast. Even then it will have competitors (AMD anyone?).

        1
        November 11, 2020
        • David Emery said:
          MSFT already has an ARM version of Windows running on its Surface tablets.

          0
          November 11, 2020
          • Gregg Thurman said:
            But it’s slooow and flaky with a lot of Windows programs.

            0
            November 12, 2020
      • Bart Yee said:
        IMO, IF Apple wanted to tackle the server market with relatively compact, high efficiency, low power usage servers, it could create a new corporate-facing division specifically to build servers for resale. With hard drive storage at all time price/TB lows and medium-high TB SSD storage available for faster access, Apple “could” easily create server / RAID systems with very high performance and relative power efficient.

        If Apple also were to extend the same designs and 5nm architecture over to its own memory and other support chips, it could shrink motherboards in half. Imagine 1/2 size width servers fitting 2 units in one rack space while using less than 1 unit’s power requirement. This could reduce response time and latency for AI, machine learning, AR, and general web hosting and data retrieval.

        So a lot of potential advantages for Apple’s own internal use AND a new potential revenue stream (or support headache) if Apple were to sell for external use. Big question is what type of server OS it would run under and what potential hacking would occur especially in poorly prepared and configured systems.

        So many things to consider.

        2
        November 11, 2020
  2. George Row said:
    Saying that for AMD “… the x86 performance crown might never be regained” is a bit weird, when the whole point is that Apple has abandoned X86.

    2
    November 11, 2020
    • David Emery said:
      Intel and AMD have been handicapped for years (decades) by the backwards compatibility with the old x86 architecture. It was lousy when it came out, and has not matured particularly well. That’s true for both performance and for security. If it wasn’t for the Wintel monopoly, we’d be much further along in alternate high performance and low power architectures.

      Frankly, only Apple could pull off a wholesale move away from x86, by virtue of both its iPhone and its system (as opposed to just hardware or software) expertise. The latest Stratechery is a very good read: https://stratechery.com/2020/apples-shifting-differentiation/ Note how the advantage shifts back and forth between HW led and SW led integration.

      2
      November 11, 2020
      • Gregg Thurman said:
        Robert, read the Stratechery article referenced in David’s post above. Excellent article (thanks for posting it David).

        0
        November 11, 2020
  3. Robert Paul Leitao said:
    How does Apple’s efforts on chip design impact the industry? I’m amazed at the power Apple has packed into the M1 and at 5nm the chip’s efficiency. Losing Apple as a laptop/desktop customer for chips certainly isn’t a help to Intel and AMD is locked out of gaining Apple as a customer.

    Where does this leave the PC market for the industry now that Apple is designing and sourcing its own chips?

    2
    November 11, 2020
    • Gregg Thurman said:
      Where does this leave the PC market for the industry now that Apple is designing and sourcing its own chips?

      Windows’ major market is Corporate use, and Corporations move at a glacial pace, especially if it entails large capital expenditures.

      I think where we will see major changes in the PC market will be incremental through further adoption of MacBooks by corporate road warriors and BYOC work environments.

      Now the creative market, that’s another story. They always have, and always will demand more performance. The creatives will adopt M1 systems much faster that the overall corporate world.

      Your average consumers will probably hang on to their Intel powered Macs a little longer before making the switch.

      2
      November 11, 2020
      • Bart Yee said:
        Agree Gregg, PC’s will also continue to exist because of the “same as Android” price / value issues between so many relatively undifferentiated box makers who neither make their own software or relative hardware. Oh the motherboards and cases are different but using basically off the shelf CPU’s and support chip subsystems so, it’s a race to the bottom on price.

        0
        November 11, 2020
  4. Robert Paul Leitao said:
    I’m not an expert on chip architecture. To what degree does using the A-series chip architecture in the Mac line drive greater integration of software, apps and services across all of Apple’s device lines?

    I’m certain there are significant benefits to Apple’s strategy. Can any of our readers provide a bit of insight on the issue?

    0
    November 11, 2020
    • David Emery said:
      I believe the ARM architecture is more of a RISC approach. This makes compilers simpler, and by extension makes it easier to write good machine code optimizers. The legacy x86 architecture carried along A LOT of baggage.

      Back in the early 80s, there were a couple of different instruction set architectures. The 68000 was one set, that’s what Apple used for the Mac. Intel had its 8086 architecture, that’s what Windows used. There were several other candidates, but as an over-simplification, non-Windows systems went with 68000 (including a lot of embedded devices), and PCs went with 8086. I was working at Siemens Research in the mid 80s, on a collaborative project with Intel using a very different approach (a capability machine derived from the Intel 432, if you know your computing history.) Intel lost interest because of the massive success of the x86 line in personal computers, and then low-end servers. Too bad, the BiiN approach would have been a real step forward for -secure- computing.

      0
      November 11, 2020
    • John Konopka said:
      Difficult to say as Apple is pretty tight lipped about this. This is an Apple in-house product so there are no public spec sheets.

      Clearly, Apple has vast amounts of information about how their products are used, which parts are stressed more, where the bottlenecks are so they could use this to inform their chip design.

      I think the huge value of the M1, and future M series chips, is in added circuitry that speeds up the Mac as opposed to just speeding up the CPU.

      A simple example is the inclusion of dedicated circuitry to encode/decode video. This is not unique to Apple but it is an example of how dedicated circuitry can offload a task from the main processor and perform it more efficiently.

      Apple says this chip has over 16 billion transistors. This is an astounding number. My guess is that Apple has identified tasks, such as the video encode/decode, which could be performed better by dedicated circuitry. With so many transistors available they could, in principle, find ways to perform operations with many fewer clock cycles. In the old days, silicon was expensive so you would try to be clever about coding algorithms that reused the same registers/ports etc. Now you can use silicon to speed things up.

      Suppose there are four possible outcomes of a problem you do over and over. Instead of doing this sequentially you could use more silicon to do all four in parallel and pick the winner at the end getting a 4x speed up.

      0
      November 11, 2020
  5. Gregg Thurman said:
    From the article this quote struck me as being very important, break old applications so that new features aren’t held back by backwards compatibility woes..

    Last night, thinking about one of the ways Apple was able to speed up the M1, was the elimination of Windows backwards compatibility in Intel”s processors. Intel’s chip progress has been burdened for years with MSFT’s demand for Windows backwards compatibility.

    Corporate America was MSFT’s bread and butter, and Gates didn’t want to give the Corporate world an excuse to consider other OSs. As long as the Corporate world didn’t have to make major expenditures on its IT infrastructure with each new iteration of Windows Gates was happy. This forced the other half of the Wintel duopoly, Intel, to support legacy Windows applications. That added a lot of excess baggage to Windows, Windows applications and Intel processors.

    Jobs was never afraid to cannibalize its own products, if it meant offering something better. It took a while to get all the parts to launch day, but we are witnessing the mother of all acts of corporate cannibalism, and the other side of it is going to be wonderful.

    6
    November 11, 2020
  6. Gregg Thurman said:
    Apple decision to lower the price on the Mac Mini, but not on it laptops has been rattling around in my head since the announcement. Even with a slightly lower I/O the M1 powered will be a ton faster the Winters just 2 years old, and order of magnitude faster than Winters older than that, AND it can run Windows (insignificant hit to performance) in Rosetta.

    Because it can power your PC keyboard, monitor and mouse, the consumer gets much more power than comparably priced PCs. If you are an iPhone/iPad/Apple Watch/Air Pod user thinking of upgrading to a Mac (30% of Macs sold in the September quarter were switchers) this is the machine for you. That would also explain the short reappearance of the PC Guy. I suspect we’ll be seeing much more of him in the future.

    In my opinion Apple is making an aggressive move for share. That means more Accessories and Service revenues as well.

    4
    November 11, 2020
    • David Emery said:
      Well, I, for one, thought the last Intel Minis were badly over-priced!

      0
      November 11, 2020
      • John Konopka said:
        No disrespect, but I’m old so all of these products look ridiculously cheap to me.

        We used to make a lot of money buying a 1MB memory board for $5,000 and selling it in our product as a $10,000 accessory. We declined to add a trackball to our system because the wholesale cost was about $5,000. I bought my wife a 500MB hard drive for her work for $500 and we were amazed that it was so cheap. I could go on and on. Look up the specs for the first few versions of Cray supercomputers, none of which performed as well as the current Mac mini.

        I get it. By comparison the Mac mini might be a few dollars more than some comparable machine from someone else, but … what we get for today’s inflation devalued dollars is amazing.

        5
        November 11, 2020
        • David Emery said:
          First, I’m old enough to remember my employer spending $15k (might have been $20k) to get a 10mb hard drive. (1980)…

          But I’m complaining about the Mini now vs the earlier Minis. I was never expecting Mac Pro performance from a Mini, but I would like it to have a cheap upgrade path (RAM and particularly hard drive.) The original Mini cost $499. It used commodity RAM and Storage, so you could upgrade it with 3rd party parts a lot cheaper than Apple prices. And it came with a plethora of ports, so you could hook a lot of stuff to it: https://igotoffer.com/apple/mac-mini-g4-1-25

          I’ve bought a bunch of Minis since buying that first generation, probably 7 or 8 (lost count, but that includes 2 for my mother as well as a bunch for myself/wife that we’ve used as servers as well as desktop machines.) But my last Mini is a ’14 model, because I thought the later models were over-priced. I’d much prefer to see a cheaper Mini. I think $600 would have been a better price point for that level of machine.

          0
          November 11, 2020
      • Gregg Thurman said:
        My bad. I confused running legacy Mac code with running Windows stuff. Thanks for the correction.

        0
        November 12, 2020

Leave a Reply