š¹ Music for this post: https://www.youtube.com/watch?v=ILWSp0m9G2U.
Lest you forget I am a technologist, this column is for you.
Iām 52 years old. The very first computer I ever used was a DEC PDP-11 over a dial-up connection.
The year? 1980; I was twelve.
The second computer I used was a Commodore PET. Third? A friendās Sinclair ZX81, built from a kit. Fourth? A Commodore VIC-20, which was the first computer I came to own. From there, I worked with all of the famous 8-bit computers, from the Atari 800 to the Apple II to the Commodore 64.
The years 1980-1985 were inarguably the best time in history to be a computer nerd. Beyond these famous 8-bit computers, we saw so many other interesting computers:
- The Tandy TRS-80 Model III (1980)
- The Texas Instruments TI-99/4A (1981)
- The IBM PC (1981)
- The BBC Micro (1981)
- The Jupiter ACE (1982), which ran Forth!
- The Franklin Ace 1000, an Apple II clone!
- The Sinclair ZX Spectrum (1982)
- The Compaq Portable (1982)
- The IBM PC XT (1983)
- The Apple Lisa (1983)
- The Amstrad CPC64 (1984)
- The IBM PCjr (1984)
- The Apple Macintosh (ahem, 1984)
- The IBM PC/AT (1984)
- ā¦and the Commodore Amiga, introduced in 1985
All of these were much more different from one another than computers are today. Different ideas, different technologies, different ways of programming them, different features and benefits. Each idea had something to challenge the other, and standardization was not a thing just yet. The idea was to be different, and to try new things, often. This was a remarkable period of time!
If you were a teenager ā with all that attendant attitude ā back then, one thing you did not care to own was an IBM PC of any kind. Why? Because, compared to almost all of the alternatives, it was boring. IBMs ā and Apple IIs, for that matter ā lacked the sprites and player missile graphics of the Commodore 64 and Atari 400/800. They lacked the sound synthesis of the ā64. They lacked the 256 color palette of the Ataris. The IBM PC tried to convince you that it was a great PC because it was a 16-bit machine. But it had an 8-bit data bus.
The IBM PC sucked. All the kids knew it.
You could say that all the things we ā the kids ā valued were only important to gaming, but we young programmers back then would have disagreed with you. We knew that, in order to create any sort of engaging experience on a computer, sounds and visuals were important. Would you, today, enjoy using Excel if it looked like this?
Nonetheless, this sort of thing literally excited all of the adults of the period. You know, these kinds of adults:
Do you remember the phrase, āNobody ever got fired for buying IBMā? I do. I wasnāt alone in thinking: this is a strange form of suicide.
Of course, if you are in charge of Wall Street, and the world, you do have a lot of control over the markets, and what gets bought and sold.
The Lisa of 1983 and the Macintosh of 1984 were, famously, incredibly different from everything else that came out before. And they were 32 bit machines with 16-bit data buses. Whoa.
But 1985 saw what was, perhaps, the most interesting computer of the entire decade: The Commodore Amiga.
With the same Motorola 68000 processor of the Mac, a color palette of 4,096 colors and things like https://en.wikipedia.org/wiki/Hold-And-Modify, it was the first personal computer that could really do photorealistic graphics, address 16MB of RAM, 8-bit PCM sound with tricks that could achieve nearly CD quality, it was like nothing else that had come before it. Then came the Video Toaster, and the Amiga simply boggled every geekās mind. Many, many things led to the Amigaās demise, and you can read about that in Ars Technicaās spectacular 12-part series.
But: these were the days in which regularly-released new technologies would raise pulses a little bit.
A little while later, Macs got color graphics and got to full 32-bitness; IBMs got there, too. Processors got fast. Windows 95 came out. Then VAX/VMS somehow became Windows NT, and then Windows 2000, and then XP. XP became Windows 7, Windows 8, Windows 10. Then 64-bit. Etc.
Linux came out, and that was incredibly interesting and important, but itās just an operating system kernel.
Mac OS merged with NeXTSTEP and became Mac OS X.
The smartphone era occurred. iPhone came out and changed, well, phones. That was important (and, of course, was a key step toward this very year and this very article). But pulse-quickening? Not at the same level as 1980-1985. Not to me, anyway.
Not much else has happened since 1985 to raise a pulse as high, other than the NeXT computer in 1989. What NeXTSTEP did to popularize object-oriented programming was utterly, completely, groundbreaking.
Of course, the things that you could DO with a computer in those interceding years? Sure, those have been exciting. You know, the Internet and all. But computers themselves? Meh. A processor, a graphics chip, memory, storage, some IO controllers. Sure, faster, whatever. Apple makes them. Lenovo makes them. Dell makes them. Bleh. DDR4? Yawn. PCI Express? Shoot me. Thousand+ watt power supplies! Do you lack testosterone or something? These are useful alternatives to sheep and Nyquil Pure ZZZs. They do not quicken my pulse. They are evolutionary, and maybe even devolutionary.
In 2020, my pulse quickened. Itās been a long time. The Apple M1 is the most important thing to happen to the personal computing hardware landscape not just this year, but in the past 30.
Thatās why thereās so much written about the M1 already. If this is news to you, I recommend you start with these three articles:
- AnandTech: The 2020 Mac Mini Unleashed: Putting Apple Silicon M1 To The Test
- Erik Engheim: Why Is Appleās M1 Chip So Fast?
- Daring Fireball’s John Gruber: M1 Macs: Truth and Truthiness
As a bonus, donāt miss Linus Torvaldsā initial reaction to Appleās announcement back in June.
Linus has never been one to say good things about Apple, or anyone using monolithic microkernels for that matter.
The M1 represents the first complete re-thinking of the general-purpose personal computer since the 1980s. It is exciting in the same way that each of those very different 1980s computers listed above was. Let me put it this way: As a result of the M1, one of two things will happen, and either of them will be profound:
- Microsoft and its hardware brethren will have to completely rethink how they deliver personal computing solutions; or
- The Mac will become the go-to personal computer solution for most people.
Why does this matter here, on The Progressive CIO?
Because apart from the geek fact that the current bulk of personal computers are a bore, there is the human fact that personal computers still, to large degree, have not fully met an important goal: to honor the human condition.
This week, my boss, the CEO, told me that his Lenovo laptop speaker sounded really bad. I said, āMine too! You know why? Because the cooling fans blow hot air right over the speaker, causing the voice coil to malform. A bad design decision.ā
Have you spent time using Microsoft Teams and Zoom this year? Do these applications sometimes make your laptopās fans go wild after hours of use? Teams, for all its ubiquity, is a ridiculously resource-hungry app. Itās written in Electron, after all. It cares not one bit for saving your computerās power. Itās designed to be easy to port to multiple platforms above all other things. Itās developer-focused, not end-user-focused.
Personal computers, as they have evolved ā even Macs, to some degree ā are janky. They still crash and burn way too often. IT people love computers that donāt need a reboot, but itās 2020, and most of our computers need to be rebooted at least once per month, if not per week. They have fans that blow incredibly hot air over very important parts that donāt like to get hot. They have batteries filled with ridiculous chemicals that can explode under the right conditions. They are bedrooms of private information that are riddled with vulnerabilities that need to be constantly patched. We spend a good portion of our computing horsepower analyzing whatās going on under the hood to make sure that those vulnerabilities arenāt being attacked. Our computers get hotter, and less, reliable, and things just get worse in a vicious cycle. Apps like Microsoft Teams are written in Electron, which is running on JavaScript through an encapsulated web browser, which is compiled to an API framework for the OS itās running on, which is talking to the OS kernel that is running on the bare metal. This is at least six layers of abstraction away from the hardware. Itās a miracle that the computers work at all.
How does the M1 change all of this? Well, for starters, the term SoC (whatever happened to Silicon on Ceramic? I digress.) has been wildly overused. The M1 seems to meet the real definition. Everything about it is designed to work together, in harmony. It encapsulates multiple CPU cores, high performance graphics with multiple cores, specialized I/O and storage controllers, image processors, fast and unified RAM, neural processing, and much more. All bespoke and specifically designed to support the heavily object-oriented and declarative design standards that Apple has promoted for a long time now, starting from NeXTSTEP and continuing up to, and through, SwiftUI.
The real kicker, of course, is how the M1ās RISC instructions are so much easier to optimize in out-of-order-execution pipelines with multiple instruction decoders. And the M1 has ā gasp! ā eight of them. CISC ecosystems just cannot do that sort of thing. While Intelās CISC approach eventually killed the RISC-based PowerPC, it was because the PowerPC couldnāt scale without generating significantly more heat. Intel won Appleās heart because of the need to keep computers cool and quiet. Now the shoe is squarely on the other foot, and the endgame looks decidedly different today than it did 15 years ago.
Back to the human condition:
One of the complaints that weāve all heard this year is how our need to work apart, using videoconferencing applications, has reduced or removed the key cues we need in order to have truly quality conversations. We canāt make eye contact. We canāt hear the small gasps or intakes of breath that are passive ways of saying: hey, can I speak for a sec? Those little gasps and intakes of breath, though, are what make conversations flow, rather than feeling like this.
Whether you know it or not, our computers and smartphones spend an inordinate amount of processing power to filter out background noises. This includes not just the TV in the background, or your labradoodle barking at the Amazon driver. It includes the damn fan in your laptop, which also sounds a lot like moving air breath.
When our computers run cooler, they run more reliably. They last longer. Their sound input systems donāt have to waste additional power filtering out the sounds that the computers themselves are making. They donāt burn our laps, or our palms. Their speakers donāt melt. Their batteries last longer. We get more done, more smoothly, so that we can finish using them, and get back to our real lives.
Good computers are also fast at what they do, and honor the human condition by allowing humans to be finished using them, more quickly, with less drama. Computers are a means to an end, and if that end involves too much time or pain, they are simply not good.
The M1 is fast. It runs cool. And it is fast. Yep, I said that twice. It significantly changes the dialog about what a personal computer can be, for the first time in three decades.
Thatās why the M1 matters on The Progressive CIO, and why it is the Technology of the Year, 2020.
Discuss this specific post on Twitter or LinkedIn.