Categories
Current Events: 2020 Technology of the Year

Technology of the Year, 2020 Edition

🎹 Music for this post: https://www.youtube.com/watch?v=ILWSp0m9G2U.

Lest you forget I am a technologist, this column is for you.

I’m 52 years old. The very first computer I ever used was a DEC PDP-11 over a dial-up connection.

The year? 1980; I was twelve.

The second computer I used was a Commodore PET. Third? A friend’s Sinclair ZX81, built from a kit. Fourth? A Commodore VIC-20, which was the first computer I came to own. From there, I worked with all of the famous 8-bit computers, from the Atari 800 to the Apple II to the Commodore 64.

The years 1980-1985 were inarguably the best time in history to be a computer nerd. Beyond these famous 8-bit computers, we saw so many other interesting computers:

  • The Tandy TRS-80 Model III (1980)
  • The Texas Instruments TI-99/4A (1981)
  • The IBM PC (1981)
  • The BBC Micro (1981)
  • The Jupiter ACE (1982), which ran Forth!
  • The Franklin Ace 1000, an Apple II clone!
  • The Sinclair ZX Spectrum (1982)
  • The Compaq Portable (1982)
  • The IBM PC XT (1983)
  • The Apple Lisa (1983)
  • The Amstrad CPC64 (1984)
  • The IBM PCjr (1984)
  • The Apple Macintosh (ahem, 1984)
  • The IBM PC/AT (1984)
  • …and the Commodore Amiga, introduced in 1985

All of these were much more different from one another than computers are today. Different ideas, different technologies, different ways of programming them, different features and benefits. Each idea had something to challenge the other, and standardization was not a thing just yet. The idea was to be different, and to try new things, often. This was a remarkable period of time!

If you were a teenager — with all that attendant attitude — back then, one thing you did not care to own was an IBM PC of any kind. Why? Because, compared to almost all of the alternatives, it was boring. IBMs — and Apple IIs, for that matter — lacked the sprites and player missile graphics of the Commodore 64 and Atari 400/800. They lacked the sound synthesis of the ’64. They lacked the 256 color palette of the Ataris. The IBM PC tried to convince you that it was a great PC because it was a 16-bit machine. But it had an 8-bit data bus.

The IBM PC sucked. All the kids knew it.

You could say that all the things we — the kids — valued were only important to gaming, but we young programmers back then would have disagreed with you. We knew that, in order to create any sort of engaging experience on a computer, sounds and visuals were important. Would you, today, enjoy using Excel if it looked like this?

[Lotus 1-2-3]

Nonetheless, this sort of thing literally excited all of the adults of the period. You know, these kinds of adults:

Michael Douglas

Do you remember the phrase, “Nobody ever got fired for buying IBM”? I do. I wasn’t alone in thinking: this is a strange form of suicide.

Of course, if you are in charge of Wall Street, and the world, you do have a lot of control over the markets, and what gets bought and sold.

The Lisa of 1983 and the Macintosh of 1984 were, famously, incredibly different from everything else that came out before. And they were 32 bit machines with 16-bit data buses. Whoa.

But 1985 saw what was, perhaps, the most interesting computer of the entire decade: The Commodore Amiga.

With the same Motorola 68000 processor of the Mac, a color palette of 4,096 colors and things like https://en.wikipedia.org/wiki/Hold-And-Modify, it was the first personal computer that could really do photorealistic graphics, address 16MB of RAM, 8-bit PCM sound with tricks that could achieve nearly CD quality, it was like nothing else that had come before it. Then came the Video Toaster, and the Amiga simply boggled every geek’s mind. Many, many things led to the Amiga’s demise, and you can read about that in Ars Technica’s spectacular 12-part series.

But: these were the days in which regularly-released new technologies would raise pulses a little bit.

A little while later, Macs got color graphics and got to full 32-bitness; IBMs got there, too. Processors got fast. Windows 95 came out. Then VAX/VMS somehow became Windows NT, and then Windows 2000, and then XP. XP became Windows 7, Windows 8, Windows 10. Then 64-bit. Etc.

Linux came out, and that was incredibly interesting and important, but it’s just an operating system kernel.

Mac OS merged with NeXTSTEP and became Mac OS X.

The smartphone era occurred. iPhone came out and changed, well, phones. That was important (and, of course, was a key step toward this very year and this very article). But pulse-quickening? Not at the same level as 1980-1985. Not to me, anyway.

Not much else has happened since 1985 to raise a pulse as high, other than the NeXT computer in 1989. What NeXTSTEP did to popularize object-oriented programming was utterly, completely, groundbreaking.

Of course, the things that you could DO with a computer in those interceding years? Sure, those have been exciting. You know, the Internet and all. But computers themselves? Meh. A processor, a graphics chip, memory, storage, some IO controllers. Sure, faster, whatever. Apple makes them. Lenovo makes them. Dell makes them. Bleh. DDR4? Yawn. PCI Express? Shoot me. Thousand+ watt power supplies! Do you lack testosterone or something? These are useful alternatives to sheep and Nyquil Pure ZZZs. They do not quicken my pulse. They are evolutionary, and maybe even devolutionary.

In 2020, my pulse quickened. It’s been a long time. The Apple M1 is the most important thing to happen to the personal computing hardware landscape not just this year, but in the past 30.

That’s why there’s so much written about the M1 already. If this is news to you, I recommend you start with these three articles:

As a bonus, don’t miss Linus Torvalds’ initial reaction to Apple’s announcement back in June.

Linus has never been one to say good things about Apple, or anyone using monolithic microkernels for that matter.

The M1 represents the first complete re-thinking of the general-purpose personal computer since the 1980s. It is exciting in the same way that each of those very different 1980s computers listed above was. Let me put it this way: As a result of the M1, one of two things will happen, and either of them will be profound:

  • Microsoft and its hardware brethren will have to completely rethink how they deliver personal computing solutions; or
  • The Mac will become the go-to personal computer solution for most people.

Why does this matter here, on The Progressive CIO?

Because apart from the geek fact that the current bulk of personal computers are a bore, there is the human fact that personal computers still, to large degree, have not fully met an important goal: to honor the human condition.

This week, my boss, the CEO, told me that his Lenovo laptop speaker sounded really bad. I said, “Mine too! You know why? Because the cooling fans blow hot air right over the speaker, causing the voice coil to malform. A bad design decision.”

Have you spent time using Microsoft Teams and Zoom this year? Do these applications sometimes make your laptop’s fans go wild after hours of use? Teams, for all its ubiquity, is a ridiculously resource-hungry app. It’s written in Electron, after all. It cares not one bit for saving your computer’s power. It’s designed to be easy to port to multiple platforms above all other things. It’s developer-focused, not end-user-focused.

Personal computers, as they have evolved — even Macs, to some degree — are janky. They still crash and burn way too often. IT people love computers that don’t need a reboot, but it’s 2020, and most of our computers need to be rebooted at least once per month, if not per week. They have fans that blow incredibly hot air over very important parts that don’t like to get hot. They have batteries filled with ridiculous chemicals that can explode under the right conditions. They are bedrooms of private information that are riddled with vulnerabilities that need to be constantly patched. We spend a good portion of our computing horsepower analyzing what’s going on under the hood to make sure that those vulnerabilities aren’t being attacked. Our computers get hotter, and less, reliable, and things just get worse in a vicious cycle. Apps like Microsoft Teams are written in Electron, which is running on JavaScript through an encapsulated web browser, which is compiled to an API framework for the OS it’s running on, which is talking to the OS kernel that is running on the bare metal. This is at least six layers of abstraction away from the hardware. It’s a miracle that the computers work at all.

How does the M1 change all of this? Well, for starters, the term SoC (whatever happened to Silicon on Ceramic? I digress.) has been wildly overused. The M1 seems to meet the real definition. Everything about it is designed to work together, in harmony. It encapsulates multiple CPU cores, high performance graphics with multiple cores, specialized I/O and storage controllers, image processors, fast and unified RAM, neural processing, and much more. All bespoke and specifically designed to support the heavily object-oriented and declarative design standards that Apple has promoted for a long time now, starting from NeXTSTEP and continuing up to, and through, SwiftUI.

The real kicker, of course, is how the M1’s RISC instructions are so much easier to optimize in out-of-order-execution pipelines with multiple instruction decoders. And the M1 has — gasp! — eight of them. CISC ecosystems just cannot do that sort of thing. While Intel’s CISC approach eventually killed the RISC-based PowerPC, it was because the PowerPC couldn’t scale without generating significantly more heat. Intel won Apple’s heart because of the need to keep computers cool and quiet. Now the shoe is squarely on the other foot, and the endgame looks decidedly different today than it did 15 years ago.

Back to the human condition:

One of the complaints that we’ve all heard this year is how our need to work apart, using videoconferencing applications, has reduced or removed the key cues we need in order to have truly quality conversations. We can’t make eye contact. We can’t hear the small gasps or intakes of breath that are passive ways of saying: hey, can I speak for a sec? Those little gasps and intakes of breath, though, are what make conversations flow, rather than feeling like this.

Whether you know it or not, our computers and smartphones spend an inordinate amount of processing power to filter out background noises. This includes not just the TV in the background, or your labradoodle barking at the Amazon driver. It includes the damn fan in your laptop, which also sounds a lot like moving air breath.

When our computers run cooler, they run more reliably. They last longer. Their sound input systems don’t have to waste additional power filtering out the sounds that the computers themselves are making. They don’t burn our laps, or our palms. Their speakers don’t melt. Their batteries last longer. We get more done, more smoothly, so that we can finish using them, and get back to our real lives.

Good computers are also fast at what they do, and honor the human condition by allowing humans to be finished using them, more quickly, with less drama. Computers are a means to an end, and if that end involves too much time or pain, they are simply not good.

The M1 is fast. It runs cool. And it is fast. Yep, I said that twice. It significantly changes the dialog about what a personal computer can be, for the first time in three decades.

That’s why the M1 matters on The Progressive CIO, and why it is the Technology of the Year, 2020.

Discuss this specific post on Twitter or LinkedIn.

[Logo]