Categories
Current Events: 2020 Technology of the Year

Technology of the Year, 2020 Edition

🎹 Music for this post: https://www.youtube.com/watch?v=ILWSp0m9G2U.

Lest you forget I am a technologist, this column is for you.

I’m 52 years old. The very first computer I ever used was a DEC PDP-11 over a dial-up connection.

The year? 1980; I was twelve.

The second computer I used was a Commodore PET. Third? A friend’s Sinclair ZX81, built from a kit. Fourth? A Commodore VIC-20, which was the first computer I came to own. From there, I worked with all of the famous 8-bit computers, from the Atari 800 to the Apple II to the Commodore 64.

The years 1980-1985 were inarguably the best time in history to be a computer nerd. Beyond these famous 8-bit computers, we saw so many other interesting computers:

  • The Tandy TRS-80 Model III (1980)
  • The Texas Instruments TI-99/4A (1981)
  • The IBM PC (1981)
  • The BBC Micro (1981)
  • The Jupiter ACE (1982), which ran Forth!
  • The Franklin Ace 1000, an Apple II clone!
  • The Sinclair ZX Spectrum (1982)
  • The Compaq Portable (1982)
  • The IBM PC XT (1983)
  • The Apple Lisa (1983)
  • The Amstrad CPC64 (1984)
  • The IBM PCjr (1984)
  • The Apple Macintosh (ahem, 1984)
  • The IBM PC/AT (1984)
  • …and the Commodore Amiga, introduced in 1985

All of these were much more different from one another than computers are today. Different ideas, different technologies, different ways of programming them, different features and benefits. Each idea had something to challenge the other, and standardization was not a thing just yet. The idea was to be different, and to try new things, often. This was a remarkable period of time!

If you were a teenager — with all that attendant attitude — back then, one thing you did not care to own was an IBM PC of any kind. Why? Because, compared to almost all of the alternatives, it was boring. IBMs — and Apple IIs, for that matter — lacked the sprites and player missile graphics of the Commodore 64 and Atari 400/800. They lacked the sound synthesis of the ’64. They lacked the 256 color palette of the Ataris. The IBM PC tried to convince you that it was a great PC because it was a 16-bit machine. But it had an 8-bit data bus.

The IBM PC sucked. All the kids knew it.

You could say that all the things we — the kids — valued were only important to gaming, but we young programmers back then would have disagreed with you. We knew that, in order to create any sort of engaging experience on a computer, sounds and visuals were important. Would you, today, enjoy using Excel if it looked like this?

[Lotus 1-2-3]

Nonetheless, this sort of thing literally excited all of the adults of the period. You know, these kinds of adults:

Michael Douglas

Do you remember the phrase, “Nobody ever got fired for buying IBM”? I do. I wasn’t alone in thinking: this is a strange form of suicide.

Of course, if you are in charge of Wall Street, and the world, you do have a lot of control over the markets, and what gets bought and sold.

The Lisa of 1983 and the Macintosh of 1984 were, famously, incredibly different from everything else that came out before. And they were 32 bit machines with 16-bit data buses. Whoa.

But 1985 saw what was, perhaps, the most interesting computer of the entire decade: The Commodore Amiga.

With the same Motorola 68000 processor of the Mac, a color palette of 4,096 colors and things like https://en.wikipedia.org/wiki/Hold-And-Modify, it was the first personal computer that could really do photorealistic graphics, address 16MB of RAM, 8-bit PCM sound with tricks that could achieve nearly CD quality, it was like nothing else that had come before it. Then came the Video Toaster, and the Amiga simply boggled every geek’s mind. Many, many things led to the Amiga’s demise, and you can read about that in Ars Technica’s spectacular 12-part series.

But: these were the days in which regularly-released new technologies would raise pulses a little bit.

A little while later, Macs got color graphics and got to full 32-bitness; IBMs got there, too. Processors got fast. Windows 95 came out. Then VAX/VMS somehow became Windows NT, and then Windows 2000, and then XP. XP became Windows 7, Windows 8, Windows 10. Then 64-bit. Etc.

Linux came out, and that was incredibly interesting and important, but it’s just an operating system kernel.

Mac OS merged with NeXTSTEP and became Mac OS X.

The smartphone era occurred. iPhone came out and changed, well, phones. That was important (and, of course, was a key step toward this very year and this very article). But pulse-quickening? Not at the same level as 1980-1985. Not to me, anyway.

Not much else has happened since 1985 to raise a pulse as high, other than the NeXT computer in 1989. What NeXTSTEP did to popularize object-oriented programming was utterly, completely, groundbreaking.

Of course, the things that you could DO with a computer in those interceding years? Sure, those have been exciting. You know, the Internet and all. But computers themselves? Meh. A processor, a graphics chip, memory, storage, some IO controllers. Sure, faster, whatever. Apple makes them. Lenovo makes them. Dell makes them. Bleh. DDR4? Yawn. PCI Express? Shoot me. Thousand+ watt power supplies! Do you lack testosterone or something? These are useful alternatives to sheep and Nyquil Pure ZZZs. They do not quicken my pulse. They are evolutionary, and maybe even devolutionary.

In 2020, my pulse quickened. It’s been a long time. The Apple M1 is the most important thing to happen to the personal computing hardware landscape not just this year, but in the past 30.

That’s why there’s so much written about the M1 already. If this is news to you, I recommend you start with these three articles:

As a bonus, don’t miss Linus Torvalds’ initial reaction to Apple’s announcement back in June.

Linus has never been one to say good things about Apple, or anyone using monolithic microkernels for that matter.

The M1 represents the first complete re-thinking of the general-purpose personal computer since the 1980s. It is exciting in the same way that each of those very different 1980s computers listed above was. Let me put it this way: As a result of the M1, one of two things will happen, and either of them will be profound:

  • Microsoft and its hardware brethren will have to completely rethink how they deliver personal computing solutions; or
  • The Mac will become the go-to personal computer solution for most people.

Why does this matter here, on The Progressive CIO?

Because apart from the geek fact that the current bulk of personal computers are a bore, there is the human fact that personal computers still, to large degree, have not fully met an important goal: to honor the human condition.

This week, my boss, the CEO, told me that his Lenovo laptop speaker sounded really bad. I said, “Mine too! You know why? Because the cooling fans blow hot air right over the speaker, causing the voice coil to malform. A bad design decision.”

Have you spent time using Microsoft Teams and Zoom this year? Do these applications sometimes make your laptop’s fans go wild after hours of use? Teams, for all its ubiquity, is a ridiculously resource-hungry app. It’s written in Electron, after all. It cares not one bit for saving your computer’s power. It’s designed to be easy to port to multiple platforms above all other things. It’s developer-focused, not end-user-focused.

Personal computers, as they have evolved — even Macs, to some degree — are janky. They still crash and burn way too often. IT people love computers that don’t need a reboot, but it’s 2020, and most of our computers need to be rebooted at least once per month, if not per week. They have fans that blow incredibly hot air over very important parts that don’t like to get hot. They have batteries filled with ridiculous chemicals that can explode under the right conditions. They are bedrooms of private information that are riddled with vulnerabilities that need to be constantly patched. We spend a good portion of our computing horsepower analyzing what’s going on under the hood to make sure that those vulnerabilities aren’t being attacked. Our computers get hotter, and less, reliable, and things just get worse in a vicious cycle. Apps like Microsoft Teams are written in Electron, which is running on JavaScript through an encapsulated web browser, which is compiled to an API framework for the OS it’s running on, which is talking to the OS kernel that is running on the bare metal. This is at least six layers of abstraction away from the hardware. It’s a miracle that the computers work at all.

How does the M1 change all of this? Well, for starters, the term SoC (whatever happened to Silicon on Ceramic? I digress.) has been wildly overused. The M1 seems to meet the real definition. Everything about it is designed to work together, in harmony. It encapsulates multiple CPU cores, high performance graphics with multiple cores, specialized I/O and storage controllers, image processors, fast and unified RAM, neural processing, and much more. All bespoke and specifically designed to support the heavily object-oriented and declarative design standards that Apple has promoted for a long time now, starting from NeXTSTEP and continuing up to, and through, SwiftUI.

The real kicker, of course, is how the M1’s RISC instructions are so much easier to optimize in out-of-order-execution pipelines with multiple instruction decoders. And the M1 has — gasp! — eight of them. CISC ecosystems just cannot do that sort of thing. While Intel’s CISC approach eventually killed the RISC-based PowerPC, it was because the PowerPC couldn’t scale without generating significantly more heat. Intel won Apple’s heart because of the need to keep computers cool and quiet. Now the shoe is squarely on the other foot, and the endgame looks decidedly different today than it did 15 years ago.

Back to the human condition:

One of the complaints that we’ve all heard this year is how our need to work apart, using videoconferencing applications, has reduced or removed the key cues we need in order to have truly quality conversations. We can’t make eye contact. We can’t hear the small gasps or intakes of breath that are passive ways of saying: hey, can I speak for a sec? Those little gasps and intakes of breath, though, are what make conversations flow, rather than feeling like this.

Whether you know it or not, our computers and smartphones spend an inordinate amount of processing power to filter out background noises. This includes not just the TV in the background, or your labradoodle barking at the Amazon driver. It includes the damn fan in your laptop, which also sounds a lot like moving air breath.

When our computers run cooler, they run more reliably. They last longer. Their sound input systems don’t have to waste additional power filtering out the sounds that the computers themselves are making. They don’t burn our laps, or our palms. Their speakers don’t melt. Their batteries last longer. We get more done, more smoothly, so that we can finish using them, and get back to our real lives.

Good computers are also fast at what they do, and honor the human condition by allowing humans to be finished using them, more quickly, with less drama. Computers are a means to an end, and if that end involves too much time or pain, they are simply not good.

The M1 is fast. It runs cool. And it is fast. Yep, I said that twice. It significantly changes the dialog about what a personal computer can be, for the first time in three decades.

That’s why the M1 matters on The Progressive CIO, and why it is the Technology of the Year, 2020.

Discuss this specific post on Twitter or LinkedIn.

[Logo]
Categories
Current Events: 2020 Technology of the Year

Well, That Didn’t Take Long

🎹 Music for this post: https://www.youtube.com/watch?v=wEBlaMOmKV4.

In reference to Technology of the Year, 2020 Edition…

Here you go: Bloomberg: Microsoft Designing Its Own Chips for Servers, Surface PCs

Discuss this specific post on Twitter or LinkedIn.

[Logo]
Categories
Current Events: 2021 Technology of the Year

Technology of the Year, 2021 Edition

🎹 Music for this post: https://www.youtube.com/watch?v=mrwfB4aAZZc.

“There were very few things people loved. That was one of them.” So said a dear friend and retired IT director, who introduced this year’s Technology of the Year to me many years ago. Oxymoron? No. Read on.

I assure you that this year’s Progressive CIO Technology of the Year will not be highlighted in any other similar column you will read this season. It’s unfortunate, and I can explain why.

(This is NOT a sponsored post.)


What if you could live in a world where you received no junk mail? Not occasional junk mail—no junk mail.

In 2003, Sendio Technologies introduced its “Opt-Inbox” technology, which was arguably the earliest implementation of a challenge-response technology (patented at the time) whose goal was to eliminate spam. Sendio’s earliest offering was an appliance that you would install in line with your email server. Here’s the Sendio user experience:

  1. Someone from outside your organization emails you for the first time.
  2. You do not receive that email.
  3. The sender receives a “challenge” email similar to the following:

Your email to John Doe john@doe.com is almost there!

We need to verify your email address. Click the button to instantly become a trusted sender.

âś“ Verify me

John Doe’s Company is using Sendio’s Opt-Inbox technology to ensure they do not receive unwanted email. Verifying your email address will ensure that all future email from you will be delivered directly to John Doe’s inbox without delay.

Verification using the button above is instantaneous but if you prefer, simply sending a reply to this email will also work.

  1. If the sender reacts to the challenge, the email is delivered to your inbox.
  2. If the sender doesn’t react to the challenge, you do not see the email in your inbox. Automated email systems are unable to address the challenge to verify themselves. Additionally, in my experience, even human beings who receive the challenge emails don’t click on them, for fear they are phishing.
  3. As you desire, you can visit a web page that shows you a list of emails that were not verified. After a few days of doing this, this list is almost all unsolicited email, not deserving meaningful attention. (You will find this to be delightful.)

If, on the outside chance, a real human verifies their email and it arrives in your inbox, you can go back to Sendio’s web interface and block them from ever communicating with you again.

Sendio also allows administrators to configure a greylisting filter as well, which further reduces the amount of email that even invokes a verification process. You can upload your contact lists to pre-identify people you want to hear from. If you send an email to somebody before they email you for the first time, Sendio validates them automatically.

If you’ve read this far and think, “How is this net-net different from me properly marking email as junk and occasionally checking my junk email folder?” then you will begin to appreciate why you don’t see Sendio on any end-of-year technology lists this year. It’s very difficult to appreciate the difference between the predominant way of doing things and the Opt-Inbox way of doing things, in the same way it’s difficult to appreciate certain products (iPhones, Apple Watches, Rolexes, etc.) until you experience them first-hand.

I can attempt to convey the difference using these words: In the Opt-inbox view of the world, there is no need for Bayesian filters or machine learning — all of which are imperfect. In the Opt-inbox view of the world, there is nothing in your junk folder, ever. It’s a wonderful, and one of the few experiences that is truly remarkable. In the Opt-inbox view of the world, the only email you see is from people with whom you’ve definitively decided you want to communicate.

Refer back to my friend’s comment: “There were very few things people loved. [Sendio] was one of them.” Someone else once told me: “You should never love something that cannot love you back.” That’s good advice, but Sendio elicits a response in people that pushes those sorts of boundaries.

So what’s the big deal with Sendio in 2021? Sendio Opt-Inbox for Office 365 Exchange Online.

Whereas Sendio used to involve appliances and/or awkward cloud-based integrations — making it even more unlikely that you’d get a chance to experience it — in Q4 2021, Sendio made its product available to Exchange Online customers for the first time, and the implementation is brilliant. It works via Azure AD’s Application Registration system, and integrates with Exchange Online through a series of Mail Flow rules. Here are some remarkable things about this version of the product:

  1. It does not require any changes to your MX records;
  2. It plays well with Exchange Online Protection (all mail is still processed through your organization’s EOP rules);
  3. It relies on Azure AD, so it provides seamless SSO, supporting your MFA and other authentication rules, with no drama.
  4. It can be enabled for individual people or groups who need it, without needing to be introduced or trained to your whole enterprise for those who have less of a need for a product like this.
  5. It is decidedly inexpensive. It starts at about $56 per user per year, with a 10-user minimum.

Hey.com introduced a nearly-identical opt-inbox capability last year, their How It Works page probably does a better job of explaining how this sort of system works than either I or Sendio have done. But Hey.com is not something that integrates Exchange Online, making it a non-starter for most enterprises. Sendio also doesn’t try to reinvent your email experience with things like Hey.com’s “Feed” and “Paper Trail.” Sendio’s technology is simple, thoughtful, and completely, 100% effective at eliminating spam.

Hey.com and Sendio are pretty much the only two accessible, lucid challenge-response spam elimination products on the market. How many people do you know who use Hey.com? How many people do you know who use Sendio? Why do your answers contain such low numbers? Because it’s difficult to explain in words. It’s something that you have to experience to appreciate. And Sendio for Office 365 Exchange Online significantly reduces the barrier to entry for your organization to experience this first-hand. This is why it is The Progressive CIO Technology of the Year, 2021.

Discuss this specific post on Twitter or LinkedIn.

[Logo]
Categories
Blockchain Current Events: 2022 Technology of the Year

Technology of the Year, 2022 Edition

🎹 Music for this post: https://www.youtube.com/watch?v=gPGFj-swZA0.

I’m a blockchain critic (in the company of countless others, as noted in the evergreen postscript on Blockchain Bunk), but I hope that this year’s Progressive CIO Technology of the Year doesn’t surprise you.

The Ethereum Merge was, in my eyes, the most important technology achievement of 2022.

Proof of Work was a reasonable way to kick off the cyber currency revolution, but it is not sustainable in its scalability, and the decentralization ideals behind the model have proven dubious. If the evolution of blockchain structures has taught us anything, it’s that decentralization continues to be alien to us.

My professional lifespan has been bookended with philosophical engineering movements whose aim was to prove that decentralization makes more sense than centralization. Over 30 years ago, Microsoft introduced Object Linking and Embedding (OLE), followed by IBM and Apple’s collaboration on OpenDoc. During those years, I served as an IT Director for a small publishing company, whose leadership was intrigued by these ideas, which were more academically interesting than practical. It was certainly neat-o for Joe’s PowerPoint to point to an Excel range in a document resident on Jill’s hard drive as well as a paragraph from a Word document on Mark’s drive. But who would want to keep track of the existence of these documents to ensure that Joe’s PowerPoint would function properly? What if Jill wanted to trash her ideas? What if her computer was off? What if Mark was fired, and his work proved to be a toxic force? Who was responsible for ensuring good enough computer hygiene to guarantee this would all work?

In those days, I remember asking my boss: if IBM thought peer-to-peer document links were the right architecture for enterprise work, wouldn’t they have designed the IBM System/360 in accordance with these ideals? So why didn’t they? Because anxiety:

  • Would you want your home to have its contents scattered between you and your place of work?
  • Are your cooking implements found in several rooms around your house?
  • Are your toiletries in your kitchen and living room?
  • Do you use a different keychain for each key? Do you keep each key in a different place?
  • Do you have a different bank for each dollar in your savings?
  • When shopping for groceries, do you enjoy visiting more than one store to gather everything you need?

We all want related things to stay together, as much as possible. Decentralization, in the vein of OLE and OpenDoc, makes us anxious. “What if Jill’s computer dies?” “Where is this information coming from again?” “Explain this to me again? How do I know it won’t move or disappear?”

In the wake of the thought experiments of OLE and OpenDoc, the Internet happened — boom, like that — bringing renewed and awesome focus to centralized information sources. OLE went on to become ActiveX (with a very different focus), and OpenDoc died. The client-server resonated with the human spirit, and it still does. With the Internet, every information resource had a canonical location, and it Just. Made. Sense.

Here we are, nearly a third of a century later, and the cryptocurrency movement asserts that we’ve been doing currency all wrong for 5,000+ years, calling centralization into question once again. No matter what the blockchain diehards assert, however, decentralization remains more theater than reality.

The Ethereum Merge was a risk not just in its technical execution, but in its philosophical positioning. It was the boldest of bold moves in a year overflowing with crypto failures that boggled human comprehension in more ways than one.. It has advanced the public dialog about transparency and decentralization theater, eschewing decentralization religion in favor of human comprehension.

While OpenAI, ChatGPT and the like might be popular candidates for this year’s award (and, despite widespread coverage, are very much works-in-progress that demo well for a few minutes but whose limits are quickly reached—better candidates for a future award), Ethereum’s move to Proof of Stake was a bolder statement. It is an honest and transparent step toward something more controversially centralized. The model, while lacking the abstract and religious purity of the Proof of Work blockchain, is easier for people to comprehend. The Merge didn’t just reduce Ethereum’s power consumption; it made it more possible for people of average intelligence to comprehend WTF is going on with the underpinning value proposition of ETH. Beyond that, more importantly, it increases the chances that cybercurrency will either have to significantly transform itself in order to meet a real need, or disappear altogether. Either of those things will be better for humanity than what we have today.

All this said, I wouldn’t be me if I didn’t remind you: there’s a lot more work to be done. The Ethereum Merge didn’t solve many things:

  • The misapplication of blockchain to physical goods;
  • The unfounded assertions that blockchains are hack-proof;
  • The fact that today’s cryptocurrencies are actually investments with significant risks associated with them, and that they need to be regulated by bodies like the SEC;
  • The entire idea of Web3, which needs a decentralization reality check of epic proportions.

If a technology move is significant enough to drive public discourse about the state of the human universe, it deserves our attention. This is why the Ethereum Merge is the Progressive CIO Technology of the Year for 2022.

Discuss this specific post on Twitter or LinkedIn.

[Logo]
Categories
Current Events: 2023 Technology of the Year

Technology of the Year – 2023 Edition

🎹 Music for this post: https://www.youtube.com/watch?v=8W_VC_BgMjo.

I published the 2022 edition of Technology of the Year on December 12, 2022, which was less than two weeks after the public release of this year’s winner, ChatGPT. What a year it has been.

ChatGPT ticks all the boxes of a successful technology:

Beyond that, there isn’t much I can add that I haven’t already shared in my April post, ChatGPT Challenges Us to Focus on Better Things. Are We Up for It?

ChatGPT is the Progressive CIO Technology of the Year for 2023.

Discuss this specific post on Twitter or LinkedIn.

[Logo]