Categories
Compassion Current Events: 2020 Empathy

Guess Who’s Coming to Phish?

🎹 Music for this post: https://www.youtube.com/watch?v=sh5_NSemt0Q.

By now, I’m sure you have all read about what Tribune Publishing did to its employees.

Does your organization perform internal phishing tests?

If so, do you feel you do it “better” than Tribune Publishing did?

In what way?

Is it necessary to perform phishing tests?

Why do you think so?

If you know me by now, you might have an idea where I’m going. I think it’s a good idea for your organization to consider reasons why it’s not good to do these sorts of tests at all.

Phishing, by its very nature, will get evermore convincing. That is its entire point.

You do not need to test people to discover this.

What you will discover when you test is that a select group of individuals will fall victim to it.

You will be surprised at some, and not at others.

You will “educate” them about what they did to fall victim.

You will do it again, and you will get different results.

Lather, rinse, repeat.

If you get really “good” at administering phishing tests, you will lose satisfaction with the results. You will realize that the phishers up their games all the time, and that you need to, too. And you might, in fact, wind up doing something similar to what Tribune Publishing did in order to “really show those users” how at-risk they are.

Where does that get you in the end? It puts you squarely into the “us versus them” — “IT versus users” (and I use that horrible term with purpose, here) — position that gives IT a bad name. This is the very reason why I sometimes claim that “IT is a two-letter four letter word.”

Is that what you want?

What would it look like if you were to suggest to your IT leadership and teams that the time for phishing tests is over? What do you think they would say?

“The only way for people to really know how vulnerable they are is to do an objective empirical test that allows us to show them!”

“Those phishers are a moving target, and we need people to see how vulnerable they really are to the newest techniques!”

I am going to get vulnerable with you, in two ways.

First off, several years ago, I, too, thought these tests were novel and useful. In particular, I was interested in creating a dialog with senior leaders about their own vulnerability to phishing. It is a fairly commonly-accepted fact that senior executives are the most successfully-targeted people for phishing initiatives, because phishers have the most to gain, and executives are generally under greater-than-average pressure to quickly plow through their emails.

But I also know that, over the years, I have come very, very close to falling for some very sophisticated phishing myself — to the point where I once performed an action that I had doubts about, and had to quickly employ technical processes to mitigate what I had done. I was very lucky.

If I were a betting man, I would bet that your IT teams feel that they would not fall for phishing as easily as the rest of your organization.

And therein lies the inflection point for your cultural conversation with your IT teams.

Let’s assume that your IT team could educate your workforce to be as “good” at avoiding phishing as they feel they are. Ask your IT teams, “Are you 100% immune to phishing?”

If they tell you, “yes,” then I think you know the work you have to do with them.

If they tell you, “no,” then ask them, what are the best ways to protect you from that fact? Should the executive team do a phishing test on you?

If someone says, “Yeah, that would be kind of cool!” then I suggest that you warn them it would have to be pretty compelling in order to have the desired impact. Show them what happened at Tribune Publishing. Ask them how they would feel if you did that to them.

I suspect that you and I both know where that conversation will lead.


Language of the following sort nauseates me:

“We have to educate the users so that they learn to protect themselves.”

(There it is again…isn’t the term “users” disgusting?)

Do organizations perform phishing tests to primarily benefit their employees, or to primarily benefit the organization? If a victimized employee came to you after being phished, do you suppose that their initial response would be: “Gee, I wish you had tested me so that this wouldn’t have happened!”

It is our very industry that has created the holes that attackers use to take advantage of people. With more thought, our industry could have created operating systems and protocols that presaged human nature and mitigated the need for humans to worry so much when engaging with our creations. Back in the 1970s, some significant work was done to anticipate the need for more secure operating systems that might have fundamentally changed the direction of personal computing, but these ideas never took off once the computer wars of the 1980s ensued.

Given that it is our industry that created the mess that we are in, is it fair to so effortlessly thrust the results of our laziness on our customers?

Now, I am certainly not the first person to write this sort of opinion piece about phishing tests. But I am fairly confident that I am the first person who will frame this topic in the following manner:

Did you ever watch Guess Who’s Coming to Dinner? It’s a powerful movie that portrays the emotions of an interracial couple — and the reactions of their parents about their desire to marry — during the Civil Rights Era. In a particularly powerful scene, the son, played by Sidney Poitier, reacts to his father’s assertion that he has to do what his father asks (not marry a white woman), simply because his father brought him into this world. Take a few minutes to watch this powerful scene:

Think of our industry as the father, and think of our customer as the son. We owe our customer everything. Why can’t we do a little more — no, a lot more — to pick up the slack?

In fact, there are proven tools to help us mitigate many types of phishing. The single most valuable tool is a well-implemented security risk assessment, wherein you identify the things that you think are vulnerable to phishing, and create practices that harden those areas.

What was the Tribune trying to do with its phishing exercise? From all appearances, they wanted to see if they could lead employees to share credentials for ostensibly nefarious use. But if those systems were hardened with multi-factor authentication, what would a phishing test achieve?

IT teams can spend money to embarrass people. But wouldn’t it be better to spend the same money protecting people? If it costs more to protect people than to embarrass people, then might it be worth discussing whether or not you want a culture like they have at Tribune Publishing?

I encourage all IT professionals to remember that we are like the father in Guess Who’s Coming to Dinner. We represent an industry that made imperfect choices. Giving our customers technical responsibilities that make our lives easier is distasteful and disrespectful.

To paraphrase Sidney Poitier: We owe them everything.

Discuss this specific post on Twitter or LinkedIn.

[Logo]
Categories
Current Events: 2020 Technology of the Year

Technology of the Year, 2020 Edition

🎹 Music for this post: https://www.youtube.com/watch?v=ILWSp0m9G2U.

Lest you forget I am a technologist, this column is for you.

I’m 52 years old. The very first computer I ever used was a DEC PDP-11 over a dial-up connection.

The year? 1980; I was twelve.

The second computer I used was a Commodore PET. Third? A friend’s Sinclair ZX81, built from a kit. Fourth? A Commodore VIC-20, which was the first computer I came to own. From there, I worked with all of the famous 8-bit computers, from the Atari 800 to the Apple II to the Commodore 64.

The years 1980-1985 were inarguably the best time in history to be a computer nerd. Beyond these famous 8-bit computers, we saw so many other interesting computers:

  • The Tandy TRS-80 Model III (1980)
  • The Texas Instruments TI-99/4A (1981)
  • The IBM PC (1981)
  • The BBC Micro (1981)
  • The Jupiter ACE (1982), which ran Forth!
  • The Franklin Ace 1000, an Apple II clone!
  • The Sinclair ZX Spectrum (1982)
  • The Compaq Portable (1982)
  • The IBM PC XT (1983)
  • The Apple Lisa (1983)
  • The Amstrad CPC64 (1984)
  • The IBM PCjr (1984)
  • The Apple Macintosh (ahem, 1984)
  • The IBM PC/AT (1984)
  • …and the Commodore Amiga, introduced in 1985

All of these were much more different from one another than computers are today. Different ideas, different technologies, different ways of programming them, different features and benefits. Each idea had something to challenge the other, and standardization was not a thing just yet. The idea was to be different, and to try new things, often. This was a remarkable period of time!

If you were a teenager — with all that attendant attitude — back then, one thing you did not care to own was an IBM PC of any kind. Why? Because, compared to almost all of the alternatives, it was boring. IBMs — and Apple IIs, for that matter — lacked the sprites and player missile graphics of the Commodore 64 and Atari 400/800. They lacked the sound synthesis of the ’64. They lacked the 256 color palette of the Ataris. The IBM PC tried to convince you that it was a great PC because it was a 16-bit machine. But it had an 8-bit data bus.

The IBM PC sucked. All the kids knew it.

You could say that all the things we — the kids — valued were only important to gaming, but we young programmers back then would have disagreed with you. We knew that, in order to create any sort of engaging experience on a computer, sounds and visuals were important. Would you, today, enjoy using Excel if it looked like this?

[Lotus 1-2-3]

Nonetheless, this sort of thing literally excited all of the adults of the period. You know, these kinds of adults:

Michael Douglas

Do you remember the phrase, “Nobody ever got fired for buying IBM”? I do. I wasn’t alone in thinking: this is a strange form of suicide.

Of course, if you are in charge of Wall Street, and the world, you do have a lot of control over the markets, and what gets bought and sold.

The Lisa of 1983 and the Macintosh of 1984 were, famously, incredibly different from everything else that came out before. And they were 32 bit machines with 16-bit data buses. Whoa.

But 1985 saw what was, perhaps, the most interesting computer of the entire decade: The Commodore Amiga.

With the same Motorola 68000 processor of the Mac, a color palette of 4,096 colors and things like https://en.wikipedia.org/wiki/Hold-And-Modify, it was the first personal computer that could really do photorealistic graphics, address 16MB of RAM, 8-bit PCM sound with tricks that could achieve nearly CD quality, it was like nothing else that had come before it. Then came the Video Toaster, and the Amiga simply boggled every geek’s mind. Many, many things led to the Amiga’s demise, and you can read about that in Ars Technica’s spectacular 12-part series.

But: these were the days in which regularly-released new technologies would raise pulses a little bit.

A little while later, Macs got color graphics and got to full 32-bitness; IBMs got there, too. Processors got fast. Windows 95 came out. Then VAX/VMS somehow became Windows NT, and then Windows 2000, and then XP. XP became Windows 7, Windows 8, Windows 10. Then 64-bit. Etc.

Linux came out, and that was incredibly interesting and important, but it’s just an operating system kernel.

Mac OS merged with NeXTSTEP and became Mac OS X.

The smartphone era occurred. iPhone came out and changed, well, phones. That was important (and, of course, was a key step toward this very year and this very article). But pulse-quickening? Not at the same level as 1980-1985. Not to me, anyway.

Not much else has happened since 1985 to raise a pulse as high, other than the NeXT computer in 1989. What NeXTSTEP did to popularize object-oriented programming was utterly, completely, groundbreaking.

Of course, the things that you could DO with a computer in those interceding years? Sure, those have been exciting. You know, the Internet and all. But computers themselves? Meh. A processor, a graphics chip, memory, storage, some IO controllers. Sure, faster, whatever. Apple makes them. Lenovo makes them. Dell makes them. Bleh. DDR4? Yawn. PCI Express? Shoot me. Thousand+ watt power supplies! Do you lack testosterone or something? These are useful alternatives to sheep and Nyquil Pure ZZZs. They do not quicken my pulse. They are evolutionary, and maybe even devolutionary.

In 2020, my pulse quickened. It’s been a long time. The Apple M1 is the most important thing to happen to the personal computing hardware landscape not just this year, but in the past 30.

That’s why there’s so much written about the M1 already. If this is news to you, I recommend you start with these three articles:

As a bonus, don’t miss Linus Torvalds’ initial reaction to Apple’s announcement back in June.

Linus has never been one to say good things about Apple, or anyone using monolithic microkernels for that matter.

The M1 represents the first complete re-thinking of the general-purpose personal computer since the 1980s. It is exciting in the same way that each of those very different 1980s computers listed above was. Let me put it this way: As a result of the M1, one of two things will happen, and either of them will be profound:

  • Microsoft and its hardware brethren will have to completely rethink how they deliver personal computing solutions; or
  • The Mac will become the go-to personal computer solution for most people.

Why does this matter here, on The Progressive CIO?

Because apart from the geek fact that the current bulk of personal computers are a bore, there is the human fact that personal computers still, to large degree, have not fully met an important goal: to honor the human condition.

This week, my boss, the CEO, told me that his Lenovo laptop speaker sounded really bad. I said, “Mine too! You know why? Because the cooling fans blow hot air right over the speaker, causing the voice coil to malform. A bad design decision.”

Have you spent time using Microsoft Teams and Zoom this year? Do these applications sometimes make your laptop’s fans go wild after hours of use? Teams, for all its ubiquity, is a ridiculously resource-hungry app. It’s written in Electron, after all. It cares not one bit for saving your computer’s power. It’s designed to be easy to port to multiple platforms above all other things. It’s developer-focused, not end-user-focused.

Personal computers, as they have evolved — even Macs, to some degree — are janky. They still crash and burn way too often. IT people love computers that don’t need a reboot, but it’s 2020, and most of our computers need to be rebooted at least once per month, if not per week. They have fans that blow incredibly hot air over very important parts that don’t like to get hot. They have batteries filled with ridiculous chemicals that can explode under the right conditions. They are bedrooms of private information that are riddled with vulnerabilities that need to be constantly patched. We spend a good portion of our computing horsepower analyzing what’s going on under the hood to make sure that those vulnerabilities aren’t being attacked. Our computers get hotter, and less, reliable, and things just get worse in a vicious cycle. Apps like Microsoft Teams are written in Electron, which is running on JavaScript through an encapsulated web browser, which is compiled to an API framework for the OS it’s running on, which is talking to the OS kernel that is running on the bare metal. This is at least six layers of abstraction away from the hardware. It’s a miracle that the computers work at all.

How does the M1 change all of this? Well, for starters, the term SoC (whatever happened to Silicon on Ceramic? I digress.) has been wildly overused. The M1 seems to meet the real definition. Everything about it is designed to work together, in harmony. It encapsulates multiple CPU cores, high performance graphics with multiple cores, specialized I/O and storage controllers, image processors, fast and unified RAM, neural processing, and much more. All bespoke and specifically designed to support the heavily object-oriented and declarative design standards that Apple has promoted for a long time now, starting from NeXTSTEP and continuing up to, and through, SwiftUI.

The real kicker, of course, is how the M1’s RISC instructions are so much easier to optimize in out-of-order-execution pipelines with multiple instruction decoders. And the M1 has — gasp! — eight of them. CISC ecosystems just cannot do that sort of thing. While Intel’s CISC approach eventually killed the RISC-based PowerPC, it was because the PowerPC couldn’t scale without generating significantly more heat. Intel won Apple’s heart because of the need to keep computers cool and quiet. Now the shoe is squarely on the other foot, and the endgame looks decidedly different today than it did 15 years ago.

Back to the human condition:

One of the complaints that we’ve all heard this year is how our need to work apart, using videoconferencing applications, has reduced or removed the key cues we need in order to have truly quality conversations. We can’t make eye contact. We can’t hear the small gasps or intakes of breath that are passive ways of saying: hey, can I speak for a sec? Those little gasps and intakes of breath, though, are what make conversations flow, rather than feeling like this.

Whether you know it or not, our computers and smartphones spend an inordinate amount of processing power to filter out background noises. This includes not just the TV in the background, or your labradoodle barking at the Amazon driver. It includes the damn fan in your laptop, which also sounds a lot like moving air breath.

When our computers run cooler, they run more reliably. They last longer. Their sound input systems don’t have to waste additional power filtering out the sounds that the computers themselves are making. They don’t burn our laps, or our palms. Their speakers don’t melt. Their batteries last longer. We get more done, more smoothly, so that we can finish using them, and get back to our real lives.

Good computers are also fast at what they do, and honor the human condition by allowing humans to be finished using them, more quickly, with less drama. Computers are a means to an end, and if that end involves too much time or pain, they are simply not good.

The M1 is fast. It runs cool. And it is fast. Yep, I said that twice. It significantly changes the dialog about what a personal computer can be, for the first time in three decades.

That’s why the M1 matters on The Progressive CIO, and why it is the Technology of the Year, 2020.

Discuss this specific post on Twitter or LinkedIn.

[Logo]
Categories
Current Events: 2020 Technology of the Year

Well, That Didn’t Take Long

🎹 Music for this post: https://www.youtube.com/watch?v=wEBlaMOmKV4.

In reference to Technology of the Year, 2020 Edition

Here you go: Bloomberg: Microsoft Designing Its Own Chips for Servers, Surface PCs

Discuss this specific post on Twitter or LinkedIn.

[Logo]
Categories
Albums of the Year Current Events: 2020

Albums of the Year, 2020 Edition

🎹 Music for this post is the music for this post.

Always remember: work is a means to an end. The things that are not work are the most important things in life. For me, music is one of those things. I like the challenge of distilling a year’s worth of music into five favorite albums. This awful year, I gave myself permission to select six that I want to share with you, my dear readers.

1. Fontaines DC – A Hero’s Death

https://www.fontainesdc.com/

A great work, by a young post-punk band with the same raw talent you might have seen from U2 in 1980, Billy Bragg in 1984, and Nick Cave in 1986. If you need convincing, just watch.

2. Chris Cornell – No One Sings Like You Anymore (Volume One)

https://chriscornell.com/

You will miss Chris Cornell even more when you listen to this late 2020 surprise, a love letter to artists he admired. His cover of John Lennon’s “Watching the Wheels” is better than the original. If there is a Volume Two, it will be hard to live up to this. Let’s hope.

3. Wire – Mind Hive

http://www.pinkflag.com/

Wire is, inarguably, one of the best bands of all time. I cannot think of an album ever made by a band of 45-year industry veterans that is as vital as Mind Hive. An unexpected surprise in this very grim year.

4. Sharon Jones & The Dap-Kings – Just Dropped In To See What Condition My Rendition Was In

https://sharonjonesandthedapkings.com/

Another great posthumous album of brilliant covers. Losing Sharon Jones was a tragedy. What a joy to hear her once more.

5. Idles – Ultra Mono

https://www.idlesband.com/

2020 was a year that was made for punk music. Reviewers seem to like this album less than 2018’s Joy As An Act of Resistance. I respectfully disagree. This is vital, timely, and just what my soul needed.

6. Kelly Stoltz – Ah! (etc)

https://www.kelleystoltz.com/

This is comfort food for people of my vintage, so it might not be your thing. Take a little of The Byrds and throw in some Robyn Hitchcock, and you have the makings of something wonderful. The vintage cover artwork is the cherry on the sundae.

Happy holidays to you and yours!

Discuss this specific post on Twitter or LinkedIn.

[Logo]