Categories
Compassion Current Events: 2020 Empathy

Guess Who’s Coming to Phish?

By now, I’m sure you have all read about what Tribune Publishing did to its employees.

Does your organization perform internal phishing tests?

If so, do you feel you do it “better” than Tribune Publishing did?

In what way?

Is it necessary to perform phishing tests?

Why do you think so?

If you know me by now, you might have an idea where I’m going. I think it’s a good idea for your organization to consider reasons why it’s not good to do these sorts of tests at all.

Phishing, by its very nature, will get evermore convincing. That is its entire point.

You do not need to test people to discover this.

What you will discover when you test is that a select group of individuals will fall victim to it.

You will be surprised at some, and not at others.

You will “educate” them about what they did to fall victim.

You will do it again, and you will get different results.

Lather, rinse, repeat.

If you get really “good” at administering phishing tests, you will lose satisfaction with the results. You will realize that the phishers up their games all the time, and that you need to, too. And you might, in fact, wind up doing something similar to what Tribune Publishing did in order to “really show those users” how at-risk they are.

Where does that get you in the end? It puts you squarely into the “us versus them” — “IT versus users” (and I use that horrible term with purpose, here) — position that gives IT a bad name. This is the very reason why I sometimes claim that “IT is a two-letter four letter word.”

Is that what you want?

What would it look like if you were to suggest to your IT leadership and teams that the time for phishing tests is over? What do you think they would say?

“The only way for people to really know how vulnerable they are is to do an objective empirical test that allows us to show them!”

“Those phishers are a moving target, and we need people to see how vulnerable they really are to the newest techniques!”

I am going to get vulnerable with you, in two ways.

First off, several years ago, I, too, thought these tests were novel and useful. In particular, I was interested in creating a dialog with senior leaders about their own vulnerability to phishing. It is a fairly commonly-accepted fact that senior executives are the most successfully-targeted people for phishing initiatives, because phishers have the most to gain, and executives are generally under greater-than-average pressure to quickly plow through their emails.

But I also know that, over the years, I have come very, very close to falling for some very sophisticated phishing myself — to the point where I once performed an action that I had doubts about, and had to quickly employ technical processes to mitigate what I had done. I was very lucky.

If I were a betting man, I would bet that your IT teams feel that they would not fall for phishing as easily as the rest of your organization.

And therein lies the inflection point for your cultural conversation with your IT teams.

Let’s assume that your IT team could educate your workforce to be as “good” at avoiding phishing as they feel they are. Ask your IT teams, “Are you 100% immune to phishing?”

If they tell you, “yes,” then I think you know the work you have to do with them.

If they tell you, “no,” then ask them, what are the best ways to protect you from that fact? Should the executive team do a phishing test on you?

If someone says, “Yeah, that would be kind of cool!” then I suggest that you warn them it would have to be pretty compelling in order to have the desired impact. Show them what happened at Tribune Publishing. Ask them how they would feel if you did that to them.

I suspect that you and I both know where that conversation will lead.


Language of the following sort nauseates me:

“We have to educate the users so that they learn to protect themselves.”

(There it is again…isn’t the term “users” disgusting?)

Do organizations perform phishing tests to primarily benefit their employees, or to primarily benefit the organization? If a victimized employee came to you after being phished, do you suppose that their initial response would be: “Gee, I wish you had tested me so that this wouldn’t have happened!”

It is our very industry that has created the holes that attackers use to take advantage of people. With more thought, our industry could have created operating systems and protocols that presaged human nature and mitigated the need for humans to worry so much when engaging with our creations. Back in the 1970s, some significant work was done to anticipate the need for more secure operating systems that might have fundamentally changed the direction of personal computing, but these ideas never took off once the computer wars of the 1980s ensued.

Given that it is our industry that created the mess that we are in, is it fair to so effortlessly thrust the results of our laziness on our customers?

Now, I am certainly not the first person to write this sort of opinion piece about phishing tests. But I am fairly confident that I am the first person who will frame this topic in the following manner:

Did you ever watch Guess Who’s Coming to Dinner? It’s a powerful movie that portrays the emotions of an interracial couple — and the reactions of their parents about their desire to marry — during the Civil Rights Era. In a particularly powerful scene, the son, played by Sidney Poitier, reacts to his father’s assertion that he has to do what his father asks (not marry a white woman), simply because his father brought him into this world. Take a few minutes to watch this powerful scene:

Think of our industry as the father, and think of our customer as the son. We owe our customer everything. Why can’t we do a little more — no, a lot more — to pick up the slack?

In fact, there are proven tools to help us mitigate many types of phishing. The single most valuable tool is a well-implemented security risk assessment, wherein you identify the things that you think are vulnerable to phishing, and create practices that harden those areas.

What was the Tribune trying to do with its phishing exercise? From all appearances, they wanted to see if they could lead employees to share credentials for ostensibly nefarious use. But if those systems were hardened with multi-factor authentication, what would a phishing test achieve?

IT teams can spend money to embarrass people. But wouldn’t it be better to spend the same money protecting people? If it costs more to protect people than to embarrass people, then might it be worth discussing whether or not you want a culture like they have at Tribune Publishing?

I encourage all IT professionals to remember that we are like the father in Guess Who’s Coming to Dinner. We represent an industry that made imperfect choices. Giving our customers technical responsibilities that make our lives easier is distasteful and disrespectful.

To paraphrase Sidney Poitier: We owe them everything.

Discuss this specific post on Twitter or LinkedIn.

🎹 Music for this post: https://www.youtube.com/watch?v=sh5_NSemt0Q.

[Logo]
Categories
Compassion Foundational Values

A Special Threshold

What is the primary emotion you feel when you watch Martin Gugino fall to the ground and bleed?

If your answer is not related to the idea of “compassion,” I accept that, and I am willing to admit that the approaches discussed among these pages might not be your thing.

I am fairly certain that “compassionate” is not at the top of the list of adjectives that are used to describe technologists. Nurses? Sure. But not us.

Why is that?

Could it have to do with the fact that we have come to be associated with soulless machines rather than people? That we are purveyors of things that are designed to supplant human beings in one way or another? That we have not found ways to accommodate and codify the irrational portions of life?

What does it mean for us to genuinely care about the people we are serving with our solutions?

It might mean that we would do well to appreciate that technology — and technologists — can elicit fear or anxiety in people in oh-so-many ways. We all can remember the first time we had to teach computer skills to a beginner. And certainly, if you are in a position to help someone solve problems by applying technology, the anxiety that accompanies the unknown future is part of the journey.

If you are your family’s technology “guru,” how many times have you been asked “What did I do wrong?” How does that question make you feel? Whenever I am asked that, before I open my mouth, I find it helpful to take a step back and ask myself, “What did we do wrong?” Typically, I can find a better answer to that. I feel so much better when I can say: “It’s not you.” And I have to tell you, the older I get, the more I find that answer to be the case.

Over the years, I cannot count the number of times I have overheard technologists uttering, while teaching, “Don’t worry. This will be easy.” How very easy it is to say that — it is the textbook definition of “dismissive.” Technology can be an affront to the senses. It is not natural or organic by any means. When we can develop an appreciation for that, we are in a better position to be responsive to the emotions that it can elicit in those we serve.

Compassion, like empathy, is hard. Compassion denotes a special threshold between doing what is easy and doing what is right. It involves more than just noticing a problem; it demands that you act on the problem. When we exhibit compassion, we make it a priority to stop to see what is wrong, and to exercise vulnerability, empathy, patience, and all of the other values we talk about on these pages. We slow ourselves down to accommodate the travails, anxieties, and fears of others, so that we can pick them up and walk with them, work with them, and guide them to a place where they can feel more comfortable.

Are you comfortable merely watching those who are afraid or hurt? Or will you get up and help them?

Discuss this specific post on Twitter or LinkedIn.

🎹 Music for this post: https://www.youtube.com/watch?v=ieQH6X_XBJo

[Logo]