What do your teammates call the audience for the software you create or deploy for your organization?
If I were a betting man — or a cheap psychic — I might bet that your answer is 2 or 5. Despite my wary ways, however, I wouldn’t hesitate to bet that you didn’t answer with 3.
I confess that over my decades in this business, I would have given answer 5 many times. But with each passing year, if that word slips out of my mouth, I feel ever so slightly more sick.
I’m going to be lazy here and quote myself from an interview I did in 2019 with Phil Weinzimer:
I encourage my teams to avoid the terms “users” and “end users” whenever possible. These terms imply a class divide. Arguably, our industry has adopted these terms to help us empathize with people who we are not. But I find that to be an incomplete thought. It’s a very condescending concept when you think about it. We are all people, and we share similar limitations. If we are solving problems, and we create an “us” versus “them” scenario, we are really not putting ourselves in the same bucket as our customers. Some people will say to this, well, if I create something that works for me as an engineer, it will not work well for a non-engineer. I say: don’t create something that works for you as an engineer. Create something that works for you as a non-engineer. If you cannot get in touch with your inner non-engineer, then I believe you have further personal development to do!
Listen to your teams over several hours or days. You will likely hear the term “users” pop up from time to time, and it might just start to feel like weeds or crabgrass in your lawn or garden in mid-summer. I might argue that this is the most basic indicator of a lack of humility in our field that you can find.
It is wise to remember that we are all simple bags of blood and bones. No one of us is more special than another. We may be asked to perform duties because of certain willingness or abilities, but the people who we serve deserve every minute of us being just like them that we can give them. You will never — never — deliver a brilliant solution to the people you serve when you come to the table as the person who you are, but who they are not.
If this is starting to sound like a discussion about empathy, you’d be onto something. But empathy is most attainable if we start with a foundation of humility, and we have plenty of time to discuss empathy at another time.
Will you admit that you are a small, meek animal in a universe that is infinitely larger than you are? Will you admit that you really understand very little, and that you will never truly understand everything about your life?
The headline asks an interesting question, and I have an educated, but admittedly unscientific, answer: yes, men “get” impostor syndrome. All the time.
What Catherine Bennett describes as “the association of authority with traditionally male exhibitions of extreme assurance” is, from my perspective, the defining mark of a deeply-buried case of male impostor syndrome.
What does true self-confidence look like?
Is it the way that Donald Trump or Boris Johnson or David Cameron or Vladimir Putin behave?
Conversely, what about when Ronald Reagan admitted his mistakes in the Iran-Contra scandal? Or when George H. W. Bush apologized for raising taxes? Or when John F. Kennedy took responsibility for the Bay of Pigs invasion?
Are those illustrations of weakness, or of strength?
Friends, true self-confidence is marked by the ability to openly admit mistakes or lack of knowledge…true self-confidence is all about vulnerability and humility.
Why might you not want to admit mistakes? Do you feel that it might amplify a certain lack of ability, and that others might think less of you? Both of these are strong indicators of personal insecurity. If you are afraid to admit mistakes, you are, by definition, afraid of people knowing your weaknesses. All signs point to some amount of impostor syndrome at this point.
The difference between female and male instances of impostor syndrome, I think, is that women seem to feel more comfortable exploring their weaknesses at a liminal level than men do. Men instead tend to pelt their insecurities down into subliminal territory, creating strong compensating facades of synthesized self-confidence, perhaps powered through the unique delusion of testosterone.
What man (or woman) who cannot admit mistakes and take corrective actions does so for any reason other than fear?
And what could that man or woman be afraid of, other than people beginning to see chinks in their armor?
As I have gotten older, I have developed a belief that most people lack self-confidence for some portion — maybe even a large portion — of their lives. Most people (including yours truly!) become aware of their relative lack of significance in the universe through some lonely moments of self-reflection, and they build up ugly behaviors in order to compensate for this. Ironically, it is the very ability for so many people around us to successfully fake self-confidence through “brio,” to borrow Boris Johnson’s term, that we find the seedlings of self-doubt so deeply sown in ourselves.
If our default human tendency was to openly embrace and exhibit our faults — with confidence! — the world might be a very different place, don’t you think?
Does your company believe that, when you invest in software, you are investing in something that requires care and feeding and additional investment, in order to live up to its purpose, which is to reflect change in your ever-changing business?
Does your senior leadership understand that the answer to all of these questions needs to be yes?
Most importantly: Do you have an approach to address the “no” answers you have for any of the above?
In response to “Software Engineering Is a Form of Leadership,” a colleague shared that, “I once had the opportunity to have a conversation with Bill Gates. I asked him, ‘Among your software engineers, how much of the job is technical versus nontechnical?’ He responded that for leaders of software groups, the job was 80% nontechnical, and for his most technical people, the job was 50% nontechnical. Of course, you understand that very well! You have done a good job of identifying the leadership skills that are so important.”
That’s both a great story and a nice compliment, but one of my greatest challenges in writing here on The Progressive CIO is engaging with folks who are not so lucky as I and my peers are. To date, I am preaching to the choir: my LinkedIn profile is filled with people who are a fairly lucky and like-minded bunch. Very little of The Progressive CIO is news to them. “Software Engineering Is a Form of Leadership” was written to challenge conventional wisdom. But while I was writing, and even during my subsequent edits, I felt like it was not going to land in the right audience. This turned out to be largely true. Despite the post’s novel perspective, there were not too many people in my limited network who seemed to find anything novel or inspirational in it. Sure, I got a good handful of likes, atta-boys, and agreement from all of the usual suspects, similar to what I shared above. But my network is filled with people who work in functional, healthy, and progressive organizations who already work within the sort of framework we talk about on these pages. I was preaching to the choir.
In my own industry (foodservice distribution), I can say with great certainty that many organizations don’t have great answers to all of the above. Those that do — and I like to consider the organizations that I work for among them — do well in no small part because we treat technology as a key investment in the human condition. We believe that all of our employees, including technology folks, have to bring gobs of emotional intelligence, empathy, humility, and compassion to the table in order to drive our organizations forward.
But why are we a (relatively) rare breed? Well, foodservice distribution is a pennies business. Many organizations try to spend as little as possible on technology as they possibly can. After they put in an ERP system, they want to be done paying for it, so that they can get back to doing what they do best: buying and selling food. What they too often fail to see is that they can’t buy or sell food as competitively and strategically as possible if they put blinders on to the way the world around them has changed in the past 25 years.
Your business may be in a similar situation.
If your organization is one of those who would answer “no” to any or all of the lead-in questions above, you are the target audience for these pages, and this post is here for you. Read on….
Lucky for you, dear reader, “Software Engineering Is a Form of Leadership” elicited the sort of response that I have been eager to get ever since my very first article on The Progressive CIO.
I received an adept, articulate, and perhaps even a little angry critical response from one of my closest and longstanding friends. It was the very sort of counterpoint to “Software Engineering is a Form of Leadership” that I needed to help reach the audience I wanted to reach. That response is more of a must-read than anything I have written to date on these pages. What tickles me most is that my friend is a fellow rhetorical theory major, as well as an excellent writer and author. That criticism is the focus here today. It sheds light on the very real — and very wide — gap that exists between what some might call my fantasy and so many other people’s reality.
There are literally thousands of companies — both big and small — that operate with, and even promote, behaviors that are self-defeating for their information customers (both internal and external), and that are at odds with the profession of software engineering (and IT) as it is currently being taught in schools and practiced in the most successful organizations.
It would be all too easy to say that these organizations work out of a sense of ignorance, or of austerity. Even though there might be some truth to that, it would be an unsympathetic assertion. I think it’s more appropriate to say that these organizations work this way out of a sense of experience, exposure, and habit. Engineers and IT people still generally have an awful reputation as propellerheads, and most people are conditioned to view us through the lens of that stereotype. There are still millions upon millions of people in the workplace who have never been exposed to well-trained and human-focused IT or software professionals, because most of the best technology professionals tend to work in a relatively small number of organizations. Universities are trying to change this, but it will take a very long time before that change is complete. Even then, just like in every profession, there will be bad apples who will perpetuate the stereotype.
Given the unfortunate wealth of this sort of experience, exposure, and habit, the questions we need to address are: 1) how do people help their organizations discover and explore alternative habits; and 2) who is responsible for initiating these alternative habits given the chicken-and-egg nature of the problem?
To begin the journey, let’s explore my friend’s experience with their company’s experience, exposure, and habits.
So, at the end of it all, I guess my overall thought is WOW, we have had such different experiences.
Your post starts off with highlighting how great your Dad was, which was nice to know. What an amazing career he had. I loved that you called it “suppertime” because that was what it was. I was amazed that your father expressed his disdain for a computer science degree, that really surprised me. (Just recently I have been reflecting on how important a quality liberal arts education can be.)
But where you start losing me is “Software’s very nature doesn’t merely involve uncertainty. It invites it. It is its raison d’être.” Software doesn’t have a nature. It is a bunch of code designed to get things done. It is a created response to the current perceived problem. Sure, software has to keep changing because the external parameters are changing (I am hesitant to say evolving). The software and the hardware fit together so that people can be more productive.
“Software doesn’t have a nature.” That statement was like a can opener for my mind. Yep, my friend is right. Only my choir would see it that way…and everybody in our little choir would do well to think of this quite differently. Of course, a software engineering academic would want to assert that the code isn’t the important part of the dynamic here. The code is but a single step in the broader context of evaluating and solving a human problem. We all know that.
But the meta problem here is that the experience, exposure, and habits of many companies cause them to perceive that code is what the software engineer brings to the table. In that sense, then, my friend is completely correct: software itself certainly doesn’t have a nature.
As a response, I offer a something vastly clearer: the situation that leads to software has a nature.
This leads me to something I have wanted to write about for a very long time. In 1968, Lloyd Bitzer, a rhetorical theorist at The University of Wisconsin – Madison, wrote a piece entitled “The Rhetorical Situation” that has become essential reading for anyone engaged in the modern study of rhetoric. Those of you visiting the Progressive CIO for the first time here might not know that my personal style in managing software engineering and IT is driven by a human-first approach that has its foundations in applied rhetorical theory. I believe that the problems in our profession are most effectively addressed through a comprehensive study and understanding of an audience and its attendant needs, calling for supreme skills in Aristotle’s rhetorical triad of logos, pathos, and ethos.
In “The Rhetorical Situation,” Bitzer attempts to describe the sorts of situations that drive the need for rhetorical discourse. What exactly is that, you ask? Well, rhetorical discourse is probably best defined as language that is employed to move or influence an audience, to bring about change. Traditionally, rhetorical discourse is the label applied to speeches and other overtly persuasive pieces of language. Modern rhetorical theory, however, acknowledges that a vast percentage of human communication can be considered rhetorical discourse…even the simplest “STOP” sign.
Here are two key paragraphs, presented without ellipsis, from the midsection of Bitzer’s “The Rhetorical Situation”:
Hence, to say that rhetoric is situational means: (1) rhetorical discourse comes into existence as a response to situation, in the same sense that an answer comes into existence in response to a question, or a solution in response to a problem; (2) a speech is given rhetorical significance by the situation, just as a unit of discourse is given significance as answer or as solution by the question or problem; (3) a rhetorical situation must exist as a necessary condition of rhetorical discourse, just as a question must exist as a necessary condition of an answer; (4) many questions go unanswered and many problems remain unsolved; similarly, many rhetorical situations mature and decay without giving birth to rhetorical utterance; (5) a situation is rhetorical insofar as it needs and invites discourse capable of participating with situation and thereby altering its reality; (6) discourse is rhetorical insofar as it functions (or seeks to function) as a fitting response to a situation which needs and invites it. (7) Finally, the situation controls the rhetorical response in the same sense that the question controls the answer and the problem controls the solution. Not the rhetor and not persuasive intent, but the situation is the source and ground of rhetorical activity — and, I should add, of rhetorical criticism.
Let us now amplify the nature of situation by providing a formal definition and examining constituents. Rhetorical situation may be defined as a complex of persons, events, objects, and relations presenting an actual or potential exigence which can be completely or partially removed if discourse, introduced into the situation, can so constrain human decision or action as to bring about the significant modification of the exigence. Prior to the creation and presentation of discourse, there are three constituents of any rhetorical situation: the first is the exigence; the second and third are elements of the complex, namely the audience to be constrained in decision and action, and the constraints which influence the rhetor and can be brought to bear upon the audience.
Lloyd Bitzer, The Rhetorical Situation, 1968
Let’s see how that reads after a bit of search-and-replace:
Hence, to say that SOFTWARE is situational means: (1) SOFTWARE comes into existence as a response to situation, in the same sense that an answer comes into existence in response to a question, or a solution in response to a problem; (2) SOFTWARE is given … significance by the situation, just as a unit of discourse is given significance as answer or as solution by the question or problem; (3) a SOFTWARE situation must exist as a necessary condition of SOFTWARE, just as a question must exist as a necessary condition of an answer; (4) many questions go unanswered and many problems remain unsolved; similarly, many SOFTWARE situations mature and decay without giving birth to SOFTWARE; (5) a situation is SOFTWARE insofar as it needs and invites SOFTWARE capable of participating with situation and thereby altering its reality; (6) discourse is SOFTWARE insofar as it functions (or seeks to function) as a fitting response to a situation which needs and invites it. (7) Finally, the situation controls the SOFTWARE response in the same sense that the question controls the answer and the problem controls the solution. Not the SOFTWARE ENGINEER and not persuasive intent, but the situation is the source and ground of SOFTWARE activity — and, I should add, of SOFTWARE criticism.
Let us now amplify the nature of situation by providing a formal definition and examining constituents. SOFTWARE situation may be defined as a complex of persons, events, objects, and relations presenting an actual or potential exigence which can be completely or partially removed if SOFTWARE, introduced into the situation, can so constrain human decision or action as to bring about the significant modification of the exigence. Prior to the creation and presentation of SOFTWARE, there are three constituents of any SOFTWARE situation: the first is the exigence; the second and third are elements of the complex, namely the audience to be constrained in decision and action, and the constraints which influence the SOFTWARE ENGINEER and can be brought to bear upon the audience.
The remarkable thing about Bitzer’s essay is how effortlessly interchangeable the ideas of rhetorical discourse and software are. The piece does not sound completely ridiculous with these simple substitutions. I suggest that this is because software is itself a form of rhetorical discourse, in that it is language that is employed to move or influence an audience, to bring about change. I suppose you could argue this over several beers in any college town in the world, but I would still tell you that software is a form of discourse. In another way of looking at it, software is an expression of thought, and it is a dialogue. It is written for humans, by humans, to express ideas and thoughts, and it is changed in response to the needs of its interlocutors.
Unfortunately for me, a Google search for “is software a form of discourse” (in quotes) as of this writing returns one of those sad “No results found” messages. There are, however, three or four references to “software is a form of discourse,” one of which seems truly interesting. I think we’re onto something. Let’s check back in a few years. But I digress…
In rhetoric, the person responding to the exigency and who creates the rhetorical discourse is known as the rhetor. The rhetor’s counterpart in our world is the software engineer. The most challenging work for both of these individuals is in the mise en place:
Both rhetoric and software seek to drive movement and change. The rhetor’s ultimate product is the speech; the software engineer’s ultimate product is the program. The wordsmithing of these two products is significant, no doubt, but the magic is all in the mise en place. If that is done poorly, the end product will be irrelevant. Nobody will be moved, and nothing will be changed. In software engineering, there is a maxim: the requirements and the documentation are more important than the code. If you lose the code but still have the requirements documented, you can create new code to address the requirements. But if you lose the requirements and have the code, all you have is a solution to a problem that nobody understands anymore.
This is why the academic practice of software engineering pays special attention to the practices of requirements elicitation, group dynamics, and human factors. This is where the key elements of Aristotle’s Rhetoric — logos, pathos, ethos — come into play, every day, all day, for software engineers. In order to be successful, software engineers have to not only look at things from a logical perspective, but they also have to empathize (deeply) and develop a sense of trust in their approaches and their responses to the situation at hand. They will not be able to create the sort of durable and lasting relationships with the people they serve if they fail in these skills. And because software is always an ongoing journey (it is designed to change, constantly, relentlessly), if you do not maintain healthy relationships between those serving and those being served, you will wind up with what, in our industry, we would call a “legacy software” situation.
My friend’s cogent response continues:
Then you say Scrum is great because it allows us to “manage the invited change and uncertainty, by taking one step at a time.” Well, not all change is invited, and it would really be nice if we could look ahead a little farther and anticipate what Hell this change is going to create when we are three steps down the line.
Side note: Several years ago, we had a “lunch and learn” at our company at which the director of the IT department explained Scrum to all of us with PowerPoint slides and everything! Basically, it was their justification for why they worked on which projects/problems. At the time, there were eight people in the IT department, and they would meet every morning to discuss their priorities for the day. If an urgent problem or issue arose, they would have to evaluate what they could set aside so that they could refocus their attention. Multitaskers they were not.
Prior to them worshiping at the altar of Scrum, they had a spreadsheet list approach. All issues and problems from all departments were catalogued (it was a Jira system). As each department became more and more frustrated that progress was never being made, the IT department decided to call a monthly meeting with the people who had entered the Jiras. We were all supposed to review the list as a company team so that we could understand their overwhelming burden and conflicting priorities. What actually happened was they lost ALL control because the individual departments started organizing their work for them, and those groups were setting the priority ordering. We never had another meeting, and they found Scrum.
Friends, we need to listen to this; these are profoundly important points that illustrate key anti-patterns that our industry faces. For certain, not all change is invited. When we select to use software to address our problems, however, we are making a decision to employ something that was designed to be changed. Back to my father’s point, we use software because changing (electronic) hardware is just too difficult, and in all too many circumstances, we need to allow for change, because change is the natural state of things.
So let’s get down to some experience, exposure, and habits regarding Scrum. Scrum is one of the most abused, misused, and misunderstood ideas of the modern workplace. I’m not a betting man, but I would bet that, in 80% of all implementations, Scrum is FUBAR. Is Scrum bad because it’s so easy to get wrong? Perhaps. But that would be like saying nobody should perform ballet because it’s so easy to get it wrong. Ballet is ridiculously hard to do well. But, holy cow, when it’s done well, it’s breathtaking.
The interesting thing to observe in the experience, exposure, and habits of my friend’s company is that they misunderstand who should be practicing Scrum. It’s not the IT team! And the IT team is not the one to set the priorities for what they work on! Businesspeople are the ones who are supposed to set the priorities for the organization; businesspeople decide what must stop if something else must take priority; businesspeople decide how to invest in staffing the IT teams with the quantity and quality of people who can help them achieve what they require. If you ever find an IT team with this sort of control, it means that the businesspeople, all the way up to the top, have simply abdicated their responsibilities. In Scrum, the Product Owner for an initiative is not on the IT team. This is all a behavioral anti-pattern.
It is because of this that I very much appreciate that my friend’s IT team “lost ALL control because” the operating departments “started organizing their work for them.” Good for those departments! Take control of what you are supposed to control! Unfortunately, from afar, the radio broadcast here seems to indicate that the IT team found a way to use Scrum to regain apparent control. I have no idea how this happened, but it certainly sounds like a horrible human mess. That’s not Scrum. At all.
Back to the post, now we are talking about the definitions of leadership. “Supervision is the practice of overseeing people to ensure they’re doing their assigned tasks.” Okay, I can agree to that.
“Management is the practice of nurturing someone’s career so that they can achieve what they aim to.” That may be what it is supposed to be, but that certainly isn’t my experience (and I really don’t think I am alone here). In my experience, management is about the managers relying on other people to get work done and then taking all of the credit for themselves. Support, encouragement, and/or appreciation is not offered because “you might think too much of yourself.” Management is concerned with protecting their position and their salary.
Also, the idea that management is interested in my goals is foreign to me. They are interested in their goals (see above: protecting their position and their salary). If I can help with that, then I will be included without being a truly engaged partner in the process because all information is on a need to know basis.
I’m as guilty of throwing idealistic definitions of supervision, management, and leadership around as the next guy. I’m lucky to not only believe in these definitions, but to work with others who do as well. But take a moment to absorb the above. Despite all of our best wishes, this is what happens at many organizations. Need proof? Why does Dilbert exist?
But Dilbert is not just a comic: it is an anti-pattern. If you prefer reality over cartoons, then refer to the above. It’s real.
So, if your management behaves this way, what can you do? One passive-aggressive thing to do might be to print this article and post it somewhere. But stupid people don’t read, so that’s not going to work for you.
What can you do when you feel like a mere peon but are interested in making your organization better (ahem, being a true leader)? Let’s use an example from my friend:
Last month, we had an all-staff meeting. Our new President’s big news was that our Board passed the budget. He thanked Percy, our new CFO, for what a great job he did putting the budget together, and what a great job he did presenting it to the Board. It was just so wonderful, and he did such a great job. What a testament to him that the budget was unanimously passed.
Hmmmm, interesting that it is his success. All of the managers worked on their department budgets (fussing about costs and ways to save), entering and revising all of the numbers in the budgeting software when they decided to cut all travel and training for 2021. The individuals in accounting also provided us with all kinds of reports and help.
But let’s be grateful to Percy, who, by the way, could/should have said, “I couldn’t have done it without my team” or anything that acknowledged that it was a group effort.
Nope, nothing — no surprise there.
The best guidance I can offer you from the limits of this page is this: employ the power of the question to make your leaders think. Find an opportunity to ask a question like this: “Do you have any thoughts on why our department managers’ morale is so low?” Even though you are smarter than your organization’s management, ask the question as if you don’t know the answer. Then see how it goes. If you do this enough, over a period of time, the lazy people you are talking to just might want to know if you have any ideas. And it’s always your option to say, “I have no idea, I just wanted to explore.” If you do this enough, you will be able to get a more detailed picture of who feels what, and you will be able to do a lot with that information. Remember that human beings love to be given an opportunity to share their thoughts, and little bits of unexpectedly useful information will always come your way if you are skilled enough to shut up, sit back, and take it all in without offering anything in return. They call it “the power of the question” for a reason: collecting all of that perspective gives you the power and time to figure out how to put it to good use.
Of course, this is only one option. Another completely valid option is to find a better job. You deserve it! But please do take an opportunity to experiment with the power of the question no matter where you work, because your ability to do this will help in so many situations, even in the very best organizations, under much better circumstances.
Back to my friend:
“Leadership is the practice of taking people on a journey to an unknown place while managing their natural anxieties about this journey.” Hmmmm, that hasn’t been my experience either.
Friends, leadership is one of the most debated terms in all of business. I am going to tell you, unequivocally, that the very nicest organizations to work for all agree that leadership is the idea of moving a novel (and typically uncomfortable) idea forward, regardless of where you sit in your organization.
But remember my friend’s perspective. There are a lot of organizations that aren’t the very nicest. These definitions don’t mean much to them.
Back to software:
But next comes the real stunner: “When software engineers use frameworks like Scrum to assist them in their efforts to drive change, they are living the very highest form of leadership.” Wow, I read that I thought, are you kidding me? Because they code software, because they use Scrum, they are leaders. What? They aren’t thinking outside the box, looking at the future, creating a path forward for the organization. They are solving the computer problem in front of them which has been tagged as a priority. You soften it a bit by saying “Everyone worthy of being called leader needs regular nourishment in these skills, software engineers in particular.” But still, WOW.
Maybe it is the person who comes to the software developer and says, this is what I am envisioning (beyond a reaction to an issue), this is what would make things better, this is what would move us forward, maybe that is the person who should be heralded.
Once again, my friend nails it. For many organizations, the programmers solve “the computer problem in front of them which has been tagged as a priority.”
Is that what your organization’s programmers do?
Software engineers are there to help with human, business problems. Not computer problems. In fact, well-trained and thoughtful software engineers will discourage the use of software when they think it’s the right thing to do, just as civil engineers will discourage the use of concrete barriers when visibility is more important than sturdiness. In many process workflows, human input is more important than automation. Software engineers who are members of ACM or IEEE have a code of ethics that underscores their responsibilities in this regard.
Software engineers are there specifically to collaborate with others to think outside the box, look at the future, and create a path forward for their organization.
Scrum — as difficult to get right as ballet — is a key tool to helping manage this anxiety. Scrum says: let’s take our uncertain journey one step at a time. Let’s be transparent with one another about our feelings and anxieties, and let’s commit to inspecting each step of our journey, and let’s commit to adapting based upon what we find after each step.
The three pillars of Scrum — transparency, inspection, adaptation — are Leadership 101. We’re going to try something new. We don’t know exactly what’s going to happen, but that’s OK. We will talk often, review what’s going on and how we feel about it, we will make decisions about what’s next, and there will be a person to encourage us along the way.
To be fair, Scrum isn’t only practiced by technologists. It’s practiced by everyone involved in a key initiative. But technologists are the ones who have such a colorful palette of solutions to the problems discovered through Scrum. As a bonus, they (today) learn Scrum in a university setting, and are practiced not just in employing it, but in teaching it.
All too often, however, these points are lost on so many organizations who hire these technologists…all because of their experience, exposure, and habits.
What can you do about this?
Back to the power of the question: “Are we taking the time to learn the ins and outs of Scrum to our advantage? I hear it’s really difficult to do well, like ballet. What could we do to learn more?” Or perhaps, “Do we think that we are effectively confronting and addressing the problems we face today? What are our anxieties about what lies ahead, and what are we willing to do about those anxieties?” Or something like that. You’re smart. Take it from there.
So, what you describe is your experience; it certainly is not mine. Maybe I really don’t understand what a Real software engineer is and how hidden beneath his exterior is a true leader waiting to be acknowledged — if only given the chance and “proper care and feeding.” So much for the rest of us saps, who toil at our tasks and aren’t offered the six-figure salaries of those in the IT department. Did you consider that maybe these software engineers aren’t interested in a leadership position? I just don’t understand why they and using Scrum makes them so much more worthy than the rest of us.
But, like I said when I started this, we have just had such different experiences.
The best software engineers — ahem, leaders — are trusted guides who we look forward to being with, because we know that they will get us where we need to go, and we return to them routinely to seek their guidance and help. If you have software engineers who do not appreciate this opportunity, then you have what we would call code monkeys. Code monkeys are fine. But they are not worthy of being called engineers, let alone leaders. The “programmers” of 30 or 40 years ago who never kept up with the times would be considered the code monkeys of today.
If your organization uses code monkeys and not software engineers, do you care? If you do care, what can you do about it?
I hope you agree that my friend’s willingness to share their organization’s experience, exposure, and habits was a real gift to our dialogue. I am sure, too, that when we have friends like this, that we do everything possible to help them either move their organizations forward, or move on to greener pastures. Sometimes, the problem is that you are the smartest person in the room, and the best option is to leave the room.
I posed a question earlier in this post that I haven’t forgotten about:
You are! Yes, you. It’s not a chicken-and-egg thing at all. While it might appear that nothing can happen unless your management takes the first move, there is always a chance that your management needs you to make the first move. Sure, if you are wrong, then you will need to move on. But: you already care enough to read about these things. You are smart. You can employ the power of the question to lead from wherever you are today. And if you never try, then you never lead.
If you, like I, came of professional age in the earliest days of the Internet boom (we’re talking the early ’90s), you might have been exposed to the overwrought sense of intellectual entitlement and rationalization endemic to the San Francisco Bay area. This has mushroomed in recent years to the point of ridiculousness. And you, like I, might have walked away from it.
I grew up in the New York Metro area, and am no stranger to Regional Superiority Syndrome. Like-minded people in large metro areas, living a balls-out Darwinian oval track race, trying their best to out-think one another, all the while shrouding their self-esteem varia with a veil of civic pride. “We’re a great city, filled with the best minds, surrounded by the best culture the world has to offer. Not here? Sucks to be you!”
Generations of people in Silicon Valley (and in the New York Metro) have produced important things; this is not up for dispute. But starkly missing from this sort of culture is a genuine appreciation for, and sense of, humility.
It is this lack of humility that I find myself responding to in the work that I do. An important part of a CIO’s job is to protect the companies they work for from the incredible amount of bullshit that is threaded through our industry. Promises of AI; new features that are always on the way; so much software that is apparently so great, so easy to use; anything can be solved with an integration or an API; Object Linking and Embedding is architecture of the future (or maybe it was ActiveX); Citrix is good; Cisco is great; Access is wonderful; SAP is amazing; Electron is groundbreaking; Apple is doomed; blockchain can solve issues with tracing lettuce from farm to table; every report is valuable. If you are a CIO, add your own bullshit to this list. I will not disagree.
In this vein, part of our jobs as technology leaders is to pay attention to the culture from which these issues emanate. You might be like me, and you might fail to drag yourself out of bed to do this once as often as you should. I tired of it long ago, but there are moments where I take a deep breath, dive in, and catch up.
“Silicon Valley’s Safe Space” focuses on Scott Alexander Siskind, creator of Slate Star Codex, a (now-preserved) blog-cum-support-group for Silicon Valley intellectuals who shared thoughts related to rationalism in technology. In the echoes of the dialog from these Bay Area rationalists, you get the sense that these people felt that they were doing something new and different. The naïveté of that notion is amusing.
It would be fair to say that Slate Star Codexers practice (technology-) applied rationalism in the same vein that I practice (technology-) applied rhetorical theory, the principal difference being that rationalism has been regularly and predictably applied to technology throughout history, whereas rhetorical theory has definitively not.
What’s clear from reading the Times article is that many of these folks would like to shelter their discussions from scrutiny and counterpoint from less-than-likeminded individuals. This is why, despite my short summary, I think you should take the time to read the entire Times article, as well as many of the links within. Many folks might be tempted to focus on some of the right-wing versus left-wing issues in the article and in the blog’s content; that would be a waste of time, because there is little notable to be found in that aspect of the story. The bigger story is one of perspective lost to self-importance.
The post highlights a dividing line between “humanities/empathizing/intuitive” people and “sciency/systematizing/utilitarian” people (the rationalists), and treats the former with a predictable and carefully-buffered dose of contempt.
Next up is a link to a TechCrunch article titled “Geeks for Monarchy: The Rise of the Neoreactionaries.” The Times piece cites angel investor/Andreesen-Horowitz General Partner Balaji Srinivasan opining that he and his cohort “could not let that kind of story gain traction,” ostensibly because it might prove to be perfect fodder for the outsiders, providing a tad too much insight into who these people really are.
Ultimately, what the Times piece helps us see is that the Bay Area technologists’ rationalism is a powerful underpinning for yet-another-inwardly-focused media empire, as if the world needs more of that sort of thing. Poynter’s David Cohn summarizes this point nicely.
Rationalism is helpful. Anything remotely involving science benefits from it. What I find troubling about this era of Bay Area Philosophy is that its philosophers’ rhetoric is regressive, rather than progressive. It is Plato vs. Aristotle all over again. Recall the first line of Aristotle’s Rhetoric: “Rhetoric is the counterpart of dialectic.” In that single phrase, Aristotle acknowledges the need for dialectic, but warns us that there is more to life than logic. In Aristotle’s world, we consider not just logos, but pathos and ethos in equal measure.
Scott Siskind’s “sciency/systematizing/utilitarian” people may have a hard time with “humanities/empathizing/intuitive” because it is more comfortable to suck the safe teat of logic. But humans are rarely logical, and we are fools to believe that we are rational. Only through humility can we come to terms with this. “Sciency/systematizing/utilitarian” people sometimes like to label matters of ethos and pathos with a dismissive epithet: “soft skills.” These cannot be empirically evaluated through rationalism, therefore they are not worth the time to pursue. What the rationalists fail to see is that this philosophy itself is a logical fallacy: an enthymeme, better-known as “a syllogism where one or more of the premises are implied rather than stated.”You can thank Aristotle’s Rhetoric for that.
As Times author Cade Metz put it:
“Slate Star Codex was a window into the Silicon Valley psyche. There are good reasons to try and understand that psyche, because the decisions made by tech companies and the people who run them eventually affect millions.”
If the people who run our tech companies fail to nurture humility, vulnerability, and empathy, then they will never be able to solve humans’ thorniest problems. What we see in the Times article is a classic imbalance between objectivity vs. subjectivity, and a call to do more. All fields need to consider the relationship between these two viewpoints; one is not more relevant than the other.
Will our profession allow these Bay Area rationalists alone to define what gets said, and what gets funded? Or will we (hopefully) promote a more balanced discussion? Read the Times article. Get up-to-speed. Please do your part to contribute to a balanced conversation.
Lies take many forms. The most troublesome ones involve the liar taking advantage of the audience’s trust when the audience would benefit from the truth more than the liar would…and when it’s far easier for the audience to trust than it is for them to seek the truth.
Every lie’s potential success is dependent upon its audience’s trust. In too many situations, trust trumps truth, because trust is easier.
I am sometimes troubled by the lack of effort that we put into our endeavors to discern something approaching truth. How can we help others if we don’t adequately comprehend the time that we must invest to gather the details we need to truly understand them, and their circumstances? In our profession, employers all too often don’t cultivate a culture that supports this way of working.
Most people would not rush to have surgery until they have had many doctor visits, discussions and tests over several days or weeks. But when we develop or introduce technology solutions for our “patients,” why do we so often fail to allocate the time to understand them, and their problems?
That puts me back in the 1970s. At that time, photography was a mature, nearly 150-year-old practice, positioned in stark contrast to its older sibling, painting. Back then, however, even the finest photographer’s technique couldn’t adequately replicate the color range and dynamic depth of real life. Nonetheless, photographers made every attempt to get as close as they could.
The photographic frame is an incredibly small, two dimensional window into a 360°, three dimensional moment and place in time and space. It is always a mere selection of a world that the photographer decided was important to capture. A photograph represents a decision, more than anything else. While photos might achieve realism in a way that paintings cannot, they remain impressionistic documents, whose intent it is to convey something that the human behind them wanted to convey.
When I was young, I became fascinated with photographing the nighttime version of our world. Neon lights, twirling gas station signs, dimly lit people, the lines of red and golden beacons emitted from the back and front of automobiles grizzling by. The tools available to me as a young photographer in the 1970s were limited. It was impossible to reproduce on paper what my eyes perceived.
To achieve this, our phones take hundreds of photographs in rapid succession, computationally stitching them together, taking the richest and most optimally exposed details from each frame. The final result is a stunning composite that replicates your natural perception better than any tools we’ve ever previously had.
The film photographers of my youth would (and still do!) cry foul, asserting that these photographs are pure artificial trickery. Our eyes would disagree. This illustrates something interesting: in order to achieve something that is closer to “truth,” we must collect multiple views, from different perspectives. It seems paradoxical that conveying something as simple-seeming as truth requires such complexity. Hence the Yiddish proverb:
Photography is only one tool in our communication arsenal; the written and spoken word are far more regularly employed. Do we consider that our utterances have the same limitations as the simple photographs of yore? More often than we would like to admit, words are ambiguous, and they are always strung together by imperfect creatures. When any person speaks or writes, there is a good chance that their words will be perceived differently by different people. A simple set of words alone cannot adequately express an idea, in the same way that one simple photograph cannot convey a scene with the depth, breadth, and dimensionality of real life. Language is, at best, impressionistic.
Yet, the vast majority of public discourse today assumes that a few simple words can convey truth. If you don’t believe this, then you may have never visited Facebook or Twitter.
Just as one photograph is not enough, one word is not enough. One sentence is not enough. One paragraph is not enough. One book is not enough. To discern something worthy of being called truth, people need to gain perspectives from multiple photographs…multiple words…multiple sentences…multiple paragraphs…multiple books.
Of course, this is the essence of learning. But in recent decades, we have seen learning increasingly demonized. Should something take too much effort to understand, we are told that it is “intellectual.” Sometimes, we are told that learning is an exercise for “elites.” If something is “complex,” we are encouraged to postpone digesting it until we have time.
But learning demands that we find multiple perspectives. It requires critical thinking, looking through and past the imperfections in our written and spoken interpersonal communications. This takes time. We like things to be simple, but truth isn’t simple.
Is it fair to say that calculus should be simple? Or that it’s a bunch of bull because it’s complex?
Is it fair to say that the theory of relativity is a bunch of malarkey, by the same standards?
Is it fair to say that British history is bollocks?
The conceit of modern humanity is to believe we can distill truth from a few soundbites, whether through one or two books, or one or two social media posts. The arc of this extreme conceit arguably began in September 1982, when USA Today published its first issue. That conceit — however attractive it was at the time — continues to erode our comprehension of the difficulty of truth to this day.
No matter how hard we study language in school, every one of us struggles to successfully communicate the many goings on in our world. Unfortunately, our choices seem to be either a) perceived as a complex, talkative bore, or b) to be perceived at all. We too often choose the latter. That choice promotes ego over truth.
Our egos are a bigger problem than we think. Not only do they prioritize our simple desire to be heard over producing something worthwhile to hear, but our egos lead us to believe that our capacity for language is a magical, God-given gift that makes us superior to other creatures. You and I have been programmed to believe that a paragraph like this one is vastly more sophisticated than a dog’s bark. If that were true, this might be the last essay ever written about truth.