newsletter #35 | 17-Apr-2018
The empathetic mindset is about being curious, open to learning what’s going through other people’s minds. It’s about focusing on understanding another person’s perspective and experience of the world. It’s specifically about letting go of the urge to have answers or solve problems for that person. Yet, most of us have spent our lives proving to our teachers, parents, peers, bosses (and selves) that we’re good at solving problems. The culture at most technology work places is usually about these solutions, leaving little time for developing empathy. Yet even within these cultures there is plenty of opportunity to stop thinking about the solution for a while–to use an empathetic mindset.
Right now in the technology world, broad representation of cultures, histories, and values among teams is rare. We’re working on improving this, actively seeking the people who have been excluded from chances others have access to. Change is slow.
So, in tandem with diversifying our teams, we can intentionally exercise the empathetic mindset. We can recognize five areas where disparity and unfairness result from decisions our teams make, and step in to help everyone choose again.
A point I make in Practical Empathy is, “The empathetic mindset does not mean you have to feel warmth for another person. The words ‘understand’ and ‘comprehend’ do not mean ‘adopt’ or ‘agree with.’” Yes, it would be nice to feel warmth or “brotherhood.” But I also want to make sure professionals have the skills to wield the empathetic mindset when they are feeling confronted by someone who has different values. I want professionals to have the experience of recognizing a fellow human in someone whom they think is “less than” they are. If a professional has the awareness to recognize that they are reacting, judging, or being triggered, they can hit the mental pause button and decide what to do next. In some contexts, they can move into a mode of being curious where this other person’s stances come from. (Find out the source; ask and listen.) In other contexts, they may need to step away and take time to assess what could come out of the context for each participant. In contexts so far beyond their own boundary that they just can’t face the other person, the professional will step away completely and ask others for input.
But in the day-to-day of our work, often the context is so subtle professionals don’t notice it.
Sometimes the opportunity for empathy is made invisible because of the conventions of operating a business, or the habits of a business or technology work culture. Here are five areas to become more aware of in your work environment, data, and assignments:
1. Cognitive bias – the tendency for the human brain to find patterns in stuff … even if the patterns are meaningless. So if your boss is interested in why the data shows females behaving in a certain way with your services (like checking more luggage on flights), recognize that it’s only convention dividing the data that way. Being female does not cause most of the behaviors you see in data. Ask your boss if you can find out the real reasons why people check luggage. (Recite the mantra: “correlation is not causation.”)
2. Systemic bias – the way convention, algorithms, and models still exclude and marginalize people. One example that raised a red flag for me: “I would certainly get a credit score on any potential renter, and choose the person with the best credit.” The week before I’d just heard a story from someone who had gone through a divorce, and her ex had purposely gone out and run up a huge credit card bill with the goal of ruining her credit for five or ten years. That landlord in the first statement thinks his decision is just personal, but as a whole in society it adds up to a brick wall against so many situations that the credit model forgot.
3. ROI bias – more specifically, the bias in the corporate world toward earning a profit and bringing investors monetary reward. There are longer games that can be played which support other goals such as equality and fairness. Support for local communities, environmental sustainability, volunteerism in local schools already exist in some places. We can push for more, such as support for markets deemed unprofitable or support for groups who, in a generation or two, will be the foundation for new markets. Corporations are long-lived, I could argue that aiming for only short-term-profit is actually detrimental to their future.
4. Demographic assumptions – the generalized behaviors we imagine when hearing phrases like “people over 65” or “rural inhabitants.” I have been teaching people to describe audience segments by inner thinking rather than with shorthand demographic references. I also help professionals recognize red flags when encountering “tokenism” (or “caricatures”)–where a segment is painted with different ethnicity or gender (or where a segment is described using exaggerations), to symbolize the idea of “the other.” Examples are “Spanish speakers” or “low GPA students.” If you are aware of it, you can help people make changes.
5. Cultural blindness – the tendency for members of the privileged in any group to think of themselves as “average.” An example of this concept is “the social bubble” we’ve been learning about in the past few years. Even in your own neighborhood, there are people living a much different life with much different access to things you assume are distributed more widely. To end this list on a positive human note, here is an empathetic-mindset research project showing photos of families and homes by income all around the world.
The point of this “woke” list it to develop your awareness–to notice and recognize instances. Then give yourself and your team the power to choose again.
Q: (from Eduardo Hernandez) I saw your talk recently at Qualcomm for UX Speakeasy (Nov-2017) where you talked about the problem space. There is a lingering thought that I’ve had that I was hoping to get your feedback on. I often hear the term “create good mental models.” This statement specifies that mental models need creation rather than discovery. I am wondering isn’t it better to design an interface, for example, by discovering an existing mental model first, then building on that? Creating a design then attaching a mental model to it creates complexities, a level of abstraction, and allows you to rationalize the mental model to fit your design. Starting from a discovered or existing mental model and creating a design from there could potentially lead to better understanding. Where in the process should you concern yourself with mental models and are mental models created or discovered?
A: My initial response via email was hurried and probably didn’t address the question. Sorry Eduardo! Yes, I agree that, when it’s a team trying to create a product, mental models should be discovered. Mental models are only created by an individual, in their mind, representing a concept of the thing they are dealing with. Teams creating products are better off discovering the various mental models of different thinking-style segments they serve. Where in the process? Ideally up front, before product discovery cycles begin. However, in reality, this rarely happens. Luckily teams can discover these mental models at any point, and then guide the product in a direction more supportive of one or many of the discovered models. (I usually merge several thinking-style segments’ mental models together into one diagram, if they have a lot in common.)
Also, there are many definitions of “mental model.” The definition I use is “the model your team makes of the people they are supporting.” Another common definition of mental model is “the model a user has of the tool they are trying to apply to their purpose.”
indi can help you
coffee with indi