“
don’t think this is realistic,” he said. “The CEO would be an older white man.” My colleague and I agreed that might often be the case, but explained that we wanted to focus more on Linda’s needs and motivations than on how she looked. “Sorry, it’s just not believable,” he insisted. “We need to change it.” I squirmed in my Aeron chair. My colleague looked out the window. We’d lost that one, and we knew it. Back at the office, “Linda” became “Michael”—a suit-clad, salt-and-pepper-haired guy. But we kept Linda’s photo in the mix, swapping it to another profile so that our personas wouldn’t end up lily-white. A couple weeks later, we were back in that same conference room, where our client had asked us to share the revised personas with another member of his executive team. We were halfway through our spiel when executive number two cut us off. “So, you have a divorced black woman in a low-level job,” he said. “I have a problem with that.” Reader, I died. Looking back, both of these clients were right: most of the CEOs who were members of their organization were white men, and representing their members this way wasn’t a good plan for their future. But what they missed—because, I recognize now, our personas encouraged them to miss it—was that demographics weren’t the point. Differing motivations and challenges were the real drivers behind what these people wanted and how they interacted with the organization. We thought adding photos, genders, ages, and hometowns would give our personas a more realistic feel. And they did—just not the way we intended. Rather than helping folks connect with these people, the personas encouraged the team to assume that demographic information drove motivations—that
”
”
Sara Wachter-Boettcher (Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech)