📚 Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech – Sara Wachter-Boettcher

Highlights from Kindle

  • An industry that is willing to invest plenty of resources in chasing “delight” and “disruption,” but that hasn’t stopped to think about who’s being served by its products, and who’s being left behind, alienated, or insulted.
  • Until the tech industry becomes more representative of the people it’s trying to serve, these problems will persist—and our products will be worse off because of it.
  • Adding to the problem, Thomas says, potential employers spend their time looking for a “culture fit”—someone who neatly matches the employees already in the company—which ends up reinforcing the status quo, rather than changing
  • But once you hand them out at a meeting or post them in the break room, personas can make it easy for teams to start designing only for that narrow profile.
  • This kind of narrow thinking about who and what is normal also makes its way into the technology itself, in the form of default settings.
  • Defaults also affect how we perceive our choices, making us more likely to choose whatever is presented as default, and less likely to switch to something else. This is known as the default effect.
  • Default settings can be helpful or deceptive, thoughtful or frustrating. But they’re never neutral. They’re designed.
  • But when applied to people and their identities, rather than to a product’s features, the term “edge case” is problematic—because it assumes there’s such a thing as an “average” user in the first place.
  • We didn’t call these people’s identities and scenarios “edge cases,” though. We called them stress cases.
  • In contrast, a stress case shows designers how strong their work is—and where it breaks down.
  • Identifying stress cases helps us see the spectrum of varied and imperfect ways humans encounter our products, especially taking into consideration moments of stress, anxiety and urgency. Stress cases help us design for real user journeys that fall outside of our ideal circumstances and assumptions.
  • “There is no feeling of hierarchy or urgency when news is breaking,” she told me.15
  • “teaser” (a common industry term for a short, one-sentence introduction),
  • To get that underlying reasoning, though, tech companies need to talk to real people, not just gather big data about them.
  • The only thing that’s normal is diversity.
  • starts not with the form itself, but rather with a screen that specifically mentions racial profiling, and reminds users not to rely only on race.
  • But metrics are only as good as the goals and intentions that underlie them.
  • The problem is that in interaction design, metrics tend to boil down to one singular goal: engagement.
  • when everyone’s talking incessantly about engagement, it’s easy to end up wearing blinders, never asking whether that engagement is even a good thing.
  • Because, it turns out, one of the key components of having a great personality is knowing when to express it, and when to hold back.
  • “We focus on clarity over cleverness and personality,”
  • “inattentional blindness.: inattentional blindness is a phenomenon in which humans fail to perceive major things happening in front of their eyes when their attention has been focused on something else.”
  • Zuckering.: “deliberately confusing jargon and user-interfaces” to “trick your users into sharing more info about themselves than they really want to.”
  • In short, proxy data can actually make a system less accurate over time, not more, without you even realizing it.
  • “caretaker speech”
  • Screens built to be tapped through as quickly as possible, so you won’t notice what you’re agreeing to. Services that collect information based on proxy, and use it to make (often incorrect) assumptions about you. “Delightful” features designed to hide what’s actually under the hood. Patronizing language that treats you like a child—and tries to make you believe that tech companies know best.
  • no matter how much tech companies talk about algorithms like they’re nothing but advanced math, they always reflect the values of their creators: the programmers and product teams working in tech.
  • “Big data processes codify the past,”
  • because it “forces group members to prepare better, to anticipate alternative viewpoints and to expect that reaching consensus will take effort.”
  • The implication is clear: exposure to difference changes perspective, and increases tolerance.
Advertisement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s