Disruptive Design: Harmful Patterns and Bad Practice

A presentation at From Business To Buttons 2019 in May 2019 in Stockholm, Sweden by Laura Kalbag

Slide 1

Slide 1

Disruptive Design: Harmful Patterns and Bad Practice

Laura Kalbag ind.ielaurakalbag.com@laurakalbag

Slide 2

Slide 2

I found this terrifying image in an image search. It’s part of a person’s presentation. I couldn’t find the rest of the presentation, but I can imagine the person presenting it might’ve looked a bit different from little old me.

Slide 3

Slide 3

aut neca aut necare, either kill or be killed

It’s deliberately invoking the latin phrase “aut neca aut necare”, “either kill or be killed.” Honestly, I feel like this is a common attitude in technology today.

Slide 4

Slide 4

A very early example of disruptive technology was the printing press, making access to knowledge and education more widely available than ever before. It disrupted the control over knowledge that was held by book scribes and wealthy people.

Slide 5

Slide 5

The web and the increasing availability of the internet has brought another disruptive wave of democratisation. Making the acquisition of knowledge and services less limited by geographic proximity or access to schools and libraries.

I got into the web industry over a decade and a half ago not just because I loved the idea of the web, but because I could learn everything I needed to participate with the web.

Slide 6

Slide 6

Disruption

But let’s face it, these aren’t the disruptions we’re usually talking about when we talk about disruption.

Slide 7

Slide 7

Disruption or destruction

In technology today, disruption often means destruction, taking technology into other industries and taking over. airbnb makes it harder for people to rent housing in cities. Uber is discouraging people from using public transport and putting more cars on the road.

Slide 8

Slide 8

Terminology matters.

We should take care to examine the words we use, the goals they represent and the effects they have on the work we do. It’s too easy to dress up what we’re doing with jargon and euphemistic terms.

Slide 9

Slide 9

Engagement

Like you can’t read a piece on Medium without tripping over the term “engagement.” But what does it mean? What are we trying to achieve with engagement? Views? Clicks? Interactions? Prolonged use?

Slide 10

Slide 10

Engagement or addiction

Engagement that doesn’t serve a user’s agenda is addiction. Their addiction makes us money.

Slide 11

Slide 11

Hooked by Nir Eyal, ‘How to Build Habit-Forming Products’

I’m not exaggerating. We live in a world where this book exists and was “voted one of the best business books of the year by Goodreads readers.”

Slide 12

Slide 12

Dear tech industry, we have problems.

They say admitting you have a problem is the first step to recovery. And Dear Tech industry, we have problems.

Slide 13

Slide 13

The title of this presentation says I’ll talk about patterns, and so I best get into it. I’ve not got long, so I want to share with you a few patterns representative of some of the problems we have in this industry. The kinds of patterns, that like the words we use, we use so frequently we may not even stop to consider why.

Slide 14

Slide 14

As the author of a book about accessibility, it makes sense that I start with some patterns that affect accessibility. In case you don’t know what I mean by accessibility in technology: Accessibility is the degree to which technology is usable by as many people as possible, especially disabled people who are too easily excluded from what we build.

Slide 15

Slide 15

Pattern 1: Low contrast text

Low contrast text is text whose colour is low contrast against the background colour. (Like this slide)

Slide 16

Slide 16

That’s a more readable contrast now…

Slide 17

Slide 17

85.3% of homepages have text contrast issues.

A few months ago, WebAim, a web accessibility organisation, analysed the top 1 million websites for accessibility. Of all the issues they found on these 1 million sites, low contrast text was the most prevalent. WebAim’s 1 million project

Slide 18

Slide 18

Low contrast text affects…

Low contrast text has a perhaps surprisingly big impact. It affects:

  • people with eyesight loss and impaired vision
  • colour blind people
  • people using screens with low resolution
  • and even people using devices in bright light or spaces

Slide 19

Slide 19

Most often, low contrast text is found in places like footers, for secondary information or legal text. (Specifically in this screenshot, the terms and conditions, and the non-link text at the bottom of this screenshot is too low contrast…)

We use colour contrast to add information hierarchy. As designers, it’s a common choice to lighten text which we consider less important, to help people focus on the text we consider important.

Slide 20

Slide 20

But if you make your text so low contrast that people find it difficult to read, your decision about what content is important has an even greater impact…

Slide 21

Slide 21

What if you’re making important information about how LinkedIn Learning’s cookies operate harder for people to read? “These cookies enable us and third parties to track your internet navigation behaviour on our website and potentially off our website…”

I’m cynical, but not so cynical that I believe the designer did this intentionally. They probably thought “people won’t read this anyway, let’s make it easy to dismiss…”

We have to question these assumptions, and the agenda behind them.

Slide 22

Slide 22

Pattern 1: Low contrast text

So summing up, low contrast text is: popular, is bad for inclusivity, and accessibility, but isn’t usually done maliciously.

Lucky for us all, it’s easy to fix.

Slide 23

Slide 23

Low text contrast is easily avoidable.

We can still use contrast and colour to distinguish hierarchy, we just need to do so within a spectrum of readable text contrast.

Slide 24

Slide 24

Stéphanie Walter’s Tips to Create an Accessible and Contrasted Color Palette

Stéphanie Walter has a great article with guidelines and tools on her blog.

Slide 25

Slide 25

Geri Coady’s Color Accessibility Workflows

[Geri Coady’s Color Accessibility Workflows](https://abookapart.com/ products/color-accessibility-workflows) also covers the use and accessibility of colour and contrast in depth, which will help you create more inclusive technology.

Slide 26

Slide 26

Pattern 1: Low contrast text

So I’m rating all these patterns, and the overall rating I’m giving low contrast text: 😔(sad pensive face.)

Slide 27

Slide 27

Inclusive or accessible?

At this point, it’s probably good to note the difference between inclusivity and accessibility. Just in case you’re not sure which means what…

Slide 28

Slide 28

Accessible design

Accessibility is like adding an ugly ramp as an afterthought to the back of your newly-built shop building.

Slide 29

Slide 29

Inclusive design

Inclusive design is like building your shop building with a ramp and stairs from the very beginning, trying to accommodate as many needs as possible. It’s remembering to add railings for people who need support, and bright lines on the steps so people who find it difficult to see them can use the steps safely.

Slide 30

Slide 30

Inclusion

And of course, inclusion is not exclusive to disabled people and people with impairments. When we talk about inclusivity in technology, we are usually talking about how to better include people from a variety of marginalised groups, who are not the users usually centred in technology.

Slide 31

Slide 31

Which means you can have accessibility without inclusivity. But it’s short-term and somewhat superficial. We need inclusivity to make better technology.

Slide 32

Slide 32

Most of the people who create most of today’s technology create it to suit their own needs. I’m talking about people who aren’t mostly white, usually men, relatively wealthy, heterosexual, cisgendered, college-educated, English-speaking, and able-bodied.

Slide 33

Slide 33

Pattern 2: CAPTCHA

CAPTCHA is a challenge–response test used in computing to determine whether or not the user is human.

Slide 34

Slide 34

“Completely Automated Public Turing test to tell Computers and Humans Apart”

It’s an acronym for “Completely Automated Public Turing test to tell Computers and Humans Apart.” CAPTCHA forms are used to prevent bots from accessing a website, primarily so they cannot index a site or access its features repeatedly for free.

Slide 35

Slide 35

We used to have other types of CAPTCHAs, but nowadays CAPTCHAs are most likely to ask you to identify pedestrian crossings, traffic lights, and other geographic features that can assist with mapping. Detecting whether you’re human is mostly about the time delay and movement in your interactions…

Slide 36

Slide 36

4.25% of sites use CAPTCHA.

According to SimilarTech, 4.25% of sites use CAPTCHA.

Slide 37

Slide 37

The popularity of CAPTCHAs is a major issue for people who use screenreaders (assistive technology which reads the content of the screen as audio or braille). In 2017, a WebAim survey of screen reader users reported CAPTCHA as significantly the most problematic feature when trying to access the web.

Slide 38

Slide 38

CAPTCHA dictates that proving you’re not a robot requires specific visual, spacial, linguistic or mathematical skill.

Slide 39

Slide 39

To the point where “People with assistive technologies like screen readers often report that they get classified as a bot…”–Hampus Sethfors, ‘Captchas Suck

Slide 40

Slide 40

But CAPTCHA is evolving.

Slide 41

Slide 41

2,113,153 sites use reCAPTCHA.

We understand why tech might choose to misguidedly use CAPTCHA, but why is it so prevalent? Over 2 million sites use reCAPTCHA, Google’s free captcha service. What’s in it for Google offering this service for free? Why do they tell us we need it? Source

Slide 42

Slide 42

Right now, we’re helping Google digitise books, recognise images, and stop self-driving cars from running red lights. ‘[You are building a self driving AI without even knowing about it](https:// hackernoon.com/you-are-building-a-self-driving-ai-without-even-knowing-about-it-62fadbfa5fdf)’

Slide 43

Slide 43

A [woman in the US even sued Google, saying they were unfairly using free labour](https://www.bizjournals.com/boston/blog/ techflash/2015/01/massachusetts-womans-lawsuit-accuses-google-of.html).

Slide 44

Slide 45

Slide 45

“reCAPTCHA v3 returns a score for each request without user friction.” That user friction is people even knowing that the captcha is there, it operates invisibly on the site. So is that better for accessibility? Maybe. If it doesn’t continue to classify people using assistive technology as bots.

Slide 46

Slide 46

But “reCAPTCHA works best when it has the most context about interactions with your site, which comes from seeing both legitimate and abusive behavior. For this reason, we recommend including reCAPTCHA verification on forms or actions as well as in the background of pages for analytics.”

To work, reCAPTCHA needs to be deeply embedded within a site, with access to all interaction conducted on the site.

If you’re thinking Google’s unprecedented access to our interactions on websites is no problem, I have some reading material for you…

Slide 47

Slide 47

The Age of Surveillance Capitalism by Shoshana Zuboff.

What is surveillance capitalism? “Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data. Although some of these data are applied to product or service improvement, the rest are…fabricated into prediction products that anticipate what you will do now, soon, and later.”—Shoshana Zuboff, The Age of Surveillance Capitalism

Slide 48

Slide 48

Surveillance capitalism is defined by surveilling our behaviour, and monetising it.

If it sounds familiar, it’s because it forms the dominant business model of technology today.

Slide 49

Slide 49

“Google’s proprietary methods enable it to surveil, capture, expand, construct, and claim behavioral surplus, including data that users intentionally choose not to share. —Shoshana Zuboff, The Age of Surveillance Capitalism

reCAPTCHA operating silently in the background is the perfect surveillance tool.

Slide 50

Slide 50

😬

I know, we’re starting to wade into deep waters here. Not the kinds of topics you’d always feel comfortable discussing with your co-workers, boss or clients.

Slide 51

Slide 51

Pattern 2: CAPTCHA

So CAPTCHA, it’s popular, it’s poor for inclusivity and accessibility, and the ethics are misjudged. As we get into features that are tangled with our business models and business goals, “fixability” becomes trickier.

Slide 52

Slide 52

If we are genuinely trying to prove a person is human, privacy-respecting alternatives are in the works. But we have to be willing to find and try alternatives, and not just use what everybody else is using. I’ll come back to this point later…

Slide 53

Slide 53

Pattern 2: CAPTCHA

CAPTCHA’s overall rating: 😟 it’s a worried face from me.

Slide 54

Slide 54

Pattern 3: Profiling

Profiling is the process of construction and application of user profiles generated by computerised data analysis.

Slide 55

Slide 55

Profiling props up the majority of today’s “Big Tech.”

It’s a popular pattern… profiling props up the majority of today’s “Big Tech.” Here’s examples from some of the most popular sites. You definitely use at least one of these sites on a regular basis…

Slide 56

Slide 56

Google

(Most popular homepage in the world.) Search results are not just based on relevancy to the term but are personalised based on Google’s determination of user interests (profiles).

Slide 57

Slide 57

YouTube

(2nd most popular site in the world.) Recommends videos based on newness and popularity, but also based on user interests and viewing history.

Slide 58

Slide 58

Facebook

(3rd most popular site in the world.) Facebook’s news feed shows posts in an order based on what Facebook determines to be most interesting and relevant to the user.

Slide 59

Slide 59

Yahoo!

(9th most popular in the world.) Yahoo’s search results are personalised based on user’s search history, and a home page feed of news and other content is recommended based on popularity and the user’s predicted interests.

Slide 60

Slide 60

Amazon

(10th most popular site in the world.) Products and product categories are recommended on the home page, including ‘featured’ recommendations, based on user purchases and views.

Slide 61

Slide 61

Instagram

(14th most popular site in the world.) Posts displayed in an order Instagram determines to be most interesting and relevant to the user.

Slide 62

Slide 62

All of these sites use the profiles they’ve gathered to target users with advertising and content.

Slide 63

Slide 63

As a business, being able to customise your output based on a user’s interests makes sense. This is a quote from a marketing blog: “Understanding your customers is the secret of successful selling. Existing customers are important sources of information. The more you know about them, the easier it can be to increase their spending. And you can use the information you gather to help you find new customers.” Source

Slide 64

Slide 64

“I like relevant advertising.”

As customers or consumers, you might say “I like seeing advertising that is relevant to me, rather than irrelevant junk I don’t care for.” But are all the ads that you see really relevant to you?

Slide 65

Slide 65

I don’t give Facebook much information, my public profile is barely filled out, I rarely Like or Share anything.

Slide 66

Slide 66

So I get ads aimed at a stereotypical woman in her early thirties. (Laundry capsules, shampoo, makeup, dresses, pregnancy tests…)

Slide 67

Slide 67

This is why facebook.com will only let you sign up as Male or Female, not non-binary or the self-declared option you get after you’ve signed up. It makes it easier to put you in a box.

Slide 68

Slide 68

As Eva Blum-Dumontet wrote in the New Statesman, “When profiling us, companies are not interested in our names or who we really are: they are interested in patterns of our behaviour that they believe match an audience type… So to target us more efficiently, the advertisement industry relies on a very binary vision of the world.”—’Why we need to talk about the sexism of online ad profiling

Slide 69

Slide 69

And through this, advertising continues to perpetuates stereotypes and existing prejudices: “We need to be clear that a data driven world – where artificial intelligence makes decision based on simplistic profiles about us – isn’t going to solve prejudices: it’s going to perpetuate them.” –Eva Blum-Dumontet, ’Why we need to talk about the sexism of online ad profiling

This isn’t just about ads that tell women they should be focused on domestic tasks, looking pretty and getting pregnant…

Slide 70

Slide 70

And through this, advertising continues to perpetuates stereotypes and existing prejudices: In 2016, Pro Publica discovered that Facebook was allowing advertisers to exclude users by race in ads for housing: “The ubiquitous social network not only allows advertisers to target users by their interests or background, it also gives advertisers the ability to exclude specific groups it calls ‘Ethnic Affinities.’ Ads that exclude people based on race, gender and other sensitive factors are prohibited by federal law in housing and employment.” –Julia Angwin and Terry Parris Jr., ‘Facebook Lets Advertisers Exclude Users by Race

Slide 71

Slide 71

And even after Facebook argued it was not liable for advertisers’ discrimination (but still settled these cases), Facebook still perpetuates inequality in its profiling. A study into who Facebook picked to show ads to from very broad categories showed “given a large group of people who might be eligible to see an advertisement, Facebook will pick among them based on its own profit-maximizing calculations, sometimes serving ads to audiences that are skewed heavily by race and gender.” –Aaron Rieke and Corinne Yu, ‘Discrimination’s Digital Frontier

Slide 72

Slide 72

“In these experiments, Facebook delivered ads for jobs in the lumber industry to an audience that was approximately 70 percent white and 90 percent men, and supermarket-cashier positions to an audience of approximately 85 percent women. Home-sale ads, meanwhile, were delivered to approximately 75 percent white users, while ads for rentals were shown to a more racially balanced group.”–Aaron Rieke and Corinne Yu, ‘Discrimination’s Digital Frontier

Slide 73

Slide 73

Concluding: “An ad system that is designed to maximize clicks, and to maximize profits for Facebook, will naturally reinforce these social inequities and so serve as a barrier to equal opportunity.”–Aaron Rieke and Corinne Yu, ‘Discrimination’s Digital Frontier

Slide 74

Slide 74

When maximising clicks to maximise profit reinforces social inequities, what does mean for tech’s dominant funding models?

We have to ask “When maximising clicks to maximise profit reinforces social inequities, what does mean for tech’s dominant funding models?”

Slide 75

Slide 75

Web-based tracking is just the beginning…

Trackers are becoming more ubiquitous…

Slide 76

Slide 76

Google Nest, having a good look around your home (And the now-ubiquitous smart speakers!) Source

Slide 77

Slide 77

Hello Barbie which your kids can talk to, with all that information recorded and sent to a multinational corporation… Source

Slide 78

Slide 78

Smart pacifier. Put a chip in your baby. Source

Slide 79

Slide 79

Looncup - smart menstrual cup! One of many smart things that women can put inside themselves. (Most of the internet of things companies in this genre are run by men…) Source

Slide 80

Slide 80

Smart dildo…

Slide 81

Slide 81

We Connect (the smart dildo makers) were even sued for tracking users’ habits. (Though many companies will get away with tracking like this as they’ll hide it in the terms and conditions…)

Slide 82

Slide 82

Have you ever wondered how many calories you’re burning during intercourse? How many thrusts? Speed of your thrusts? The duration of your sessions? Frequency? How many different positions you use in the period of a week, month or year? You want the iCondom.

And have you ever wanted to share all that information with advertisers, insurers, your government, and who knows else?

Slide 83

Slide 83

The cost of using technology should not be our personal data, our privacy, our power, control or agency.

I’m not saying you shouldn’t want to use any of those products as a concept. But the cost of using those products should not be our personal data, our privacy, our power, control, or agency.

Slide 84

Slide 84

Privacy is not about hiding, it’s about having power and agency over how your information is used.

Slide 85

Slide 85

This is an inclusivity issue.

This is an inclusivity issue because we have to pay greater attention to who benefits and who loses out when targeting is involved.

Slide 86

Slide 86

One person’s relevant advertising is another person’s grounds for discrimination.

Slide 87

Slide 87

This isn’t an exaggeration… health insurance is a major customer for data obtained through profiling. Cracked Labs did a report looking at profile information data brokers have obtained. See under health interests: they guess whether a person disabled, and the likelihood whether someone has no major health insurance. Potential employers could use this information to exclude people with disabilities from their targeted job ads.

Slide 88

Slide 88

Privacy is not a low cost for convenience.

Next time we say privacy is a low cost for convenience, we must remember the people whose data will be used against them. And what happens if the inferences about us are wrong?

Slide 89

Slide 89

Pattern 3: Profiling

So profiling is immensely popular. Massively and negatively impacts inclusivity, and accessibility. The ethics are something we need to be concerned about. And here’s the kicker, unlike some of the other patterns, it’s not easy to fix.

Slide 90

Slide 90

Big tech will not be reformed.

Big tech will not be reformed because it doesn’t want to be reformed, it sees no problem with what it’s doing wrong, it just thinks it has a public relations problem.

Slide 91

Slide 91

These guys? They’re six of the twelve richest people in the world. Their platforms are bigger than countries. They have more lobbying power and influence than anybody else in the world. They don’t have a problem.

Slide 92

Slide 92

Find your power.

This means we will not be able to reform the massive corporation we’re working for from the inside. People may have the best will in the world to do so, but we have to find our power elsewhere.

Slide 93

Slide 93

Google walkout organizers say they’re facing retaliation

This is what happens to people who try to change Big Tech from the inside: they can’t be fired so they get demoted and lose responsibility, lose the little power they already have.

Slide 94

Slide 94

We need new models.

We need new models. Not just new role models, but new funding models. New ways to fund technology where access to technology and civil rights are not pitted against each other.

Slide 95

Slide 95

Success criteria

As designers we need to question the success criteria for our design work. Are we facilitating growth at others’ expense? Why are we adding the features we’re adding? Who benefits from them? Who loses?

Slide 96

Slide 96

As Shoshana Zuboff says, “The threats we face are fundamental as surveillance capitalists take command of the essential questions that define knowledge, authority, and power in our time: Who knows? Who decides? Who decides who decides?” –Shoshana Zuboff, The Age of Surveillance Capitalism

Slide 97

Slide 97

Better isn’t always good enough.

We can make sites more usable, less hostile to the people using them. We can make sites inclusive and more accessible. But if we’re doing all that with a bad business model, all we’re doing is leading more people into exploitation.

Slide 98

Slide 98

As Sara Wachter-Boettcher says in her book, Technically Wrong: “I believe that making interfaces easier to use is vital work. But when designers use clean aesthetics to cover over a complex reality—to take something human, nuanced, and rife with potential for bias, and flatten it behind a seamless interface—they’re not really making it easier for you. They’re just hiding the flaws in their model, and hoping you won’t ask too many difficult questions.”

Slide 99

Slide 99

Pattern 3: Profiling

It’s hard to put a rating on this one. 🤐 Maybe it’s just face with a zip mouth.

Slide 100

Slide 100

Everybody’s doing it.

Many of these patterns have become industry standards. We don’t question their use like we should, because everybody else uses them.

Slide 101

Slide 101

You can make change but change is not easy.

I speak about things like this at events like this because I know you are the people who can make change. I’ve been speaking on these issues for 7 years, and while I’ve seen awareness increase, change has been too slow. It’s a constant battle, but it’s a battle worth fighting.

Slide 102

Slide 102

1: Be different.

We’ve got to be comfortable being different. Be the person who creates alternatives. Be a better designer, find ways to create (and fund) these useful and fun experiences for people without them losing power and agency over their own data.

Slide 103

Slide 103

2: Be the advisor.

Be the advisor. Do the research on inclusive, ethical technology, make recommendations to others. Make it harder for them to make excuses.

Slide 104

Slide 104

3: Be the advocate.

Be the advocate. Marginalised folks shouldn’t have to risk themselves to make change. Advocate for others. Advocate for the underrepresented.

Slide 105

Slide 105

4: Be the questioner.

Question the norms. Try asking a business how do you make your money? Come ask me later how I make money! Ask why are we building this in this way?

Slide 106

Slide 106

5: Be the gatekeeper.

Be the gatekeeper. When the advocacy isn’t getting you far enough, use your expertise to prevent unethical design from happening on your watch.

Slide 107

Slide 107

6: Be difficult.

Be difficult. Be the person who is known for always bringing up the issue. Embrace the awkwardness that comes with your power. Call out questionable behaviour.

Slide 108

Slide 108

7: Be unprofessional.

Don’t let anybody tell you that standing up for the needs of yourself, and others, is unprofessional. Don’t let people tell you to be quiet. Or that you’ll get things done if you’re a bit nicer.

Slide 109

Slide 109

8: Be the supporter.

Be the supporter. If you are not comfortable speaking up for yourself, at least be there for those that do. Remember silence is complicity.

Slide 110

Slide 110

Nothing is inevitable.

Nothing is inevitable. Big tech will tell us that this is the only way to build technology. It isn’t.

Slide 111

Slide 111

Disrupt the disrupters.

Go forth and disrupt the disrupters.

Slide 112

Slide 112

Thank you.

Laura Kalbag ind.ielaurakalbag.com@laurakalbag