Accessible unethical technology

A presentation at Accessibility Scotland in October 2019 in Edinburgh, UK by Laura Kalbag

Slide 1

Slide 2

Slide 2

Not all work is created equal.

Since I wrote a book about accessibility, I’ve had many people excitedly share with me their organisation’s work with accessibility. It’s great, I learn so much from it, and I love it. However, too often the organisations these good people work for are … not so good.

Slide 3

Slide 3

Ethical technology

While my book is about accessibility, and accessibility is a core value in all my work, my job is not entirely being an advocate for accessibility. I’m not an expert like the other speakers here today. My job is the co-founder of Small Technology Foundation, a not-for-profit organisation advocating for, and building technology that protects and personhood and democracy in the digital network age. Ethical technology.

Slide 4

Slide 4

I can’t get excited about work used to exploit people.

And while inclusivity is a key part of ethical design, this still leaves me with something of a conundrum. I want to be excited about the brilliant work people are doing in accessibility, but I don’t want that work to be used to exploitative and unethical ends.

Slide 5

Slide 5

I keep coming back to a wise tweet by the great Heydon Pickering:

“Not everything should be accessible. Some things shouldn’t exist at all.”

Slide 6

Slide 6

What defines unethical technology?

But what are the things that shouldn’t exist? What defines unethical technology? What follows is an incomplete list of some ways (specifically Internet) technology can be unethical:

Slide 7

Slide 7

1: Inequality in distribution and access

Inequality in distribution and access is a key issue for those of us here today. The lack of accommodations for many disabled people could often be considered discriminative. The same goes for the lack of access for poor people, people who can’t afford the latest devices, or expensive data plans.

Slide 8

Slide 8

2: Lack of accountability and responsibility

Many of the ethical issues in technology comes down to accountability and responsibility (or a lack thereof.)

Slide 9

Slide 9

Misinformation

Centralised platforms encourage indiscriminate sharing, which results in the rapid and viral spread of misinformation.

Slide 10

Slide 10

Profiling

Extracting personal information from every data point of what a person shares online, their devices, their locations, their friends, their habits, their sentiments, is an invasion of a persons’s privacy, and is done so without their true consent.

Slide 11

Slide 11

Automated decisions

Using the extracted personal information to make automated decisions. Like whether someone would be the right candidate for a job, or qualify for credit, or is a potential terrorist. (It is usually irrelevant whether those decisions are based on inaccurate or discriminatory information.)

Slide 12

Slide 12

Targeting

The profiling and targeting of individuals enables manipulation by the platforms themselves, or by advertisers (or data brokers, or governments) utilising information provided by the platforms.

Slide 13

Slide 13

Insufficient security

Particularly in situations where a platform collects people’s personal information, including credit card information, it is irresponsible to not adequately protect that information. It is a risk to store this information in the first place, and if we do, we must ensure it is stored securely.

Slide 14

Slide 14

Accountability and responsibility means addressing the impact of our work.

Whether it’s the business model, the management, designers and developers, or anybody else in the organisation, often organisations do not care to address the impact of their work, or worse, deliberately design for harmful outcomes that deliver them more power and/or money.

Slide 15

Slide 15

“Intent does not erase impact.”

These are topics that have been explored in depth recently by Tatiana Mac, who invokes “intent does not erase impact” to describe our often-haphazard approach to designing technology.

Slide 16

Slide 16

In her brilliant A List Apart article, ‘Canary in a Coal Mine: How Tech Provides Platforms for Hate’ she explains:

“As product creators, it is our responsibility to protect the safety of our users by stopping those that intend to or already cause them harm. Better yet, we ought to think of this before we build the platforms to prevent this in the first place.”

Slide 17

Slide 17

Ask for permission, not for forgiveness.

People usually say “ask for forgiveness, not for permission” to encourage each other to be more daring. Instead, I think we require the approach, “ask for permission, not for forgiveness.”

Slide 18

Slide 18

3 Environmental impact

Now that people are starting to take the climate crisis more seriously, it’s time to reckon with the environmental costs of Internet technology at scale. The impact of running massive server farms, of blockchain technology turning electricity into currency, of machine learning using vast quantities of energy.

Slide 19

Slide 19

4 Business ethics

There are ethical considerations in business, which apply to any industry, but are exacerbating factors when combined with inequality in distribution and access, a lack of accountability and responsibility, or environmental impact:

Slide 20

Slide 20

Proprietary lock-in

Once you sign up for a product or service, it is difficult to leave. To some, this could be a minor inconvenience, but if your privacy is being routinely violated, the requirement to leave becomes paramount.

Slide 21

Slide 21

Industry monopoly

What’s worse is if no alternatives exist. What if the only social platform we can use is one that environmentally damaging? What if the only way we can participate in civil society online is inaccessible?

Slide 22

Slide 22

Tracking

Earlier I mentioned profiling being one of the key ethical issues in technology today.

Slide 23

Slide 23

Profiling is enabled by tracking.

Profiling is enabled by tracking; in order to develop profiles of you, they need data points. Those data points are obtained by tracking you, using any kind of technology available.

Slide 24

Slide 24

I block trackers.

I didn’t ever picture myself spending my days examining the worse of the web. But here I am.

Slide 25

Slide 25

I’ve worked on Better Blocker’s tracker blocking rules for the last four years, trying to work out what trackers are doing, why they’re doing it, and blocking the bad ones to protect the people who use Better Blocker. And as much as I try to uncover bad practices, block and break invasive trackers, they keep getting sneakier, and more harmful.

Slide 26

Slide 26

What is a tracker?

So what is a tracker? What is the kind of stuff I block?

Slide 27

Slide 27

I visit the City AM site, because I’m well into business news. I want to look at the third-party resources that it uses. Third-party resources are usually a good indicator of basic trackers, as third-party services can track you across the web on all the different sites that use them.

Slide 28

Slide 28

A quick inspection of City AM reveals 31 third-party trackers. For a news site, that’s low to average. I pick out one tracker at random:

Slide 29

Slide 29

adnxs.com

I have a look at our statistics for all the sites we’ve found that call a resource from adnxs.com. It’s on 16.9% of the top 10,000 sites on the web. We class that as a pandemic.

Slide 30

Slide 30

So I chuck adnxs.com into a browser to work out what it might be. It’s “AppNexus”…

Slide 31

Slide 31

…and it “Powers The Advertising That Powers The Internet.” Apparently “Our mission is to create a better internet, and for us, that begins with advertising.” Well, I’m not sure many people would share that opinion.

Slide 32

Slide 32

I scroll on down to find their privacy policy, which is usually tucked away in a footer in tiny text that they wish was invisible.

Slide 33

Slide 33

A quick browse of the privacy policy alongside the script itself, and I come to understand that the AppNexus tracker is tracking visitors to create profiles of them. These profiles are then used to target ads to the visitors for products they might find relevant.

Slide 34

Slide 34

We all know targeted ads, when the same products follow us around the web. For example, on Facebook I get a lot of ads for women in their 30s: Laundry capsules, shampoo, makeup, dresses, pregnancy tests…

Slide 35

Slide 35

Back to the AppNexus privacy policy… Conveniently, they have a section detailing ‘What Information Do We Collect and Use?’

Slide 36

Slide 36

This includes a long long list, so I’ve picked out some particularly interesting/scary data points:

  • your device make and model
  • precise geographic location data
  • web pages or apps visited or used, and the time those web pages or apps were visited or used
  • Information about you or inferences about your interests that a seller, buyer or third-party provider has collected about you and shared with us—such as information about your interests or demographic information (your age or gender).

(They go to great lengths to emphasise that they do not collect the interests info themselves, they just get it from other people…)

Slide 37

Slide 37

What are my interests according to AppNexus?

I’m curious about what these interests might be. What might AppNexus know about me?

Slide 38

Slide 38

Cracked Labs have done multiple reports into the personal data that corporations collect, combine, analyse, trade and use.

Slide 39

Slide 39

Data Brokers

Much of the combining, analysing and trading of data is done by data brokers.

Slide 40

Slide 40

Two of the biggest data brokers are Oracle and Acxiom.

Slide 41

Slide 41

According to Cracked Labs: “Acxiom provides up to 3000 attributes and scores on 700 million people in the US, Europe, and other regions.”–Wolfie Christl

Slide 42

Slide 42

And “Oracle sorts people into thousands of categories and provides more than 30,000 attributes on 2 billion consumer profiles.” But what are those attributes and categories?

Slide 43

Slide 43

Again, I’ve picked out some of the creepiest bits of information:

  • one of nearly 200 “ethnic codes”
  • political views
  • relationship status
  • income
  • details about banking and insurance policies
  • type of home: including prison

Slide 44

Slide 44

  • likelihood whether a person is planning to have a baby or adopt a child
  • number and age of children
  • purchases, including whether a person bought pain relief products
  • whether a person is likely to have an interest in the air force, army, navy, lottery and sweepstakes, or gay and lesbian movies

Slide 45

Slide 45

  • search history, including whether a person searched about abortion, legalising drugs, or gay marriage, protests, strikes, boycotts or riots
  • the likelihood that a person is a social influencer or is socially influenced

Slide 46

Slide 46

It’s not just the data brokers.

It’s not just the data brokers that are doing this, most platforms are creating profiles of you, using them to target you and organise your “personal feeds” to keep you engaged and interested in their sites.

Slide 47

Slide 47

It’s for more than just advertising.

These attributes can be used to target you with advertising, not just for products you might like, but including the ads that political parties put on Facebook. That’s what the Cambridge Analytica scandal was about. Political parties having access to not just your attributes but your friends’ attributes and your friends’ friends’ attributes.

Slide 48

Slide 48

And it’s no longer just your personal information and its patterns, or your habits. Facial recognition and sentiment analysis is also being used to create a deeper profile of you.

Slide 49

Slide 49

I could talk about this in more depth for much longer, but I’ve just not got the time. But the book Surveillance Capitalism by Shoshana Zuboff contains both the history and the predicted future of these massive complex surveillance systems:

Slide 50

Slide 50

“Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data. Although some of these data are applied to product or service improvement, the rest are…fabricated into prediction products that anticipate what you will do now, soon, and later.”–Shoshana Zuboff

Slide 51

Slide 51

How to protect ourselves (individuals)

How can we protect ourselves (as individuals)? I’m not talking about security tips (passwords, two-factor authentication and things like that…), I’m talking about how to protect yourself from tracking by the websites you’re visiting and the third-party services they employ on those websites.

Slide 52

Slide 52

1 Avoid logging in. (If you can.)

Avoid logging in. (If you can.) For example, when you’re watching videos on YouTube. Many platforms will still track you via your IP address, or fingerprinting, but whatever you can do to minimise them connecting your browsing habits with your personal information will help you stay protected.

Slide 53

Slide 53

2 Avoid providing your real name.

Avoid providing your real name. Of course this is trickier in professional settings, but where you can, use a pseudonym (a cute username!) to prevent platforms from connecting the data they collect about you to further data from other platforms.

Slide 54

Slide 54

3 Use custom email addresses.

Use custom email addresses.…

Slide 55

Slide 55

For example, when you’re signing up for a platform, use an email address like twitter+email@emailprovider.com. Many email services support using custom email addresses like this, and will forward any emails to that address to your primary inbox.

Slide 56

Slide 56

4 Avoid providing your phone number.

Avoid providing your phone number. Folks in privacy and security recommend using two-factor authentication to prevent nefarious strangers from getting into your accounts. But be aware that phone numbers are a little risky for authentication…

Slide 57

Slide 57

A study into whether Facebook used personally-identifiable information for targeted advertising found that when “We added and verified a phone number for [two-factor authentication] to one of the authors’ accounts. We found that the phone number became targetable after 22 days”—Giridhari Venkatadri, Elena Lucherini, Piotr Sapiezynski, and Alan Mislove

Slide 58

Slide 58

And not long ago, Twitter admitted they did the same thing: “When an advertiser uploaded their marketing list, we may have matched people on Twitter to their list based on the email or phone number the Twitter account holder provided for safety and security purposes.”

Slide 59

Slide 59

5 Use a reputable Virtual Private Network (VPN)

Use a (reputable) Virtual Private Network (VPN). VPNs will obscure your location, making it harder to track you and associate your browsing habits with you as a individual. But make sure they don’t spy on your browser traffic themselves. (Be very suspicious of the cheap/free VPNs!)

Slide 60

Slide 60

6 Use private browsing or browsing containers.

Use private browsing or browsing containers. When you login to a platform, do so in a private window or container. Usually this will ensure that any cookies set during your session will be discarded when you close that window or container.

Slide 61

Slide 61

7 Log out.

Log out. Help prevent platforms from continuing to track you on other sites (especially social media buttons!) by logging out. I once found a site sending the articles I was viewing back to Instagram because I was still logged in.

Slide 62

Slide 62

8 Disallow third-party cookies.

Disallow third-party cookies in your browser preferences. This might break a lot of sites, but where you can, it’ll protect you from quite a bit of tracking.

Slide 63

Slide 63

9 Use a tracker blocker.

Use a tracker blocker…

Slide 64

Slide 64

Far be it for me to recommend something I build myself, but we built Better Blocker out of necessity. It doesn’t block everything, but it does block a lot.

Slide 65

Slide 65

We also recommend uBlock Origin for platforms that support it (Most tracker blockers will also track you, uBlock Origin is privacyrespecting.)

Slide 66

Slide 66

10 Use DuckDuckGo for search. (Not Google.)

Use DuckDuckGo, not Google search. Google has its trackers on 80% of the web’s most popular sites, don’t give it the intimate information contained in your search history. (The symptoms you’re worried about, the political issues you care about, the products you’re looking to buy…)

Slide 67

Slide 67

DuckDuckGo is a privacy-respecting alternative.

Slide 68

Slide 68

11 Don’t use Gmail.

Don’t use Gmail. Your email not only contains all your communication, but the receipts for everything you’ve bought, the confirmations of every event you’ve signed up for, and every platform, newsletter, and service you’ve joined.

Slide 69

Slide 69

There are plenty of email providers out there that are privacy-respecting. Two good options are Fastmail

Slide 70

Slide 70

and Protonmail.

Slide 71

Slide 71

12 Don’t use Whatsapp.

Whatsapp’s message contents may be encrypted, but not who you’re talking to and when you’re talking to them.

Slide 72

Slide 72

There are privacy-respecting alternatives including Wire.

Slide 73

Slide 73

13 Don’t use Facebook. (For everything.)

Don’t use Facebook (for everything). I’m very aware that there isn’t really a good alternative to Facebook when your child’s school is using it to communicate, or your friends are sharing baby photos. But when you can find alternatives for chat (Wire can replace Messenger) and Facebook Pages (your own website gives you far more control!), do it.

Slide 74

Slide 74

14 Seek alternatives.

Seek alternatives to the software you use every day.

Slide 75

Slide 75

switching.software has a great list of resources, provided by people who really care about ethical technology.

Slide 76

Slide 76

We’re not just tracked on the web.

Of course, these are all choices we can make once we’re on the web, but we need to be aware of other places where we are tracked.…

Slide 77

Slide 77

  • Third-party services that websites use aren’t necessarily visible to us.
  • The webhosts hosting the websites can also track us.
  • Browsers (beware those that want you to log in or use telemetry!) can also track us.
  • Our operating systems can also track us.
  • And even our Internet Service Providers (ISPs) have records of what we’re browsing online.

Slide 78

Slide 78

Your choices affect your friends and family.

And of course, your choices affect your friends and family. Your friends and family’s choices also affect you. You may not be using Gmail, but if you email somebody who does, Google still gets some information about you. You may not use Facebook, but Facebook still has shadow profiles for people added via their friend’s contacts lists.

Slide 79

Slide 79

This is too much work.

This all seems like a lot of work, right? I KNOW! I advocate for privacy and I don’t have the time or the resources to do all of this all the time.

Slide 80

Slide 80

Don’t blame the victim.

That’s why it’s unfair to blame the victim for having their privacy eroded.

Slide 81

Slide 81

Our concept of privacy is being twisted.

Not to mention that our concept of privacy is getting twisted by the same people who have an agenda to erode it. One of the biggest culprits in attempting to redefine privacy: Facebook.

Slide 82

Slide 82

Here is a Facebook ad that’s recently been showing on TVs. It shows a person undressing behind towels held up by her friends on the beach, alongside the Facebook post visibility options, “Public, Friends, Only Me, Close Friends,” explaining how we each have different privacy preferences in life. It ends saying “there’s lots of ways to control your privacy settings on Facebook.”…

Slide 83

Slide 83

But it doesn’t mention that “Friends”, “Only Me”, and “Close Friends” should really read “Friends (and Facebook)”, “Only Me (and Facebook)” and “Close Friends (and Facebook)”.

Because you’re never really sharing something with “Only Me” on Facebook. Facebook Inc. has access to everything you share.

Slide 84

Slide 84

Privacy is the ability to choose what you want to share with others, and what you want to keep to yourself. Facebook shouldn’t be trying to tell us otherwise.

Slide 85

Slide 85

Google doesn’t believe in privacy either. Ten years ago, Eric Schmidt, then CEO of Google (now Executive Chairman of Alphabet, Google’s parent corporation), famously said >“If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.”

To which I respond: OK, Eric. Tell me about your last trip to the toilet…

Slide 86

Slide 86

Shouldn’t we be smart about what we share publicly?

Do we need to be smart about what we share publicly? Sure! Don’t go posting photos of your credit card or your home address. Maybe it’s unwise to share that photo of yourself blackout drunk when you’ve got a job interview next week. Perhaps we should take responsibility if we say something awful to another person online.…

Slide 87

Slide 87

Sharing should be a social concern, not a privacy concern.

But we should be worried about what we share online for social reasons, not for privacy reasons. We shouldn’t need to be taking these steps to protect our privacy from corporations and governments.

Slide 88

Slide 88

Corporations blame us for giving up our privacy.

Right now, the corporations are more than happy to blame us for our loss of privacy. They say we signed the terms and conditions, we should read the privacy policies, it’s our fault.

Slide 89

Slide 89

However, an editorial board op-ed in the New York Times pointed out that the current state of privacy policies are not fit for use:

“The clicks that pass for consent are uninformed, non-negotiated and offered in exchange for services that are often necessary for civic life.”

Slide 90

Slide 90

It’s the same conclusion of a whole documentary that was made way back in 2013, called “Terms And Conditions May Apply.”

Slide 91

Slide 91

As part of my work looking into trackers, I often read privacy policies, but even I don’t read the entirety of all the privacy policies for the platforms I use.

In fact, “Two law professors analyzed the sign-in terms and conditions of 500 popular US websites, including Google and Facebook, and found that more than 99 percent of them were “unreadable,” far exceeding the level most American adults read at, but are still enforced.”—Dustin Patar

Slide 92

Slide 92

It is not informed consent.

It’s not informed consent. And how can we even truly consent if we don’t know how our information can be used against us?

Slide 93

Slide 93

It’s not consent if there’s not a real choice.

And it’s not true consent if it’s not a real choice. We’re not asked who should be allowed access to our information, and how much, and how often, and for how long, and when…

Slide 94

Slide 94

We’re asked to give up everything or get nothing.

We’re asked to give up everything or get nothing. That’s not a real choice.

Slide 95

Slide 95

It’s certainly not a real choice when the cost of not consenting is to lose access to social, civil, and labour infrastructure.

Slide 96

Slide 96

There’s a recent paper by Jan Fernback and Gwen Shaffer, ‘Cell Phones, Security and Social Capital: Examining How Perceptions of Data Privacy Violations among Cell-Mostly Internet Users Impact Attitudes and Behavior.’ The paper examines the privacy tradeoffs that disproportionately affect mobile-mostly internet users, looking at technology-driven inequalities.

I’ll get more into that later, but for now, I’ll leave you with an excerpt that shows the cost of not giving consent:

“Some focus group participants reported that, in an effort to maintain data privacy, they modify online activities in ways that harm personal relationships and force them to forego job opportunities.”

Slide 97

Slide 97

The technology we use is our new everyday things.

The technology we use are our new everyday things. It forms that vital social, civil, and labour infrastructure. And as (largely) helpless consumers, there’s often not much we can do to protect ourselves without a lot of free time and money.

Slide 98

Slide 98

Accessible unethical technology

Now I want to get into the stuff that I don’t often get to discuss in depth at other events. It’s why I wanted to speak today, get your thoughts. Things that I wrestle with, that constrain me, and I hope you’ll endure me if I don’t quite have the right words. It’s the reasoning behind my talk. I want to talk about accessible unethical technology.

Slide 99

Slide 99

Intersectionality

When discussing inclusivity, we often discuss intersectionality.

Slide 100

Slide 100

Intersectionality is a theory introduced by Kimberlé Williams Crenshaw where she examined how black women are often let down by discrimination narratives that focus on black men as the victims of race-based discrimination, and white women as the victims of gender-based discrimination. These narratives don’t examine how black women often face discrimination compounded by both race and gender, discrimination unique to black women, and discrimination compounded by other factors such as class, sexual orientation, age and disability.

Slide 101

Slide 101

Kimberlé Williams Crenshaw uses intersectionality to describe how overlapping or intersecting social identities, particularly marginalised or minority identities, relate to systems and structures of domination, oppression or discrimination.

Slide 102

Slide 102

We see this in accessibility and inclusivity when we understand that a disabled person frequently has more than one impairment that might impact their use of technology. The impairments reported by disabled people in the UK in the year 2017-2018 shows that many disabled people in the UK have more than one impairment that affected them in the last year.

Slide 103

Slide 103

Disability also intersects with race, class, wealth, occupation…

And those impairments also intersects with a person’s race, class, wealth, job, and so many other factors. For example, a disabled person with affordable access to a useful assistive technology will likely have a very different experience from a disabled person who cannot afford the same access.

Slide 104

Slide 104

We need to talk about unethical accessible technology.

We rarely discuss is how the ethical considerations of a product, project or business also intersect with inclusivity.

Slide 105

Slide 105

When the technology you use is a lifeline to access, you are impacted more severely by its unethical factors.

Slide 106

Slide 106

Last year, Dr Frances Ryan covered this in her article, ‘The missing link: why disabled people can’t afford to #DeleteFacebook’. After the Cambridge Analytica scandal was uncovered, many people started encouraging each other to #DeleteFacebook.

Slide 107

Slide 107

Dr Frances Ryan pointed out “I can’t help but wonder if only privileged people can afford to take a position of social media puritanism. For many, particularly people from marginalised groups, social media is a lifeline – a bridge to a new community, a route to employment, a way to tackle isolation.”

Slide 108

Slide 108

This is also echoed in that paper I mentioned earlier, by Jan Fernback and Gwen Shaffer:

Slide 109

Slide 109

“First, economically disadvantaged individuals, Hispanics, and African Americans are significantly more likely to rely on phones to access the internet, compared to wealthier, white Americans. Similarly, people of color are heavier users of social media apps compared to white Americans.”…

Slide 110

Slide 110

“Second, mobile internet use, mobile apps, and cell phones themselves leak significantly more device-specific data compared to accessing websites on a computer.”

Slide 111

Slide 111

”In light of these combined realities, we wanted to examine the kinds of online privacy tradeoffs that disproportionately impact cell mostly internet users and, by extension, economically disadvantaged Americans and people of color.”

Slide 112

Slide 112

It’s inequality.

And what they found speaks to incredible inequality:

Slide 113

Slide 113

Speaking about this paper, Gwen Schaffer explained

“All individuals are vulnerable to security breaches, identity fraud, system errors, and hacking. But economically disadvantaged individuals who rely exclusively on their mobile phones to access the internet are disproportionately exploited…”

Slide 114

Slide 114

“Unfortunately, members of disadvantaged populations are frequent targets of data profiling by retailers hoping to sell them cheap merchandise or bait them into taking out subprime loans.”

Slide 115

Slide 115

“They may be charged higher insurance premiums or find their job applications rejected. Ultimately, the inequities they experience off-line are compounded by online privacy challenges.”

Slide 116

Slide 116

But many (and I think a lot of us can identify with this feeling!), felt resigned to trading their privacy for access to the Internet:

“Study participants, largely, seemed resigned to their status as having little power and minimal social capital.”

Slide 117

Slide 117

These inequalities are likely to intersect with disabled people’s lives.

And I believe that these inequalities are likely to intersect with disabled people’s lives. Particularly because disabled people already have less access to the Internet…

Slide 118

Slide 118

If we read the UK’s Office for National Statistics Internet Use report for 2019, the number of disabled adults using the internet has risen. However, there’s still a whopping 17% difference between disabled adults and non-disabled adults.

“In 2019, the proportion of recent internet users was lower for adults who were disabled (78%) compared with those who were not disabled (95%).”

Slide 119

Slide 119

If we briefly return to the topic of privacy policies, I noted that the reading level required for the average privacy policy was higher than the education afforded most Americans.

In fact, the report suggested 498/500 of the privacy policies examined required more than 14 years of education to understand…

Slide 120

Slide 120

What about if you have difficulty reading?

What they failed to note was how that might be significantly harder to access for those with difficulties reading.

Slide 121

Slide 121

Privacy and access

This brings me to the intersection of privacy and access.

Slide 122

Slide 122

It’s what I really wondered about when I read the results of WebAIM’s 8th Screenreader User Survey.

Slide 123

Slide 123

The survey featured the question, “How comfortable would you be with allowing web sites (and thus web site owners) to detect whether you are using a screen reader?”

Slide 124

Slide 124

Responses were 62.5% “very or somewhat comfortable” with allowing screen reader detection, with respondents with disabilities being significantly more likely to be favourable of detection than respondents without disabilities.

Slide 125

Slide 125

The report summarises: > “These responses clearly indicate that the majority of users are comfortable with revealing their usage of assistive technologies, especially if it results in a more accessible experience.”

Slide 126

Slide 126

The report also points to Marco Zehe’s blog post on that very question from 2014, where he discusses problems that could (or are even likely) to arise if a website could detect whether a visitor was using a screenreader. It’s a question that has come up again this year with Apple briefly enabling assistive technology detection in iOS Safari.

Slide 127

Slide 127

Léonie Watson lays out the case against screen reader detection very clearly in the same year including (and I paraphrase badly here):

  • not wanting to share personal information with the websites she visits
  • not wanting to be segregated into text-only websites

Slide 128

Slide 128

  • not wanting design decisions to be based on the wrong criteria
  • wanting the focus to be on achieving screen reader accessibility through future-friendly inclusive robust standards-based development

Slide 129

Slide 129

Screen reader detection is an invasion of privacy.

Like the detection of any personal information, it’s an invasion of privacy. The decision about what you choose to share is taken away from you. The consequences are then further decisions are made on your behalf.

Slide 130

Slide 130

A data point is more than just a data point.

Often when I discuss sharing personal information with people, they only think about the value of a specific (sometimes sensitive) data point, and not the implications of sharing that data point with others.

Slide 131

Slide 131

Marco goes into this in detail in his blog post, discussing that one data point of him using a screenreader:

“For one, letting a website know you’re using a screen reader means running around the web waving a red flag that shouts “here, I’m visually impaired or blind!” at anyone who is willing to look…”

Slide 132

Slide 132

“It would take away the one place where we as blind people can be relatively undetected without our white cane or guide dog screaming at everybody around us that we’re blind or visually impaired, and therefore giving others a chance to treat us like true equals.”

Slide 133

Slide 133

“Because let’s face it, the vast majority of non-disabled people are apprehensive in one way or another when encountering a person with a disability.”

Slide 134

Slide 134

Sam’s non-tech perspective.

Curious about the feelings of people who don’t work in tech, I asked my brother Sam. (A lazy focus group of one.) He’s a bit biased because he’s my brother, has some knowledge of tech, and has to put up with me going on about privacy all the time (and he says that has affected how he uses the web.)

Sam has cerebral palsy and learning difficulties, uses a screen reader occasionally and uses dictation software 90% of the time. As a person with a neurological condition that is visible in his physicality, he gets Marco’s perspective, he’s frustrated by how people treat him when they know he’s disabled.

Slide 135

Slide 135

Still, Sam has the same feelings as many of those screen reader users responding to the survey:

“I don’t mind if the platforms know I’m disabled, if they provide me with better access. Though I’d be bothered if they made it obvious to other users of the platform…”

Slide 136

Slide 136

We should be allowed to make the real choice.

Sam, and the 62.5% of screen reader users should be allowed to make this choice. As long as it is a real choice. As long as we know who is allowed access to our information, and how much, and how often, and for how long, and when.

Slide 137

Slide 137

Like so many issues we have with technology, what we’re dealing with are the underlying social and systemic issues. As technologists, we often can’t help ourselves trying to fix or smooth over problems with technology. But technology can’t fix issues of domination, oppression or discrimination.

Slide 138

Slide 138

Technology amplifies social and systemic issues.

But it can make those issues worse. We can (and do) amplify and speed up systemic issues with technology.

Slide 139

Slide 139

Mike Ananny made this point recently in an article about tech platforms. We still seem to operate with the notion that online life is somehow a different life, detached from our everyday existence.

Slide 140

Slide 140

And tech platforms often take advantage of that notion by suggesting that if we don’t like “technology” (because they also imply that their approach is the one true inevitable way to build tech), we can just log out, log off, and be mindful or some other shit instead.

People with this mindset often show how shallow they are by saying “if you don’t like technology, you don’t have to use it…”

Slide 141

Slide 141

But there’s a reason we can’t escape technology:

“Platforms are societies of intertwined people and machines. There is no such thing as “online life” versus “real life.” We give massive ground if we pretend that these companies are simply having an “effect” or “impact” on some separate society.”—Mike Ananny

Slide 142

Slide 142

Technology colonialism

Which brings me to another issue rife in technology today. Given that I’m a non-disabled person currently talking about accessibility and inclusivity, I think it’s worth me also mentioning technology colonialism.

Slide 143

Slide 144

Slide 144

First explaining what colonialism is…

“European colonialism was spurred by an interest in trade with foreign nations. This involved the exchange of materials and products between the colonial powers and foreign nations.”

Slide 145

Slide 145

“Colonial powers always saw themselves as superiors over the native people whose culture was rarely recognized or respected. The colonizers saw economic value in these foreign relations, but it was always viewed as a transaction based on inequality.”…

Slide 146

Slide 146

And then comparing it to what we so often do in technology:

“Technology companies continue this same philosophy in how they present their own products. These products are almost always designed by white men for a global audience with little understanding of the diverse interests of end users.”

Slide 147

Slide 147

Can you tell why I think technology colonialism is incredibly relevant to inclusivity?

Slide 148

Slide 148

We don’t speak to users. Instead, we use analytics to design interfaces for people we’ll never try to speak to, or ask whether they even wanted to use our tech in the first place. We’ll assume we know best because we are the experts, and they are “just users.” We don’t have diverse teams, we barely even try to involve people with needs different from our own. How often do we try to design accessible interfaces without actually involving anyone who uses assistive technology?

Slide 149

Slide 149

“Nothing about us without us.”

It’s the reason disabled activists and anti-racism activists both say “Nothing about us without us.” Because assuming you know what’s best for people with different needs from your own usually results in an incorrect (and too often) patronising solution.

Slide 150

Slide 150

We have to hold our communities to account.

All this talk of being colonial, it’s important for me to acknowledge my own position as a non-disabled person advocating for inclusivity and accessibility. I believe we have to hold our communities to account without centring ourselves.

Slide 151

Slide 151

Accessibility is not charity or kindness, it’s a responsibility.

I don’t know what’s best for anyone, but when I learn what’s harmful, I’m going to pay attention and share what I learn. Accessibility is not charity or kindness, it’s a responsibility.

Slide 152

Slide 152

We not only have a responsibility to design more inclusive and accessible technology, but to consider the impact our design has outside of its immediate interface.

Slide 153

Slide 153

Making our technology inclusive and accessible is not enough if the driving forces behind that technology are unethical.

Slide 154

Slide 154

We shouldn’t be grateful for the accessibility of unethical products.

Slide 155

Slide 155

Accessibility is only inclusive if it respects all the rights of a person.

Slide 156

Slide 156

It’s hard to advocate for change when alternatives don’t yet exist.

As the people advocating for change, we can’t exactly go around telling people to stop using this technology unless there are real, ethical alternatives.

Slide 157

Slide 157

We have the power to make that change.

That’s where you and me come in. As people who work in technology, and who create technology, we have far more power for change. We can encourage more ethical practice. We can build alternatives.

Slide 158

Slide 158

How to build ethical technology

How do we build ethical technology?

Slide 159

Slide 159

Build small technology

As an antidote to big tech, we need to build small technology.

Slide 160

Slide 160

Everyday tools for everyday people designed to increase human welfare, not corporate profits.

Yeah, sure it’s a lofty goal, but there are practical ways to approach it.

Slide 161

Slide 161

Building small technology: approaches

Let’s make a start with some approaches. Best practices, if you will.

Slide 162

Slide 162

How to build small technology: Make it easy to use.

Slide 163

Slide 163

Plenty of privacy-respecting tools exist for nerds to protect themselves (I use some of them.) But we mustn’t make protecting ourselves a privilege only available to those who have the knowledge, time and money.

Slide 164

Slide 164

It’s why we must make easy-to-use technology that is

  • functional (this includes accessible)
  • convenient
  • reliable

Slide 165

Slide 165

How to build small technology: Make it inclusive.

Slide 166

Slide 166

We must ensure people have equal rights and access to the tools we build and the communities who build them, with a particular focus on including people from traditionally marginalised groups.

Free and open technology (a lot of those nerd tools) are particularly terrible at this, not building accessible technology, and often surrounding themselves with toxic communities.

Slide 167

Slide 167

How to build small technology: Don’t be colonial.

Slide 168

Slide 168

Our teams must reflect the intended audience of our technology.

Slide 169

Slide 169

If we can’t build teams like this (some of us work in small teams or as individuals), we must ensure people with different needs can take what we make and specialise it for their needs. We can build upon the best practices and shared experiences of others, but we should not be making assumptions about what is suitable for an audience we are not a part of.

Slide 170

Slide 170

How to build small technology: Make it personal.

Slide 171

Slide 171

We’ve got to stop our infatuation with growth and greed. Focus on building personal technology for everyday people, not spending all our focus, experience, and money on tools for startups and enterprises.

Slide 172

Slide 172

Building small technology: architecture

Next up, the architecture of the technology.

Slide 173

Slide 173

How to build small technology: Make it private by default.

Slide 174

Slide 174

This bears repeating: Privacy is the ability to choose what you want to share with others, and what you want to keep to yourself.

Slide 175

Slide 175

Make your technology functional without personal information.

Slide 176

Slide 176

Consent: Allow people to share their information for relevant functionality only with their explicit consent.

Slide 177

Slide 177

Consent: When obtaining consent, tell the person what you are going to do with their information, who will have access to it, and how long you will keep that information stored. (This has recently become established as a requirement under the GDPR.)

Slide 178

Slide 178

Consent: Write easy-to-understand privacy policies. Don’t just copy and paste them from other sites (they probably copy-pasted them in the first place!) Ensure it’s up-to-date with every update to your technology.

Slide 179

Slide 179

Consent: Don’t use third-party consent frameworks. Most of these aren’t GDPR-compliant, they’re awful experiences for your visitors, and they’re likely to just get you into legal trouble.

Slide 180

Slide 180

Third-party services: Don’t use third-party services if you can avoid them. (As they present a risk to you and your users.)

Slide 181

Slide 181

Third-party services: Make it your responsibility to know what they’re doing with your users’ information.

If you do use third-party services, make it your responsibility to know their privacy policies, what information they are collecting, and what they are doing with that information.

Slide 182

Slide 182

Third-party services: Self-host all the things.

If you use third-party scripts, delivery networks, videos, images and fonts, self-host them wherever possible. Ask the providers if it’s unclear whether they provide a self-hosted option

Slide 183

Slide 183

And it’s probably worth mentioning a little bit of social media etiquette:

If you know how, strip the tracking identifiers and Google amp junk from URLs before you share them. Friends don’t let corporations invade their friends’ privacy.

Slide 184

Slide 184

Social media etiquette: If you feel you need a presence social media or blogging platforms, don’t make it the only option. Post to your own site first, then mirror those posts on third-party platforms for the exposure you desire.

Slide 185

Slide 185

How to build small technology: Make it zero-knowledge.

Slide 186

Slide 186

Zero-knowledge tools have no knowledge of your information. The technology may store a person’s information, but the people who make or host the tools cannot access that information if they wanted to.

Slide 187

Slide 187

Zero-knowledge: Keep a person’s information on their device where possible.

Slide 188

Slide 188

Zero-knowledge: If a person’s information needs to be synced to another device, ensure that information is end-to-end encrypted, with only that person having access to decrypt it.

Slide 189

Slide 189

How to build small technology: Make it peer-to-peer.

Slide 190

Slide 190

Peer-to-peer systems enable people to connect directly with one another without a person (usually a corporation or a government) in the middle. Often this means communicating device to device without a server in the middle.

Slide 191

Slide 191

How to build small technology: Make it interoperable.

Slide 192

Slide 192

Interoperable systems can talk to one another using well-established protocols, such as web standards. (Standards don’t always mean best practice, but that’s a discussion for another time…) Make it easy for a person to export personal information from your technology into another platform. (This is also required by GDPR.)

Slide 193

Slide 193

Interoperable: Make it easy for a person to export their information from your technology into another platform. (This is also required by GDPR.)

Slide 194

Slide 194

How to build small technology: Make it share alike.

And we also have to take care with how we share our technology, and how we sustain its existence. Make it share alike.

Slide 195

Slide 195

Cultivate a healthy commons by using licences that allow others to build upon, and contribute back to your work. Don’t allow big tech to come along, make use of it and shut it off. (That’s what the MIT licence allows!)

Slide 196

Slide 196

How to build small technology: Make it non-commercial.

And we also have to take care with how we share our technology, and how we sustain its existence. Make it non-commercial.

Slide 197

Slide 197

Non-commercial: Build stayups, not startups.

My partner at Small Technology Foundation, Aral Balkan, coined the term stayups for the anti-startup. We don’t need more tech companies aiming to fail fast or be sold as quickly as possible. We need long-term sustainable technology.

Slide 198

Slide 198

Non-commercial: Build not-for-profit technology.

If we are building sustainable technology for everyday people, we need a compatible funding model, not venture capital or equity-based investment.

Slide 199

Slide 199

Building small technology: personal approaches

Are you asking how can I do any of this?

Slide 200

Slide 200

It feels impossible. It probably is!

It may feel difficult or even impossible to build small technology with your current employer or organisation. It probably is! But there are steps we can take to give ourselves the opportunity to build more ethical technology.

Slide 201

Slide 201

1: If you can’t do it at work, do it at home.

If you have the time, make a personal website, practice small technology on your own projects.

Slide 202

Slide 202

2: Use small technology as job criteria.

Use small technology as a criteria when you’re looking for your next job. You don’t have to be at your current job forever.

Slide 203

Slide 203

3: Developing accessibility best practices is always a good thing.

Developing accessibility best practices is always a good thing. If you’re currently making unethical technology accessible, that’s fine. At least you can use those skills to make accessible ethical technology in the future.

Slide 204

Slide 204

Building small technology: who builds it?

So building small technology… who builds it?

Slide 205

Slide 205

Luddites, rebels and tin-foil hats.

These are all comments I’ve heard from people aiming to demean our work. I’ve been speaking about this for around seven years. I’ve been heckled by a loyal Google employee, I’ve been called a tinfoil-hatwearing ranter by a Facebook employee. I’ve had people tell me there just isn’t any other way, that I’m just trying to impede the “natural progress” of technology.

Slide 206

Slide 206

As Rose Eveleth wrote in a recent article on Vox, about that assertion that technology is just following its natural progress:

“The assertion that technology companies can’t possibly be shaped or restrained with the public’s interest in mind is to argue that they are fundamentally different from any other industry. They’re not.”

Slide 207

Slide 207

We can’t keep making poor excuses for bad practices.

Slide 208

Slide 208

We must consider who we are implicitly endorsing when we recommend their work and their products.

Slide 209

Slide 209

And, I’m sorry, I don’t give a jot about all the cool shiz coming out of unethical companies. You are not examples to be held above others. Your work is hurting our world, not contributing to it.

Slide 210

Slide 210

Our whole approach matters.

Our whole approach matters. It’s not just about how we build technology, but our approach to being a part of communities that create technology.

Slide 211

Slide 211

You might be thinking “but I’m just one person.”

Slide 212

Slide 212

But we are an industry, we are communities, we are organisations, we are groups made up of many persons. And if more of us made an effort, we could have a huge impact.

Slide 213

Slide 213

You are not your job.

We have to remember that we are more than just the organisation we work for. If you work for a big corporation that does unethical things, you probably didn’t make the decision to do that bad thing. But I think the time has come that we can no longer unquestioningly defend our employers.

Slide 214

Slide 214

We need to use our social capital, we need to be the change we want to exist.

But how? I have some ideas…

Slide 215

Slide 215

1. Be independent.

We’ve got to be comfortable being different, we can’t just follow other people’s leads when those other people aren’t being good leaders. Don’t look to heroes who can let you down, don’t be loyal to big corporations who don’t care anything for you.

Slide 216

Slide 216

2. Be the advisor.

Be the advisor. Do the research on inclusive, ethical technology, make recommendations to others. Make it harder for them to make excuses. (You’re here, you’re doing it already!)

Slide 217

Slide 217

3. Be the advocate.

Be the advocate. Marginalised folks shouldn’t have to risk themselves to make change. Advocate for others. Advocate for the underrepresented.

Slide 218

Slide 218

4. Be the questioner.

Question those defaults. Ask why was it been chosen to be built that way in the first place? Try asking a start-up how it makes its money!

Slide 219

Slide 219

5. Be the gatekeeper.

When the advocacy isn’t getting you far enough, use your expertise to prevent unethical things from happening on your watch. You don’t have to deploy a website, and other people might not know how to deploy that website.

Slide 220

Slide 220

6. Be difficult.

Be difficult. Be the person who is known for always bringing up the issue. Embrace the awkwardness that comes with your power. Call out questionable behaviour.

Slide 221

Slide 221

7. Be unprofessional.

Don’t let anybody tell you that standing up for the needs of yourself, and others, is unprofessional. Don’t let people tell you to be quiet. Or that you’ll get things done if you’re a bit nicer.

Slide 222

Slide 222

8. Be the supporter.

Be the supporter. If you are not comfortable speaking up for yourself, at least be there for those that do. Remember silence is complicity.

Slide 223

Slide 223

Speaking up is risky. We’re often fighting entities far bigger than ourselves. We have our lives, the way we make money at risk…

Slide 224

Slide 224

But letting technology continue this way is riskier.

Slide 225

Slide 225

Eat your vegetables.

Someone came up to me after I gave a talk a couple of months ago, and referred to me as “the woman who comes and tells people to eat their vegetables.” But I am not your mother!

Slide 226

Slide 226

We deserve better.

I’m just here because I want to tell you that we deserve better.

Slide 227