Defying the mainstream: building technology that respects our rights

A presentation at New Adventures in January 2020 in Nottingham, UK by Laura Kalbag

Slide 1

Slide 1

Defying the mainstream: building technology that respects our rights

Laura Kalbag, laurakalbag.com@LauraKalbag Small Technology Foundation small-tech.org

Slide 2

Slide 2

So I’m browsing the Pink News site, because I think we can agree that the same-sex dance on Strictly at the end of last year was a dream…

Slide 3

Slide 3

And as I scroll down below the article, I am presented with reams of classic clickbait: “Cork, Want To Get The Latest Vista Hearing…”

Slide 4

Slide 4

“25 Celebs You Didn’t Realize Are Gay - No. 8 Will Surprise Women”

Slide 5

Slide 5

“Drink This Before Going to Bed to Help Burn Belly Fat”

Slide 6

Slide 6

This clickbait is quite revealing through its chosen topics alone. It knows I’m in Cork. It knows the broad subject matter of the content of the article I’m reading, or the site I’m on. It appears to suspect I’m a woman. And well, who doesn’t feel targeted by clickbait ads about belly fat.

Slide 7

Slide 7

This clickbait is provided to Pink News by Taboola.

Slide 8

Slide 8

I work on blocking trackers with a privacy tool called Better Blocker, and we’ve looked into Taboola. In our crawls of the most popular sites on the web, we found Taboola on nearly 5% of sites.

Slide 9

Slide 9

Taboola’s aim is to:

“Drive marketing results by targeting your audience when they are most receptive to new messages.”

Slide 10

Slide 10

You can do that with their:

“Data Rich Recommendations: Ensure that your brand reaches interested people by leveraging the massive amounts of user data powering the Taboola engine.” (Emphasis my own!)

Slide 11

Slide 11

In fact, they provide a handy graphic here showing some of the information that might be useful about a site’s visitor. “Device and operating system”… but also… “In the the market for > car, fashion, electric bike”… “Interest > pet lovers, environment, entertainment, science & tech.”

Slide 12

Slide 12

So I scroll down to Taboola’s privacy policy, to see how they know this information about me, and what they intend to do with it. They seem to have a specific policy for “Third Party Online Advertising”, so I’ll check that out.

Slide 13

Slide 13

“We automatically collect User Information when Users interact with our Services that appear on our Customers’ websites and digital properties.”

Slide 14

Slide 14

“Taboola collects only pseudonymized data, which means we do not know who you are because we do not know or process your name, email address, or other identifiable data.”

Slide 15

Slide 15

Let’s debunk this for a second. “Pseudonymised data” or “anonymised data” doesn’t mean you’re unidentifiable. Even though it’s a claim that privacy policies have been hanging off for years…

Slide 16

Slide 16

As Bruce Schneier said over a decade ago in Wired, “it takes only a small named database [(as in a database containing names)] for someone to pry the anonymity off a much larger anonymous database.” They just need to compare some data points that match in each database.

Slide 17

Slide 17

A recent study (and it’s not the only study) into methods to re-identify individuals from anonymised datasets found “Using our model, we find that 99.98% of Americans would be correctly re-identified in any dataset using 15 demographic attributes.” Attributes such as age, gender, ethnicity, post code, number of children, number of cars owned, location, status updates, and results on a personality quiz.

Slide 18

Slide 18

Returning to Taboola’s privacy policy… I want to know how the interests Taboola infers compares to these kinds of demographic attributes. They’re described by Taboola as “data segments”… “A data segment is a grouping of users who share one or more attributes (e.g., travel enthusiasts). We offer a number of data segments, both proprietary and from our data partners”. Kindly, they’ve provided a link to their data partners…

Slide 19

Slide 19

Now two of these data partners stand out to me in particular… Acxiom and Oracle

Slide 20

Slide 20

And that’s because Cracked Labs have done multiple reports into the personal data that corporations collect, combine, analyse, trade and use. And the data brokers that deal in this data.

Slide 21

Slide 21

…featuring two of the biggest data brokers: Oracle and Acxiom.

Slide 22

Slide 22

According to Cracked Labs: “Acxiom provides up to 3000 attributes and scores on 700 million people in the US, Europe, and other regions.”

Slide 23

Slide 24

Slide 24

I’ve picked out some of the creepiest bits of information from the Cracked Labs reports:

  • one of nearly 200 “ethnic codes”
  • political views
  • relationship status
  • income
  • details about banking and insurance policies
  • type of home: including if your home is prison…

Slide 25

Slide 25

  • likelihood whether a person is planning to have a baby or adopt a child
  • number and age of children
  • purchases, including whether a person bought pain relief products
  • whether a person is likely to have an interest in the air force, army, navy, lottery and sweepstakes, or gay and lesbian movies…

Slide 26

Slide 26

search history, including whether a person searched about: abortion, legalising drugs, or gay marriage, protests, strikes, boycotts or riots. And the likelihood that a person is a social influencer or is socially influenced.

Slide 27

Slide 27

Taboola says it “does not knowingly create segments that are based upon what we consider to be sensitive information…” Hmm… Helpfully, Taboola also provides a detailed list of all their apparently not-sensitive “standard health-related segments”…

Slide 28

Slide 28

  • Active Health Management: Far Below Average
  • Health: I Have No Confidence in The Health Care System
  • family and parenting > motherhood > artificial insemination

Slide 29

Slide 29

  • First Sign of Pain, I Take Medicine
  • health and fitness > addiction
  • health and fitness > disorders > panic and anxiety

Slide 30

Slide 30

  • Personality - Dealing with Stress - Bottled Up
  • Personality - Dealing with Stress - Emotional
  • Personality - Dealing with Stress - Quick Fix

This isn’t exactly the kind of information you want marketers to use to sell to you… It is personal.

Slide 31

Slide 31

Personality attributes were also used by Cambridge Analytica. They collected them through a personality test app on Facebook that also harvested the profiles of the participant’s friends and friend’s friends.

Slide 32

Slide 32

In this personality test app, “Users were scored on ‘big five’ personality traits – Openness, Conscientiousness, Extroversion, Agreeableness and Neuroticism – and in exchange, 40% of them consented to access to their Facebook profiles.” Source: https://www.theguardian.com/news/2018/mar/17/data-war-whistleblower-christopher-wylie-faceook-nix-bannon-trump

Slide 33

Slide 33

“[Cambridge Analytica] itself claimed to be able to analyse huge amounts of consumer data and combine that with behavioural science to identify people who organisations can target with marketing material.” Source: https://www.theguardian.com/news/2018/mar/18/what-is-cambridge-analytica-firm-at-centre-of-facebook-data-breach

Slide 34

Slide 34

Profiling

It’s profiling.

Slide 35

Slide 35

Cambridge Analytica was a venture of SCL Elections whose “expertise was in “psychological operations” – or psyops – changing people’s minds not through persuasion but through “informational dominance”, a set of techniques that includes rumour, disinformation and fake news.” Source: https://www.theguardian.com/news/2018/mar/17/data-war-whistleblower-christopher-wylie-faceook-nix-bannon-trump

Slide 36

Slide 36

Targeting.

That’s targeting.

Slide 37

Slide 37

The same SCL worked with Steve Bannon on the Trump election campaign. And as this neat graphic from The Guardian shows, SCL’s ventures, Cambridge Analytica and AggregateIQ, worked on multiple Brexit Leave campaigns too. Source: https://www.theguardian.com/news/2018/mar/18/what-is-cambridge-analytica-firm-at-centre-of-facebook-data-breach

Slide 38

Slide 38

Manipulating.

We, as citizens, could be manipulated by the profiling and targeting.

Slide 39

Slide 39

This is all the topic of a recent documentary on Netflix called The Great Hack. And I’d really recommend it if you want a lot of the information without having to do all the reading. It’s accessible for your friends who don’t speak tech-y too.

Slide 40

Slide 40

Tracking affects democracy.

It means it’s not exaggerating to say that tracking affects democracy. And if we use tracking, we have to consider its ethical implications.

Slide 41

Slide 41

I could talk about this in more depth for much longer, but I’ve just not got the time. If you want a read, the book ’Surveillance Capitalism’ by Shoshana Zuboff contains both the history and the predicted future of these massive complex surveillance systems:

Slide 42

Slide 42

Shoshana Zuboff coined the term ‘surveillance capitalism’ and describes it in this book…

“Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data. Although some of these data are applied to product or service improvement, the rest are…fabricated into prediction products that anticipate what you will do now, soon, and later.”

Slide 43

Slide 43

And if you take one look at the size of that book and decide to opt-out, try listening to Shoshana Zuboff interviewed on the Adam Buxton podcast instead.

Slide 44

Slide 44

But it’s all so convenient!

Many people use the argument that profiling and targeting is ok because it makes technology more convenient for the majority of us.

Slide 45

Slide 45

Convenient exploitative technology is like fluffy handcuffs. They may look cute and fluffy, they might lead to some fun. But they’re still handcuffs, and you always want to have access to the key.

Slide 46

Slide 46

How to protect ourselves (individuals)

How can we protect ourselves (as individuals)? Let’s look at a few of the things you can do:

Slide 47

Slide 47

A. Avoid logging in.

Avoid logging in. (If you can.) For example, when you’re watching videos on YouTube.

Slide 48

Slide 48

Fingerprinting

However, many platforms will still track you via fingerprinting. These are a combination of identifiers unique to your browser that act as a fingerprint. Identifiers like your browser height and width, the device you’re using, (and ironically) whether you have ‘Do Not Track’ set in your browser preferences.

Slide 49

Slide 49

In 2015, Facebook even filed a patent saying it could identify people who might know each other because they appear in photos taken by the same camera… the camera being identified by identical lens scratches and dust. Source: https://gizmodo.com/facebook-knows-how-to-track-you-using-the-dust-on-your-1821030620

Slide 50

Slide 50

B. Avoid providing your phone number.

Avoid providing your phone number. Folks recommend using two-factor authentication to prevent nefarious strangers from getting into your accounts. But be aware that phone numbers are a little risky for authentication…

Slide 51

Slide 51

A study into whether Facebook used personally-identifiable information for targeted advertising found that when “We added and verified a phone number for [two-factor authentication] to one of the authors’ accounts… the phone number became targetable [by advertising] after 22 days”

Slide 52

Slide 52

And not long ago, Twitter admitted they did the same thing: “When an advertiser uploaded their marketing list, we may have matched people on Twitter to their list… based on the email or phone number the Twitter account holder provided for safety and security purposes.” Source: https://www.wired.com/story/twitter-two-factor-advertising/

Slide 53

Slide 53

C. Disallow cookies.

Disallow cookies in your browser preferences.

Slide 54

Slide 54

C. Disallow cookies.

Thing is, if we block cookies, many sites fall to pieces, and usually silently. Even if we only block cookies from third-parties. If a site relies on a third-party anything persistent, be it logins, preferences, even shopping baskets… that’ll probably break.

Slide 55

Slide 55

D. Don’t use Gmail.

Don’t use Gmail. Your email not only contains all your communication, but the receipts for everything you’ve bought, the confirmations of every event you’ve signed up for, and every platform, newsletter, and service you’ve joined. (logged in thing)

Slide 56

Slide 56

From our own crawls of the web for Better Blocker, we discovered Google has its tentacles in around 80% of the popular web. Think of all the information Google can extract from those sites.

Slide 57

Slide 57

Your choices affect your friends and family.

Though if your friends and family use Gmail, you’re a bit stuck. Likewise, your choices affect your friends and family.

Slide 58

Slide 58

We’re not just tracked on the web.

Of course, these are all choices we can make once we’re on the web, but we need to be aware of other places where we are tracked.…

Slide 59

Slide 60

Slide 60

Amazon Ring and Alexa can hear everything you say and spy on your neighbours. Source: https://reallifemag.com/false-alarm/

Slide 61

Slide 62

Slide 62

A smart pacifier means you can put a chip in your baby. Source: https://www.pacif-i.io

Slide 63

Slide 63

Of course it was only a matter of time before someone made a smart menstrual cup… Source: https://www.kickstarter.com/projects/700989404/looncup-the-worlds-first-smart-menstrual-cup

Slide 64

Slide 65

Slide 65

And let’s not forget the smart dildo…

Slide 66

Slide 66

We Connect (the smart dildo makers) were even sued for tracking users’ “habits”. Source: https://www.vocativ.com/358530/smart-dildo-company-sued-for-tracking-users-habits/

Slide 67

Slide 67

Have you ever wondered how many calories you’re burning during intercourse? How many thrusts? Speed of your thrusts? The duration of your sessions? Frequency? How many different positions you use in the period of a week, month or year? You want the iCondom And have you ever wanted to share all that information with advertisers, insurers, your government, and who knows else?

Slide 68

Slide 68

Avoiding it all is too much work.

Avoiding it all seems like a lot of work, right? I KNOW! I advocate for privacy…and I don’t have the time or the resources to do all of this all the time.

Slide 69

Slide 69

Don’t blame the victim.

That’s why it’s unfair to blame the victim for having their privacy eroded.

Slide 70

Slide 70

Our concept of privacy is being twisted.

Not to mention that our concept of privacy is getting twisted by the same people who have an agenda to erode it. One of the biggest culprits in attempting to redefine privacy: Facebook.

Slide 71

Slide 71

Here is a Facebook ad that’s recently been showing on TVs. It shows a person undressing behind towels held up by her friends on the beach, alongside the Facebook post visibility options, “Public, Friends, Only Me, Close Friends,” explaining how we each have different privacy preferences in life. It ends saying “there’s lots of ways to control your privacy settings on Facebook.”…

Slide 72

Slide 72

But it doesn’t mention that “Friends”, “Only Me”, and “Close Friends” should really read: “Friends (and Facebook)”, “Only Me (and Facebook)” and “Close Friends (and Facebook)”. Because you’re never really sharing something with “Only Me” on Facebook. Facebook Inc. has access to everything you share.

Slide 73

Slide 73

Privacy is the ability to choose what you want to share with others, and what you want to keep to yourself. Facebook shouldn’t be trying to tell us otherwise.

Slide 74

Slide 74

Google has an interesting interpretation of privacy too. Ten years ago, Eric Schmidt, then CEO of Google (now Executive Chairman of Alphabet, Google’s parent corporation), famously said:

“If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.”

A lot of people will say lines like this. But would they feel comfortable sharing their own (uncleared) browser history?

Slide 75

Slide 75

Shouldn’t we be smart about what we share publicly?

Do we need to be smart about what we share publicly? Sure! Don’t go posting photos of your credit card or your home address. Maybe it’s unwise to share that photo of yourself blackout drunk when you’ve got a job interview next week. Perhaps we should take responsibility if we say something awful to another person online.… But this isn’t about what we knowingly share publicly.

Slide 76

Slide 76

Corporations blame us for giving up our privacy.

Right now, the corporations are more than happy to blame us for our loss of privacy. They say we agreed to the terms and conditions, we should read the privacy policies, it’s our fault.

Slide 77

Slide 77

This in itself is the subject of a whole documentary that was made way back in 2013, called “Terms And Conditions May Apply” covering the ridiculous length and legalese in terms and conditions, and how we couldn’t possibly read them for every service we use.

Slide 78

Slide 78

More recently, an editorial board op-ed in the New York Times pointed out the flaws in privacy policies for consent:

“The clicks that pass for consent are uninformed, non-negotiated and offered in exchange for services that are often necessary for civic life.”

Slide 79

Slide 79

There are studies that speak to how difficult it is to understand these policies too: “Two law professors analyzed the sign-in terms and conditions of 500 popular US websites, including Google and Facebook, and found that more than 99 percent of them were “unreadable,” far exceeding the level most American adults read at, but are still enforced.” Source: https://www.vice.com/en_us/article/xwbg7j/online-contract-terms-of-service-are-incomprehensible-to-adults-study-finds

Slide 80

Slide 80

It is not informed consent

It’s not informed consent when you can’t understand the terms. And how can we even truly consent if we don’t know how our information can be used against us?

Slide 81

Slide 81

It’s not consent if there’s not a real choice.

And it’s not true consent if it’s not a real choice. We’re not asked who should be allowed access to our information, and how much, and how often, and for how long, and when…

Slide 82

Slide 82

Like this fresh hell from Huffington Post. They’ve got this because the GDPR (General Data Protection Regulation) requires they ask for consent before tracking you… I go to read a post on their site and I get an option that essentially reads “Before you continue… and OK” because who is going to read that boring paragraph of text in between? I select Manage options, because I want to opt out!

Slide 83

Slide 83

And I’m presented with yet more text saying the same nothing and another OK button. I want to opt out, so I choose “Manage.”

Slide 84

Slide 84

Yet more text saying the same nothing and “Done.” Well, as much as I’m done with it all, I came here to opt out. So I guess I look at the small text that says “See how our partners use your data”. I select “Show…”

Slide 85

Slide 85

Ooh, a box of text saying the same nothing again, but in small white text on blue this time. I have to hover over a tiny blue “i” icon to read it. Still no option to opt out. I select “Hide” to hide that…

Slide 86

Slide 86

What about in “See and customise which partners can use your data”?

Slide 87

Slide 87

Ah nice, a list of every third party service they use and a link to their privacy policy. Many of which have no reference to the third-party use of their services. Delightful.

Slide 88

Slide 88

There is no choice.

At no point there was there a choice for me. Nothing that resembles consent. Just lots of text saying the same thing over and over again until I give up and just select OK. (Incidentally, these interfaces have already been ruled not compliant with the GDPR.)

Slide 89

Slide 89

We’re asked to give up everything or get nothing.

We’re asked to give up everything or get nothing. That’s not a real choice.

Slide 90

Slide 90

The cost of not consenting is to lose access to social, civil and labour infrastructure. It’s certainly not a real choice when the cost of not consenting is to lose access to social, civil, and labour infrastructure.

Slide 91

Slide 91

There’s a recent paper by Jan Fernback and Gwen Shaffer, ‘Cell Phones, Security and Social Capital.’ The paper examines the privacy tradeoffs that disproportionately affect mobile-mostly internet users. What they found shows the cost of not giving consent.

Slide 92

Slide 92

Speaking about this paper, Gwen Schaffer explained

“All individuals are vulnerable to security breaches, identity fraud, system errors, and hacking. But economically disadvantaged individuals who rely exclusively on their mobile phones to access the internet are disproportionately exploited…”

Slide 93

Slide 93

“Some focus group participants reported that, in an effort to maintain data privacy, they modify online activities in ways that harm personal relationships and force them to forego job opportunities.”

Source: https://www.miccenter.org/wp-content/uploads/2019/08/CellPhones_1a.pdf

Slide 94

Slide 94

The technology we use is our new everyday things.

The technology we use are our new everyday things. It forms that vital social, civil, and labour infrastructure. And as (largely) helpless consumers, there’s often not much we can do to protect ourselves without a lot of free time and money.

Slide 95

Slide 95

When the technology you use is a lifeline to access, you are impacted more severely by its unethical factors.

Slide 96

Slide 96

Two years ago, Dr Frances Ryan covered this in her article, ‘The missing link: why disabled people can’t afford to #DeleteFacebook’. After the Cambridge Analytica scandal was uncovered, many people started encouraging each other to #DeleteFacebook. In fact, it’s been happening again the last few weeks.

Slide 97

Slide 97

Dr Ryan pointed out:

“I can’t help but wonder if only privileged people can afford to take a position of social media puritanism. For many, particularly people from marginalised groups, social media is a lifeline – a bridge to a new community, a route to employment, a way to tackle isolation.”

Slide 98

Slide 98

Technology can’t fix issues of domination, oppression or discrimination.

Like so many issues we have with technology, what we’re dealing with are the underlying social and systemic issues. As technologists, we often can’t help ourselves trying to fix or smooth over problems with technology. But technology can’t fix issues of domination, oppression or discrimination.

Slide 99

Slide 99

Technology amplifies social and systemic issues.

Technology can make those issues worse. We can (and do) amplify and speed up systemic issues with technology.

Slide 100

Slide 100

Mike Ananny recently made the point in an article about tech platforms, that we still seem to operate with the notion that online life is somehow a different life, detached from our everyday existence.

Slide 101

Slide 101

Tech platforms often take advantage of that notion by suggesting that if we don’t like “technology” we can just log out, log off, and be mindful or some other shit instead. People with this mindset often show how shallow they are by saying “if you don’t like the technology, you don’t have to use it…”

Slide 102

Slide 102

But we can’t escape technology:

“Platforms are societies of intertwined people and machines. There is no such thing as “online life” versus “real life.” We give massive ground if we pretend that these companies are simply having an “effect” or “impact” on some separate society.”

Source: https://www.niemanlab.org/2019/10/tech-platforms-are-where-public-life-is-increasingly-constructed-and-their-motivations-are-far-from-neutral/

Slide 103

Slide 103

Technology colonialism

Which brings me to another issue rife in technology today. Technology colonialism.

Slide 104

Slide 105

Slide 105

He started with some history:

“Colonial powers always saw themselves as superiors over the native people whose culture was rarely recognized or respected. The colonizers saw economic value in… foreign relations, but it was always viewed as a transaction based on inequality.”

Slide 106

Slide 106

And then comparing it to what we so often do in technology:

“Technology companies continue this same philosophy in how they present their own products. These products are almost always designed by white men for a global audience with little understanding of the diverse interests of end users.”

Slide 107

Slide 107

We have to reckon with our colonial history. This speaks to us politically too, but today I’m talking about our tech industry, our tech community. We have to reckon with the colonial way in which we’ve created technology…

Slide 108

Slide 108

We don’t speak to users. Instead, we use analytics and data to design interfaces for people we’ll never try to speak to, or ask whether they even wanted our tech in the first place. We’ll assume we know best because we are the experts, and they are “just users.” We don’t have diverse teams, we barely even try to involve people with backgrounds different from our own. We fetishise our tools, valuing the designer experience over that of the people using what we build.

Slide 109

Slide 109

“Intent does not erase impact.”

We can say we had the right intentions, but that truly means nothing. This has been explored in depth recently by Tatiana Mac, who invokes “intent does not erase impact” to describe our often-haphazard approach to designing technology.

Slide 110

Slide 110

We not only have a responsibility to design more ethical technology, but to consider the impact our design has outside of its immediate interface.

Slide 111

Slide 111

It’s hard to advocate for change when alternatives don’t yet exist.

As the people advocating for change, we can’t exactly go around telling people to stop using this technology unless there are real, ethical alternatives.

Slide 112

Slide 112

We have the power to make that change.

That’s where you and me come in. As people who work in technology, and who create technology, we have far more power for change. We can encourage more ethical practice. We can build alternatives.

Slide 113

Slide 113

How to build more rights-respecting technology

How do we build more rights-respecting technology?

Slide 114

Slide 114

Build small technology

As an antidote to big tech, we need to build small technology.

Slide 115

Slide 115

Everyday tools for everyday people designed to increase human welfare, not corporate profits.

Yeah, sure it’s a lofty goal, but there are practical ways to approach it.

Slide 116

Slide 116

Make it easy to use.

First off, make it easy to use.

Slide 117

Slide 117

Protecting ourselves shouldn’t just be the privilege of those with knowledge, time and money.

Plenty of privacy-respecting tools exist for nerds to protect themselves (I use some of them.) But we mustn’t make protecting ourselves a privilege only available to those who have the knowledge, time and money.

Slide 118

Slide 118

It’s why we must make easy-to-use technology that is

  • functional (this includes accessible. If it’s not accessible, it’s not functional)
  • convenient
  • reliable

Slide 119

Slide 119

Make it inclusive.

Slide 120

Slide 120

We must ensure people have equal rights and access to the tools we build and the communities who build them, with a particular focus on including people from traditionally marginalised groups. Current privacy-respecting tools (and most of our tools) are particularly terrible at this, not building inclusive technology, and often surrounding themselves with toxic communities.

Slide 121

Slide 121

Don’t be colonial.

Slide 122

Slide 122

Our teams must reflect the intended audience of our technology.

Slide 123

Slide 123

If we can’t build teams like this (some of us work in small teams or as individuals), we must ensure people with different needs can take what we make and specialise it for their needs. We can build upon the best practices and shared experiences of others, but we should not be making assumptions about what is suitable for an audience we are not a part of.

Slide 124

Slide 124

Make it personal.

Slide 125

Slide 125

Build technology for everyday people, not just startups and enterprises.

We’ve got to stop our infatuation with growth and greed. Focus on building personal technology for everyday people, not spending all our focus, experience, and money on tools for startups and enterprises.

Slide 126

Slide 126

Make it private by default.

Slide 127

Slide 127

I’m saying it again, to undo Facebook’s attempts at privacy-washing: Privacy is the ability to choose what you want to share with others, and what you want to keep to yourself.

Slide 128

Slide 128

Make your technology functional without personal information.

You don’t need to know a person’s gender to provide them with services. You don’t need analytics that segment people into stereotypes based on guesswork.

Slide 129

Slide 129

Consent:

Allow people to share their information for relevant functionality only with their explicit consent.

Slide 130

Slide 130

When obtaining consent, tell the person how you’ll use their information, when you’ll use it, who will have access to it, and how long you will keep that information stored. (This has recently become established as a requirement under the GDPR—the EU’s General Data Protection Regulation)

Slide 131

Slide 131

Write easy-to-understand privacy policies.

Don’t just copy and paste them from other sites (they probably copy-pasted them in the first place!) Ensure the privacy policy is up-to-date with every update to your technology.

Slide 132

Slide 132

Don’t use third-party consent frameworks.

Don’t use third-party consent frameworks. Most of these aren’t GDPR-compliant, they’re awful experiences for your visitors, and they may well get you into legal trouble.

Slide 133

Slide 133

Don’t use third-party services. (If you can avoid them.)

Don’t use third-party services at all if you can avoid them. (As they present a risk to you and your users.)

Slide 134

Slide 134

Make it your responsibility to know what they’re doing with your users’ information. If you do use third-party services, make it your responsibility to know their terms and policies, what information they are collecting, and what they are doing with that information.

Slide 135

Slide 135

Self-host all the things.

If you use third-party scripts, content delivery networks, videos, images and fonts, self-host them wherever possible. Ask the providers if it’s unclear whether they provide a self-hosted option.

Slide 136

Slide 136

Social media etiquette:

And it’s probably worth mentioning a little bit of social media etiquette: If you know how, strip the tracking identifiers and Google amp junk from URLs before you share them. Friends don’t let corporations invade their friends’ privacy.

Slide 137

Slide 137

Post to your own site first, then mirror those posts to third-party platforms. If you feel you need a presence social media or blogging platforms, don’t make it the only option. Post to your own site first, then mirror those posts on third-party platforms for the exposure you desire.

Slide 138

Slide 138

Your basic blog is way better than Medium.

Slide 139

Slide 139

Make it zero-knowledge.

Slide 140

Slide 140

Zero-knowledge tech has no knowledge of your information. It may store a person’s information, but the people who make or host the tech cannot access that information if they wanted to.

Slide 141

Slide 141

Keep a person’s information on their device where possible.

Slide 142

Slide 142

Ensure any information synced to another device is end-to-end encrypted.

If a person’s information needs to be synced to another device, ensure that information is end-to-end encrypted, with only that person having access to decrypt it.

Slide 143

Slide 143

The adage is true, the cloud is just somebody else’s computer.

Slide 144

Slide 144

Make it share alike.

And we also have to take care with how we share our technology, and how we sustain its existence. Make it share alike.

Slide 145

Slide 145

Cultivate a healthy commons by using licences that allow others to build upon, and contribute back to your work. Don’t let Big Tech use your work if they’re not going to contribute their changes back.

Slide 146

Slide 146

Make it noncommercial.

And we also have to take care with how we share our technology, and how we sustain its existence. Make it non-commercial.

Slide 147

Slide 147

Support stayups, not startups.

My partner at Small Technology Foundation, Aral Balkan, coined the term stayups for the antistartup. We don’t need more tech companies aiming to fail fast or be sold as quickly as possible. We need long-term sustainable technology.

Slide 148

Slide 148

Support not-for-profit technology.

If we are building sustainable technology for everyday people, we need a compatible funding model, not venture capital or equity-based investment.

Slide 149

Slide 149

It feels impossible. It probably is!

It may feel difficult or even impossible to build small technology with your current employer or organisation. It probably is! But there are steps we can take to give ourselves the opportunity to build more ethical technology.

Slide 150

Slide 150

1. Use small technology as a criteria

Use small technology as a criteria when you’re looking for your next job. You don’t have to be at your current job forever.

Slide 151

Slide 151

2. Seek alternatives.

Seek alternatives to the software you use every day.

Slide 152

Slide 152

switching.software has a great list of resources, provided by people who really care about ethical technology as well as ease-of-use.

Slide 153

Slide 153

3. If you can’t do it at work, do it at home.

If you can’t do it at work, do it at home. If you have the time, make a personal website, practice small technology on your own projects.

Slide 154

Slide 154

We’re trying to build tools to help people do this (without compromising their site’s visitors.) This is Site.js, and we’re building it so you can build a secure personal website without all that configuration, without relying on a third party service who might start tracking all your visitors without your consent, or pivot to a dodgy business model. (It’s free and open small technology!)

Slide 155

Slide 155

It even runs on Raspberry Pi. If you like the idea of holding a portable server in your hand, running off 4G.

Slide 156

Slide 156

Of course there’s no tracking. It has ephemeral server statistics to see which pages are popular (and what’s being hit but isn’t there - 404s) but does not track the site’s visitors.

It’s a tiny tool in a world of choice, but it is an alternative.

Slide 157

Slide 157

Tin foil hats are all the rage.

I’ve been speaking about tracking and privacy for around seven years. (Luckily it’s becoming more mainstream these days!) I’ve been heckled by a loyal Google employee, I’ve been called a tinfoil-hat-wearing ranter by a Facebook employee. I’ve had people tell me there just isn’t any other way, that I’m just trying to impede the “natural progress” of technology…

Slide 158

Slide 158

As Rose Eveleth wrote in a recent article on Vox: “The assertion that technology companies can’t possibly be shaped or restrained with the public’s interest in mind is to argue that they are fundamentally different from any other industry. They’re not.”

Slide 159

Slide 159

We can’t keep making poor excuses for bad practices.

Slide 160

Slide 160

Divest ourselves of unethical organisations.

We must divest ourselves of unethical organisations. Consider who we are financially supporting or implicitly endorsing when we recommend their work and their products.

Slide 161

Slide 161

And, I’m sorry, I don’t give a fuck about all the cool shit coming out of exploitative companies. You are not examples to be held above others. Your work is hurting our world, not contributing to it.

Slide 162

Slide 162

Our whole approach matters.

Our whole approach matters. Our whole approach matters. It’s not just about our philosophy, or how we build technology, but our approach to being a part of communities that create technology.

Slide 163

Slide 163

I’m just one person.

You might be thinking “but I’m just one person.”

Slide 164

Slide 164

But we are an industry, we are communities, we are organisations, we are groups made up of many persons. And if we work together on this, we could have a huge impact.

Slide 165

Slide 165

You are not your job.

We have to remember that we are more than just the organisation we work for. If you work for a big corporation that does exploitative things, you probably didn’t make the decision to do that bad thing. But I think the time has come that we can no longer unquestioningly defend our employers or clients.

Slide 166

Slide 166

We need to use our social capital (also known as privilege!), we need to be the change we want to exist.

Slide 167

Slide 167

Make change happen

There are different roles we can take in making change happen. Roles that fit us differently depending on our privilege and our position within an organisation…

Slide 168

Slide 168

1. Be independent.

We’ve got to be comfortable being different, we can’t just follow other people’s leads when those other people aren’t being good leaders. Don’t look to heroes who can let you down, don’t be loyal to big corporations who don’t care anything for you.

Slide 169

Slide 169

2. Be the advisor.

Do the research on inclusive, ethical technology, make recommendations to others. Make it harder for them to make excuses.

Slide 170

Slide 170

3. Be the advocate.

Marginalised folks shouldn’t have to risk themselves to make change. Advocate for others. The overrepresented should advocate for the underrepresented.

Slide 171

Slide 171

4. Be the questioner.

Question those defaults. Ask why was it been chosen to be built that way in the first place? Try asking a start-up how it makes its money!

Slide 172

Slide 172

5. Be the gatekeeper.

When the advocacy isn’t getting you far enough, use your expertise to prevent exploitative things from happening on your watch.

Slide 173

Slide 173

6. Be difficult.

Be the person who is known for always bringing up the issue. Embrace the awkwardness that comes with your power. Call out questionable behaviour.

Slide 174

Slide 174

7. Be unprofessional.

Don’t let anybody tell you that standing up for the needs of yourself, and others, is unprofessional. Don’t let people tell you to be quiet. Or that you’ll get things done if you’re a bit nicer.

Slide 175

Slide 175

8. Be the supporter.

If you are not comfortable speaking up for yourself, at least be there for those that do. Remember silence is complicity.

Slide 176

Slide 176

Speaking up is risky and hard.

It can be really fucking lonely. We’re often fighting entities far bigger than ourselves. Our lives, our ability to make a living is at risk.

Slide 177

Slide 177

But letting technology continue this way is riskier. Like society and democracy riskier.

Slide 178

Slide 178

We deserve better.

People say the talks I give are scary, but I’m not here to scare you. I’m just here because I want to tell you that we deserve better.

Slide 179

Slide 179

Thank you!

Slides are at https://noti.st/laurakalbag

Laura Kalbag, laurakalbag.com@LauraKalbag Small Technology Foundation small-tech.org