8 Unbelievable Things You Never Knew About Tracking

A presentation at ffconf in November 2019 in Brighton, UK by Laura Kalbag

Slide 1

Slide 1

8 Unbelievable Things You Never Knew About Tracking

Laura Kalbag, laurakalbag.com@LauraKalbag Small Technology Foundation small-tech.org

Slide 2

Slide 2

So I’m browsing the Pink News site, because I think we can agree that the same-sex dance on Strictly last weekend was a dream…

Slide 3

Slide 3

And as I scroll down below the article, I am presented with reams of classic clickbait: “Cork, Want To Get The Latest Vista Hearing…”

Slide 4

Slide 4

“25 Celebs You Didn’t Realize Are Gay - No. 8 Will Surprise Women”

Slide 5

Slide 5

“Drink This Before Going to Bed to Help Burn Belly Fat”

Slide 6

Slide 6

This clickbait is quite revealing through its chosen topics alone. It knows I’m in Cork. It knows the broad subject matter of the content of the article I’m reading, or the site I’m on. It appears to suspect I’m a woman. And well, who doesn’t feel targeted by clickbait ads about belly fat.

Slide 7

Slide 7

This clickbait is provided to Pink News by Taboola. On this page alone, Taboola is requested through 14 scripts, 23 images, 2 stylesheets and inside 5 frames.

Slide 8

Slide 8

I work on blocking trackers with a privacy tool called Better Blocker, and we’ve looked into Taboola. In our crawls of the most popular sites on the web, we found Taboola on nearly 5% of sites.

Slide 9

Slide 9

Taboola’s aim is to:

“Drive marketing results by targeting your audience when they are most receptive to new messages.”

Slide 10

Slide 10

You can do that with their:

“Data Rich Recommendations: Ensure that your brand reaches interested people by leveraging the massive amounts of user data powering the Taboola engine.”

Slide 11

Slide 11

In fact, they provide a handy graphic here showing some of the information that might be useful about a site’s visitor. “Device and operating system”… but also… “In the the market for > car, fashion, electric bike”… “Interest > pet lovers, environment, entertainment, science & tech.”

Slide 12

Slide 12

So I scroll down to Taboola’s privacy policy, to see how they know this information about me, and what they intend to do with it. They seem to have a specific policy for “Third Party Online Advertising”, so I’ll check that out.

Slide 13

Slide 13

“We automatically collect User Information when Users interact with our Services that appear on our Customers’ websites and digital properties.”

Slide 14

Slide 14

“Taboola collects only pseudonymized data, which means we do not know who you are because we do not know or process your name, email address, or other identifiable data.”

Slide 15

Slide 15

Let’s debunk this for a second. “Pseudonymized data” or “anonymised data” doesn’t mean you’re unidentifiable. Even though it’s a claim that privacy policies have been hanging off for years…

Slide 16

Slide 16

As Bruce Schneier said over a decade ago in Wired, “it takes only a small named database [(as in a database containing names)] for someone to pry the anonymity off a much larger anonymous database.” They just need to compare some data points that match.

Slide 17

Slide 17

A recent study (and it’s not the only study) into methods to re-identify individuals from anonymised datasets found “Using our model, we find that 99.98% of Americans would be correctly re-identified in any dataset using 15 demographic attributes.” Attributes such as age, gender, ethnicity, post code, number of children, number of cars owned, location, status updates, and results on a personality quiz.

Slide 18

Slide 18

Returning to Taboola’s privacy policy… I want to know how the interests Taboola infers compares to these kinds of demographic attributes. They’re described by Taboola as “data segments”… “A data segment is a grouping of users who share one or more attributes (e.g., travel enthusiasts). We offer a number of data segments, both proprietary and from our data partners”. Kindly, they’ve provided a link to their data partners…

Slide 19

Slide 19

Now two of these data partners stand out to me in particular… Acxiom and Oracle

Slide 20

Slide 20

And that’s because Cracked Labs have done multiple reports into the personal data that corporations collect, combine, analyse, trade and use. And the data brokers that deal in this data.

Slide 21

Slide 21

…featuring two of the biggest data brokers: Oracle and Acxiom.

Slide 22

Slide 22

According to Cracked Labs: “Acxiom provides up to 3000 attributes and scores on 700 million people in the US, Europe, and other regions.”

Slide 23

Slide 24

Slide 24

I’ve picked out some of the creepiest bits of information from the Cracked Labs reports:

  • one of nearly 200 “ethnic codes”
  • political views
  • relationship status
  • income
  • details about banking and insurance policies
  • type of home: including if your home is prison…

Slide 25

Slide 25

  • likelihood whether a person is planning to have a baby or adopt a child
  • number and age of children
  • purchases, including whether a person bought pain relief products
  • whether a person is likely to have an interest in the air force, army, navy, lottery and sweepstakes, or gay and lesbian movies…

Slide 26

Slide 26

search history, including whether a person searched about: abortion, legalising drugs, or gay marriage, protests, strikes, boycotts or riots. And the likelihood that a person is a social influencer or is socially influenced.

Slide 27

Slide 27

Taboola says it “does not knowingly create segments that are based upon what we consider to be sensitive information…” Hmm… Helpfully, Taboola also provides a detailed list of all their apparently not-sensitive “standard health-related segments”…

Slide 28

Slide 28

I’ve picked out some that really jumped out at me as being quite personal:

  • Active Health Management: Far Below Average
  • Health: I Have No Confidence in The Health Care System
  • family and parenting > motherhood > artificial insemination

Slide 29

Slide 29

  • First Sign of Pain, I Take Medicine
  • health and fitness > addiction
  • health and fitness > disorders > panic and anxiety

Slide 30

Slide 30

  • Personality - Dealing with Stress - Bottled Up
  • Personality - Dealing with Stress - Emotional
  • Personality - Dealing with Stress - Quick Fix

This isn’t exactly the kind of information you want marketers to use to sell to you…

Slide 31

Slide 31

Cambridge Analytica

Personality attributes were also used by Cambridge Analytica. They collected them through a personality test app on Facebook that also harvested the profiles of the participant’s friends and friend’s friends.

Slide 32

Slide 32

In this personality test app, “Users were scored on ‘big five’ personality traits – Openness, Conscientiousness, Extroversion, Agreeableness and Neuroticism – and in exchange, 40% of them consented to access to their Facebook profiles.” Source: https://www.theguardian.com/news/2018/mar/17/data-war-whistleblower-christopher-wylie-faceook-nix-bannon-trump

Slide 33

Slide 33

“[Cambridge Analytica] itself claimed to be able to analyse huge amounts of consumer data and combine that with behavioural science to identify people who organisations can target with marketing material. It collected data from a wide range of sources, including social media platforms such as Facebook, and its own polling.” Source: https://www.theguardian.com/news/2018/mar/18/what-is-cambridge-analytica-firm-at-centre-of-facebook-data-breach

Slide 34

Slide 34

Profiling.

It’s profiling.

Slide 35

Slide 35

Cambridge Analytica was a venture of SCL Elections whose “expertise was in “psychological operations” – or psyops – changing people’s minds not through persuasion but through “informational dominance”, a set of techniques that includes rumour, disinformation and fake news.” Source: https://www.theguardian.com/news/2018/mar/17/data-war-whistleblower-christopher-wylie-faceook-nix-bannon-trump

Slide 36

Slide 36

Targeting.

That’s targeting.

Slide 37

Slide 37

SCL worked with Steve Bannon on the Trump election campaign. And as this neat graphic from The Guardian shows, SCL’s ventures, Cambridge Analytica and AggregateIQ, worked on multiple Brexit Leave campaigns too. Source: https://www.theguardian.com/news/2018/mar/18/what-is-cambridge-analytica-firm-at-centre-of-facebook-data-breach

Slide 38

Slide 38

Manipulating.

We, as citizens, were manipulated by the profiling and targeting.

Slide 39

Slide 39

This is all the topic of a recent documentary on Netflix called The Great Hack. And I’d really recommend it if you want a lot of the information without having to do all the reading. It’s accessible for your not-so-techy friends too.

Slide 40

Slide 40

Tracking affects democracy.

And it means it’s not exaggerating to say that tracking affects democracy. If we use tracking, we have to consider its ethical implications.

Slide 41

Slide 41

I could talk about this in more depth for much longer, but I’ve just not got the time. If you want a read, the book ’Surveillance Capitalism’ by Shoshana Zuboff contains both the history and the predicted future of these massive complex surveillance systems:

Slide 42

Slide 42

Shoshana Zuboff coined the term ‘surveillance capitalism’ and describes it in this book… “Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data. Although some of these data are applied to product or service improvement, the rest are…fabricated into prediction products that anticipate what you will do now, soon, and later.”

Slide 43

Slide 43

And if you take one look at the size of that book and decide to opt-out, try listening to Shoshana Zuboff interviewed on the Adam Buxton podcast instead.

Slide 44

Slide 44

But it’s all so convenient!

Many people use the argument that profiling and targeting is ok because it makes technology more convenient for the majority of us.

Slide 45

Slide 45

Convenient unethical technology is like fluffy handcuffs. They may look cute and fluffy, they might lead to some fun. But they’re still handcuffs, and you always want to have access to the key.

Slide 46

Slide 46

How to protect ourselves (individuals)

How can we protect ourselves (as individuals)? Let’s look at a few of the things you can do:

Slide 47

Slide 47

A. Avoid logging in.

Avoid logging in. (If you can.) For example, when you’re watching videos on YouTube.

Slide 48

Slide 48

Fingerprinting

However, many platforms will still track you via fingerprinting. These are a combination of identifiers unique to your browser that act as a fingerprint. Identifiers like your browser height and width, the device you’re using, (and ironically) whether you have ‘Do Not Track’ set in your browser preferences.

Slide 49

Slide 49

In 2015, Facebook even filed a patent saying it could identify people who might know each other because they appear in photos taken by the same camera… the camera being identified by identical lens scratches and dust. Source: https://gizmodo.com/facebook-knows-how-to-track-you-using-the-dust-on-your-1821030620

Slide 50

Slide 50

B. Avoid providing your phone number.

Avoid providing your phone number. Folks recommend using two-factor authentication to prevent nefarious strangers from getting into your accounts. But be aware that phone numbers are a little risky for authentication…

Slide 51

Slide 51

A study into whether Facebook used personally-identifiable information for targeted advertising found that when “We added and verified a phone number for [two-factor authentication] to one of the authors’ accounts… the phone number became targetable [by advertising] after 22 days”

Slide 52

Slide 52

And not long ago, Twitter admitted they did the same thing: “When an advertiser uploaded their marketing list, we may have matched people on Twitter to their list… based on the email or phone number the Twitter account holder provided for safety and security purposes.” Source: https://www.wired.com/story/twitter-two-factor-advertising/

Slide 53

Slide 53

C. Disallow third-party cookies.

Disallow third-party cookies in your browser preferences.

Slide 54

Slide 54

Most of us know that if you blocked third-party resources, many of our sites would fall to pieces. Likewise, everything breaks when you disallow third-party cookies, and usually silently. If a site relies on a third-party anything persistent, be it logins, preferences, even shopping baskets, that’ll probably break.

Slide 55

Slide 55

D. Don’t use Gmail.

Don’t use Gmail. Your email not only contains all your communication, but the receipts for everything you’ve bought, the confirmations of every event you’ve signed up for, and every platform, newsletter, and service you’ve joined. (logged in thing)

Slide 56

Slide 56

From our own crawls of the web for Better Blocker, we discovered Google has its tentacles in around 80% of the popular web. Think of all the information Google can extract from those sites.

Slide 57

Slide 57

Your choices affect your friends and family.

Though if your friends and family use Gmail, you’re a bit stuck. Likewise, your choices affect your friends and family.

Slide 58

Slide 58

We’re not just tracked on the web.

Of course, these are all choices we can make once we’re on the web, but we need to be aware of other places where we are tracked.…

Slide 59

Slide 60

Slide 60

Amazon Ring and Alexa can hear everything you say and spy on your neighbours. Source: https://reallifemag.com/false-alarm/

Slide 61

Slide 62

Slide 62

A smart pacifier means you can put a chip in your baby. Source: https://www.pacif-i.io

Slide 63

Slide 63

Of course it was only a matter of time before someone made a smart menstrual cup… Source: https://www.kickstarter.com/projects/700989404/looncup-the-worlds-first-smart-menstrual-cup

Slide 64

Slide 64

And let’s not forget the smart dildo…

Slide 65

Slide 65

We Connect (the smart dildo makers) were even sued for tracking users’ “habits”. Source: https://www.vocativ.com/358530/smart-dildo-company-sued-for-tracking-users-habits/

Slide 66

Slide 66

Have you ever wondered how many calories you’re burning during intercourse? How many thrusts? Speed of your thrusts? The duration of your sessions? Frequency? How many different positions you use in the period of a week, month or year? You want the iCondom And have you ever wanted to share all that information with advertisers, insurers, your government, and who knows else?

Slide 67

Slide 67

Avoiding it all is too much work.

Avoiding it all seems like a lot of work, right? I KNOW! I advocate for privacy…and I don’t have the time or the resources to do all of this all the time.

Slide 68

Slide 68

Don’t blame the victim.

That’s why it’s unfair to blame the victim for having their privacy eroded.

Slide 69

Slide 69

Our concept of privacy is being twisted.

Not to mention that our concept of privacy is getting twisted by the same people who have an agenda to erode it. One of the biggest culprits in attempting to redefine privacy: Facebook.

Slide 70

Slide 70

Here is a Facebook ad that’s recently been showing on TVs. It shows a person undressing behind towels held up by her friends on the beach, alongside the Facebook post visibility options, “Public, Friends, Only Me, Close Friends,” explaining how we each have different privacy preferences in life. It ends saying “there’s lots of ways to control your privacy settings on Facebook.”…

Slide 71

Slide 71

But it doesn’t mention that “Friends”, “Only Me”, and “Close Friends” should really read: “Friends (and Facebook)”, “Only Me (and Facebook)” and “Close Friends (and Facebook)”. Because you’re never really sharing something with “Only Me” on Facebook. Facebook Inc. has access to everything you share.

Slide 72

Slide 72

Privacy is the ability to choose what you want to share with others, and what you want to keep to yourself. Facebook shouldn’t be trying to tell us otherwise.

Slide 73

Slide 73

Google has an interesting interpretation of privacy too. Ten years ago, Eric Schmidt, then CEO of Google (now Executive Chairman of Alphabet, Google’s parent corporation), famously said:

“If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.”

To which I respond: OK, Eric. Tell me about your last trip to the toilet…

Slide 74

Slide 74

Shouldn’t we be smart about what we share publicly?

Do we need to be smart about what we share publicly? Sure! Don’t go posting photos of your credit card or your home address. Maybe it’s unwise to share that photo of yourself blackout drunk when you’ve got a job interview next week. Perhaps we should take responsibility if we say something awful to another person online.…

Slide 75

Slide 75

Corporations blame us for giving up our privacy.

Right now, the corporations are more than happy to blame us for our loss of privacy. They say we agreed to the terms and conditions, we should read the privacy policies, it’s our fault.

Slide 76

Slide 76

This in itself is the subject of a whole documentary that was made way back in 2013, called “Terms And Conditions May Apply” covering the ridiculous length and legalese in terms and conditions, and how we couldn’t possibly read them for every service we use.

Slide 77

Slide 77

More recently, an editorial board op-ed in the New York Times pointed out the flaws in privacy policies for consent:

“The clicks that pass for consent are uninformed, non-negotiated and offered in exchange for services that are often necessary for civic life.”

Slide 78

Slide 78

There are studies that speak to how difficult it is to understand these policies too: “Two law professors analyzed the sign-in terms and conditions of 500 popular US websites, including Google and Facebook, and found that more than 99 percent of them were “unreadable,” far exceeding the level most American adults read at, but are still enforced.” Source: https://www.vice.com/en_us/article/xwbg7j/online-contract-terms-of-service-are-incomprehensible-to-adults-study-finds

Slide 79

Slide 79

It is not informed consent

It’s not informed consent when you can’t understand the terms. And how can we even truly consent if we don’t know how our information can be used against us?

Slide 80

Slide 80

It’s not consent if there’s not a real choice.

And it’s not true consent if it’s not a real choice. We’re not asked who should be allowed access to our information, and how much, and how often, and for how long, and when…

Slide 81

Slide 81

Like this fresh hell from Huffington Post. They’ve got this because the GDPR (General Data Protection Regulation) requires they ask for consent before tracking you… I go to read a post on their site and I get an option that essentially reads “Before you continue… and OK” because who is going to read that boring paragraph of text in between? I select Manage options, because I want to opt out!

Slide 82

Slide 82

And I’m presented with yet more text saying the same nothing and another OK button. I want to opt out, so I choose “Manage.”

Slide 83

Slide 83

Yet more text saying the same nothing and “Done.” Well, as much as I’m done with it all, I came here to opt out. So I guess I look at the small text that says “See how our partners use your data”. I select “Show…”

Slide 84

Slide 84

Ooh, a box of text saying the same nothing again, but in small white text on blue this time. I have to hover over a tiny blue “i” icon to read it. Still no option to opt out. I select “Hide” to hide that…

Slide 85

Slide 85

What about in “See and customise which partners can use your data.” Ah nice, a list of every third party service they use and a link to their privacy policy. Many of which have no reference to the third-party use of their services. Delightful.

Slide 86

Slide 86

There is no choice.

At no point there was there a choice for me. Nothing that resembles consent. Just lots of text saying the same thing over and over again until I give up and just select OK. (Incidentally, these interfaces have already been ruled not compliant with the GDPR.)

Slide 87

Slide 87

We’re asked to give up everything or get nothing.

We’re asked to give up everything or get nothing. That’s not a real choice.

Slide 88

Slide 88

The cost of not consenting is to lose access to social, civil and labour infrastructure. It’s certainly not a real choice when the cost of not consenting is to lose access to social, civil, and labour infrastructure.

Slide 89

Slide 89

There’s a recent paper by Jan Fernback and Gwen Shaffer, ‘Cell Phones, Security and Social Capital.’ The paper examines the privacy tradeoffs that disproportionately affect mobile-mostly internet users. What they found shows the cost of not giving consent.

Slide 90

Slide 90

Speaking about this paper, Gwen Schaffer explained

“All individuals are vulnerable to security breaches, identity fraud, system errors, and hacking. But economically disadvantaged individuals who rely exclusively on their mobile phones to access the internet are disproportionately exploited…”

Slide 91

Slide 91

“Some focus group participants reported that, in an effort to maintain data privacy, they modify online activities in ways that harm personal relationships and force them to forego job opportunities.”

Source: https://www.miccenter.org/wp-content/uploads/2019/08/CellPhones_1a.pdf

Slide 92

Slide 92

The technology we use is our new everyday things.

The technology we use are our new everyday things. It forms that vital social, civil, and labour infrastructure. And as (largely) helpless consumers, there’s often not much we can do to protect ourselves without a lot of free time and money.

Slide 93

Slide 93

When the technology you use is a lifeline to access, you are impacted more severely by its unethical factors.

Slide 94

Slide 94

Last year, Dr Frances Ryan covered this in her article, ‘The missing link: why disabled people can’t afford to #DeleteFacebook’. After the Cambridge Analytica scandal was uncovered, many people started encouraging each other to #DeleteFacebook. In fact, it’s been happening again the last few weeks.

Slide 95

Slide 95

Dr Frances Ryan pointed out:

“I can’t help but wonder if only privileged people can afford to take a position of social media puritanism. For many, particularly people from marginalised groups, social media is a lifeline – a bridge to a new community, a route to employment, a way to tackle isolation.”

Slide 96

Slide 96

Technology can’t fix issues of domination, oppression or discrimination.

Like so many issues we have with technology, what we’re dealing with are the underlying social and systemic issues. As technologists, we often can’t help ourselves trying to fix or smooth over problems with technology. But technology can’t fix issues of domination, oppression or discrimination.

Slide 97

Slide 97

Technology amplifies social and systemic issues.

Technology can make those issues worse. We can (and do) amplify and speed up systemic issues with technology.

Slide 98

Slide 98

Mike Ananny recently made the point in an article about tech platforms, that we still seem to operate with the notion that online life is somehow a different life, detached from our everyday existence.

Slide 99

Slide 99

Tech platforms often take advantage of that notion by suggesting that if we don’t like “technology” we can just log out, log off, and be mindful or some other shit instead. People with this mindset often show how shallow they are by saying “if you don’t like the technology, you don’t have to use it…”

Slide 100

Slide 100

But we can’t escape technology:

“Platforms are societies of intertwined people and machines. There is no such thing as “online life” versus “real life.” We give massive ground if we pretend that these companies are simply having an “effect” or “impact” on some separate society.”

Source: https://www.niemanlab.org/2019/10/tech-platforms-are-where-public-life-is-increasingly-constructed-and-their-motivations-are-far-from-neutral/

Slide 101

Slide 101

Technology colonialism

Which brings me to another issue rife in technology today. Technology colonialism.

Slide 102

Slide 103

Slide 103

He started with some history:

“Colonial powers always saw themselves as superiors over the native people whose culture was rarely recognized or respected. The colonizers saw economic value in… foreign relations, but it was always viewed as a transaction based on inequality.”

Slide 104

Slide 104

And then comparing it to what we so often do in technology:

“Technology companies continue this same philosophy in how they present their own products. These products are almost always designed by white men for a global audience with little understanding of the diverse interests of end users.”

Slide 105

Slide 105

We have to reckon with our colonial history.

We have to reckon with our colonial history. This speaks to us politically too, but today I’m talking about our tech industry, our tech community. We have to reckon with the colonial way in which we’ve created technology…

Slide 106

Slide 106

We don’t speak to users. Instead, we use analytics and data to design interfaces for people we’ll never try to speak to, or ask whether they even wanted our tech in the first place. We’ll assume we know best because we are the experts, and they are “just users.” We don’t have diverse teams, we barely even try to involve people with backgrounds different from our own. We fetishise our tools, valuing the developer experience over that of the people using what we build.

Slide 107

Slide 107

“Intent does not erase impact.”

We can say we had the right intentions, but that truly means nothing. This has been explored in depth recently by Tatiana Mac, who invokes “intent does not erase impact” to describe our often-haphazard approach to designing technology.

Slide 108

Slide 108

We not only have a responsibility to design more ethical technology, but to consider the impact our design has outside of its immediate interface.

Slide 109

Slide 109

It’s hard to advocate for change when alternatives don’t yet exist.

As the people advocating for change, we can’t exactly go around telling people to stop using this technology unless there are real, ethical alternatives.

Slide 110

Slide 110

We have the power to make that change.

That’s where you and me come in. As people who work in technology, and who create technology, we have far more power for change. We can encourage more ethical practice. We can build alternatives.

Slide 111

Slide 111

How to build more ethical technology

How do we build more ethical technology?

Slide 112

Slide 112

Build small technology

As an antidote to big tech, we need to build small technology.

Slide 113

Slide 113

Everyday tools for everyday people designed to increase human welfare, not corporate profits.

Yeah, sure it’s a lofty goal, but there are practical ways to approach it.

Slide 114

Slide 114

Make it easy to use.

First off, make it easy to use.

Slide 115

Slide 115

Protecting ourselves shouldn’t just be the privilege of those with knowledge, time and money.

Plenty of privacy-respecting tools exist for nerds to protect themselves (I use some of them.) But we mustn’t make protecting ourselves a privilege only available to those who have the knowledge, time and money.

Slide 116

Slide 116

It’s why we must make easy-to-use technology that is

  • functional (this includes accessible. If it’s not accessible, it’s not functional)
  • convenient
  • reliable

Slide 117

Slide 117

Make it inclusive.

Slide 118

Slide 118

We must ensure people have equal rights and access to the tools we build and the communities who build them, with a particular focus on including people from traditionally marginalised groups. Current privacy-respecting tools (and most of our tools) are particularly terrible at this, not building inclusive technology, and often surrounding themselves with toxic communities.

Slide 119

Slide 119

Don’t be colonial.

Slide 120

Slide 120

Our teams must reflect the intended audience of our technology.

Slide 121

Slide 121

If we can’t build teams like this (some of us work in small teams or as individuals), we must ensure people with different needs can take what we make and specialise it for their needs. We can build upon the best practices and shared experiences of others, but we should not be making assumptions about what is suitable for an audience we are not a part of.

Slide 122

Slide 122

Make it personal.

Slide 123

Slide 123

Build technology for everyday people, not just startups and enterprises.

We’ve got to stop our infatuation with growth and greed. Focus on building personal technology for everyday people, not spending all our focus, experience, and money on tools for startups and enterprises.

Slide 124

Slide 124

Make it private by default.

Slide 125

Slide 125

I’m saying it again, to undo Facebook’s attempts at privacy-washing: Privacy is the ability to choose what you want to share with others, and what you want to keep to yourself.

Slide 126

Slide 126

Make your technology functional without personal information.

You don’t need to know a person’s gender to provide them with services. You don’t need analytics that segment people into stereotypes based on guesswork.

Slide 127

Slide 127

Consent:

Allow people to share their information for relevant functionality only with their explicit consent.

Slide 128

Slide 128

When obtaining consent, tell the person how you’ll use their information, when you’ll use it, who will have access to it, and how long you will keep that information stored. (This has recently become established as a requirement under the GDPR—the EU’s General Data Protection Regulation)

Slide 129

Slide 129

Write easy-to-understand privacy policies.

Don’t just copy and paste them from other sites (they probably copy-pasted them in the first place!) Ensure the privacy policy is up-to-date with every update to your technology.

Slide 130

Slide 130

Don’t use third-party consent frameworks.

Don’t use third-party consent frameworks. Most of these aren’t GDPR-compliant, they’re awful experiences for your visitors, and they’ll likely to just get you into legal trouble.

Slide 131

Slide 131

Don’t use third-party services. (If you can avoid them.)

Don’t use third-party services at all if you can avoid them. (As they present a risk to you and your users.)

Slide 132

Slide 132

Like innocent little web fonts. Last week people were discussing how Crashlytics installs a custom font to track users on iOS. As the CSS Tricks Twitter account correctly pointed out: that’s the entire reason behind Google Web Fonts’ existence. They don’t host all those fonts for you out of the kindness of their little Googly hearts.

Slide 133

Slide 133

Make it your responsibility to know what they’re doing with your users’ information. If you do use third-party services, make it your responsibility to know their terms and policies, what information they are collecting, and what they are doing with that information.

Slide 134

Slide 134

Self-host all the things.

If you use third-party scripts, content delivery networks, videos, images and fonts, self-host them wherever possible. Ask the providers if it’s unclear whether they provide a self-hosted option.

Slide 135

Slide 135

Social media etiquette:

And it’s probably worth mentioning a little bit of social media etiquette: If you know how, strip the tracking identifiers and Google amp junk from URLs before you share them. Friends don’t let corporations invade their friends’ privacy.

Slide 136

Slide 136

Post to your own site first, then mirror those posts to third-party platforms. If you feel you need a presence social media or blogging platforms, don’t make it the only option. Post to your own site first, then mirror those posts on third-party platforms for the exposure you desire.

Slide 137

Slide 137

Your basic blog is way better than Medium.

Slide 138

Slide 138

Make it zero-knowledge.

Slide 139

Slide 139

Zero-knowledge tech has no knowledge of your information. It may store a person’s information, but the people who make or host the tech cannot access that information if they wanted to.

Slide 140

Slide 140

Keep a person’s information on their device where possible.

Slide 141

Slide 141

Ensure any information synced to another device is end-to-end encrypted.

If a person’s information needs to be synced to another device, ensure that information is end-to-end encrypted, with only that person having access to decrypt it.

Slide 142

Slide 142

The adage is true, the cloud is just somebody else’s computer.

Slide 143

Slide 143

Make it share alike.

And we also have to take care with how we share our technology, and how we sustain its existence. Make it share alike.

Slide 144

Slide 144

Cultivate a healthy commons by using licences that allow others to build upon, and contribute back to your work. Don’t let big tech use your work if they’re not going to contribute their changes back.

Slide 145

Slide 145

Make it noncommercial.

And we also have to take care with how we share our technology, and how we sustain its existence. Make it non-commercial.

Slide 146

Slide 146

Support stayups, not startups.

My partner at Small Technology Foundation, Aral Balkan, coined the term stayups for the antistartup. We don’t need more tech companies aiming to fail fast or be sold as quickly as possible. We need long-term sustainable technology.

Slide 147

Slide 147

Support not-for-profit technology.

If we are building sustainable technology for everyday people, we need a compatible funding model, not venture capital or equity-based investment.

Slide 148

Slide 148

It feels impossible. It probably is!

It may feel difficult or even impossible to build small technology with your current employer or organisation. It probably is! But there are steps we can take to give ourselves the opportunity to build more ethical technology.

Slide 149

Slide 149

1. Use small technology as a criteria

Use small technology as a criteria when you’re looking for your next job. You don’t have to be at your current job forever.

Slide 150

Slide 150

2. Seek alternatives.

Seek alternatives to the software you use every day.

Slide 151

Slide 151

switching.software has a great list of resources, provided by people who really care about ethical technology as well as ease-of-use.

Slide 152

Slide 152

3. If you can’t do it at work, do it at home.

If you can’t do it at work, do it at home. If you have the time, make a personal website, practice small technology on your own projects.

Slide 153

Let me tell you quickly about what my personal site runs on. It’s Site.js, and we’re building it so you can build a secure personal website without all that configuration, without relying on a third party service who might start tracking all your visitors without your consent, or pivot to a dodgy business model. (It’s free and open small technology!)

Slide 154

You can create a baby web page, install Site.js, and set up a secure (https certificate and all) production server in 30 seconds It’s designed to be easy enough for a Laura to use (I have no patience and a terrible memory for complicated terminal commands) (This is slightly longer than 30 seconds because of all the comments explaining how to do it) (There’s a little wait is because it’s just doing the TLS handshake to issue the certificates —it’s a one-off when you set up the server.)

Slide 155

Slide 155

It also includes live reload, and automatic server reloads which means you can use it to easily test sites cross-browser. And it can be a proxy server as well (I currently use it with Hugo though I’m eyeing up Eleventy)

Slide 156

And yes, it also runs on Raspberry Pi. If you like the idea of holding a server in your hand, running off 4G.

Slide 157

Slide 157

There’s no tracking And it even has ephemeral server statistics to see which pages are popular (and what’s being hit but isn’t there - 404s) but does not track the site’s visitors. accessible from a cryptographically secure URL (no login required) You can share this URL with whoever you want (but I’m not sharing mine with you—it’s like letting you look in my knicker drawer. You can already see that I need to sort out my fonts and, for some reason, I don’t have a robots.txt file)

Slide 158

Slide 158

Tin foil hats are all the rage.

I’ve been speaking about tracking and privacy for around seven years. (Luckily it’s becoming more mainstream these days!) I’ve been heckled by a loyal Google employee, I’ve been called a tinfoil-hat-wearing ranter by a Facebook employee. I’ve had people tell me there just isn’t any other way, that I’m just trying to impede the “natural progress” of technology…

Slide 159

Slide 159

As Rose Eveleth wrote in a recent article on Vox: “The assertion that technology companies can’t possibly be shaped or restrained with the public’s interest in mind is to argue that they are fundamentally different from any other industry. They’re not.”

Slide 160

Slide 160

We can’t keep making poor excuses for bad practices.

Slide 161

Slide 161

Divest ourselves of unethical organisations.

We must divest ourselves of unethical organisations. Consider who we are financially supporting or implicitly endorsing when we recommend their work and their products.

Slide 162

Slide 162

IAnd, I’m sorry, I don’t give a fuck about all the cool shit coming out of unethical companies. You are not examples to be held above others. Your work is hurting our world, not contributing to it.

Slide 163

Slide 163

Our whole approach matters.

It’s not just about how we build technology, but our approach to being a part of communities that create technology.

Slide 164

Slide 164

I’m just one person.

You might be thinking “but I’m just one person.”

Slide 165

Slide 165

But we are an industry, we are communities, we are organisations, we are groups made up of many persons. And if we work together on this, we could have a huge impact.

Slide 166

Slide 166

You are not your job.

We have to remember that we are more than just the organisation we work for. If you work for a big corporation that does unethical things, you probably didn’t make the decision to do that bad thing. But I think the time has come that we can no longer unquestioningly defend our employers.

Slide 167

Slide 167

We need to use our social capital (also known as privilege!), we need to be the change we want to exist.

Slide 168

Slide 168

8 (believable) ways to make change happen.

I think perhaps that was a lot more than 8 unbelievable things about tracking… How about now I give you 8 very believable ways you can make change happen?

Slide 169

Slide 169

1. Be independent.

We’ve got to be comfortable being different, we can’t just follow other people’s leads when those other people aren’t being good leaders. Don’t look to heroes who can let you down, don’t be loyal to big corporations who don’t care anything for you.

Slide 170

Slide 170

2. Be the advisor.

Do the research on inclusive, ethical technology, make recommendations to others. Make it harder for them to make excuses.

Slide 171

Slide 171

3. Be the advocate.

Marginalised folks shouldn’t have to risk themselves to make change. Advocate for others. Advocate for the underrepresented.

Slide 172

Slide 172

4. Be the questioner.

Question those defaults. Ask why was it been chosen to be built that way in the first place? Try asking a start-up how it makes its money!

Slide 173

Slide 173

5. Be the gatekeeper.

When the advocacy isn’t getting you far enough, use your expertise to prevent unethical things from happening on your watch. You don’t have to deploy a website,…

Slide 174

Slide 174

6. Be difficult.

Be the person who is known for always bringing up the issue. Embrace the awkwardness that comes with your power. Call out questionable behaviour.

Slide 175

Slide 175

7. Be unprofessional.

Don’t let anybody tell you that standing up for the needs of yourself, and others, is unprofessional. Don’t let people tell you to be quiet. Or that you’ll get things done if you’re a bit nicer.

Slide 176

Slide 176

8. Be the supporter.

If you are not comfortable speaking up for yourself, at least be there for those that do. Remember silence is complicity.

Slide 177

Slide 177

Speaking up is risky and hard.

It can be really fucking lonely. We’re often fighting entities far bigger than ourselves. Our lives, our ability to make a living is at risk.

Slide 178

Slide 178

But letting technology continue this way is riskier. Like society and democracy riskier.

Slide 179

Slide 179

We deserve better.

People say the talks I give are scary, but I’m not here to scare you. I’m just here because I want to tell you that we deserve better.

Slide 180

Slide 180

Thank you!

Slides are at https://noti.st/laurakalbag

Laura Kalbag, laurakalbag.com@LauraKalbag Small Technology Foundation small-tech.org