This One Weird Trick Tells Us Everything About You

A presentation at SmashingConf Freiburg (Online) in September 2020 in by Laura Kalbag

Slide 1

Slide 1

This One Weird Trick Tells Us Everything About You

Laura Kalbag: laurakalbag.com@LauraKalbag Small Technology Foundation: small-tech.org

Slide 2

Slide 2

The other day I was browsing the Pink News site, reading important news about the future of Drag Race UK, and I scrolled down…

Slide 3

Slide 3

I am presented with reams of classic clickbait (I mean, “sponsored ads”): “Most Mac Owners Don’t Know About This (Do It Today)”…

Slide 4

Slide 4

“Cork: Don’t Install Solar Panels Until You Read This”…

Slide 5

Slide 5

“Ireland: A 40+ Dating Site That Actually Works!”

Slide 6

Slide 6

This clickbait is quite revealing through its chosen topics alone. It knows I’m in Cork in Ireland, using a Mac, and suspects I’m a heterosexual person that might be 40+ (not quite there yet!)

Slide 7

Slide 7

This clickbait is provided to Pink News by Taboola.

Slide 8

Slide 8

I work on blocking trackers with a privacy tool called Better Blocker, and we’ve looked into Taboola. In our crawls of the most popular sites on the web, we found Taboola on nearly 5% of sites.

Slide 9

Slide 9

Taboola’s aim is to:

“Drive marketing results by targeting your audience when they are most receptive to new messages.”

Slide 10

Slide 10

You can do that with their:

“Data Rich Recommendations: Ensure that your brand reaches interested people by leveraging the massive amounts of user data powering the Taboola engine.” (Emphasis my own!)

Slide 11

Slide 11

In fact, they provide a handy graphic here showing some of the information that might be useful about a site’s visitor. “Device and operating system”… but also… “In the the market for > car, fashion, electric bike”… “Interest > pet lovers, environment, entertainment, science & tech.”

Slide 12

Slide 12

So I scroll down to Taboola’s privacy policy, to see how they know this information about me, and what they intend to do with it. They seem to have a specific policy for “Third Party Online Advertising”, so I’ll check that out.

Slide 13

Slide 13

“We automatically collect User Information when Users interact with our Services that appear on our Customers’ websites and digital properties.”

Slide 14

Slide 14

“Taboola collects only pseudonymized data, which means we do not know who you are because we do not know or process your name, email address, or other identifiable data.”

Slide 15

Slide 15

Let’s debunk this for a second. “Pseudonymised data” or “anonymised data” doesn’t mean you’re unidentifiable. Even though it’s a claim that privacy policies have been relying on for years…

Slide 16

Slide 16

As Bruce Schneier said over a decade ago in Wired, “it takes only a small named database [(as in a database containing names)] for someone to pry the anonymity off a much larger anonymous database.” They just need to compare some data points that match in each database.

Slide 17

Slide 17

A recent study (and it’s not the only study) into methods to re-identify individuals from anonymised datasets found “Using our model, we find that 99.98% of Americans would be correctly re-identified in any dataset using 15 demographic attributes.” Attributes such as age, gender, ethnicity, post code, number of children, number of cars owned, location, status updates, and results on a personality quiz.

Slide 18

Slide 18

Returning to Taboola’s privacy policy… I want to know how the interests Taboola infers compares to these kinds of demographic attributes. They’re described by Taboola as “data segments”… “A data segment is a grouping of users who share one or more attributes (e.g., travel enthusiasts). We offer a number of data segments, both proprietary and from our data partners”. Kindly, they’ve provided a link to their data partners…

Slide 19

Slide 19

Now two of these data partners stand out to me in particular… Acxiom and Oracle

Slide 20

Slide 20

And that’s because Cracked Labs have done multiple reports into the personal data that corporations collect, combine, analyse, trade and use. And the data brokers that deal in this data.

Slide 21

Slide 21

…featuring two of the biggest data brokers: Oracle and Acxiom.

Slide 22

Slide 22

According to Cracked Labs: “Acxiom provides up to 3000 attributes and scores on 700 million people in the US, Europe, and other regions.”

Slide 23

Slide 24

Slide 24

I’ve picked out some of the creepiest bits of information from the Cracked Labs reports:

  • one of nearly 200 “ethnic codes”
  • political views
  • relationship status
  • income
  • details about banking and insurance policies
  • type of home: including if your home is prison,
  • likelihood whether a person is planning to have a baby or adopt a child
  • number and age of children
  • purchases, including whether a person bought pain relief products
  • whether a person is likely to have an interest in the air force, army, navy, lottery and sweepstakes, or gay and lesbian movies
  • search history, including whether a person searched about: abortion, legalising drugs, or gay marriage, protests, strikes, boycotts or riots. And the likelihood that a person is a social influencer or is socially influenced.

Slide 25

Slide 25

Taboola says it “does not knowingly create segments that are based upon what we consider to be sensitive information…” Hmm… Helpfully, Taboola also provides a detailed list of all their apparently not-sensitive “standard health-related segments”…

Slide 26

Slide 26

  • Active Health Management: Far Below Average
  • Uses Internet to Research Condition / Treatment Related Content
  • Expecting Parents > Due Next 3 Months

Slide 27

Slide 27

  • Disability Insurance Due to Loss of Income Through Medical Causes
  • Health Attitudes and Behaviors > Weight Loss Surgery Inclined
  • health and fitness > disorders > panic and anxiety

Slide 28

Slide 28

  • Beauty Products > Confidence Level 1
  • Personality - Dealing with Stress - Emotional
  • Personality - Dealing with Stress - Quick Fix

This isn’t exactly the kind of information you want marketers to use to sell to you. It is personal.

Slide 29

Slide 29

Personality attributes were also used by Cambridge Analytica. They collected them through a personality test app on Facebook that also harvested the profiles of the participant’s friends and friend’s friends.

Slide 30

Slide 30

In this personality test app, “Users were scored on ‘big five’ personality traits – Openness, Conscientiousness, Extroversion, Agreeableness and Neuroticism – and in exchange, 40% of them consented to access to their Facebook profiles.” Source: https://www.theguardian.com/news/2018/mar/17/data-war-whistleblower-christopher-wylie-faceook-nix-bannon-trump

Slide 31

Slide 31

“[Cambridge Analytica] itself claimed to be able to analyse huge amounts of consumer data and combine that with behavioural science to identify people who organisations can target with marketing material.” Source: https://www.theguardian.com/news/2018/mar/18/what-is-cambridge-analytica-firm-at-centre-of-facebook-data-breach

Slide 32

Slide 32

Profiling

It’s profiling.

Slide 33

Slide 33

Cambridge Analytica was a venture of SCL Elections whose “expertise was in “psychological operations” – or psyops – changing people’s minds not through persuasion but through “informational dominance”, a set of techniques that includes rumour, disinformation and fake news.” Source: https://www.theguardian.com/news/2018/mar/17/data-war-whistleblower-christopher-wylie-faceook-nix-bannon-trump

Slide 34

Slide 34

Targeting.

That’s targeting.

Slide 35

Slide 35

Manipulating.

We, as citizens, could be manipulated by the profiling and targeting.

Slide 36

Slide 36

This is all the topic of a recent documentary on Netflix called The Great Hack. And I’d really recommend it if you want a lot of the information without having to do all the reading. It’s accessible for your friends who don’t speak tech-y too.

Slide 37

Slide 37

Tracking affects democracy.

It means it’s not exaggerating to say that tracking affects democracy. And if we use tracking, we have to consider its ethical implications.

Slide 38

Slide 38

I could talk about this in more depth for much longer, but I’ve just not got the time. If you want a read, the book ’Surveillance Capitalism’ by Shoshana Zuboff contains both the history and the predicted future of these massive complex surveillance systems:

Slide 39

Slide 39

Shoshana Zuboff coined the term ‘surveillance capitalism’ and describes it in this book…

“Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data. Although some of these data are applied to product or service improvement, the rest are…fabricated into prediction products that anticipate what you will do now, soon, and later.”

Slide 40

Slide 40

And if you take one look at the size of that book and decide to opt-out, try listening to Shoshana Zuboff interviewed on the Adam Buxton podcast instead.

Slide 41

Slide 41

But it’s all so convenient!

Many people use the argument that profiling and targeting is ok because it makes technology more convenient for the majority of us.

Slide 42

Slide 42

Convenient exploitative technology is like fluffy handcuffs. They may look cute and fluffy, they might lead to some fun. But they’re still handcuffs, and you always want to have access to the key.

Slide 43

Slide 43

How to protect ourselves (individuals)

How can we protect ourselves (as individuals)? Let’s look at a few of the things you can do:

Slide 44

Slide 44

A. Avoid logging in.

Avoid logging in. (If you can.) For example, when you’re watching videos on YouTube.

Slide 45

Slide 45

Fingerprinting

However, many platforms will still track you via fingerprinting. These are a combination of identifiers unique to your browser that act as a fingerprint. Identifiers like your browser height and width, the device you’re using, (and ironically) whether you have ‘Do Not Track’ set in your browser preferences.

Slide 46

Slide 46

In 2015, Facebook even filed a patent saying it could identify people who might know each other because they appear in photos taken by the same camera… the camera being identified by identical lens scratches and dust. Source: https://gizmodo.com/facebook-knows-how-to-track-you-using-the-dust-on-your-1821030620

Slide 47

Slide 47

B. Avoid providing your phone number.

Avoid providing your phone number. Folks recommend using two-factor authentication to prevent nefarious strangers from getting into your accounts. But be aware that phone numbers are a little risky for authentication…

Slide 48

Slide 48

A study into whether Facebook used personally-identifiable information for targeted advertising found that when “We added and verified a phone number for [two-factor authentication] to one of the authors’ accounts… the phone number became targetable [by advertising] after 22 days”

Slide 49

Slide 49

And not long ago, Twitter admitted they did the same thing: “When an advertiser uploaded their marketing list, we may have matched people on Twitter to their list… based on the email or phone number the Twitter account holder provided for safety and security purposes.” Source: https://www.wired.com/story/twitter-two-factor-advertising/

Slide 50

Slide 50

C. Disallow cookies.

Disallow cookies in your browser preferences.

Slide 51

Slide 51

Thing is, if we block cookies, many sites fall to pieces, and usually silently. Even if we only block cookies from third-parties. If a site relies on a third-party anything persistent, be it logins, preferences, even shopping baskets… that’ll probably break.

Slide 52

Slide 52

D. Don’t use Gmail.

Don’t use Gmail. Your email not only contains all your communication, but the receipts for everything you’ve bought, the confirmations of every event you’ve signed up for, and every platform, newsletter, and service you’ve joined. (logged in thing)

Slide 53

Slide 53

From our own crawls of the web for Better Blocker, we discovered Google has its tentacles in around 80% of the popular web. Think of all the information Google can extract from those sites.

Slide 54

Slide 54

Your choices affect your friends and family.

Though if your friends and family use Gmail, you’re a bit stuck. Likewise, your choices affect your friends and family.

Slide 55

Slide 55

We’re not just tracked on the web.

These are all choices we make on the web, but tracking really is embedded in every aspect of mainstream technology.…

Slide 56

Slide 57

Slide 57

Amazon Ring and Alexa can hear everything you say and spy on your neighbours. Source: https://reallifemag.com/false-alarm/

Slide 58

Slide 59

Slide 59

A smart pacifier means you can put a chip in your baby. Source: https://www.pacif-i.io

Slide 60

Slide 60

Of course it was only a matter of time before someone made a smart menstrual cup… Source: https://www.kickstarter.com/projects/700989404/looncup-the-worlds-first-smart-menstrual-cup

Slide 61

Slide 62

Slide 62

And let’s not forget the smart dildo…

Slide 63

Slide 63

We Connect (the smart dildo makers) were even sued for tracking users’ “habits”. Source: https://www.vocativ.com/358530/smart-dildo-company-sued-for-tracking-users-habits/

Slide 64

Slide 64

Have you ever wondered how many calories you’re burning during intercourse? How many thrusts? Speed of your thrusts? The duration of your sessions? Frequency? How many different positions you use in the period of a week, month or year? You want the iCondom And have you ever wanted to share all that information with advertisers, insurers, your government, and who knows else?

Slide 65

Slide 65

Avoiding it all is too much work.

Avoiding it all seems like a lot of work, right? I KNOW! I advocate for privacy…and I don’t have the time or the resources to do all of this all the time.

Slide 66

Slide 66

Don’t blame the victim.

That’s why it’s unfair to blame the victim for having their privacy eroded.

Slide 67

Slide 67

Our concept of privacy is being twisted.

Not to mention that our concept of privacy is getting twisted by the same people who have an agenda to erode it. One of the biggest culprits in attempting to redefine privacy: Facebook.

Slide 68

Slide 68

Here is a Facebook ad that’s recently been showing on TVs. It shows a person undressing behind towels held up by her friends on the beach, alongside the Facebook post visibility options, “Public, Friends, Only Me, Close Friends,” explaining how we each have different privacy preferences in life. It ends saying “there’s lots of ways to control your privacy settings on Facebook.”…

Slide 69

Slide 69

But it doesn’t mention that “Friends”, “Only Me”, and “Close Friends” should really read: “Friends (and Facebook)”, “Only Me (and Facebook)” and “Close Friends (and Facebook)”. Because you’re never really sharing something with “Only Me” on Facebook. Facebook Inc. has access to everything you share.

Slide 70

Slide 70

Privacy is the ability to choose what you want to share with others, and what you want to keep to yourself.

Facebook shouldn’t be trying to tell us otherwise.

Slide 71

Slide 71

Shouldn’t we be smart about what we share publicly?

Do we need to be smart about what we share publicly? Sure! Don’t go posting photos of your credit card or your home address. Maybe it’s unwise to share that photo of yourself blackout drunk when you’ve got a job interview next week. Perhaps we should take responsibility if we say something awful to another person online.… But this isn’t about what we knowingly share publicly.

Slide 72

Slide 72

Corporations blame us for giving up our privacy.

Right now, the corporations are more than happy to blame us for our loss of privacy. They say we agreed to the terms and conditions, we should read the privacy policies, it’s our fault.

Slide 73

Slide 73

This in itself is the subject of a whole documentary that was made way back in 2013, called “Terms And Conditions May Apply” covering the ridiculous length and legalese in terms and conditions, and how we couldn’t possibly read them for every service we use.

Slide 74

Slide 74

More recently, an editorial board op-ed in the New York Times pointed out the flaws in privacy policies for consent:

“The clicks that pass for consent are uninformed, non-negotiated and offered in exchange for services that are often necessary for civic life.”

Slide 75

Slide 75

There are studies that speak to how difficult it is to understand these policies too: “Two law professors analyzed the sign-in terms and conditions of 500 popular US websites, including Google and Facebook, and found that more than 99 percent of them were “unreadable,” far exceeding the level most American adults read at, but are still enforced.” Source: https://www.vice.com/en_us/article/xwbg7j/online-contract-terms-of-service-are-incomprehensible-to-adults-study-finds

Slide 76

Slide 76

It is not informed consent

It’s not informed consent when you can’t understand the terms. And how can we even truly consent if we don’t know how our information can be used against us?

Slide 77

Slide 77

It’s not consent if there’s not a real choice.

And it’s not true consent if it’s not a real choice. We’re not asked who should be allowed access to our information, and how much, and how often, and for how long, and when…

Slide 78

Slide 78

Cookie consents

Let’s get into cookie consents for a minute… Which are not real cookies, nor real consent. There appears to be a few approaches currently in favour:

Slide 79

Slide 79

1: Let’s Hope They Just Give Up! Method

One example of the Let’s Hope They Just Give Up! Method from Yahoo…

Slide 80

Slide 80

A dialog box with so many tedious options, links to other sites, and unnecessary information that you hope the visitor will just get bored and click “I agree” and that’s why you designed it with the fancy bold button, and the reject button is so pale and suggests there’s so much more work involved)

Thanks Yahoo

Slide 81

Slide 81

2: Labyrinth Method

Which I like to call the “Let’s Hope They Just Give Up Method: Massive Corporation Edition…”

Slide 82

Slide 82

Spreading the information around as many screens and form elements as possible, creating a maze for the visitor which (you know) they will never have the time or inclination to explore.

Thanks Google

Slide 83

Slide 83

3: Haha, BYE! Method

This is a fairly popular new approach: telling the visitor they should just block cookies:

Slide 84

Slide 84

“The ‘help’ feature of the menu bar on most browsers will tell you how to stop accepting new cookies…” “Please note, however, that without HTTP cookies and HTML5 and Flash local storage, parts of the Site will not function properly.” (Oh and by the way, our site won’t work without them.)

Thanks Vox! (This example is from their site.)

Slide 85

Slide 85

4: Manipulate The Truth Method

This method is a wildcard I stumbled upon recently…

Slide 86

Slide 86

This honestly remarkable example from Ikea where they claim that cookies are required to load images:

“Mandatory - can not be deselected. Technical Cookies are essential for this site to work properly, and are used for things such as navigation… and allowing images to load.”

Yikes, Ikea!

Slide 87

Slide 87

There is no choice.

At no point there was there a choice for me to just opt out of being tracked. No opt in. Nothing that resembles consent.

Slide 88

Slide 88

We’re asked to give up everything or get nothing.

We’re asked to give up everything or get nothing. That’s not a real choice.

Slide 89

Slide 89

The cost of not consenting is to lose access to social, civil and labour infrastructure. It’s certainly not a real choice when the cost of not consenting is to lose access to social, civil, and labour infrastructure.

Slide 90

Slide 90

The technology we use is our new everyday things.

The technology we use are our new everyday things. It forms that vital social, civil, and labour infrastructure. And as (largely) helpless consumers, there’s often not much we can do to protect ourselves without a lot of free time and money.

Slide 91

Slide 91

When the technology you use is a lifeline to access, you are impacted more severely by its unethical factors.

Slide 92

Slide 92

Two years ago, Dr Frances Ryan covered this in her article, ‘The missing link: why disabled people can’t afford to #DeleteFacebook’. After the Cambridge Analytica scandal was uncovered, many people started encouraging each other to #DeleteFacebook. In fact, it’s been happening again the last few weeks.

Slide 93

Slide 93

Dr Ryan pointed out:

“I can’t help but wonder if only privileged people can afford to take a position of social media puritanism. For many, particularly people from marginalised groups, social media is a lifeline – a bridge to a new community, a route to employment, a way to tackle isolation.”

Slide 94

Slide 94

Technology can’t fix issues of domination, oppression or discrimination.

Like so many issues we have with technology, what we’re dealing with are the underlying social and systemic issues. As technologists, we often can’t help ourselves trying to fix or smooth over problems with technology. But technology can’t fix issues of domination, oppression or discrimination.

Slide 95

Slide 95

Technology amplifies social and systemic issues.

Technology can make those issues worse. We can (and do) amplify and speed up systemic issues with technology.

Slide 96

Slide 96

Mike Ananny recently made the point in an article about tech platforms, that we still seem to operate with the notion that online life is somehow a different life, detached from our everyday existence.

Slide 97

Slide 97

Tech platforms often take advantage of that notion by suggesting that if we don’t like “technology” we can just log out, log off, and be mindful or some other shit instead. People with this mindset often show how shallow they are by saying “if you don’t like the technology, you don’t have to use it…”

Slide 98

Slide 98

But we can’t escape technology:

“Platforms are societies of intertwined people and machines. There is no such thing as “online life” versus “real life.” We give massive ground if we pretend that these companies are simply having an “effect” or “impact” on some separate society.”

Source: https://www.niemanlab.org/2019/10/tech-platforms-are-where-public-life-is-increasingly-constructed-and-their-motivations-are-far-from-neutral/

Slide 99

Slide 99

Technology colonialism

Which brings me to another issue rife in technology today. Technology colonialism.

Slide 100

Slide 101

Slide 101

He started with some history:

“Colonial powers always saw themselves as superiors over the native people whose culture was rarely recognized or respected. The colonizers saw economic value in… foreign relations, but it was always viewed as a transaction based on inequality.”

Slide 102

Slide 102

And then comparing it to what we so often do in technology:

“Technology companies continue this same philosophy in how they present their own products. These products are almost always designed by white men for a global audience with little understanding of the diverse interests of end users.”

Slide 103

Slide 103

We have to reckon with our colonial history. This speaks to us politically too, but today I’m talking about our tech industry, our tech community. We have to reckon with the colonial way in which we’ve created technology…

Slide 104

Slide 104

We don’t speak to users. Instead, we use analytics and data to design interfaces for people we’ll never try to speak to, or ask whether they even wanted our tech in the first place. We’ll assume we know best because we are the experts, and they are “just users.” We don’t have diverse teams, we barely even try to involve people with backgrounds different from our own. We fetishise our tools, valuing the designer experience over that of the people using what we build.

Slide 105

Slide 105

“Intent does not erase impact.”

We can say we had the right intentions, but that truly means nothing. This has been explored in depth recently by Tatiana Mac, who invokes “intent does not erase impact” to describe our often-haphazard approach to designing technology.

Slide 106

Slide 106

We not only have a responsibility to design more ethical technology, but to consider the impact our design has outside of its immediate interface.

Slide 107

Slide 107

It’s hard to advocate for change when alternatives don’t yet exist.

As the people advocating for change, we can’t exactly go around telling people to stop using this technology unless there are real, ethical alternatives.

Slide 108

Slide 108

We have the power to make that change.

That’s where you and me come in. As people who work in technology, and who create technology, we have far more power for change. We can encourage more ethical practice. We can build alternatives.

Slide 109

Slide 109

How to build more rights-respecting technology

How do we build more rights-respecting technology?

Slide 110

Slide 110

Build small technology

As an antidote to big tech, we need to build small technology.

Slide 111

Slide 111

Everyday tools for everyday people designed to increase human welfare, not corporate profits.

Yeah, sure it’s a lofty goal, but there are practical ways to approach it.

Slide 112

Slide 112

Make it easy to use.

First off, make it easy to use.

Slide 113

Slide 113

Protecting ourselves shouldn’t just be the privilege of those with knowledge, time and money.

Plenty of privacy-respecting tools exist for nerds to protect themselves (I use some of them.) But we mustn’t make protecting ourselves a privilege only available to those who have the knowledge, time and money.

Slide 114

Slide 114

It’s why we must make easy-to-use technology that is

  • functional (this includes accessible. If it’s not accessible, it’s not functional)
  • convenient
  • reliable

Slide 115

Slide 115

Make it personal.

Slide 116

Slide 116

Build technology for everyday people, not just startups and enterprises.

We’ve got to stop our infatuation with growth and greed. Focus on building personal technology for everyday people, not spending all our focus, experience, and money on tools for startups and enterprises.

Slide 117

Slide 117

Make it private by default.

Slide 118

Slide 118

Make your technology functional without personal information.

You don’t need to know a person’s gender to provide them with services. You don’t need analytics that segment people into stereotypes based on guesswork.

Slide 119

Slide 119

Consent:

Allow people to share their information for relevant functionality only with their explicit consent.

Slide 120

Slide 120

When obtaining consent, tell the person how you’ll use their information, when you’ll use it, who will have access to it, and how long you will keep that information stored. (This has recently become established as a requirement under the GDPR—the EU’s General Data Protection Regulation)

Slide 121

Slide 121

Write easy-to-understand privacy policies.

Don’t just copy and paste them from other sites (they probably copy-pasted them in the first place!) Ensure the privacy policy is up-to-date with every update to your technology.

Slide 122

Slide 122

Don’t use third-party consent frameworks.

Don’t use third-party consent frameworks. Most of these aren’t GDPR-compliant, they’re awful experiences for your visitors, and they may well get you into legal trouble.

Slide 123

Slide 123

Don’t use third-party services. (If you can avoid them.)

Don’t use third-party services at all if you can avoid them. (As they present a risk to you and your users.)

Slide 124

Slide 124

Make it your responsibility to know what they’re doing with your users’ information. If you do use third-party services, make it your responsibility to know their terms and policies, what information they are collecting, and what they are doing with that information.

Slide 125

Slide 125

Self-host all the things.

If you use third-party scripts, content delivery networks, videos, images and fonts, self-host them wherever possible. Ask the providers if it’s unclear whether they provide a self-hosted option.

Slide 126

Slide 126

Social media etiquette:

And it’s probably worth mentioning a little bit of social media etiquette: If you know how, strip the tracking identifiers and Google amp junk from URLs before you share them. Friends don’t let corporations invade their friends’ privacy.

Slide 127

Slide 127

Post to your own site first, then mirror those posts to third-party platforms. If you feel you need a presence social media or blogging platforms, don’t make it the only option. Post to your own site first, then mirror those posts on third-party platforms for the exposure you desire.

Slide 128

Slide 128

Your basic blog is way better than Medium.

Slide 129

Slide 129

Make it zero-knowledge.

Slide 130

Slide 130

Zero-knowledge tech has no knowledge of your information. It may store a person’s information, but the people who make or host the tech cannot access that information if they wanted to.

Slide 131

Slide 131

Keep a person’s information on their device where possible.

Slide 132

Slide 132

Ensure any information synced to another device is end-to-end encrypted.

If a person’s information needs to be synced to another device, ensure that information is end-to-end encrypted, with only that person having access to decrypt it.

Slide 133

Slide 133

The adage is true, the cloud is just somebody else’s computer.

Slide 134

Slide 134

Make it share alike.

And we also have to take care with how we share our technology, and how we sustain its existence. Make it share alike.

Slide 135

Slide 135

Cultivate a healthy commons by using licences that allow others to build upon, and contribute back to your work. Don’t let Big Tech use your work if they’re not going to contribute their changes back.

Slide 136

Slide 136

Make it noncommercial.

And we also have to take care with how we share our technology, and how we sustain its existence. Make it non-commercial.

Slide 137

Slide 137

Support stayups, not startups.

My partner at Small Technology Foundation, Aral Balkan, coined the term stayups for the antistartup. We don’t need more tech companies aiming to fail fast or be sold as quickly as possible. We need long-term sustainable technology.

Slide 138

Slide 138

Support not-for-profit technology.

If we are building sustainable technology for everyday people, we need a compatible funding model, not venture capital or equity-based investment.

Slide 139

Slide 139

It feels impossible. It probably is!

It may feel difficult or even impossible to build small technology with your current employer or organisation. It probably is! But there are steps we can take to give ourselves the opportunity to build more ethical technology.

Slide 140

Slide 140

1. Use small technology as a criteria

Use small technology as a criteria when you’re looking for your next job. You don’t have to be at your current job forever.

Slide 141

Slide 141

2. Seek alternatives.

Seek alternatives to the software you use every day.

Slide 142

Slide 142

switching.software has a great list of resources, provided by people who really care about ethical technology as well as ease-of-use.

Slide 143

Slide 143

3. If you can’t do it at work, do it at home.

If you can’t do it at work, do it at home. If you have the time, make a personal website, practice small technology on your own projects.

Slide 144

Slide 144

We’re trying to build tools to help people do this (without compromising their site’s visitors.) This is Site.js, and we’re building it so you can build a secure personal website without all that configuration, without relying on a third party service who might start tracking all your visitors without your consent, or pivot to a dodgy business model. (It’s free and open small technology!)

Slide 145

This demo of Site.js shows you a few things Site.js can do.

00:00: Starting with a very basic HTML page showing my SmashingConf badge, which loads fine just as a file in the browser. 00:15: But what if i want to run that page on a local server? Site.js can do this! I go to the Site.js website and install Site.js in my terminal. 00:46: Then, all I need to do is run site in the website’s folder. You can see that Site.js then sets up my site on a local https server, as well as providing TLS certificates. I can then open localhost in my browser and see my site running on a local server. 01:11: And if I want to change the colour of the text on the page? I can change the CSS so the colour of my text is now white, and it instantly updates on the page without needing to refresh, because Site.js also has LiveReload. 01:36: When it’s time to go live (I already have Site.js set up on my server), I can use site push to push my site from the local folder to the server. Site.js syncs the folders quickly, and then I can visit my site live. 01:56: Site.js also includes ephemeral statistics. If I go to my unique statistics URL (which I can keep private or share with whoever I want), I can see the actual statistics that are useful for running a site (not intrusive privacy-invading analytics!) such as my most popular pages, what’s missing from my site, and how many hits my site has gotten.

Slide 146

Slide 146

Site.js is a tiny tool in a world of choice, but it is an alternative, and we have big plans for it.

Slide 147

Slide 147

Tin foil hats are all the rage.

I’ve been speaking about tracking and privacy for around seven years. (Luckily it’s becoming more mainstream these days!) I’ve been heckled by a loyal Google employee, I’ve been called a tinfoil-hat-wearing ranter by a Facebook employee. I’ve had people tell me there just isn’t any other way, that I’m just trying to impede the “natural progress” of technology…

Slide 148

Slide 148

As Rose Eveleth wrote in a recent article on Vox: “The assertion that technology companies can’t possibly be shaped or restrained with the public’s interest in mind is to argue that they are fundamentally different from any other industry. They’re not.”

Slide 149

Slide 149

We can’t keep making poor excuses for bad practices.

Slide 150

Slide 150

Divest ourselves of unethical organisations.

We must divest ourselves of unethical organisations. Consider who we are financially supporting or implicitly endorsing when we recommend their work and their products.

Slide 151

Slide 151

And, I’m sorry, I don’t give a fuck about all the cool shit coming out of exploitative companies. You are not examples to be held above others. Your work is hurting our world, not contributing to it.

Slide 152

Slide 152

Our whole approach matters.

Our whole approach matters. Our whole approach matters. It’s not just about our philosophy, or how we build technology, but our approach to being a part of communities that create technology.

Slide 153

Slide 153

I’m just one person.

You might be thinking “but I’m just one person.”

Slide 154

Slide 154

But we are an industry, we are communities, we are organisations, we are groups made up of many persons. And if we work together on this, we could have a huge impact.

Slide 155

Slide 155

You are not your job.

We have to remember that we are more than just the organisation we work for. If you work for a big corporation that does exploitative things, you probably didn’t make the decision to do that bad thing. But I think the time has come that we can no longer unquestioningly defend our employers or clients.

Slide 156

Slide 156

We need to use our social capital (also known as privilege!), we need to be the change we want to exist.

Slide 157

Slide 157

Make change happen

There are different roles we can take in making change happen. Roles that fit us differently depending on our privilege and our position within an organisation…

Slide 158

Slide 158

1. Be independent.

We’ve got to be comfortable being different, we can’t just follow other people’s leads when those other people aren’t being good leaders. Don’t look to heroes who can let you down, don’t be loyal to big corporations who don’t care anything for you.

Slide 159

Slide 159

2. Be the advisor.

Do the research on inclusive, ethical technology, make recommendations to others. Make it harder for them to make excuses.

Slide 160

Slide 160

3. Be the advocate.

Marginalised folks shouldn’t have to risk themselves to make change. Advocate for others. The overrepresented should advocate for the underrepresented.

Slide 161

Slide 161

4. Be the questioner.

Question those defaults. Ask why was it been chosen to be built that way in the first place? Try asking a start-up how it makes its money!

Slide 162

Slide 162

5. Be the gatekeeper.

When the advocacy isn’t getting you far enough, use your expertise to prevent exploitative things from happening on your watch.

Slide 163

Slide 163

6. Be difficult.

Be the person who is known for always bringing up the issue. Embrace the awkwardness that comes with your power. Call out questionable behaviour.

Slide 164

Slide 164

7. Be unprofessional.

Don’t let anybody tell you that standing up for the needs of yourself, and others, is unprofessional. Don’t let people tell you to be quiet. Or that you’ll get things done if you’re a bit nicer.

Slide 165

Slide 165

8. Be the supporter.

If you are not comfortable speaking up for yourself, at least be there for those that do. Remember silence is complicity.

Slide 166

Slide 166

Speaking up is risky and hard.

It can be really fucking lonely. We’re often fighting entities far bigger than ourselves. Our lives, our ability to make a living is at risk.

Slide 167

Slide 167

But letting technology continue this way is riskier. Like society and democracy riskier.

Slide 168

Slide 168

We deserve better.

People say the talks I give are scary, but I’m not here to scare you. I’m just here because I want to tell you that we deserve better.

Slide 169

Slide 169

Thank you!

Slides are at https://noti.st/laurakalbag

Laura Kalbag, laurakalbag.com@LauraKalbag Small Technology Foundation small-tech.org