Small Technology

A presentation at Think About! in May 2019 in Cologne, Germany by Laura Kalbag

Slide 1

Slide 1

Small Technology

Laura Kalbag and Aral Balkan, Ind.ie @laurakalbag @aral

(These slides cover the first half of the talk. Small Technology: The Problem)

Slide 2

Slide 2

We feel like we know Facebook’s game. They show us adverts, we get to socialise for free. I don’t give Facebook much information, my public profile is barely filled out, I rarely Like or Share anything.

Slide 3

Slide 3

So I get ads aimed at a stereotypical woman in her early thirties. (Makeup, laundry capsules, shampoo, dresses, pregnancy tests…)

Slide 4

Slide 4

Then four years ago, my mum died. And then Facebook suggested I might be interested in this… Goodwill Family Funeral Directors. It’s quite specific but surely a coincidence?

Slide 5

Slide 5

So I asked my siblings if any of them had posted something on Facebook… “Did any of you tell your friends on Facebook? Facebook has started showing me creepy funeral director ads…”

Slide 6

Slide 6

“Not me!” said Jess. Yeah, it might just be a strange coincidence…

Slide 7

Slide 7

Nini: “sorry, that was me I think ­ I face booked Madds cos she’s in Australia now xxx”

Well it might just be a strange coincidence, but maybe too close to be a coincidence.

Slide 8

Slide 8

My sister had told her friend via private message.

Slide 9

Slide 9

She’d used some key words in a Facebook message to her friend Maddy, Facebook may have made the connection that I’m her sister, figured out that I might want a Funeral Director and stuck that ad in my feed.

Slide 10

Slide 10

It just goes to show how much Facebook knows, and how much complexity it can grasp, despite my not telling it anything.

Slide 11

Slide 11

Are they listening?

We hear stories like this all the time. Particularly where people think their devices must be listening to them through its microphones. They opened their phone and started seeing ads for a product they were just talking about in person.

Slide 12

Slide 12

A group of scientists at Northern University in the US did a study into 17,000 apps on Android devices, to see if apps were secretly recording people with the microphone. While they did find that a lot of apps used third party services that recorded videos and screenshots of the screen, they found no evidence of an app unexpectedly activating the microphone or sending audio out when not prompted to do so. It turned out that (for now at least!) apps are unlikely to be listening and recording everything we do.

Slide 13

Slide 13

But the truth is actually more insidious. “What people don’t seem to understand is that there’s a lot of other tracking in daily life that doesn’t involve your phone’s camera or microphone that give a third party just as comprehensive a view of you.” said David Choffnes, one of the authors of the paper.

Slide 14

Slide 14

If I was browsing funeral-related sites, maybe Facebook would have that information if those sites had Facebook Share Buttons or Facebook login. As this article in SBS News points out, those Facebook buttons track you across the web, violating your privacy.

Slide 15

Slide 15

You can find out some of the things Facebook knows about you in your Ad Preferences. This is a screenshot of how Ad Preferences used to look, now you don’t get such a specific insight into the categories it has put you in. You can see it has noted I live “away from family”, I’m an “ex-pat (UK)”, I’m a “frequent international traveller”, as well as detailed categories on the devices, platforms and connections I use to access Facebook.

Slide 16

Slide 16

This is why facebook.com will only let you sign up as Male or Female, not non-binary or the self-declared gender option you get after you’ve signed up. It makes it easier to put you in a box.

Slide 17

Slide 17

As Eva Blum-Dumontet wrote in the New Statesman, “When profiling us, companies are not interested in our names or who we really are: they are interested in patterns of our behaviour that they believe match an audience type… So to target us more efficiently, the advertisement industry relies on a very binary vision of the world.”—’Why we need to talk about the sexism of online ad profiling

Slide 18

Slide 18

And through this, advertising continues to perpetuates stereotypes and existing prejudices: “We need to be clear that a data driven world – where artificial intelligence makes decision based on simplistic profiles about us – isn’t going to solve prejudices: it’s going to perpetuate them.” –Eva Blum-Dumontet, ’Why we need to talk about the sexism of online ad profiling’ This isn’t just about ads that tell women they should be focused on domestic tasks, looking pretty and getting pregnant…

Slide 19

Slide 19

And through this, advertising continues to perpetuates stereotypes and existing prejudices: In 2016, Pro Publica discovered that Facebook was allowing advertisers to exclude users by race in ads for housing: “The ubiquitous social network not only allows advertisers to target users by their interests or background, it also gives advertisers the ability to exclude specific groups it calls ‘Ethnic Affinities.’ Ads that exclude people based on race, gender and other sensitive factors are prohibited by federal law in housing and employment.” –Julia Angwin and Terry Parris Jr., ‘Facebook Lets Advertisers Exclude Users by Race

Slide 20

Slide 20

And even after Facebook argued it was not liable for advertisers’ discrimination (but still settled these cases), Facebook still perpetuates inequality in its profiling. A study into who Facebook picked to show ads to from very broad categories showed “given a large group of people who might be eligible to see an advertisement, Facebook will pick among them based on its own profit-maximizing calculations, sometimes serving ads to audiences that are skewed heavily by race and gender.” –Aaron Rieke and Corinne Yu, ‘Discrimination’s Digital Frontier

Slide 21

Slide 21

“In these experiments, Facebook delivered ads for jobs in the lumber industry to an audience that was approximately 70 percent white and 90 percent men, and supermarket-cashier positions to an audience of approximately 85 percent women. Home-sale ads, meanwhile, were delivered to approximately 75 percent white users, while ads for rentals were shown to a more racially balanced group.”–Aaron Rieke and Corinne Yu, ‘Discrimination’s Digital Frontier

Slide 22

Slide 22

Concluding: “An ad system that is designed to maximize clicks, and to maximize profits for Facebook, will naturally reinforce these social inequities and so serve as a barrier to equal opportunity.”–Aaron Rieke and Corinne Yu, ‘Discrimination’s Digital Frontier

Slide 23

Slide 23

Facebook

And this profiling, and these algorithms are used for the majority of technology today. It’s not just Facebook showing you ads, and determining whose posts you see based on what it thinks is most important to you.

Slide 24

Slide 24

Google

(Most popular homepage in the world.) Search results are not just based on relevancy to the term but are personalised based on Google’s determination of user interests (profiles).

Slide 25

Slide 25

YouTube

(2nd most popular site in the world.) Recommends videos based on newness and popularity, but also based on user interests and viewing history.

Slide 26

Slide 26

Yahoo!

(9th most popular in the world.) Yahoo’s search results are personalised based on user’s search history, and a home page feed of news and other content is recommended based on popularity and the user’s predicted interests.

Slide 27

Slide 27

Amazon

(10th most popular site in the world.) Products and product categories are recommended on the home page, including ‘featured’ recommendations, based on user purchases and views.

Slide 28

Slide 28

Instagram

(14th most popular site in the world.) Posts displayed in an order Instagram determines to be most interesting and relevant to the user.

Slide 29

Slide 29

All of these sites use the profiles they’ve gathered to target users with advertising and content.

Slide 30

Slide 30

And these categories can be combined to have an incredibly complex view of you and how you might be manipulated.

Slide 31

Slide 31

Cambridge Analytica

This brings us to Cambridge Analytica.

Slide 32

Slide 32

According to whistleblower Christopher Wylie, Cambridge Analytica is a “propaganda machine” and a “psychological warfare weapon” hired by political campaigns to create advertising to influence voters’ decisions The kind of advertising we might consider “fake news” How did they do this?

Slide 33

Slide 33

They build algorithms to process profile data from Facebook that could determine how a person is likely to vote, and how their decision could be changed or re-enforced through targeted advertising.

Slide 34

Slide 34

Cambridge Analytica harvested profile data through Facebook personality test apps, which had special permission to harvest particular data…

Slide 35

Slide 35

… not just from the person who used the app or joined the app, but also it would then go into their entire friend network, and pull out all the friends’ data as well.

Slide 36

Slide 36

“we’d only need to touch a couple hundred thousand people to expand into their entire social network, which would scale to most of America”—Christopher Wylie, ex-employee of Cambridge Analytica

Slide 37

Slide 37

Using data such as “status updates, likes, in some cases private messages…”

Slide 38

Slide 38

When all this information came out, Facebook blamed Cambridge Analytica and the researcher who created the personality test app, claiming the use was against Facebook’s terms. However, Facebook doesn’t have a problem with Facebook using data in the same way.

Slide 39

Slide 39

A data scientist who previously worked at Facebook said: “The fundamental purpose of most people at Facebook working on data is to influence and alter people’s moods and behaviour. They are doing it all the time to make you like stories more, to click on more ads, to spend more time on the site.”

Slide 40

Slide 40

“Facebook knows you better than your members of your own family”—Sarah Knapton, Science Editor, Daily Telegraph.

Slide 41

Slide 41

Psychologists and a Computer Scientist from the Universities of Cambridge and Stanford discovered that computer-based personality judgments are more accurate than those made by humans.

Slide 42

Slide 42

“The team found that their software was able to predict a study participant’s personality more accurately than a work colleague by analysing just 10 ‘Likes’.”

Slide 43

Slide 43

“Inputting 70 ‘Likes’ allowed it to obtain a truer picture of someone’s character than a friend or room-mate, while 150 ‘Likes’ outperformed a parent, sibling or partners.”

Slide 44

Slide 44

“It took 300 ‘Likes’ before the programme was able to judge character better than a spouse.”

Slide 45

Slide 45

With the introduction of Facebook reactions, there’s even more Facebook can know about you because they have more specific data.

Slide 46

Slide 47

Slide 47

“Belgian police now says that the site [Facebook] is using them as a way of collecting information about people and deciding how best to advertise to them. As such, it has warned people that they should avoid using the buttons if they want to preserve their privacy.”—Andrew Griffin, The Independent

Slide 48

Slide 48

“By limiting the number of icons to six, Facebook is counting on you to express your thoughts more easily so that the algorithms that run in the background are more effective,” the post continues. “By mouse clicks you can let them know what makes you happy.”—Andrew Griffin, The Independent

Slide 49

Slide 49

And as Julia Angwin, Terry Parris Jr and Surya Mattu found: “What the [Facebook ads] page doesn’t say is that those sources include detailed dossiers obtained from commercial data brokers about users’ offline lives. Nor does Facebook show users any of the often remarkably detailed information it gets from those brokers.”—Julia Angwin, Terry Parris Jr. and Surya Mattu. ‘Facebook Doesn’t Tell Users Everything It Really Knows About Them

Slide 50

Slide 50

These platforms have become vital social infrastructure

One of the reasons this is so scary is because platforms like Facebook have become a vital part of our social infrastructure. Leaving has big consequences if you find it harder to keep in touch with your family across the world, if your kids’ school uses Facebook for important school updates, if you have to use Facebook for work.

Slide 51

Slide 51

And even if you left or you never joined in the first place, that doesn’t mean you escape Facebook. Facebook has a shadow profile on you.

Slide 52

Slide 52

If a friend or acquaintance has used the Find Friends functionality, or used Facebook Messenger on their phone, they gave Facebook access to their Contacts. Facebook uses those names, email addresses, and phone numbers to build their shadow profiles. And companies who you buy things from upload the details they have on you to Facebook in order to better target their own ads.

Slide 53

Slide 53

“Right now commenters across the Internet will be saying, Don’t join Facebook or Delete your account. But it appears that we’re subject to Facebook’s shadow profiles whether or not we choose to participate.” —Violet Blue, ‘Firm: Facebook’s shadow profiles are ‘frightening’ dossiers on everyone

Slide 54

Slide 54

”I feel like we’re only beginning to understand why Facebook’s data is so very valuable to advertisers, governments, app makers and malicious entities.”

Slide 55

Slide 55

And it’s not just the web and apps that are tracking you and your behaviour. Nearly all “smart” and “cloud” internet-connected technology needs that internet connectivity to send your information back to its businesses’ servers.

Slide 56

Slide 56

Like the “smart” speakers that are now so ubiquitous in our homes.

Slide 57

Slide 57

Google Nest, having a good look around your home.

Slide 58

Slide 58

Hello Barbie which your kids can talk to, with all that information recorded and sent to a multinational corporation…

Slide 59

Slide 59

Smart pacifier. Put a chip in your baby.

Slide 60

Slide 60

Looncup - smart menstrual cup! One of many smart things that women can put inside themselves. (Most of the internet of things companies in this genre are run by men…)

Slide 61

Slide 61

Smart dildo…

Slide 62

Slide 62

We Connect (the smart dildo makers) were even sued for tracking users’ habits. (Though many companies will get away with tracking like this as they’ll hide it in the terms and conditions…)

Slide 63

Slide 63

Have you ever wondered how many calories you’re burning during intercourse? How many thrusts? Speed of your thrusts? The duration of your sessions? Frequency? How many different positions you use in the period of a week, month or year? You want the iCondom.

And have you ever wanted to share all that information with advertisers, insurers, your government, and who knows else?

Slide 64

Slide 64

And it’s not just tech for individuals that is affected. Our governments are sharing our health data with tech corporations.

Slide 65

Slide 66

Slide 66

This is not a dystopian future. This is now.

Slide 67

Slide 67

And it was not an accident, it was very much on purpose.

Slide 68

Slide 68

This is Surveillance Capitalism.

A term coined by Shoshana Zuboff, a professor at the Harvard Business school, who has recently authored a whole book on this topic.

Slide 69

Slide 69

The Age of Surveillance Capitalism by Shoshana Zuboff.

What is surveillance capitalism? “Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data. Although some of these data are applied to product or service improvement, the rest are…fabricated into prediction products that anticipate what you will do now, soon, and later.”—Shoshana Zuboff, The Age of Surveillance Capitalism

Slide 70

Slide 70

Surveillance capitalism is defined by surveilling our behaviour, and monetising it. If it sounds familiar, it’s because it forms the dominant business model of technology today.

Slide 71

Slide 71

Business models are more than just a feature.

As the people who work in and around technology, we have to understand that surveillance capitalism is not a feature we can replace, or phase out when we’ve found another way to make money. The monetisation of our data is the business model. The investment has been made on the premise that our behaviour will be shaped and pushed towards making the businesses more money in the future.

Slide 72

Slide 72

We can’t reform big tech.

Slide 73

Slide 73

Not least because these guys don’t want to reform big tech and they’re its wealthiest CEOs. Not just that, they are six of the twelve richest people on the planet. They have more people and power than many countries.

Slide 74

Slide 74

They don’t have a problem. They’re doing fine.

Slide 75

Slide 75

venture capital ❤ surveillance capitalism

Venture capital loves surveillance capitalism. And that is why businesses with venture capital can’t change their models. The business has been sold to the investors, and investors have been promised year-on-year exponential growth. More users. More data. More money.

Slide 76

Slide 76

The thing is, exponential growth is not sustainable. We live on a planet made up of finite resources. This is why we are living in a climate catastrophe.

Slide 77

Slide 77

And much like in the climate catastrophe, the most vulnerable and marginalised people are affected by surveillance capitalism first.

Slide 78

Slide 78

Look at the information that data brokers have about you. This is data that Axiom and Oracle have obtained. They get information in the same way Facebook does. And they’ve profiled you as disabled, or poor, or LGBTQ+. That you take an interest in protests and what your political interests are. And any government can ask for that information. They’ve profiled you as prone to gambling, drinking alcohol, or smoking. They know what your health purchases are, what symptoms and ailments you think you might have. And any corporation can buy that information.

Slide 79

Slide 79

We can’t continue in this way. We need to do something.

Slide 80

Slide 80

The Solution

And Aral will now explain some of the things we can do about it…