I worked as a designer for a parental tech start up. A customer support case came in.
In 2017 was working as a designer shortly before joining Ushahidi, in a ‘tech for parents’ start up. They made a useful product and had a good mission to help parents and kids better talk about the time spent on devices and curb ‘screen addiction’.
I had asked during my interview and wondered in my first few weeks whether the young people being monitored and limited through their devices always truly consented and how that power was exercised by a parent.
I asked ‘Has there ever been a case of abuse? where a parent has stopped a kid seeking information that they should have rights to?’
I thought about myself as a younger teen, which was before the internet was throughly used by parents generations, i was finding information about LGBT+ issues and discovering an identity that was mine, to internally to explore and wondered, would I have been stopped from viewing that?
I was assured there were policies and considerations being taken and I was placated.
Then a customer support request came through a friend of a woman was asking how to remove the app because their friend’s partner was using it to monitor, track and control their device and communication. The team was asked, what should we do? How should we respond? what was our responsibility here?
There was quiet kind of panic, one that happens when something you’d hoped to never face arises. These kinds of problems are ones that keep you awake at night thinking what if?
We talked as a team, how do we address this? Some were more involved than others, citing company policy of ‘only she (the abused partner) can request the removal of the software with his (the abuser partners) consent. So the friend asking on the abused woman’s behalf was not in a great place to support her abused friend. Can we delete the accounts, can we block them? Lots of ideas on how to ‘solve’.
I spoke up, as a person who had experience of partner abuse and said, doing any action to rouse suspicion for the abuser could put the woman in more danger and we can’t know what might happen if we suddenly remove the tool of control. We need to support the friend and woman to get to a safe situation before doing anything that could result in violence. I worked with the customer support rep, to find the closest charity organisation that could ensure safety before any technology solutions could be attempted.
Was this enough? we’ll never be sure and while I attempted to have conversations afterwards about safe guarding others the issue was largely ignored as an anomaly an ‘Edge case’.