Technology today is so demanding, as anyone with a smartphone knows. It is constantly buzzing and beeping saying “read me”, “pay attention to me”, “this is important”, “you’re missing out..”. When was the last time you have lost track of time while checking Facebook? LinkedIn? YouTube? Instagram? and the next thing you knew 15 minutes had disappeared? Tech companies have invested millions in making their services addictive. Producing the most seductive thing to show you next so you stay on their platform.
- Push notifications – Instagram, Twitter, Facebook all send us “push” notifications to encourage us to view our friends recent posts and build and interact with our social network.
- Single Sign-On via Social media – Google, Facebook and now Apple allow you to log in with your credentials and give apps permission to log in for you so that you don’t need to learn a new username password. The cost being you give those platforms knowledge of where you are visiting and the apps you are using.
- Fear Of Missing Out – All social media platforms play this game, sending you enticing messages that someone has updated something. But the only way you can find out is to log on to their platform.
- Gamification – Whether it is playing candy crush or using Apple health competitions to compete with your work colleagues or gain rewards for being health i.e. Vitality Cinema and Starbucks rewards
- “Variable ratio schedule” – Twitter or dating apps like Tinder reward you by swiping down to refresh. Those few milliseconds before the feed refreshes and you are not sure what you are going to get. Similar to a slot machine spin. It is a deliberate delay to build that anticipation.
- Repackaging content – Spotify does this in a clever way by looking at the music that you are listening to and giving you customised playlists. They also put in distracting ads that push people to take a subscription.
Tracking technologies follow us from one website to the next. That toaster we were looking starts to appear everywhere we go. The persuasive suggestion engine on e-commerce sites knows us better than we know ourselves. However, this can be challenging if we are dealing with a difficult life event like the loss of a child or a loved one.
Gillian Brockwell a staff writer at the Washington Post suffered the loss of her child. In the following days, she had to deal with ads and promotions haunting her web surfing and reminding her of her lost baby. She wrote a letter to the tech companies highlighting how deeply upsetting this technology can be in circumstances like hers.
There are of course ways to change these ads, but that is the last thing on someone’s mind dealing with a tragedy. The average user is not going to be aware of these settings or would not know where to look for them if they tried.
Deradicalising the radical.
The technology which tracks your desires and interests can also be used to divert people away from hateful content. What starts as a casual slur between gamers can develop to ever more toxic content. The cycle of radicalisation starts with a slippery slope of enquiry into the dark corners of the net. There is an interesting analysis of this in a New York times article.
Google Jigsaw project has created a number of initiatives to analyse the use of search terms and tracking to be able to subtly present deradicalising information. The goal is to provide people with the truth not just propaganda so that they get the full facts before it is too late.
The below video shows an insight into some of the work of project Jigsaw in subtly presenting ads to people that are seeking hardline material.
Dealing with harassment
Why are people so mean on the internet? Is it because of a feeling of anonymity? Nothing poisons an online discussion more than threats and bullying. Machine learning can be used to flag to a moderator when a comment is potentially not appropriate. The Perspective API allows sites that use comments and discussion threads to provide some feedback on the toxcity of a particular comment.
But what about sexual harassment or other forms of abuse that people are too frightened to report. Thankfully there are tech entrepreneurs that are working on this using AI.
Vault platform – Founded by Neta Meldav that uses a block-chain powered app to help encourage individuals to speak up. It is time-stamped report that is tamper-proof and can be used as evidence on a users phone.
Spot – Which uses AI to help people put in writing evidence of the harassment using the best police interview techniques. It aims to get around the problem of people feeling judged and what might happen as a result of reporting the offence.
“individuals can feel safe and anonymous and have control over the information they provide”.Dr Julia Shaw, Spot founder.
Working towards a safer online world
Whilst the internet is undoubtedly a force for good, we have access to so much more information, we can communicate with the whole world. But human nature is fundamentally imperfect and we end up hurting each other whether intentionally or accidentally. There needs to be a check and balance in all the technology that we use to keep the bad guys and the worse of humanity in check.
Everyone has a responsibility to hold themselves accountable for how they use technology. There is an underlying problem right here to do with privacy and the right to be left alone. On one hand, we can say we have nothing to hide so what’s the harm on the other do we really understand the value of what we are giving away?
We need to challenge the tech giant who is using persuasive tactics combined with a deep understanding of our online persona to sell their wares. Or the internet troll who is sowing hate out of spite. Or the political party or government seeking to manipulate and coerce a population to an outcome that benefits a minority.
It is good to see these initiatives to provide check and balance. But fundamentally these questions need to be answered by every individual and understand the moral and ethical decisions we take every day.
What are your thoughts how should technology be used to provide the check and balance are big tech acting like the nanny state or are they a force for good?