Trust in a post-truth world

One of the most important values in life is trust and integrity. For instance, relationships are at the heart of everything that we do. From the parent/child relationship in the home to the teacher/student relationship in the classroom to employee/employer in the workplace to romantic relationships, trust is an essential factor. These basic relationships of trust need to work in a balanced society.

A wide range of business models has been built on trust. From Ebay to Amazon to AirBNB and Uber these services only function because of the collective honesty of the participants. The business logic behind these services relies on people using them in predictable and honest ways.

ID 164009001 © Jon Anders  Wiken |

Creatures of habit.

However, what would happen if we all became too predictable, we are creatures of habit after all. What would it mean if our behaviour could be accurately predicted by algorithms? That is the world we are fast approaching if not already arrived. We can all testify to how adverts track us around the web when we casually look for something. Then emails and push notifications bombard us with tempting offers for the same product and service.

Advertisers have been using these techniques for decades. But what is different today is the abundance of data that we give away for free to receive “Free” services. We use free email, we share our browsing habits to our Internet Service Provider, our social media accounts interactions with other people, likes and comments, follows and hashtags. It is becoming increasingly possible to be able to identify and focus on particular viewpoints.

Persuasive Social media.

Facebook and other social media platforms want to keep up engagement and a key part of that is giving us content that they know we want to see. Facebook, for example, will prioritise content based on the comments you post, reactions you make, comment replies, the links you share to a group of friends over messenger.

Researchers have already shown how data as innocuous as your profile image can reveal your sexual orientation. A team from Stanford University developed the algorithm from analysing people’s self-declared orientation on their Tinder profile. Using a predictive algorithm the researchers were then able to predict with 81% accuracy for men and 74% accuracy for Women any individual’s sexual orientation.

How about if a government or a corporate entity decided to use this as a means to discriminate against or persecute someone based on their sexual orientation or religious beliefs. Combined with technologies such as facial recognition this has frightening implications. That world is already here, the technology exists and all it needs is for someone to start to bring it all together. The frightening world of minority report is not too far off today.

Political influence in a post-truth world.

The recent election in the UK saw the Conservative party gain a majority in the house of commons. No one could say that any of the political parties used the truth to win votes, each party manipulated and selectively chose “facts” mixed with emotions to persuade and influence the electorate. This, of course, is nothing new, politicians have been doing this for centuries. What makes it particularly potent today is the combination of social media and the presentation of select facts in social media and traditional media to select demographics to influence voting. The Politicians and campaigns that are most savvy to this use of technology of persuasion can be the most successful.

The sad reality is that people would rather be lied to if it is in line with their biases than to seek the truth. We may never know how the use of targetted presentation of information and social media has influenced the outcome. But, the sound bite culture and lack of substance in any of the debates put the whole nature of democracy, truth and justice in question.

The 2016 word of the year was “Post-Truth”. The Cambridge dictionary gives this definition of “Post-truth”:

relating to a situation in which people are more likely to accept an argument based on their emotions and beliefsrather than one based on facts

* The referendum was the first major vote in the era of post-truth politics.

* He dubs the current administration a ” post-truth” White House.

Definition of Post Truth Cambridge Dictionary

A definition of truth

Happily, the definition of truth has been a source of discussion for philosophers and theologians for centuries. The need to understand these arguments and to have a discerning view is all the more crucial in a “Post-Truth World”. I like the way that Ravi Zacharias summarises what truth is by some basic tests:

What makes you so sure there is an absolute reality for which there is a moral system to define? Ravi Zacharias
  1. Logical consistency: What is the truth? Truth by definition is exclusive, it excludes the contrary argument. The contrary argument and the argument can not both be true.
  2. Two tests for the truth, How do I know what is true?:
    1. Empirical adequacy: Correspondence theory is where particular claims when tested against reality correspond to reality as it actually is.
    2. Experiential relevance: Coherence theory is where the questions that we ask and the confluence of answers that we receive from multiple sources have to cohere.
  3. Four questions of truth:
    1. Origin, where did we come from?
    2. Meaning, what is the meaning of life?
    3. Morality, how do we define what is right and wrong?
    4. Destiny, what happens when we die?
  4. Five subjects of truth:
    1. God: how do I define God?
    2. Reality: what is reality?
    3. Knowledge: what is knowledge?
    4. Ethics: what is ethics?
    5. Anthropology: Cultural diversity?

Challenge for AI

Some of the challenges facing AI today are questions that have been debated for centuries. In the language of AI there are Deep-fakes, the bias in algorithms, explainability of AI decisions, the privacy of our data which all have trust as a common denominator. In the technology that we use every day we need to have trust and integrity. We can’t have a self-driving car that is making decisions based on feelings!


“Deep Fakes”: See the last blog post about Lies, damned lies and Politics for instance.

“Deepfake video is created by using two competing AI systems — one is called the generator and the other is called the discriminator. Basically, the generator creates a fake video clip and then asks the discriminator to determine whether the clip is real or fake. Each time the discriminator accurately identifies a video clip as being fake, it gives the generator a clue about what not to do when creating the next clip.

Together, the generator and discriminator form something called a generative adversarial network (GAN). The first step in establishing a GAN is to identify the desired output and create a training dataset for the generator. Once the generator begins creating an acceptable level of output, video clips can be fed to the discriminator.”

Deep Fake definition from Whatis?

Similarly, the manipulation of media can be done across audio, video and the written word using the same principles. It is getting more and more difficult to spot the fake from the real news.


Bias in AI: This can stem from machine learning of human bias. For example in recruiting certain organisations may have a bias for a certain type of candidate. Automated recruitment based on machine learning of previous candidates can highlight that bias.

Other examples come from criminal sentencing in law courts where bias in the machine learning influence the judges in the sentences given to people found guilty of the crime.


Explainability: “Computer says no” with AI seen increasingly as a black box that is getting to make important decisions about our lives. A whole new discipline of explainable AI has sprung up to explain the reasons behind the decisions. Everyone has a right to know the logical reason for a decision made for them.

Privacy of our data:

How personal data is used in the European Union is protected by the General Data Protection Regulation. Other markets notably China and US have less stringent restrictions on how personal data is used. In a world that is increasingly monetising our data for corporate or political ends, there needs to be legislation to protect personal data wherever you live.


Technology advances are exciting and everyone wants to have the latest and the greatest, but there is a danger of throwing the baby out with the bathwater. The basis of our society is one of trust and the rule of law. Decisions need to be explainable and demonstrably non-bias. The ongoing debate of how we use AI in this “post-truth” world is an interesting one and one that we are at the beginning of. It is clear that there are opportunities and the chance of abuse. We need to see how this will develop but we each have a collective responsibility to understand and question what is going on and seek the truth each day.

2 Responses

  1. January 19, 2020

    […] campaigns have used the same predictive technology to influence and persuade a targeted […]

  2. April 19, 2020

    […] on the horizon the opportunities for help are limited. In moments of crisis, your nearest available trusted person is your trust horizon. The institutional support mechanisms may not be working in these times. Who […]

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.