Technology

Tech & Truth: The Filtered Reality

Tech

Truth is under threat. In the past year alone we’ve seen countless instances of this: from fake news to the results of a US election being called into question. 

These are big and impactful examples of truth being subverted, but there are smaller, more insidious seams of untruth that run throughout our daily lives.

It’s present in the pictures we see on Instagram, the phishing messages in our inboxes, the 5-star reviews we skim on Amazon and the advertisements we mistake for articles. 

Not all mistruths we see online are created with the intent to deceive. A lot of the people building software for deep-fakes, or automatically generating website copy, don’t fully realise the harm they’ll do. 

What’s the Harm?

These sort of things may seem innocuous, but layering mistruth upon mistruth has a corrosive effect on public discourse and commerce. If enough people don’t trust what they read or see, how will inclusive civil discourse work? If you can’t trust search results, news, or stores, how will commerce work? If it’s trivial for anyone without expertise to pass themselves off as if they were an expert, what comes next? The attack on the US Capitol is a small taste of the potential power of mistruth.

As awareness of mistruth online increases, it’s likely distrust will also increase. Consumers will start to seek new indicators of truth and authenticity, and to shun not just fakery, but also things that have the potential to have been faked.

Can We Reverse the Tide?

So, what’s to be done? We see pushback happening already. Google has been working to assess “EAT” (expertise, authority and trustworthiness) in search results for years, and Apple is forcing app developers to explicitly declare how their apps protect user privacy. Other large companies, such as Facebook, are under an increasing amount of pressure to follow suit. 

Again, these large scale policy changes will trickle down to changes in how we, as individuals, act. We’ll look for new ways to present the most authentic image of ourselves online, in order to increase our trustworthiness. Research is already revealing that the more clearly and qualitatively we can present ourselves, the more trustworthy we are perceived. A joint USC and Australian National University study in 2018 showed that users with clearer audio were perceived to be more intelligent, more likeable, and their message to be more important.

COVID Spawned a New Era

For those of us lucky enough to work remotely, we’ve spent more time sequestered at home recently than ever before. This isn’t going to go away ‘after’ COVID; it’s too convenient and cost-effective: many businesses will move to a ‘hybrid’ office model, with employees enjoying the benefits of spending more time at home whilst keeping the opportunity to work from an office where necessary. 

With this continued distance from each other, there’s a greater opportunity for bad actors to profit from online mistruths. Most of us now know at least one person we’ve never met in real life. Our communications with those people are confined to a small screen, and communicating in an authentic way is much more difficult than face-to-face communication in the same room. 

People Prefer the Truth

It’s crucial then, that we take all the steps we can to preserve trustworthiness in our remote interactions. Even small, seemingly insignificant, toe-in-the-water mistruths — such as fake virtual backgrounds — muddy the waters. Last year, the Harvard Business Review reported that — aside from looking cheap — virtual backgrounds undermine viewer impressions of authenticity, expertise, and trustworthiness. Viewers show extremely strong preference in all of these areas for real backgrounds, irrespective of the nature of the real background. 

Instead of using techniques that involve fakery, we should look to other options, and in the case of one’s background on a video call there are plenty. Cropping your video to show off your background, for example, is a much simpler and more authentic approach. Your true background is much more likely to endear you to the person you’re speaking with, even if it’s messy. 

About Aidan Fitzpatrick, Founder & CEO Reincubate

Aidan founded Reincubate in 2008 after building the world’s first iPhone data recovery tool, iPhone Backup Extractor. He’s led Reincubate to win the UK’s highest business honor twice, has spoken at Google on entrepreneurship, and is a graduate of the Entrepreneurs’ Organization’s Leadership Academy.  His research has been cited in over 20 scholarly research works, He has served as CTO at Artspages, supplying Apple with music for the iTunes Store, and ultimately building the largest SAN in Norway at the time. In his work as European Development Manager at CNET, Fitzpatrick led technical teams across Europe (London, Munich, Paris) and key projects including unification of EU publishing systems. He served as CTO at Wiggle through to its $230m breakout exit.  Visit: https://reincubate.com/

Comments
To Top

Pin It on Pinterest

Share This