We’re living in an extraordinary age: the age of trust.
We believe the intricate system we’ve constructed will remain reliable and never falter.
We’re confident enough to place much of our life online.
We expect the world to work for us.
Every day we expect feats to be accomplished that would have been impossible, even twenty years ago.
Our expectations are heightened.
When you step away to consider it, our lives are wondrous in what we take for granted.
We’re in danger of becoming the sort of people who complain about the food on the flight without taking into account they’re hurtling through the air.
We take the advancements for granted.
Everyone’s parent has a new knee.
We trust software all day, every day.
We trust the language of algorithms.
We trust software to correct our most personal creative mistakes.
‘You’re recording a song and find a note that is really quite out of tune,’ Brian Eno told The Vinyl Factory a few years ago.
‘In the past, you’d have said, it’s a great performance, so we’ll just live with it.
‘What you do now is retune that note.’
We trust we’ll be able to make things better, even at a cost.
‘So you’re always asking yourself, have we lost something of the tension of the performance, of the feeling of humanity and vulnerability and organic truth or whatever, by making these corrections?’
Maybe so. But we trust that, overall, these developments are helpful.
From the moment the alarms on our phones sound in the morning to the moment we set them again at night, we trust in a varied and intricate web of interactions.
We slide money back and forth online, consult navigational systems, forego old technologies, such as keys, for the fobs in our pockets.
We touch screens all day without understanding touchscreen technology.
We lift the hoods of our cars and fail to recognize what’s inside.
We trust that proprietary software will get us from A to B.
We’re taught as children not to get into the cars of strangers.
Now, Uber in hand, we slide into strange cars every day and expectantly look at the driver in the front seat.
‘Your name is Jason, right?’
*
These are shifting, counter-intuitive times.
Imagine telling your ten-year-old self: ‘When you get older, everyone will voluntarily carry with them a tracking device at all times.
‘For many, it’ll be the first thing they look at in the morning and the last thing they consult at night.’
There is risk and wonder embedded within the banality of our normal working days.
But you can’t be in awe of it 24/7.
You can’t walk around permanently wide-eyed.
Who wants to live their life thanking the motherboard each time the computer surges to life?
Still: acknowledge that trust is more widespread than ever.
We’ve suddenly become very, very trusting beings.
It’s happened in only a handful of years.
Would you like Google to remember this password for you?
Sure.
I trust it with everything else.
The friendly interface of technology obscures the magnitude of what is going on here.
If we take for granted the importance, and the strangeness, of these interactions, we risk losing a chance to comprehend what we’ve gained.
Wake up with your phone, glance at an Apple watch, employ Shazam to grab a song flitting past you in the coffee shop, trace your pathway with GPS, slide that CVS rewards card into the reader, pay with FasTrak or E-ZPass, use Nest to control the thermostat from your phone or computer.
It evokes a multi-part symphony of life.
Apple TV, Fire Stick, Roku, Kindle and NOOK.
WeTransfer is part of this new behaviour.
My company is part of this growing din.
I speak of this trust from the inside.
We’re a part of the lives of billions of people.
Users send files.
But these are not simply strings of code or operational instructions.
These are memories, secrets, unpublished manuscripts, personal photos, business ideas.
Users entrust part of their life to us.
They might not understand how that file shows up in exactly the expected destination, but they trust that it will.
Not only do we trust new devices and services, we also trust the companies who own them.
We trust companies with our details, our data, our choices, our shopping records, our interactions in shoe stores and grocery stores, and everything in between, even our movements in space.
We’ve been dealing with them long enough to wonder what happens when the trust breaks down.
Because we’re living our lives online, it’s worth asking the question: Is the internet heading in a more trustworthy direction?
Every day, new evidence emerges to say: Maybe not.
There’s an onslaught.
In October 2018, as reported by Business Insider, news broke of ‘a bug in the company’s Google+ social network that affected an estimated 500,000 people and exposed information that users intended to keep private’.
That’s just one particular week.
It seems to me that rather than saying companies are trustworthy or not, it’s worth examining a new kind of trust that is being asked of us.
Increasingly, there’s something that separates the interactions we have online with those in real life.
A weaker version of trust has emerged online in the past twenty years.
It’s a shadow of what’s expected in life offline.
It’s not what it used to be.
It’s almost deserving of a new definition: a lighter sort of trust, a Trust-Lite.
There’s a version that has been crafted, accepted and then taken advantage of by tech companies.
Where has this taken us?
What have we lost in the process?
Who is pointing out this erosion?
Who is keeping silent?
And what’s the next step?
*
To answer these questions we’ll have to take a quick look at one of the greatest inventions of our tech-heavy age.
It’s one that demonstrates just what is asked of us.
It’s not the iPhone.
Rather, something even more important.
It’s the latest incarnation of the terms and conditions.
In a politically divided time, one activity truly brings us all together.
No one sits down and reads the terms and conditions.
People used to.
Or at least they used to when they were at the bank, ready to sign a mortgage, or sitting across the desk from an insurance salesperson.
Small print was not a daily occurrence.
You paid attention when it came along.
Now we all refuse to scroll to the bottom.
We can’t make that journey.
We don’t read the T&Cs, the Privacy Policies (PP), the Terms of Service (ToS).
Whatever you want to call that onslaught of technical prose.
You’ve probably faced an update already today.
You’ve probably just blindly accepted some Ts and Cs.
We trust the companies that keep breaking our trust.
In 2017, Deloitte surveyed 2,000 consumers in the US. They found that 91 per cent of people consented to legally binding terms and conditions of service without reading them.
For those aged eighteen to thirty-four, a whopping 97 per cent agreed to conditions before even reading them.
*
Most of us would consider ourselves mildly suspicious, or at least unwilling to give away our first-born children.
But powerful techniques are honed to gain our trust.
In another study, two academics, Rainer Böhme of the University of California, in Berkeley, and Stefan Köpsell of Dresden’s Technische Universität, looked at the wording of consent forms. They offered 80,000 participants alternative wordings of simple consent.
Some participants were mildly bullied – the kind of bullying in the Trust-Lite age we’ve become used to.
They were told their consent was required and then offered an ‘I agree’ button.
‘They went along 26% more often than did other users, who had been politely asked to participate (with phrases like “we would appreciate very much your assistance” and both “yes” and “no” options represented by lookalike buttons),’ wrote David Berreby in The Guardian.
When we’re given the option to consider how much we should trust a company, some of us do make considerations and weigh the options.
But offering up our trust blindly has become the habit of the age.
We sigh, occasionally chastise ourselves, but go along with what has become habitual behaviour.
‘Ubiquitous EULAs [end user license agreements] have trained even privacy-concerned users to click on “accept” whenever they face an interception that reminds them of a EULA,’ Böhme and Köpsell wrote in their study.
There must be a solution.
There must be a way we can all work together.
Berreby ended his article by saying: ‘Perhaps society could subject internet agreements to industry-wide codes of conduct. You don’t have a contract with a doctor, but you can expect her to adhere to the Hippocratic oath …’
And sure, that could happen in a perfect world.
There are loose initiatives and blue sky thoughts.
What if there was an industry-wide agreement?
What if tech companies segmented their T&Cs?
What if a bubble popped up each time they wanted access to something new?
The terms could be presented to the user as and when they become relevant.
What if?
What’s evident in the loose solutions bandied about is that they’re built on a fragile stem of hope.
I’ve trusted this company blindly.
I hope they’ll do the right thing.
(Cut to: Zuckerberg blinking in front of the cameras at the Senate hearings.)
*
The T&Cs reveal something about us: our laziness, our willingness to get to the good stuff, our willingness to hand over … whatever.
But they also act as revealing portraits of the companies themselves.
After all, this is the one interaction where some brutal honesty occurs.
In this interaction, there’s no well-edited video with a soundtrack of plucked guitars.
There’s no one dancing around to sell you the product or remind you we’re better off connected.
It’s legal text.
For companies that fetishize smooth surfaces, clean design, short sentences, simple messaging and bright colours, here’s the spot where they get down to it.
In 2019, a journalist from the New York Times analysed the ‘length and readability of privacy policies from nearly 150 popular websites and apps’.
Using a test to measure the complexity of the text, he found Facebook’s privacy policy more difficult to read than Stephen Hawking’s A Brief History of Time.
‘Only Immanuel Kant’s famously difficult Critique of Pure Reason registers a more challenging readability score than Facebook’s privacy policy,’ wrote Kevin Litman-Navarro.
The list is long, like a rap sheet.
In 2012, Instagram told its users they could now sell your photos to … whoever.
This was one instance where users suddenly cared about those T&Cs.
For a while.
Half the users left, but they seem to have all trickled back.
‘TwitPic takes credit for your photos,’ CNN reported on the now defunct firm, ‘the company keeps your deleted images and you can’t sue the company after a year.’
Even oldies but goldies like Sears and Kmart made a play back in 2007.
After offering participants a measly ten bucks to join their online community, they snooped.
They tracked users as they surfed the web, banked, checked up on prescriptions, and just lived their lives.
The justification was buried in the T&Cs.
Eventually, the retailer’s parent company settled charges in 2009.
There’s the invasiveness of Snapchat.
There’s the way OnStar, the roadside assistance, would keep tracking your movement even after you cancelled the service.
Writers may think they’re using Twitter as a real-time diary but guess who legally owns all those tweets?
My favourite, though, for the scope of the villainy of the T&Cs comes from the apps.
One stirring example was Brightest Flashlight Free app for Android.
Their brilliantly ignoble terms of service said the app would periodically collect information.
They didn’t trust you, the user, enough to make it crystal clear they’d track your every move and deliver the information to third-party advertisers.
Most villainous of all was that you didn’t actually have to accept.
The app began recording your whereabouts before you even offered your permission.
I sometimes envision these T&Cs personified.
They’re villains who move undetected through the Trust-Lite society.
And when they all get together they can’t believe how easy it’s become to crack this societal safe.
Who knew all it would take was a block of legalese text?
It was like a skeleton key.
It would give you access to anything.
An example of this strange new devalued definition of the word ‘trust’ comes from the Apple T&Cs.
Apple is sleek in its messaging.
It carries the feeling of a company that should be trusted.
And we do trust it – we have to.
You don’t get to become a trillion-dollar company by acting like a suspicious free Brightest Flashlight app.
But Apple T&Cs are the equivalent of a partner who just wants to know where you’re at.
Look more closely.
This is just one facet of the Trust-Lite life.
*
Let’s move from the fine print to the ratings.
What about how we view the forms we’ve trusted in the past?
We used to trust book reviews.
Now, authors sneakily write their own Amazon reviews, and their parents get in there as well.
We used to trust restaurant reviews.
Now, everyone knows Yelp can’t be trusted because whingers and complainers flood the reviews, in a cloud of anger, after their meal arrives five minutes late.
We know this isn’t the truth of a dining experience.
Ever seen a person grapple with a bad Airbnb review?
‘They’re talking about … me. My house. My life.’
We’re used to viewing people through the prism of their Airbnb reviews.
We know what an Uber number means.
And then there’s TaskRabbit and other personal service apps, and the much-bashed Peeple, the app that promised a way to rate everyone around you but, unsurprisingly, has yet to find much of a user base.
What happens to our interpersonal relationships and our expectations when we start relying on scores?
Our lives unfold, day-by-day, interaction by interaction.
We’re constantly asked to enter our details, enter our new passwords, give secondary contact details, enter text in the box below.
Now that we’ve all bought into this ongoing process, there’s not much choice other than to go along with it, say yes to the next update, click the box at the end of the most recent set of terms and conditions.
Trust is forced upon us.
Trust is demanded from us.
Trust accretes around us.
Its upkeep is important.
Does it wear us down to keep handing over so much personal data?
Are we getting tired of revealing so much for so little?
We have become dependent on this trusting environment.
But we’ve also become reflections of it.
Integers of trust.
We’ve become our unshakeable names.
I remember the first time I saw The Crucible.
It wasn’t a particularly good production.
Lots of talk of witchcraft, not a lot of subtlety.
There’s the moment when John Proctor decides to confess to engaging in witchcraft but refuses to sign the written confession the Judge wants to post on the church door.
Why?
‘Because it is my name!’ the actor bellowed. ‘Because I cannot have another in my life!’
That line remains, even when bellowed, unvarnished and primal.
It’s one of those moments – watch the Daniel Day-Lewis version on YouTube – when raw exposition works, when a character’s primal self is revealed.
But compared with today, John Proctor had an option.
Or, I should say, someone comparable, with a fast horse and an escape route, had an option – you could outrun your name.
Move a town or two away.
Reinvention was possible.
There were opportunities for a new start; the new town meant a chance to start over, to rebuild trust with a new group of people, to not be absolutely bound to the decisions of the past.
To remake one’s name.
Look at the horror stories we tell ourselves now.
Is it too soon to turn to Black Mirror as a valid image?
The chills of the recent ‘Nosedive’ episode are derived from a nightmare scenario in which every human interaction comes with a rating.
The protagonist, Lacie, is a young woman living in a slightly amped-up version of our current situation.
The characters all live in a glib world of incessant, unstoppable ratings.
If Lacie was to give out a cry of ‘But it’s my name!’ she’d be met by her peers with a shrug, because they’re all aware of her name, and her rating – a lingering, overshadowing, overpowering number that strikes such fear in them all because it’s simply the endpoint of all the other aggregates we’ve become so well versed in reading.
*
(We’re all highly attuned to the meaning of ratings now.
One friend told me he used to be able to look at a wind pressure number and understand the conditions out on the ocean.
Now he spends more time looking at his Instagram likes.
He’s developed a similar relationship with them.
He reads their meaning.
He understands the possible social weather conditions that lead to 54 likes instead of 78.)
It’s a strange sort of boomerang.
We want to quantify and rate everything around us, we want trustworthy digits.
We want our movies – works of subtlety and ambiguity – to be turned into a cold percentage on Rotten Tomatoes.
But we also want to escape this beautiful system, this finely tuned machine.
We aren’t inactive participants in this world of trust.
What’s most terrifying about the Black Mirror episode is the unending nicety.
The world we’ve created doesn’t turn off, won’t give us a break.
The ratings system never stops.
‘I feel like I’ve changed,’ an Airbnb owner told me, not so long ago.
‘I’ve become a performative version of myself.
‘Trust me, trust me, trust me.’
She was late to pick up a young couple – that service was included – ‘And I remember,’ my friend said, ‘the car ride back.
‘It was so uncomfortable.
‘Beneath all the performed niceties was this punitive undertone.
‘I could tell they were angry and entitled and weren’t particularly interested in accepting an apology.
‘And sure enough, in rolled the negative review.
‘So can people trust me?
‘Does this ambiguity of the situation matter to an aggregation of reviews?’
And could this ever be changed?
The world of trust has expanded.
Proponents would say this is a very good thing.
Corporations are acting better.
People are acting better.
(Except they’re both not.)
What’s wrong with a little corrective?
If you’re so worried about your name, don’t do anything wrong.
We trust you’ll do the right thing.
Is there a financial imperative to the trust we hand over?
My dad loves restoring cars.
It’s one of his great passions in life.
Most enjoyable for him is the moment when he opens up the hood and finds out for himself what’s wrong.
The garage sometimes offers good advice.
But he doesn’t always trust those who work at the garage to advise him, because sometimes they’ll act hastily or misdiagnose the problem.
They’re nice guys.
But they’re driven by an ulterior motive.
Their immediate impulse is to tell him to replace a part.
This carburettor? Throw it away. Replace it.
Often, after returning from the garage, he’ll suspect he might be able to fix the part himself.
At the very least he wants to take a look, weigh the options, judge the evidence.
Recently he bought an Alfa Romeo.
I sensed weariness in his voice when he told me about it.
‘You open up the engine,’ he said, ‘and you can’t really get to the engine. It’s cordoned off. It’s covered, encased.’
If you buy a Tesla, at some point that car is going to speak to you and let you know something is wrong and needs to be replaced.
You’re going to have to trust that the system is making the right diagnosis.
This interaction is going to involve an act of trust on your part.
There is no way to eyeball it.
There’s no way to appraise the evidence at hand.
A new battery might not be necessary.
An engine overhaul might not be the only remedy.
It’s no secret the technological sensors we trust are tilted.
They’re naturally tilted towards consumerism.
In 2017, Apple confirmed it intentionally slows down older iPhones, ‘a feature introduced last year to protect against problems caused by ageing batteries’ reported The Guardian.
We trust our information won’t simply be used to get us to buy more.
We trust, we trust, we trust.
For many, a relationship with Amazon is an inescapable fact, these days.
Our relationships with Amazon call for another type of trust.
I was no different.
I trusted them blindly.
I had gotten to the point where I no longer looked up the price of anything because the expectation is that Amazon is going to be the cheapest.
If it’s available from Prime and shows up the next day, it’s a double bonus.
It beats having to actually remove my credit card from my wallet.
A while ago, I noticed I’ve just stopped looking at what my daily needs cost.
A little while after noticing that, I questioned whether this subservience to Amazon is healthy.
I’m blindly trusting Amazon to offer the best prices and I’m losing any sense of the price of things.
It feels like I’m losing another skill.
We’re losing the ability to question a company like Amazon.
We’re losing the skill to discern.
*
We think of Amazon as an American company.
But I tend to think of them as a great Russian novelist.
Their ability to construct a psychological profile is far more intricate than Facebook.
Run through their research capabilities.
Alexa, the ranking engine, gives them all the global ranks of every website.
They own the overview of all traffic.
They’re able to monitor the flows of history each day.
IMDb tracks and monitors what we’re all watching, which gives them insight into tastes for Amazon Prime and Amazon Studios.
Goodreads was bought in 2013.
Why try to imagine what could possibly comprise a person’s diet?
When you buy Whole Foods, you’ll know.
In 2018 they acquired a small pharma company called PillPack.
The amount of insight is stunning.
These insights provide the broad strokes of the story.
But then look at the material we’re submitting to them so they’ll be able to plot out our own biographies.
My Amazon history discloses where I am in life.
These are not simply purchases, but predictive markers of the next chapter.
Amazon will know the month you stop buying tampons.
And if Amazon moves into pharmaceuticals – which has been suggested – then the moment you stop buying Tampax you’ll be suggested diapers. Or hormones.
Those who have long awaited this day are out there making noise.
Those who fear the machine in all its permutations are out there making noise.
What about those who think: It’s handy. I enjoy it.
But I can’t shake the feeling that there’s a consequence to one company being given the ability to tell such a detailed story of a life.
Is this truly ease we’re feeling?
Or is it intrusion?
Do we trust Amazon with the stuff of ourselves?
Just as we know writers and film-makers will push the way our stories are told, and experiment with form, so too will those who are telling our stories, these days.
It’s not dangerous.
It’s not malicious.
What I’ve noticed as the prevailing impulse in the tech world is: How far can I get with this?
There is push and retreat.
Boundaries are pressed against.
Pressure is exerted.
Is there any force that will stop Amazon at any point?
Where will that force come from?
What form will it take?
For companies, the challenge is evident and seductive.
I know, because my own company is ambitious in its own way.
A company embraces motion.
They will ask: Can I keep running?
And if they can keep running, and if they take over half the world, a company will ask: What do I do with it now?
A company like Amazon, guided by data, views us as characters with predictable needs.
In response, I’m starting to view them as a character.
To some degree, they’ve become a superhero.
They’ve got power that is now, at this point, difficult to wrest from them.
I feel jealousy, pride and respect for what they’ve achieved.
Like superheroes, a certain section of the populace is cheering them on.
We, as a populace, have enabled them.
As consumers, we’ve enjoyed.
But as human beings, I’m sensing worry.
And, in some cases, I’m feeling something more serious when it comes to my compatriots in the world of tech.
I feel betrayed.
I’m in a community and I don’t see my own community living up to its ‘best self’, as they say.
I feel betrayed when I see Google brokering deals with China.
In Nick Bilton’s Hatching Twitter, a gossipy recollection of those heady days when Twitter emerged, much is made of the anarchism, the coders who sit at the ‘standing meeting’ and stand when the boss asks everyone to sit.
Jack Dorsey’s nose ring, the sake nights, the punk bands.
But somehow we got from there to what we have now.
I speak not as an outsider, but as someone in the world.
These were my heroes – at least, I, like others, bought into the early pronouncements.
I can still chart the progress, from thinking how unabashedly necessary it was for the forerunners of this generation to adopt ‘Don’t be evil’ as a motto, all the way across the spectrum to that grim moment when that defunct phrase, peeled from the HQ walls, became drenched in irony, not so long ago, when The Intercept broke news of Google’s China plans.
My company is not perfect.
My ideas are not as slick as Malcolm Gladwell’s.
I haven’t read it all.
I’m not going to act like I’m in an amateur production of Network, go full Peter Finch and write the rest of the book in caps because I’M MAD AS HELL AND I’M NOT GOING TO TAKE THIS … etc.
Tech has been good to me.
It’s been good to you, too.
It’s not worth looking at what we’ve built in black and white.
In response to the hacking, the flood of fake news, Cambridge Analytica, the … I could go on, Facebook released a series of treacly, manipulative, sentimental ads that told us Facebook needed to get back to what it was all about.
And yes, of course, at that moment I said under my breath: ‘Data scraping.’
But there was more to it, obviously.
The video made it clear the betrayal is not total.
There could be a way back.
What have we done to trust?
How can we go from Trust-Lite back to something more solid?