Rachel Botsman: In algorithms we trust

“We are outsourcing our capacity to trust to an algorithm. Technology is changing the way we trust strangers.” Technological disruption is moving collaborative consumption into the era of trust leaps, and Rachel Botsman wants us to slow down and think before we check that box.

“The first wave of technological disruption gave birth to the collaborative economy. Companies focused on extracting new value from existing assets – empty seats in cars, empty bedrooms. To utilize this value, new marketplaces needed to be created.”

eBay, Airbnb, and Uber have allowed us to scale markets for assets that never had marketplaces. “Technology creates the efficiency to match supply and demand. It also pushes us to do something new or different. These trust leaps are happening at a higher and faster rate than ever before.”

The increased levels of trust we place in the platforms that power our daily lives are correspondingly eroding trust in institutions. Botsman believes the traditional pillars of our establishment – government, media, et al. are not designed for trust in a digital age.

“Trust is like energy. It hasn’t disappeared, it has shifted to a new form. Humanity started with local trust – the immediate members of our community. Institutions came to prominence when society expanded. Distributed trust in ideas, platforms, and individuals who use them is the third distinct chapter of trust.”

As the internet ramps up the numbers of transactions that involve offline interactions between people, trust is becoming a currency in and of itself. The crucial question is whether ultimate accountability for trust lies in the individuals using platforms – or in the platforms themselves.

“We never thought Uber would send us a serial killer. This is what the residents of Kalamazoo, Michigan said when Jason Dalton turned out to be their driver in 2016. Platforms can no longer be neutral pathways for connecting supply and demand.

“Instead, platforms need to be proactively accountable by inventing the tools that stop bad things happening, and reactively accountable by being there when things go wrong. Facebook has a monopoly on our time and information – are they using it in the right way?”

As the last presidential election in the United States demonstrated, “Trust and truth are not the same things. Technology tends to amplify human biases – it is the responsibility of technology companies to recognize their behavior affects how people behave on their platforms.”

Recommend

Comments

More articles to read

Selina Juul: “Take what you can eat, but eat what you can take”

Gary Carter: New reality experiment asks Chinese celebrity to live one month at Helsinki Airport