Improving one tool could help Airbnb cut prejudice

By Kitty Knowles 8 September 2017

Cracking the science of bias.

Racism is the greatest challenge faced by Airbnb – co-founder and CEO Brian Chesky admitted as much last year.

Since then, the business has blocked discriminatory hosts from hacks that helped them ‘pick and choose’ guests, and has made users sign up to a new policy which goes above and beyond what is required by law.

But sharing economy companies like Airbnb aren’t just blighted by outright bigotry; scientists have shown there are subtle, subconscious prejudices that impact our decisions every day.

Emphasising one single detail of an online profile can, however, cut this bias to great effect.

A simple tool?

‘Homophily’ is the term used to describe humans’ natural tendency to develop trustful relationships with people similar to themselves.

You can imagine how the flip-side of this can be damaging for people using Airbnb – if hosts consistently prioritise guests they see as similar to themselves, or guests only choose hosts they see themselves in.

But these biases around race, gender, age, etc, can be counteracted simply by prioritising ‘reputation’, say Stanford scientists in a new study.

This boils down to improving tools like star ratings and reviews.

The study

For their experiment, researchers invited nearly 9,000 Airbnb to use a mock-up of the travel booking platform.

Two groups were shown sets of profiles and invited to invest ‘credits’ into their favourites.

The first group, which didn’t include reputational information (star ratings, etc) confirmed the homophily bias: the more similar the profiles were, the more the participant trusted them.

However, a second group, who were shown profiles with positive ratings and reviews, invested significantly more in users whose characteristics were completely different than their own.

The researchers then analysed 1 million actual interactions between hosts and guests on the real Airbnb, and confirmed that hosts with better reputations attracted more demographically diverse guests.

Putting the findings to use

While a deeply prejudiced person is unlikely to be swayed by a rating, tools to counteract everyday homophily can offset harmful bias.

Currently Airbnb has a star ratings system, and reviews, listed first in brief near the top of a host’s profile, and more fully beneath a property profile. But perhaps more could be done.

There is no star rating when you click on the personal profile of a host, for example. There may also be ways to incentivise guests to leave more reviews, or to make reputation a more defining factor on every part of the platform.

Many other ‘sharing’ platforms fail to even do the basics: TaskRabbit has reviews but no star ratings, BorrowMyDoggy also swerves stars, while TrustedHousesitters has ratings for house sitters but no ratings for homeowners.

Given that these are all online spaces where users browse photo profiles to find and provide personal services, homophily will play a role.

“These platforms can engineer tools that have great influence in how people perceive each other and can make markets fairer, especially to users from underrepresented minorities,” says study author Bruno Abrahao, a visiting assistant professor at Stanford’s Institute for Research in the Social Sciences.

Next time you book someone or their services through the sharing economy, we hope it’s their reputation, not their image that helps to sway you the most.