Science

Can A.I detect "narcissism, Machiavellianism, or psychopathy"? AirBNB hopes so.

They might scan your social media to find out.

AJR_photo/Shutterstock

AirBnB has had problems in the past with guests throwing parties that weren’t allowed and people using the app for scams. The rental service even had to ban “party houses” after five people were shot and killed at one last year. It’s perhaps not surprising that a service that gives people access to your home might come with problems. According to the Evening Standard, one way AirBnB may be cracking down on undesirable guests is by scanning their social media.

AirBnB filed a patent last year that was apparently for an artificial intelligence system that can scan a user’s online presence to look for evidence of “narcissism, Machiavellianism, or psychopathy.” We’d imagine it wouldn’t have trouble finding some narcissists on social media.

Business Insider reached out to AirBnB, and the company said it’s not “currently implementing all of the software’s screening methods as described in the patent filing.” That would seem to mean it could be using some of the software’s screening methods.

“As with any other company, there are a number of patents we file, ranging from searching listings to automating booking availability, and it does not mean we necessarily implement all or part of what’s in them,” the spokesperson said.

AirBnB’s website says the company uses “predictive analytics and machine learning to instantly evaluate” users, which makes it sound like the company is using the kind of technology that was described in this patent.

The patent claims this A.I. system scores individuals based on a number of factors. It looks at if the personal information in their AirBnB profile matches what is found online, if they appear to use drugs or alcohol, if they use “negative language” and multiple other factors. The patent says it takes all of the data points it collects and forms a “person graph” that can determine if someone is a desirable guest or not. I can’t say I’d want to see my graph.

As with any algorithm-based software that makes judgments about people, there’s a very real concern that such a system could be biased. Though we tend to think using technology makes things less biased, creators of algorithms like this can accidentally bake in sexism, racism, transphobia and more if they don’t train the algorithm on people from a variety of backgrounds and account for systemic biases.

See also: Robots are conducting job interviews now

What’s also concerning is the fact that this algorithm would be making psychological judgements. It’s not hard to imagine how someone who suffers from depression, anxiety or any number of other psychological struggles could be judged to be undesirable simply because the algorithm thinks they’re dangerous based on how they express themselves on social media.

If AirBnB is using this software, it’s possible the company is only using it to raise a red flag so one of the company’s employees can look into that user, but too often companies rely on these types of technologies to operate without human involvement to save money. If AirBnB wants to dispel these concerns, it may need to be a little more transparent.

Related Tags