Air Canada Argues in Court that Its AI Chatbot Is a ‘Separate Legal Entity Responsible for Its Own Actions’

Robotic hands typing on keyboard, illustration.
ANDRZEJ WOJCICKI/SCIENCE PHOTO LIBRARY/Getty

A recent small claims court decision found Air Canada liable for incorrect advice given by its website chatbot that led to a grieving customer paying more for plane tickets. Incredibly, the airline argued that its AI chatbot is “a separate legal entity responsible for its own actions.”

CBC reports that Jake Moffatt was looking to book last-minute plane tickets from British Columbia to Toronto after the death of his grandmother. While on Air Canada’s website, he used the customer service chatbot to ask about getting bereavement fares and the AI chatbot advised that if Moffatt booked full-price tickets immediately, he could submit a refund application within 90 days to get the cheaper bereavement rate.

A passenger wheels his luggage near an Air Canada logo at Toronto Pearson International Airport  (Photo by Cole Burston/Getty Images)

Trusting this advice, Moffatt booked tickets for a total of $1,630. However, after completing the trip, Air Canada informed him that bereavement discounts don’t apply retroactively to already-purchased tickets. When Moffatt complained and showed a screenshot of the chatbot’s guidance, an Air Canada rep admitted the chatbot had provided “misleading words” but said the company had updated the system.

Unhappy with this resolution, Moffatt took Air Canada to small claims court. In a legal filing, Air Canada argued argued that its chatbot is “a separate legal entity responsible for its own actions.” But adjudicator Christopher Rivers called this “a remarkable submission,” stating that Air Canada owns its website and chatbot.

Ultimately, Rivers ruled that Air Canada failed to take reasonable care to ensure its chatbot provided accurate information. He ordered the airline to pay Moffatt $812 — the difference between the full-price and bereavement fares.

In its defence, Air Canada claimed customers could find the right bereavement policy details elsewhere on its website. But Rivers said there was no reason Moffatt should have known one section was accurate while another was wrong.

Read more at CBC here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.

COMMENTS

Please let us know if you're having issues with commenting.