[ad_1]

After months of resistance, Air Canada was. forced To provide a partial refund to a grieving passenger who was misled by an airline’s chatbot by misrepresenting the airline’s bereavement travel policy.

The day Jake Moffatt’s grandmother died, Moffat immediately visited the Air Canada website to book a flight from Vancouver to Toronto. Unsure of how Air Canada’s bereavement rate works, Moffat asked Air Canada’s chatbot to explain.

The chatbot provided false information, prompting Moffat to immediately book the flight and then request a refund within 90 days. In fact, Air Canada’s policy clearly states that the airline will not provide refunds for bereavement travel after the flight has been booked. Moffatt dutifully tried to follow the chatbot’s advice and request a refund, but to his surprise, the request was denied.

Moffatt spent months trying to convince Air Canada that a refund was due, sharing a screenshot from a chatbot that clearly claimed:

If you need to travel urgently or have already traveled and wish to return your ticket to reduce the bereavement rate, please return your ticket by completing our ticket refund request form. Do so within 90 days of the issue date.

Air Canada argued that because the chatbot’s response linked to a page with the actual bereavement travel policy elsewhere, Moffat should have known that bereavement rates could not be requested retroactively. Instead of a refund, the best Air Canada could do was promise to update the chatbot and offer Moffatt a $200 coupon to use on a future flight.

Unhappy with the resolution, Moffatt refused the coupon and filed a small claims complaint with the Civil Resolution Tribunal of Canada.

According to Air Canada, Moffat should never have relied on the chatbot and the airline should not be liable for the chatbot’s misleading information because, Air Canada essentially argued, “the chatbot is a separate is a legal entity that is responsible for its actions,” Court order said.

The experts Told to Vancouver Sun Moffatt’s case marks the first time a Canadian company has tried to argue that it is not responsible for the information provided by its chatbot.

Tribunal member Christopher Rivers, who decided the case in Moffat’s favour, called Air Canada’s defense “remarkable”.

“Air Canada argues that it cannot be held responsible for information provided by any of its agents, servants, or representatives, including chatbots,” Rivers wrote. “It doesn’t explain why it believed that was the case” or “why a webpage titled ‘Bereavement Travel’ was more reliable than its chatbot.”

Further, Rivers found that Moffat had “no reason” to believe that one part of Air Canada’s website would be accurate and another would not.

Air Canada “does not explain why users should double-check information on one part of its website on another part of its website,” Rivers wrote.

In the end, Rivers ruled that Moffatt was entitled to a partial refund of $650.88 from the original fare in Canadian dollars (about $482 USD), which was $1,640.36 CAD (about $1,216 USD), plus interest on the airfare. Additional damages and Moffatt’s tribunal fees.

Air Canada told Ars it will comply with the decision and considers the matter closed.

Air Canada’s chatbot appears to be inactive.

When Ars visited Air Canada’s website on Friday, it appeared that there was no chatbot support available, suggesting that Air Canada has disabled the chatbot.

Air Canada did not respond to Urs’ request to confirm whether the chatbot is still part of the airline’s online support offerings.

[ad_2]