[ad_1]
Air Canada is being forced to make good on a promise made by its AI-powered chatbot.A Canadian customer reportedly asked the company’s chatbot for clarification on its bereavement rates when his grandmother died. The chatbot told him: “If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form,” Ars Technica reports.The customer followed the advice and booked a ticket, and then reached out to Air Canada for a refund. As it turns out, the company’s policy actually states that it won’t provide refunds for bereavement travel after a flight is booked.Initially, Air Canada refused to honor the chatbot’s promise, arguing that the bot also linked to the company’s bereavement policy, which—if read—would prove its initial statement was false. The company did; however, offer the customer a $200 credit to use on a future flight.The customer ended up filing a complaint in small claims court where the airline further argued that “the chatbot is a separate legal entity that is responsible for its own actions,” presumably the first time a Canadian company has tried to prove it’s not liable for the actions of its chatbot.
Recommended by Our Editors
The court ultimately sided with the customer, forcing Air Canada to issue the passenger a partial refund of $650.88 as well as pay for his court fees.The case also has seemed to impact that chatbot’s employment — as if this morning it’s no longer operational on Air Canada’s site.
Get Our Best Stories!
Sign up for What’s New Now to get our top stories delivered to your inbox every morning.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
[ad_2]