BLOG POST

Tech & Sourcing @ Morgan Lewis

TECHNOLOGY TRANSACTIONS, OUTSOURCING, AND COMMERCIAL CONTRACTS NEWS FOR LAWYERS AND SOURCING PROFESSIONALS

An ever-increasing number of companies are choosing to use chatbots on their website, in their sales organizations, and to help with customer service. In fact, according to Vantage Market Research, the chatbot market will grow over 23% by 2030. A chatbot can provide a useful tool for consumers who are looking for quick and easy access to information as well as companies looking to provide a high level of attention and service, while allowing its employees to focus on other demands. However, companies should remain aware of and monitor the information the chatbot is sharing.

Recently, a tribunal in Canada decided that Air Canada was responsible for the discount offered to a passenger by the chatbot on the airline’s website even though the airline said that information was inaccurate. A passenger had requested information through the airline’s chatbot on discounted bereavement fares. The chatbot responded to the passenger with a link to the airline’s policy as well as the statement that the passenger could submit their ticket for a reduced bereavement rate “within 90 days of the date your ticket was issued by completing our Ticket Refund Application form.” Relying on this response, the passenger booked their ticket and, as directed, submitted their paperwork for the bereavement rate within the specified time period.

To detriment of the passenger, the chatbot’s response and information regarding the time to submit a claim was inconsistent with the language in the airline’s policy, which was also available on the website and linked in the chatbot’s response. Therefore, the airline rejected the application for the bereavement fair.

The tribunal’s decision in favor of the passenger touched on the airline’s duty of care and its obligation to “take reasonable care to ensure their representations are accurate and not misleading.” The tribunal pointed out that the passenger should not have the obligation to fact check the policy against the instructions of the chatbot and their decision appears to support the idea that the airline is responsible for the information being provided by the chatbot, whether accurate or not.

Although based on Canadian law, this type of dispute and decision provide an example of the types of claims and cases that may arise when chatbots are not monitored and distribute incorrect information and may indicate similar types of claims or decisions that could arise in other jurisdictions. This decision further emphasizes the importance of both diligence and continuously testing, improving, and updating the chatbot tools a company chooses to make available to its customers or website visitors.

If a company chooses to leverage chatbot tools, it should also choose to undertake a process of ensuring that the chatbot is accurate in order to mitigate any risk of providing false or misleading information to consumers.