As AI-powered chatbots become increasingly integrated into everyday life and business operations, speed and convenience are easily drawn in. But the voice of Simon Clayton, Reftech’s top idea manager, encourages a moment of attention.
The event industry employs artificial intelligence, and chatbots can be seen as a simple and effective first step to using AI. These “digital assistants” provide efficiency, cost savings and 24-hour support. However, recent incidents involving Air Canada should serve as a warning to businesses considering handing over customer interactions to AI without adequate supervision.
Air Canada chatbot big failure
In a widely reported case, Air Canada was forced to provide partial refunds to passengers after the chatbot misunderstood the airline’s bereavement fare policy. The chatbot has incorrectly notified its customers that they can retroactively apply for discounted fees, a statement that is directly inconsistent with the airline’s actual policy. When the customer later asked for a refund, Air Canada refused and was only dismissed by a court that held the company liability for the incorrect information generated by the AI system. The ruling dismissed Air Canada’s claim that the chatbot is a different entity and confirmed that the company is responsible for all information presented on its website, whether it was written by a human or AI.
This is why it’s important for the event industry
The meaning is clear to the event organizers and venues. While chatbots may seem like an easy way to provide a quick response to attendee queries and assist exhibitors, they do come with risk. Organizations may be liable if AI-driven systems provide misinformation about stand pricing, refund policies, venue availability, event access, or health and safety regulations.
Unlike people who can apply discretion and verify details, chatbots operate with predefined logic and data. If that data is outdated or incomplete, the chatbot can confidently provide false answers, causing customers to become frustrated and the company to remain legally exposed.
Event business lessons
The event industry thrives in trust and relationships. Chatbots can play a role in customer support, but should not be treated as human alternatives. Here’s how to mitigate the risk:
1. Please check for regular updates and fact checks.
The AI model is as good as the information being trained. Make sure your chatbot answers are reviewed and updated regularly to reflect current policies, pricing and terms.
2. Always provide human backup.
A chatbot should not be the sole contact for customer queries, particularly customer queries that include financial transactions and bookings. It is essential to provide a clear and easy route to escalate the problem to humans.
3. Monitoring and Audit Responses:
Regular audits of chatbot interactions can help you identify repetitive inaccuracies. If your chatbot provides consistently misleading answers, it may be time to retrain or rethink customer use in interactions.
4. It’s transparent about AI restrictions:
Customers should be careful when interacting with chatbots rather than humans. To clearly state that chatbots are automated systems and advise users to review important information can help you manage your expectations and reduce the risk of conflict.
The Air Canada case serves as a warning substance for any industry that uses AI in its customer service role. Chatbots can increase efficiency, but they definitely shouldn’t be handled. Overreliance on AI without proper monitoring can lead to misinformation, customer complaints, and legal liability.
The lesson is clear. AI can be a useful tool, but it must be implemented with safety nets of attention, regular monitoring, and human expertise. Otherwise, businesses risk discovering they are in the same predicament as Air Canada. No matter the cost, we have to confront the mistakes that have been generated by AI.
Source link