Dan Gillmor on Nostr: An Air Canada "AI"-powered chatbot gave a customer bogus information. Then the ...
An Air Canada "AI"-powered chatbot gave a customer bogus information. Then the airline said it wouldn't honor what it had told the customer, claiming the bot wasn't actually part of the airline (despite being run from its website).
A court said "give me a break" and ordered the airline to honor what the bot said.
https://bc.ctvnews.ca/air-canada-s-chatbot-gave-a-b-c-man-the-wrong-information-now-the-airline-has-to-pay-for-the-mistake-1.6769454
This needs to become the rule for all such things.
A court said "give me a break" and ordered the airline to honor what the bot said.
https://bc.ctvnews.ca/air-canada-s-chatbot-gave-a-b-c-man-the-wrong-information-now-the-airline-has-to-pay-for-the-mistake-1.6769454
This needs to become the rule for all such things.