“AI advice kills 1000s”

Dan Roy @roydanroy  Amazing news for researchers working on hallucinations. x.com/mlittmancs/sta…
Michael Littman @mlittmancs Airline installs chatbot. Customer gets bad information from it. Customer asks for refund. Airline says “the chatbot is a separate legal entity that is responsible for its own actions” (!). Court says no. (Whew.) Airline makes refund and turns off chatbot. https://arstechnica.com/tech-policy/2024/02/air-canada-must-honor-refund-policy-invented-by-airlines-chatbot/
Replying to @roydanroy

I think the developers are also liable, should be held accountable, and heavily penalized – to teach them not to use smoke and mirrors methods. Who convinced the airline to buy their “AI” with no certification, nor audit?  GPTs are not keeping good records and would fail most independent audits.

The developers are also liable, should be held accountable, and heavily penalized – to teach them not to use smoke and mirrors methods. Who convinced the airline to buy their “AI” with no certification, nor audit?  Most “AIs” will fail independent audits. “AI advice kills 1000s”.

Richard K Collins

About: Richard K Collins

The Internet Foundation Internet policies, global issues, global open lossless data, global open collaboration


Leave a Reply

Your email address will not be published. Required fields are marked *