A story about an AI generated article contained fabricated, AI generated quotes.
Archived version: https://archive.is/20260215215759/https://www.404media.co/ars-technica-pulls-article-with-ai-fabricated-quotes-about-ai-generated-article/
A story about an AI generated article contained fabricated, AI generated quotes.
Archived version: https://archive.is/20260215215759/https://www.404media.co/ars-technica-pulls-article-with-ai-fabricated-quotes-about-ai-generated-article/
AI does not lie. People using untrustworthy AI lie when they promote it as their own work.
Edit: to clarify, a day later. AI has wrong facts and cannot be trusted or used well. But machines don’t lie, people promoting the use of things that create misinformation lie. I wrote this to mimic the “guns kill” argument because I thought it would be fun to see reactions .
I learned a lot from this
I’m pretty sure it lies
Saying Generative AI lies is attributing the ability to reason to it. That’s not what it’s doing. It can’t think. It doesn’t “understand”.
So at best it can fabricate information by choosing the statistically best word that comes next based on its training set. That’s why there is a distinction between Generative AI hallucinations and actual lying. Humans lie. They tell untruths because they have a motive to. The Generative AI can’t have a motive.
People made AI to lie. When companies make something that does not work and promote it as reliable, that’s on the people doing that.
When faulty products are used by people, that’s on them.
I can no more blame AI than I could a car used during a robbery . Both are tools
but what if the car lied though.
The car’s AI lied.
S[ai] be[lie]ve[d]
It’s exactly like the “guns kill people” arguments. I would like all this AI stuff to go away, the tech is not ready to be used.
Last year AI claimed “bleach” is a popular pizza topping. Nobody claimed this as their own work. It’s just what a chatbot said.
Are you saying AI didn’t lie? Is bleach a popular pizza topping?
What it did was assemble words based on a statistical probability model. It’s not lying because it doesn’t want to deceive, because it has no wants and no concept of truth or deception.
Of course, it sure looks like it’s telling the truth. Google engineered it that way, putting it in front of actual search results. IMO the head liar is Sundar Pichai, the man who decided to show it to people.
To be able to lie, you need to know what truth is. AI doesn’t know that, these tools don’t have the concept of right vs wrong nor truth vs lie.
What they do is assemble words based on statistical patterns of languages.
“bleach is a popular pizza topping”, from the “perspective” of AI, is just a sequence of words that works in the English language, it has no meaning to them.
Being designed to create language patterns in a statistical way is the reason why they hallucinate, but you can’t call those “lies” because AI tools have no such concept.
AI has a high rate of hallucinations…