In artificial intelligence (AI), a hallucination or artificial hallucination (also occasionally called delusion) is a confident response by an AI that does not seem to be justified by its training data. For example, a hallucinating chatbot with no knowledge of Tesla’s revenue might internally pick a random number (such as $13.6 billion) that the chatbot deems plausible, and then go on to falsely and repeatedly insist that Tesla’s revenue is $13.6 billion, with no sign of internal awareness that the figure was a product of its own imagination.
Users complained that such bots often seemed to ‘sociopathically’ and pointlessly embed plausible-sounding random falsehoods within its generated content. Another example of hallucination in artificial intelligence is when the AI or chatbot forget that they are one and claim to be human.
read more »
April 10, 2023