Blog

šŸ˜µā€šŸ’« Hallucinations… or creativity? šŸ’”

This article by Xavier Vinaixa Roselló, published on January 14, 2026, proposes a fundamental revision of the perception of ā€œhallucinationsā€ in artificial intelligence (AI). The author argues that what is commonly classified as a critical error in large language models (LLMs) —such as inventing data or citations— could actually be the manifestation of their creativity and the mechanism by which they generate novelty.

The mechanical creativity of AI and thermodynamics
temperature

The mechanical creativity of AI and thermodynamics

The text draws a parallel between the mechanical creativity of AI and thermodynamics, using the concept of temperature in the model’s softmax function.

  1. Low temperature (Cold): The system becomes deterministic, choosing the statistically most probable option (absolute order). This produces precise but boring results.
  2. High temperature (Heat): Energy and stochasticity are injected into the system, allowing the model to ā€œrescueā€ less probable options. This controlled deviation from the statistically optimal path is what generates novelty, whether labelled as ā€œerrorā€ (false data) or ā€œcreativityā€ (unusual association).
Deep network hallucination
generate

Deep network hallucination

  1. The author points out that eliminating the ability to hallucinate would be equivalent to limiting generative capacity, since neural noise is necessary even for the human brain. As proof, the work of David Baker, winner of the 2024 Nobel Prize in Chemistry, is cited for designing de novo proteins through ā€œdeep network hallucinationā€, demonstrating that error (the invention of non-existent molecules) is a tool for expanding knowledge.

In art, artists such as Refik Anadol are already exploring this aesthetic of chaos through their machine hallucinations series.

Hallucination is the price of originality
serendipity

Hallucination is the price of originality

The main conclusion is that generative AI should be seen as a serendipity machine rather than an infallible encyclopedia. The pursuit of absolute truth is counterproductive for novelty. In summary, hallucination is the price of originality, as Andrej Karpathy stated: in generative models, hallucination is a feature, not a bug.

This site is registered on wpml.org as a development site. Switch to a production site key to remove this banner.