šµāš« Hallucinations⦠or creativity? š”
This article by Xavier Vinaixa Roselló, published on January 14, 2026, proposes a fundamental revision of the perception of āhallucinationsā in artificial intelligence (AI). The author argues that what is commonly classified as a critical error in large language models (LLMs) āsuch as inventing data or citationsā could actually be the manifestation of their creativity and the mechanism by which they generate novelty.
The mechanical creativity of AI and thermodynamics
The text draws a parallel between the mechanical creativity of AI and thermodynamics, using the concept of temperature in the modelās softmax function.
- Low temperature (Cold): The system becomes deterministic, choosing the statistically most probable option (absolute order). This produces precise but boring results.
- High temperature (Heat): Energy and stochasticity are injected into the system, allowing the model to ārescueā less probable options. This controlled deviation from the statistically optimal path is what generates novelty, whether labelled as āerrorā (false data) or ācreativityā (unusual association).
Deep network hallucination
- The author points out that eliminating the ability to hallucinate would be equivalent to limiting generative capacity, since neural noise is necessary even for the human brain. As proof, the work of David Baker, winner of the 2024 Nobel Prize in Chemistry, is cited for designing de novo proteins through ādeep network hallucinationā, demonstrating that error (the invention of non-existent molecules) is a tool for expanding knowledge.
In art, artists such as Refik Anadol are already exploring this aesthetic of chaos through their machine hallucinations series.
Hallucination is the price of originality
The main conclusion is that generative AI should be seen as a serendipity machine rather than an infallible encyclopedia. The pursuit of absolute truth is counterproductive for novelty. In summary, hallucination is the price of originality, as Andrej Karpathy stated: in generative models, hallucination is a feature, not a bug.