Temperature is a setting that controls the randomness of an LLM's output. High temperature results in more creative and diverse responses, while low temperature produces more deterministic and focused answers.
Derived from statistical thermodynamics concepts applied to the softmax function in sampling.
A common user-adjustable parameter in AI interfaces to tune creativity vs. precision.