Perplexity

What is Perplexity?

Perplexity measures how 'surprised' a model is by a text sequence. Lower perplexity means the model predicts the text with higher confidence.

Where did the term "Perplexity" come from?

Fundamental NLP metric.

How is "Perplexity" used today?

Good for pre-training checks, bad for chat quality assessment.

Related Terms