Alignment

What is Alignment?

Alignment is the field of AI safety focused on ensuring that AI systems' goals and behaviors match human intent and values.

Where did the term "Alignment" come from?

Grew alongside capabilities research to address potential risks.

How is "Alignment" used today?

Critical for deploying safe and trusted AI systems.

Related Terms