Vector embeddings are numerical representations of data (like words or images) in a high-dimensional space. Similar concepts are placed closer together in this space, allowing computers to understand semantic relationships.
Popularized by Word2Vec and subsequent embedding techniques.
The backbone of semantic search, recommendation systems, and RAG architectures.