Vector Embeddings

What is Vector Embeddings?

Vector embeddings are numerical representations of data (words, sentences, images) as points in a high-dimensional space. By mapping similar concepts closer together, embeddings allow computers to 'understand' semantic meaning and relationships mathematically.

Where did the term "Vector Embeddings" come from?

Popularized by Google's Word2Vec (2013) and later Transformer-based embeddings (BERT, OpenAI).

How is "Vector Embeddings" used today?

The engine behind modern Search, RAG (Retrieval-Augmented Generation), and Recommendation Systems.

Related Terms