Vector embeddings are numerical representations of data (words, sentences, images) as points in a high-dimensional space. By mapping similar concepts closer together, embeddings allow computers to 'understand' semantic meaning and relationships mathematically.
Popularized by Google's Word2Vec (2013) and later Transformer-based embeddings (BERT, OpenAI).
The engine behind modern Search, RAG (Retrieval-Augmented Generation), and Recommendation Systems.