← All terms
Retrieval
Embedding
A numeric vector representation of text used for similarity search.
An embedding is a fixed-length vector of floating-point numbers (typically 768-3072 dimensions) produced by an embedding model. Two pieces of text with similar meaning produce vectors that are close in that high-dimensional space. Embeddings are the foundation of semantic search and RAG — you embed your documents once, embed the user query at runtime, and find the closest matches. Embedding model selection matters: text-embedding-3-large, voyage-3, and nomic-embed-text-v1.5 are common 2026 choices.
Related terms
Building with Embedding?
We ship production AI systems built around concepts like this every quarter. Send a brief and get a written proposal in 48 hours.
Send a brief →