Vector
In modern AI systems, models convert text, images, audio, and other modalities into numerical embeddings called vectors. These vectors capture semantic meaning, enabling similarity search, retrieval, and recommendation without handcrafted rules.
When vectors are L2‑normalized, enVector’s Inner Product (IP) scoring corresponds to cosine similarity. This makes “semantic closeness” measurable with fast linear algebra while keeping data encrypted end‑to‑end.
Definition
Type: Real‑valued vectors (floating point)
Dimension: 32-4096 (max 4096)
Similarity: Inner Product (IP); with L2‑normalized vectors, IP ≡ cosine similarity
Storage: Encrypted at rest and used for encrypted computation
Ingestion
Client must encrypt vectors with the SDK before transmission; indexes only accept ciphertext data.
Server persists encrypted vectors; it does not require client secret keys.
Metadata can be attached alongside vectors for later retrieval.
Querying
Query vectors may be sent as plaintext (faster) or ciphertext (more private).
Similarity is computed using Inner Product (IP) over encrypted representations.
Last updated

