πMCP Guide
Introduction
The Model Context Protocol (MCP) π is a protocol for communicating with LLMs, primarily used for exchanging information (context) or performing actions.
π enVector supports using MCP with various AI applications. You can interact with your LLMs like Claude through natural language inputs.
This guide supports how to start enVector MCP in details.
Prerequisites
Python version >= 3.10
A running enVector instance: Docker Compose or enVector Cloud
Cloned GitHub Repository of enVector MCP server: https://github.com/CryptoLabInc/envector-mcp-server
Note: The base MCP library requires Python 3.10+ while pyenvector supports 3.9+.
Supporting tools
get_index_list: Get the list of indexes in enVector.get_index_info: Get information about a specific index in enVector.create_index: Create an index in enVector.insert: Insert vectors and the corresponding metadata into enVector index. Supports to specify embedding model to get embedding vectors to insert.search: Perform vector search and Retrieve Metadata from enVector. Supports to specify embedding model to get embedding vectors to search.
Getting Started
Run enVector server via Docker Compose or enVector Cloud.
Clone the envector-mcp-server and setting up the Python environment.
See more details in envector-mcp-server manuals.
(Optional) Run MCP server with the following example command:
This is optional when you run MCP server via AI application (e.g. Claude Desktop). See more details in Run MCP Server.
Connect your AI application (e.g. Claude) to the MCP server by configuring settings and Chat with enVector π
See more details in Connect MCP Server.
Connect MCP Server
Connect to Remote MCP Server
Run MCP server on your Python environment with the following example command:
See more CLI options on CLI Options for running MCP server.
Configure your MCP settings
.some-mcp-config.jsonin your application.
Run MCP Server via AI application
Some AI applications support to run MCP server on their applications directly, e.g. Claude Desktop. Configure your MCP settings .some-mcp-config.json in your application.
MCP Server Options
CLI Options
Arguments to run Python scripts:
π» MCP execution
--mode: MCP execution mode, supportinghttp(default) andstdiotransports.--host: MCP HTTP bind host. The default is127.0.0.1.--port: MCP HTTP bind port. The default is8000.--address: MCP HTTP bind address. Overrides--hostand--portif provided.--server-name: MCP server name. The default isenvector_mcp_server.
π enVector connection
--envector-address: enVector endpoint address ({host}:{port}or enVector Cloud endpoint ends with.clusters.envector.io).--envector-cloud-access-token: access token of enVector Cloud.
π enVector options
--envector-key-id: enVector key id (identifier).--envector-key-path: path to enVector key files.--envector-eval-mode: enVector FHE evaluation mode. Recommend to usermp(default) mode for more flexible usage.--encrypted-query: whether to encrypt the query vectors. The index is encrypted by default.
β οΈ Note: MCP server holds the key for homomorphic encryption as MCP server is a enVector Client.
βοΈ Embedding options
--embedding-mode: Mode of the embedding model. Supportsfemb(FastEmb; by default),hf(huggingface),sbert(SBERT; sentence-transformers), andopenai(OpenAI API). Foropenai, required to set environmental variableOPENAI_API_KEY.--embedding-model: Embedding model name to use enVector. Thesentence-transformers/all-MiniLM-L6-v2set as default, which dimension is 384.
Use environment variables
Copy .env.example to .env and configure .env as you want.
The detailed description of each options can be found in CLI Options.
Next Steps
Last updated

