Easily add relevant documents, chat history memory & rich user data to your LLM app's prompts. Understands chat messages, roles, and user metadata, not just texts and embeddings. Zep Memory and VectorStore implementations are shipped with your favorite frameworks: LangChain, LangChain.js, LlamaIndex, and more. Automatically embed texts and messages using state-of-the-art opeb source models, OpenAI, or bring your own vectors. Zep’s local embedding models and async enrichment ensure a snappy user experience.
Features
- Designed for building conversational LLM applications
- Vector Database with Hybrid Search
- Batteries Included Embedding & Enrichment
- Fast, low-latency APIs and stateless deployments
- Python & TypeScript/JS SDKs, Edge Deployment
- TypeScript/JS SDK supports edge deployment
Categories
Large Language Models (LLM)License
Apache License V2.0Follow Zep
Other Useful Business Software
Go From Idea to Deployed AI App Fast
Access Gemini 3 and 200+ models. Build chatbots, agents, or custom models with built-in monitoring and scaling.
Rate This Project
Login To Rate This Project
User Reviews
Be the first to post a review of Zep!