Easily add relevant documents, chat history memory & rich user data to your LLM app's prompts. Understands chat messages, roles, and user metadata, not just texts and embeddings. Zep Memory and VectorStore implementations are shipped with your favorite frameworks: LangChain, LangChain.js, LlamaIndex, and more. Automatically embed texts and messages using state-of-the-art opeb source models, OpenAI, or bring your own vectors. Zep’s local embedding models and async enrichment ensure a snappy user experience.

Features

  • Designed for building conversational LLM applications
  • Vector Database with Hybrid Search
  • Batteries Included Embedding & Enrichment
  • Fast, low-latency APIs and stateless deployments
  • Python & TypeScript/JS SDKs, Edge Deployment
  • TypeScript/JS SDK supports edge deployment

Project Samples

Project Activity

See All Activity >

License

Apache License V2.0

Follow Zep

Zep Web Site

Other Useful Business Software
Cut Cloud Costs with Google Compute Engine Icon
Cut Cloud Costs with Google Compute Engine

Save up to 91% with Spot VMs and get automatic sustained-use discounts. One free VM per month, plus $300 in credits.

Save on compute costs with Compute Engine. Reduce your batch jobs and workload bill 60-91% with Spot VMs. Compute Engine's committed use offers customers up to 70% savings through sustained use discounts. Plus, you get one free e2-micro VM monthly and $300 credit to start.
Try Compute Engine
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of Zep!

Additional Project Details

Programming Language

Go

Related Categories

Go Large Language Models (LLM)

Registered

2023-08-25