ChatGLM.cpp is a C++ implementation of the ChatGLM-6B model, enabling efficient local inference without requiring a Python environment. It is optimized for running on consumer hardware.

Features

  • Provides a C++ implementation of ChatGLM-6B
  • Supports running models on CPU and GPU
  • Optimized for low-memory hardware and edge devices
  • Allows quantization for reduced resource consumption
  • Works as a lightweight alternative to Python-based inference
  • Offers real-time chatbot capabilities

Project Samples

Project Activity

See All Activity >

License

MIT License

Follow ChatGLM.cpp

ChatGLM.cpp Web Site

Other Useful Business Software
$300 in Free Credit for Your Google Cloud Projects Icon
$300 in Free Credit for Your Google Cloud Projects

Build, test, and explore on Google Cloud with $300 in free credit. No hidden charges. No surprise bills.

Launch your next project with $300 in free Google Cloud credit—no hidden charges. Test, build, and deploy without risk. Use your credit across the Google Cloud platform to find what works best for your needs. After your credits are used, continue building with free monthly usage products. Only pay when you're ready to scale. Sign up in minutes and start exploring.
Start Free Trial
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of ChatGLM.cpp!

Additional Project Details

Operating Systems

Linux, Mac, Windows

Programming Language

C++

Related Categories

C++ Large Language Models (LLM), C++ Natural Language Processing (NLP) Tool, C++ AI Models, C++ LLM Inference Tool

Registered

2025-01-21