ChatGLM.cpp is a C++ implementation of the ChatGLM-6B model, enabling efficient local inference without requiring a Python environment. It is optimized for running on consumer hardware.

Features

  • Provides a C++ implementation of ChatGLM-6B
  • Supports running models on CPU and GPU
  • Optimized for low-memory hardware and edge devices
  • Allows quantization for reduced resource consumption
  • Works as a lightweight alternative to Python-based inference
  • Offers real-time chatbot capabilities

Project Samples

Project Activity

See All Activity >

License

MIT License

Follow ChatGLM.cpp

ChatGLM.cpp Web Site

Other Useful Business Software
Vibes don’t ship, Retool does Icon
Vibes don’t ship, Retool does

Start from a prompt and build production-ready apps on your data—with security, permissions, and compliance built in.

Vibe coding tools create cool demos, but Retool helps you build software your company can actually use. Generate internal apps that connect directly to your data—deployed in your cloud with enterprise security from day one. Build dashboards, admin panels, and workflows with granular permissions already in place. Stop prototyping and ship on a platform that actually passes security review.
Build apps that ship
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of ChatGLM.cpp!

Additional Project Details

Operating Systems

Linux, Mac, Windows

Programming Language

C++

Related Categories

C++ Large Language Models (LLM), C++ Natural Language Processing (NLP) Tool, C++ AI Models, C++ LLM Inference Tool

Registered

2025-01-21