Browse free open source LLM Inference tools and projects for Mac below. Use the toggles on the left to filter open source LLM Inference tools by OS, license, language, programming language, and project status.
Port of OpenAI's Whisper model in C/C++
Port of Facebook's LLaMA model in C/C++
Run Local LLMs on Any Device. Open-source
ONNX Runtime: cross-platform, high performance ML inferencing
A high-throughput and memory-efficient inference and serving engine
User-friendly AI Interface
OpenVINO™ Toolkit repository
High-performance neural network inference framework for mobile
Protect and discover secrets using Gitleaks
Self-hosted, community-driven, local OpenAI compatible API
State-of-the-art diffusion models for image and audio generation
The official Python client for the Huggingface Hub
Visual Instruction Tuning: Large Language-and-Vision Assistant
Easiest and laziest way for building multi-agent LLMs applications
Replace OpenAI GPT with another LLM in your app
Fast inference engine for Transformer models
C++ implementation of ChatGLM-6B & ChatGLM2-6B & ChatGLM3 & GLM4(V)
LLMs as Copilots for Theorem Proving in Lean
Easy-to-use deep learning framework with 3 key features
Everything you need to build state-of-the-art foundation models
Bring the notion of Model-as-a-Service to life
AIMET is a library that provides advanced quantization and compression
Open-Source AI Camera. Empower any camera/CCTV
Lightweight Python library for adding real-time multi-object tracking
A RWKV management and startup tool, full automation, only 8MB