Browse free open source Python LLM Inference Tools and projects below. Use the toggles on the left to filter open source Python LLM Inference Tools by OS, license, language, programming language, and project status.
Run Local LLMs on Any Device. Open-source
A high-throughput and memory-efficient inference and serving engine
Deep learning optimization library: makes distributed training easy
Ready-to-use OCR with 80+ supported languages
Lightweight anchor-free object detection model
State-of-the-art diffusion models for image and audio generation
Library for OCR-related tasks powered by Deep Learning
The official Python client for the Huggingface Hub
20+ high-performance LLMs with recipes to pretrain, finetune at scale
A unified framework for scalable computing
Visual Instruction Tuning: Large Language-and-Vision Assistant
Trainable, memory-efficient, and GPU-friendly PyTorch reproduction
Large Language Model Text Generation Inference
Bring the notion of Model-as-a-Service to life
Pytorch domain library for recommendation systems
A set of Docker images for training and serving models in TensorFlow
Everything you need to build state-of-the-art foundation models
Training and deploying machine learning models on Amazon SageMaker
Replace OpenAI GPT with another LLM in your app
Unified Model Serving Framework
Standardized Serverless ML Inference Platform on Kubernetes
LMDeploy is a toolkit for compressing, deploying, and serving LLMs
Easiest and laziest way for building multi-agent LLMs applications
A Pythonic framework to simplify AI service building
OpenMMLab Model Deployment Framework