Browse free open source LLM Inference tools and projects for Windows below. Use the toggles on the left to filter open source LLM Inference tools by OS, license, language, programming language, and project status.
Port of OpenAI's Whisper model in C/C++
ONNX Runtime: cross-platform, high performance ML inferencing
Run Local LLMs on Any Device. Open-source
Port of Facebook's LLaMA model in C/C++
C++ library for high performance inference on NVIDIA GPUs
High-performance neural network inference framework for mobile
Self-hosted, community-driven, local OpenAI compatible API
Ready-to-use OCR with 80+ supported languages
A high-throughput and memory-efficient inference and serving engine
OpenVINO™ Toolkit repository
User-friendly AI Interface
Protect and discover secrets using Gitleaks
PArallel Distributed Deep LEarning: Machine Learning Framework
Lightweight anchor-free object detection model
Training and deploying machine learning models on Amazon SageMaker
Library for OCR-related tasks powered by Deep Learning
The deep learning toolkit for speech-to-text
Run serverless GPU workloads with fast cold starts on bare-metal
Low-latency REST API for serving text-embeddings
A RWKV management and startup tool, full automation, only 8MB
State-of-the-art diffusion models for image and audio generation
State-of-the-art Parameter-Efficient Fine-Tuning
Integrate, train and manage any AI models and APIs with your database
Bring the notion of Model-as-a-Service to life
Open standard for machine learning interoperability