Triton Inference Server is an open-source inference serving software that streamlines AI inferencing. Triton enables teams to deploy any AI model from multiple deep learning and machine learning frameworks, including TensorRT, TensorFlow, PyTorch, ONNX, OpenVINO, Python, RAPIDS FIL, and more. Triton supports inference across cloud, data center, edge, and embedded devices on NVIDIA GPUs, x86 and ARM CPU, or AWS Inferentia. Triton delivers optimized performance for many query types, including real-time, batched, ensembles, and audio/video streaming. Provides Backend API that allows adding custom backends and pre/post-processing operations. Model pipelines using Ensembling or Business Logic Scripting (BLS). HTTP/REST and GRPC inference protocols based on the community-developed KServe protocol. A C API and Java API allow Triton to link directly into your application for edge and other in-process use cases.

Features

  • Supports multiple deep learning frameworks
  • Supports multiple machine learning frameworks
  • Concurrent model execution
  • Dynamic batching
  • Sequence batching and implicit state management for stateful models
  • Provides Backend API that allows adding custom backends and pre/post processing operations

Project Samples

Project Activity

See All Activity >

License

BSD License

Follow Triton Inference Server

Triton Inference Server Web Site

Other Useful Business Software
Zenflow- The AI Workflow Engine for Software Devs Icon
Zenflow- The AI Workflow Engine for Software Devs

Parallel agents. Multi-agent orchestration. Specs that turn into shipped code. Zenflow automates planning, coding, testing, and verification.

Zenflow is the AI workflow engine built for real teams. Parallel agents plan, code, test, and verify in one workflow. With spec-driven development and deep context, Zenflow turns requirements into production-ready output so teams ship faster and stay in flow.
Try free now
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of Triton Inference Server!

Additional Project Details

Operating Systems

Windows

Programming Language

Python

Related Categories

Python Machine Learning Software, Python Deep Learning Frameworks, Python LLM Inference Tool

Registered

2022-08-03