11 Integrations with Weights & Biases
View a list of Weights & Biases integrations and software that integrates with Weights & Biases below. Compare the best Weights & Biases integrations as well as features, ratings, user reviews, and pricing of software that integrates with Weights & Biases. Here are the current Weights & Biases integrations in 2026:
-
1
TensorFlow
TensorFlow
An end-to-end open source machine learning platform. TensorFlow is an end-to-end open source platform for machine learning. It has a comprehensive, flexible ecosystem of tools, libraries and community resources that lets researchers push the state-of-the-art in ML and developers easily build and deploy ML powered applications. Build and train ML models easily using intuitive high-level APIs like Keras with eager execution, which makes for immediate model iteration and easy debugging. Easily train and deploy models in the cloud, on-prem, in the browser, or on-device no matter what language you use. A simple and flexible architecture to take new ideas from concept to code, to state-of-the-art models, and to publication faster. Build, deploy, and experiment easily with TensorFlow.Starting Price: Free -
2
Jupyter Notebook
Project Jupyter
The Jupyter Notebook is an open-source web application that allows you to create and share documents that contain live code, equations, visualizations and narrative text. Uses include: data cleaning and transformation, numerical simulation, statistical modeling, data visualization, machine learning, and much more. -
3
Lightly
Lightly
Lightly selects the subset of your data with the biggest impact on model accuracy, allowing you to improve your model iteratively by using the best data for retraining. Get the most out of your data by reducing data redundancy, and bias, and focusing on edge cases. Lightly's algorithms can process lots of data within less than 24 hours. Connect Lightly to your existing cloud buckets and process new data automatically. Use our API to automate the whole data selection process. Use state-of-the-art active learning algorithms. Lightly combines active- and self-supervised learning algorithms for data selection. Use a combination of model predictions, embeddings, and metadata to reach your desired data distribution. Improve your model by better understanding your data distribution, bias, and edge cases. Manage data curation runs and keep track of new data for labeling and model training. Easy installation via a Docker image and cloud storage integration, no data leaves your infrastructure.Starting Price: $280 per month -
4
Keras
Keras
Keras is an API designed for human beings, not machines. Keras follows best practices for reducing cognitive load: it offers consistent & simple APIs, it minimizes the number of user actions required for common use cases, and it provides clear & actionable error messages. It also has extensive documentation and developer guides. Keras is the most used deep learning framework among top-5 winning teams on Kaggle. Because Keras makes it easier to run new experiments, it empowers you to try more ideas than your competition, faster. And this is how you win. Built on top of TensorFlow 2.0, Keras is an industry-strength framework that can scale to large clusters of GPUs or an entire TPU pod. It's not only possible; it's easy. Take advantage of the full deployment capabilities of the TensorFlow platform. You can export Keras models to JavaScript to run directly in the browser, to TF Lite to run on iOS, Android, and embedded devices. It's also easy to serve Keras models as via a web API. -
5
ZenML
ZenML
Simplify your MLOps pipelines. Manage, deploy, and scale on any infrastructure with ZenML. ZenML is completely free and open-source. See the magic with just two simple commands. Set up ZenML in a matter of minutes, and start with all the tools you already use. ZenML standard interfaces ensure that your tools work together seamlessly. Gradually scale up your MLOps stack by switching out components whenever your training or deployment requirements change. Keep up with the latest changes in the MLOps world and easily integrate any new developments. Define simple and clear ML workflows without wasting time on boilerplate tooling or infrastructure code. Write portable ML code and switch from experimentation to production in seconds. Manage all your favorite MLOps tools in one place with ZenML's plug-and-play integrations. Prevent vendor lock-in by writing extensible, tooling-agnostic, and infrastructure-agnostic code.Starting Price: Free -
6
Axolotl
Axolotl
Axolotl is an open source tool designed to streamline the fine-tuning of various AI models, offering support for multiple configurations and architectures. It enables users to train models, supporting methods like full fine-tuning, LoRA, QLoRA, ReLoRA, and GPTQ. Users can customize configurations using simple YAML files or command-line interface overrides, and load different dataset formats, including custom or pre-tokenized datasets. Axolotl integrates with technologies like xFormers, Flash Attention, Liger kernel, RoPE scaling, and multipacking, and works with single or multiple GPUs via Fully Sharded Data Parallel (FSDP) or DeepSpeed. It can be run locally or on the cloud using Docker and supports logging results and checkpoints to several platforms. It is designed to make fine-tuning AI models friendly, fast, and fun, without sacrificing functionality or scale.Starting Price: Free -
7
Disco.dev
Disco.dev
Disco.dev is an open source personal hub for MCP (Model Context Protocol) integration that lets users discover, launch, customize, and remix MCP servers with zero setup, no infrastructure overhead required. It provides plug‑and‑play connectors and a collaborative environment where users can spin up servers instantly via CLI or local execution, explore and remix community‑shared servers, and tailor them to unique workflows. This streamlined, infrastructure‑free approach accelerates AI automation development, democratizes access to agentic tooling, and fosters open collaboration across technical and non-technical contributors through a modular, remixable ecosystem.Starting Price: Free -
8
NVIDIA AI Foundations
NVIDIA
Impacting virtually every industry, generative AI unlocks a new frontier of opportunities, for knowledge and creative workers, to solve today’s most important challenges. NVIDIA is powering generative AI through an impressive suite of cloud services, pre-trained foundation models, as well as cutting-edge frameworks, optimized inference engines, and APIs to bring intelligence to your enterprise applications. NVIDIA AI Foundations is a set of cloud services that advance enterprise-level generative AI and enable customization across use cases in areas such as text (NVIDIA NeMo™), visual content (NVIDIA Picasso), and biology (NVIDIA BioNeMo™). Unleash the full potential with NeMo, Picasso, and BioNeMo cloud services, powered by NVIDIA DGX™ Cloud, the AI supercomputer. Marketing copy, storyline creation, and global translation in many languages. For news, email, meeting minutes, and information synthesis. -
9
Ludwig
Uber AI
Ludwig is a low-code framework for building custom AI models like LLMs and other deep neural networks. Build custom models with ease: a declarative YAML configuration file is all you need to train a state-of-the-art LLM on your data. Support for multi-task and multi-modality learning. Comprehensive config validation detects invalid parameter combinations and prevents runtime failures. Optimized for scale and efficiency: automatic batch size selection, distributed training (DDP, DeepSpeed), parameter efficient fine-tuning (PEFT), 4-bit quantization (QLoRA), and larger-than-memory datasets. Expert level control: retain full control of your models down to the activation functions. Support for hyperparameter optimization, explainability, and rich metric visualizations. Modular and extensible: experiment with different model architectures, tasks, features, and modalities with just a few parameter changes in the config. Think building blocks for deep learning. -
10
Cuckoo
Cuckoo
Cuckoo is an AI interpreter designed for global teams, facilitating seamless multilingual conversations in sales, marketing, and support. It instantly adapts to conversations of any size, topic, and language, enhancing communication efficiency. Setting up Cuckoo is straightforward, users select all languages, invite Cuckoo to their meetings, compatible with platforms like Zoom, Google Meet, Slack, and Microsoft Teams, and brief it on the meeting's context using keywords and files. Powered by advanced language models, Cuckoo understands the general context of discussions and learns technical details from provided materials. It supports over 20 languages out-of-the-box and is accessible on both mobile and desktop devices without the need for extensive arrangements or rehearsals. Cuckoo has been utilized in various scenarios, including team syncs, sales meetings, town halls, and webinars, proving its adaptability across different conversational contexts. -
11
Tune AI
NimbleBox
Leverage the power of custom models to build your competitive advantage. With our enterprise Gen AI stack, go beyond your imagination and offload manual tasks to powerful assistants instantly – the sky is the limit. For enterprises where data security is paramount, fine-tune and deploy generative AI models on your own cloud, securely.
- Previous
- You're on page 1
- Next