MPT-7B

MPT-7B

MosaicML
Mistral NeMo

Mistral NeMo

Mistral AI
+
+

Related Products

  • Vertex AI
    783 Ratings
    Visit Website
  • Google AI Studio
    11 Ratings
    Visit Website
  • LM-Kit.NET
    23 Ratings
    Visit Website
  • Ango Hub
    15 Ratings
    Visit Website
  • Buildxact
    233 Ratings
    Visit Website
  • ClickLearn
    66 Ratings
    Visit Website
  • Partful
    17 Ratings
    Visit Website
  • CBT Nuggets
    483 Ratings
    Visit Website
  • Yeastar P-Series PBX System
    95 Ratings
    Visit Website
  • Datasite Diligence Virtual Data Room
    611 Ratings
    Visit Website

About

Introducing MPT-7B, the latest entry in our MosaicML Foundation Series. MPT-7B is a transformer trained from scratch on 1T tokens of text and code. It is open source, available for commercial use, and matches the quality of LLaMA-7B. MPT-7B was trained on the MosaicML platform in 9.5 days with zero human intervention at a cost of ~$200k. Now you can train, finetune, and deploy your own private MPT models, either starting from one of our checkpoints or training from scratch. For inspiration, we are also releasing three finetuned models in addition to the base MPT-7B: MPT-7B-Instruct, MPT-7B-Chat, and MPT-7B-StoryWriter-65k+, the last of which uses a context length of 65k tokens!

About

Mistral NeMo, our new best small model. A state-of-the-art 12B model with 128k context length, and released under the Apache 2.0 license. Mistral NeMo is a 12B model built in collaboration with NVIDIA. Mistral NeMo offers a large context window of up to 128k tokens. Its reasoning, world knowledge, and coding accuracy are state-of-the-art in its size category. As it relies on standard architecture, Mistral NeMo is easy to use and a drop-in replacement in any system using Mistral 7B. We have released pre-trained base and instruction-tuned checkpoints under the Apache 2.0 license to promote adoption for researchers and enterprises. Mistral NeMo was trained with quantization awareness, enabling FP8 inference without any performance loss. The model is designed for global, multilingual applications. It is trained on function calling and has a large context window. Compared to Mistral 7B, it is much better at following precise instructions, reasoning, and handling multi-turn conversations.

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Audience

AI and LLM developers and engineers

Audience

Users looking for a language model tool to power their AI-driven applications

Support

Phone Support
24/7 Live Support
Online

Support

Phone Support
24/7 Live Support
Online

API

Offers API

API

Offers API

Screenshots and Videos

Screenshots and Videos

Pricing

Free
Free Version
Free Trial

Pricing

Free
Free Version
Free Trial

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Training

Documentation
Webinars
Live Online
In Person

Training

Documentation
Webinars
Live Online
In Person

Company Information

MosaicML
Founded: 2021
United States
www.mosaicml.com/blog/mpt-7b

Company Information

Mistral AI
Founded: 2023
France
mistral.ai/news/mistral-nemo/

Alternatives

Alpaca

Alpaca

Stanford Center for Research on Foundation Models (CRFM)

Alternatives

Jamba

Jamba

AI21 Labs
Dolly

Dolly

Databricks
Mistral Small

Mistral Small

Mistral AI
Falcon-40B

Falcon-40B

Technology Innovation Institute (TII)
Olmo 2

Olmo 2

Ai2
Llama 2

Llama 2

Meta
Qwen2.5-1M

Qwen2.5-1M

Alibaba
Mistral 7B

Mistral 7B

Mistral AI

Categories

Categories

Integrations

1min.AI
302.AI
AI-FLOW
AiAssistWorks
Axolotl
GMTech
LM-Kit.NET
LibreChat
Mammouth AI
MindMac
Mistral AI
MosaicML
OpenLIT
Overseer AI
PromptPal
Rust
Simplismart
Visual Basic
Wordware
thisorthis.ai

Integrations

1min.AI
302.AI
AI-FLOW
AiAssistWorks
Axolotl
GMTech
LM-Kit.NET
LibreChat
Mammouth AI
MindMac
Mistral AI
MosaicML
OpenLIT
Overseer AI
PromptPal
Rust
Simplismart
Visual Basic
Wordware
thisorthis.ai
Claim MPT-7B and update features and information
Claim MPT-7B and update features and information
Claim Mistral NeMo and update features and information
Claim Mistral NeMo and update features and information