MPT-7B

MPT-7B

MosaicML
MiniMax-M1

MiniMax-M1

MiniMax
+
+

Related Products

  • Vertex AI
    726 Ratings
    Visit Website
  • Google AI Studio
    9 Ratings
    Visit Website
  • LM-Kit.NET
    17 Ratings
    Visit Website
  • Seedance
    Visit Website
  • Ango Hub
    15 Ratings
    Visit Website
  • ClickLearn
    65 Ratings
    Visit Website
  • myACI
    461 Ratings
    Visit Website
  • Datasite Diligence Virtual Data Room
    469 Ratings
    Visit Website
  • Buildxact
    225 Ratings
    Visit Website
  • Partful
    16 Ratings
    Visit Website

About

Introducing MPT-7B, the latest entry in our MosaicML Foundation Series. MPT-7B is a transformer trained from scratch on 1T tokens of text and code. It is open source, available for commercial use, and matches the quality of LLaMA-7B. MPT-7B was trained on the MosaicML platform in 9.5 days with zero human intervention at a cost of ~$200k. Now you can train, finetune, and deploy your own private MPT models, either starting from one of our checkpoints or training from scratch. For inspiration, we are also releasing three finetuned models in addition to the base MPT-7B: MPT-7B-Instruct, MPT-7B-Chat, and MPT-7B-StoryWriter-65k+, the last of which uses a context length of 65k tokens!

About

MiniMax‑M1 is a large‑scale hybrid‑attention reasoning model released by MiniMax AI under the Apache 2.0 license. It supports an unprecedented 1 million‑token context window and up to 80,000-token outputs, enabling extended reasoning across long documents. Trained using large‑scale reinforcement learning with a novel CISPO algorithm, MiniMax‑M1 completed full training on 512 H800 GPUs in about three weeks. It achieves state‑of‑the‑art performance on benchmarks in mathematics, coding, software engineering, tool usage, and long‑context understanding, matching or outperforming leading models. Two model variants are available (40K and 80K thinking budgets), with weights and deployment scripts provided via GitHub and Hugging Face.

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Audience

AI and LLM developers and engineers

Audience

AI researchers, developers, and enterprises needing a solution providing LLM capable of long‑context reasoning, efficient compute, and integration via function calls

Support

Phone Support
24/7 Live Support
Online

Support

Phone Support
24/7 Live Support
Online

API

Offers API

API

Offers API

Screenshots and Videos

Screenshots and Videos

Pricing

Free
Free Version
Free Trial

Pricing

No information available.
Free Version
Free Trial

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Training

Documentation
Webinars
Live Online
In Person

Training

Documentation
Webinars
Live Online
In Person

Company Information

MosaicML
Founded: 2021
United States
www.mosaicml.com/blog/mpt-7b

Company Information

MiniMax
Founded: 2021
Singapore
github.com/MiniMax-AI/MiniMax-M1

Alternatives

Alpaca

Alpaca

Stanford Center for Research on Foundation Models (CRFM)

Alternatives

OpenAI o1

OpenAI o1

OpenAI
Dolly

Dolly

Databricks
Falcon-40B

Falcon-40B

Technology Innovation Institute (TII)
Qwen-7B

Qwen-7B

Alibaba
Llama 2

Llama 2

Meta
Llama 2

Llama 2

Meta

Categories

Categories

Integrations

Axolotl
GitHub
Hugging Face
MosaicML
SiliconFlow

Integrations

Axolotl
GitHub
Hugging Face
MosaicML
SiliconFlow
Claim MPT-7B and update features and information
Claim MPT-7B and update features and information
Claim MiniMax-M1 and update features and information
Claim MiniMax-M1 and update features and information