MPT-7B

MPT-7B

MosaicML
Xgen-small

Xgen-small

Salesforce
+
+

Related Products

  • Vertex AI
    713 Ratings
    Visit Website
  • Google AI Studio
    4 Ratings
    Visit Website
  • LM-Kit.NET
    17 Ratings
    Visit Website
  • Amazon Bedrock
    72 Ratings
    Visit Website
  • AWS Step Functions
    102 Ratings
    Visit Website
  • myACI
    442 Ratings
    Visit Website
  • Buildxact
    217 Ratings
    Visit Website
  • ClickLearn
    65 Ratings
    Visit Website
  • Datasite Diligence Virtual Data Room
    458 Ratings
    Visit Website
  • Almabase
    263 Ratings
    Visit Website

About

Introducing MPT-7B, the latest entry in our MosaicML Foundation Series. MPT-7B is a transformer trained from scratch on 1T tokens of text and code. It is open source, available for commercial use, and matches the quality of LLaMA-7B. MPT-7B was trained on the MosaicML platform in 9.5 days with zero human intervention at a cost of ~$200k. Now you can train, finetune, and deploy your own private MPT models, either starting from one of our checkpoints or training from scratch. For inspiration, we are also releasing three finetuned models in addition to the base MPT-7B: MPT-7B-Instruct, MPT-7B-Chat, and MPT-7B-StoryWriter-65k+, the last of which uses a context length of 65k tokens!

About

Xgen-small is an enterprise-ready compact language model developed by Salesforce AI Research, designed to deliver long-context performance at a predictable, low cost. It combines domain-focused data curation, scalable pre-training, length extension, instruction fine-tuning, and reinforcement learning to meet the complex, high-volume inference demands of modern enterprises. Unlike traditional large models, Xgen-small offers efficient processing of extensive contexts, enabling the synthesis of information from internal documentation, code repositories, research reports, and real-time data streams. With sizes optimized at 4B and 9B parameters, it provides a strategic advantage by balancing cost efficiency, privacy safeguards, and long-context understanding, making it a sustainable and predictable solution for deploying Enterprise AI at scale.

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Audience

AI and LLM developers and engineers

Audience

IT leaders and AI practitioners seeking a compact, efficient language model capable of processing long-context information while ensuring cost-effectiveness and data privacy

Support

Phone Support
24/7 Live Support
Online

Support

Phone Support
24/7 Live Support
Online

API

Offers API

API

Offers API

Screenshots and Videos

Screenshots and Videos

Pricing

Free
Free Version
Free Trial

Pricing

No information available.
Free Version
Free Trial

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Training

Documentation
Webinars
Live Online
In Person

Training

Documentation
Webinars
Live Online
In Person

Company Information

MosaicML
Founded: 2021
United States
www.mosaicml.com/blog/mpt-7b

Company Information

Salesforce
Founded: 1999
United States
www.salesforce.com/blog/xgen-small-enterprise-ready-small-language-models/

Alternatives

Alpaca

Alpaca

Stanford Center for Research on Foundation Models (CRFM)

Alternatives

Dolly

Dolly

Databricks
Falcon-40B

Falcon-40B

Technology Innovation Institute (TII)
Llama 2

Llama 2

Meta
Mistral NeMo

Mistral NeMo

Mistral AI
Kimi K2

Kimi K2

Moonshot AI

Categories

Categories

Integrations

Axolotl
MosaicML

Integrations

Axolotl
MosaicML
Claim MPT-7B and update features and information
Claim MPT-7B and update features and information
Claim Xgen-small and update features and information
Claim Xgen-small and update features and information