MPT-7B

MPT-7B

MosaicML
+
+

Related Products

  • Vertex AI
    726 Ratings
    Visit Website
  • LM-Kit.NET
    17 Ratings
    Visit Website
  • Google AI Studio
    9 Ratings
    Visit Website
  • Seedance
    Visit Website
  • Ango Hub
    15 Ratings
    Visit Website
  • CDK Global
    331 Ratings
    Visit Website
  • Enterprise Bot
    23 Ratings
    Visit Website
  • Nexo
    16,155 Ratings
    Visit Website
  • QuickApps
    Visit Website
  • Kinde
    50 Ratings
    Visit Website

About

This repository contains the research preview of LongLLaMA, a large language model capable of handling long contexts of 256k tokens or even more. LongLLaMA is built upon the foundation of OpenLLaMA and fine-tuned using the Focused Transformer (FoT) method. LongLLaMA code is built upon the foundation of Code Llama. We release a smaller 3B base variant (not instruction tuned) of the LongLLaMA model on a permissive license (Apache 2.0) and inference code supporting longer contexts on hugging face. Our model weights can serve as the drop-in replacement of LLaMA in existing implementations (for short context up to 2048 tokens). Additionally, we provide evaluation results and comparisons against the original OpenLLaMA models.

About

Introducing MPT-7B, the latest entry in our MosaicML Foundation Series. MPT-7B is a transformer trained from scratch on 1T tokens of text and code. It is open source, available for commercial use, and matches the quality of LLaMA-7B. MPT-7B was trained on the MosaicML platform in 9.5 days with zero human intervention at a cost of ~$200k. Now you can train, finetune, and deploy your own private MPT models, either starting from one of our checkpoints or training from scratch. For inspiration, we are also releasing three finetuned models in addition to the base MPT-7B: MPT-7B-Instruct, MPT-7B-Chat, and MPT-7B-StoryWriter-65k+, the last of which uses a context length of 65k tokens!

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Audience

Users interested in a powerful Large Language Model solution

Audience

AI and LLM developers and engineers

Support

Phone Support
24/7 Live Support
Online

Support

Phone Support
24/7 Live Support
Online

API

Offers API

API

Offers API

Screenshots and Videos

Screenshots and Videos

Pricing

Free
Free Version
Free Trial

Pricing

Free
Free Version
Free Trial

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Training

Documentation
Webinars
Live Online
In Person

Training

Documentation
Webinars
Live Online
In Person

Company Information

LongLLaMA
github.com/CStanKonrad/long_llama

Company Information

MosaicML
Founded: 2021
United States
www.mosaicml.com/blog/mpt-7b

Alternatives

Llama 2

Llama 2

Meta

Alternatives

Alpaca

Alpaca

Stanford Center for Research on Foundation Models (CRFM)
Dolly

Dolly

Databricks
Mistral NeMo

Mistral NeMo

Mistral AI
Falcon-40B

Falcon-40B

Technology Innovation Institute (TII)
Llama 2

Llama 2

Meta
StarCoder

StarCoder

BigCode

Categories

Categories

Integrations

Axolotl
MosaicML

Integrations

Axolotl
MosaicML
Claim LongLLaMA and update features and information
Claim LongLLaMA and update features and information
Claim MPT-7B and update features and information
Claim MPT-7B and update features and information