LTM-1

LTM-1

Magic AI
MiniMax-M1

MiniMax-M1

MiniMax
+
+

Related Products

  • Vertex AI
    726 Ratings
    Visit Website
  • LM-Kit.NET
    16 Ratings
    Visit Website
  • Google AI Studio
    9 Ratings
    Visit Website
  • Amp
    86 Ratings
    Visit Website
  • Google Cloud BigQuery
    1,861 Ratings
    Visit Website
  • Adobe PDF Library SDK
    35 Ratings
    Visit Website
  • Secure Eraser
    11 Ratings
    Visit Website
  • Windsurf Editor
    141 Ratings
    Visit Website
  • TeamDesk
    92 Ratings
    Visit Website
  • Clearooms
    245 Ratings
    Visit Website

About

Magic’s LTM-1 enables 50x larger context windows than transformers. Magic's trained a Large Language Model (LLM) that’s able to take in the gigantic amounts of context when generating suggestions. For our coding assistant, this means Magic can now see your entire repository of code. Larger context windows can allow AI models to reference more explicit, factual information and their own action history. We hope to be able to utilize this research to improve reliability and coherence.

About

MiniMax‑M1 is a large‑scale hybrid‑attention reasoning model released by MiniMax AI under the Apache 2.0 license. It supports an unprecedented 1 million‑token context window and up to 80,000-token outputs, enabling extended reasoning across long documents. Trained using large‑scale reinforcement learning with a novel CISPO algorithm, MiniMax‑M1 completed full training on 512 H800 GPUs in about three weeks. It achieves state‑of‑the‑art performance on benchmarks in mathematics, coding, software engineering, tool usage, and long‑context understanding, matching or outperforming leading models. Two model variants are available (40K and 80K thinking budgets), with weights and deployment scripts provided via GitHub and Hugging Face.

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Audience

Developers interested in a 5 million token context window large language model

Audience

AI researchers, developers, and enterprises needing a solution providing LLM capable of long‑context reasoning, efficient compute, and integration via function calls

Support

Phone Support
24/7 Live Support
Online

Support

Phone Support
24/7 Live Support
Online

API

Offers API

API

Offers API

Screenshots and Videos

Screenshots and Videos

Pricing

No information available.
Free Version
Free Trial

Pricing

No information available.
Free Version
Free Trial

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Training

Documentation
Webinars
Live Online
In Person

Training

Documentation
Webinars
Live Online
In Person

Company Information

Magic AI
Founded: 2022
United States
magic.dev/blog/ltm-1

Company Information

MiniMax
Founded: 2021
Singapore
github.com/MiniMax-AI/MiniMax-M1

Alternatives

LTM-2-mini

LTM-2-mini

Magic AI

Alternatives

OpenAI o1

OpenAI o1

OpenAI
Baichuan-13B

Baichuan-13B

Baichuan Intelligent Technology
Qwen-7B

Qwen-7B

Alibaba
Llama 2

Llama 2

Meta
Claude Pro

Claude Pro

Anthropic

Categories

Categories

Integrations

GitHub
Hugging Face
Quickwork
SiliconFlow

Integrations

GitHub
Hugging Face
Quickwork
SiliconFlow
Claim LTM-1 and update features and information
Claim LTM-1 and update features and information
Claim MiniMax-M1 and update features and information
Claim MiniMax-M1 and update features and information