Mixtral 8x7BMistral AI
|
||||||
Related Products
|
||||||
About
Mixtral 8x7B is a high-quality sparse mixture of experts model (SMoE) with open weights. Licensed under Apache 2.0. Mixtral outperforms Llama 2 70B on most benchmarks with 6x faster inference. It is the strongest open-weight model with a permissive license and the best model overall regarding cost/performance trade-offs. In particular, it matches or outperforms GPT-3.5 on most standard benchmarks.
|
About
OpenLLaMA is a permissively licensed open source reproduction of Meta AI’s LLaMA 7B trained on the RedPajama dataset. Our model weights can serve as the drop in replacement of LLaMA 7B in existing implementations. We also provide a smaller 3B variant of LLaMA model.
|
|||||
Platforms Supported
Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook
|
Platforms Supported
Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook
|
|||||
Audience
AI developers
|
Audience
LLM and AI developers
|
|||||
Support
Phone Support
24/7 Live Support
Online
|
Support
Phone Support
24/7 Live Support
Online
|
|||||
API
Offers API
|
API
Offers API
|
|||||
Screenshots and Videos |
Screenshots and Videos |
|||||
Pricing
Free
Free Version
Free Trial
|
Pricing
Free
Free Version
Free Trial
|
|||||
Reviews/
|
Reviews/
|
|||||
Training
Documentation
Webinars
Live Online
In Person
|
Training
Documentation
Webinars
Live Online
In Person
|
|||||
Company InformationMistral AI
Founded: 2023
France
mistral.ai/news/mixtral-of-experts/
|
Company InformationOpenLLaMA
Founded: 2023
github.com/openlm-research/open_llama
|
|||||
Alternatives |
Alternatives |
|||||
|
||||||
|
|
|||||
|
||||||
|
|
|||||
Categories |
Categories |
|||||
Integrations
APIPark
Airtrain
F#
Graydient AI
HTML
JavaScript
Klee
Langflow
Literal AI
ManagePrompt
|
Integrations
APIPark
Airtrain
F#
Graydient AI
HTML
JavaScript
Klee
Langflow
Literal AI
ManagePrompt
|
|||||
|
|