PanGu-Σ

PanGu-Σ

Huawei
+
+

Related Products

  • LM-Kit.NET
    24 Ratings
    Visit Website
  • Google AI Studio
    11 Ratings
    Visit Website
  • Vertex AI
    827 Ratings
    Visit Website
  • Windsurf Editor
    159 Ratings
    Visit Website
  • Google Cloud BigQuery
    1,939 Ratings
    Visit Website
  • Boozang
    15 Ratings
    Visit Website
  • Concord
    237 Ratings
    Visit Website
  • LTX
    141 Ratings
    Visit Website
  • BidJS
    33 Ratings
    Visit Website
  • RunPod
    205 Ratings
    Visit Website

About

Megatron-Turing Natural Language Generation model (MT-NLG), is the largest and the most powerful monolithic transformer English language model with 530 billion parameters. This 105-layer, transformer-based MT-NLG improves upon the prior state-of-the-art models in zero-, one-, and few-shot settings. It demonstrates unmatched accuracy in a broad set of natural language tasks such as, Completion prediction, Reading comprehension, Commonsense reasoning, Natural language inferences, Word sense disambiguation, etc. With the intent of accelerating research on the largest English language model till date and enabling customers to experiment, employ and apply such a large language model on downstream language tasks - NVIDIA is pleased to announce an Early Access program for its managed API service to MT-NLG mode.

About

Significant advancements in the field of natural language processing, understanding, and generation have been achieved through the expansion of large language models. This study introduces a system which utilizes Ascend 910 AI processors and the MindSpore framework to train a language model with over a trillion parameters, specifically 1.085T, named PanGu-{\Sigma}. This model, which builds upon the foundation laid by PanGu-{\alpha}, takes the traditionally dense Transformer model and transforms it into a sparse one using a concept known as Random Routed Experts (RRE). The model was efficiently trained on a dataset of 329 billion tokens using a technique called Expert Computation and Storage Separation (ECSS), leading to a 6.3-fold increase in training throughput via heterogeneous computing. Experimentation indicates that PanGu-{\Sigma} sets a new standard in zero-shot learning for various downstream Chinese NLP tasks.

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Audience

Developers interested in a powerful English large language model

Audience

AI developers

Support

Phone Support
24/7 Live Support
Online

Support

Phone Support
24/7 Live Support
Online

API

Offers API

API

Offers API

Screenshots and Videos

No images available

Screenshots and Videos

No images available

Pricing

No information available.
Free Version
Free Trial

Pricing

No information available.
Free Version
Free Trial

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Training

Documentation
Webinars
Live Online
In Person

Training

Documentation
Webinars
Live Online
In Person

Company Information

NVIDIA
Founded: 1993
United States
developer.nvidia.com/megatron-turing-natural-language-generation

Company Information

Huawei
Founded: 1987
China
huawei.com

Alternatives

Cerebras-GPT

Cerebras-GPT

Cerebras

Alternatives

LTM-1

LTM-1

Magic AI
DeepSpeed

DeepSpeed

Microsoft
PanGu-α

PanGu-α

Huawei
Chinchilla

Chinchilla

Google DeepMind
DeepSeek-V2

DeepSeek-V2

DeepSeek
NVIDIA NeMo

NVIDIA NeMo

NVIDIA
VideoPoet

VideoPoet

Google
OPT

OPT

Meta

Categories

Categories

Integrations

PanGu Chat

Integrations

PanGu Chat
Claim Megatron-Turing and update features and information
Claim Megatron-Turing and update features and information
Claim PanGu-Σ and update features and information
Claim PanGu-Σ and update features and information