ChatGLM

ChatGLM

Zhipu AI
PanGu-Σ

PanGu-Σ

Huawei
+
+

Related Products

  • Vertex AI
    726 Ratings
    Visit Website
  • LM-Kit.NET
    19 Ratings
    Visit Website
  • Google AI Studio
    9 Ratings
    Visit Website
  • Seedance
    6 Ratings
    Visit Website
  • CCM Platform
    3 Ratings
    Visit Website
  • Enterprise Bot
    23 Ratings
    Visit Website
  • Google Cloud Speech-to-Text
    398 Ratings
    Visit Website
  • Social Intents
    20 Ratings
    Visit Website
  • SurveySparrow
    2,984 Ratings
    Visit Website
  • Nexo
    16,155 Ratings
    Visit Website

About

ChatGLM-6B is an open-source, Chinese-English bilingual dialogue language model based on the General Language Model (GLM) architecture with 6.2 billion parameters. Combined with model quantization technology, users can deploy locally on consumer-grade graphics cards (only 6GB of video memory is required at the INT4 quantization level). ChatGLM-6B uses technology similar to ChatGPT, optimized for Chinese Q&A and dialogue. After about 1T identifiers of Chinese and English bilingual training, supplemented by supervision and fine-tuning, feedback self-help, human feedback reinforcement learning and other technologies, ChatGLM-6B with 6.2 billion parameters has been able to generate answers that are quite in line with human preferences.

About

Significant advancements in the field of natural language processing, understanding, and generation have been achieved through the expansion of large language models. This study introduces a system which utilizes Ascend 910 AI processors and the MindSpore framework to train a language model with over a trillion parameters, specifically 1.085T, named PanGu-{\Sigma}. This model, which builds upon the foundation laid by PanGu-{\alpha}, takes the traditionally dense Transformer model and transforms it into a sparse one using a concept known as Random Routed Experts (RRE). The model was efficiently trained on a dataset of 329 billion tokens using a technique called Expert Computation and Storage Separation (ECSS), leading to a 6.3-fold increase in training throughput via heterogeneous computing. Experimentation indicates that PanGu-{\Sigma} sets a new standard in zero-shot learning for various downstream Chinese NLP tasks.

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Audience

Developers interested in a powerful Chinese-English large language model

Audience

AI developers

Support

Phone Support
24/7 Live Support
Online

Support

Phone Support
24/7 Live Support
Online

API

Offers API

API

Offers API

Screenshots and Videos

Screenshots and Videos

No images available

Pricing

Free
Free Version
Free Trial

Pricing

No information available.
Free Version
Free Trial

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Training

Documentation
Webinars
Live Online
In Person

Training

Documentation
Webinars
Live Online
In Person

Company Information

Zhipu AI
Founded: 2019
China
chatglm.cn/

Company Information

Huawei
Founded: 1987
China
huawei.com

Alternatives

Baichuan-13B

Baichuan-13B

Baichuan Intelligent Technology

Alternatives

LTM-1

LTM-1

Magic AI
Qwen

Qwen

Alibaba
DeepSeek-V2

DeepSeek-V2

DeepSeek
PanGu-α

PanGu-α

Huawei
Llama 2

Llama 2

Meta
OPT

OPT

Meta
DeepSeek R2

DeepSeek R2

DeepSeek

Categories

Categories

Integrations

APIPark
LLaMA-Factory
PanGu Chat

Integrations

APIPark
LLaMA-Factory
PanGu Chat
Claim ChatGLM and update features and information
Claim ChatGLM and update features and information
Claim PanGu-Σ and update features and information
Claim PanGu-Σ and update features and information