Audience

Developers interested in a large language model

About GPT-NeoX

An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.

This repository records EleutherAI's library for training large-scale language models on GPUs. Our current framework is based on NVIDIA's Megatron Language Model and has been augmented with techniques from DeepSpeed as well as some novel optimizations. We aim to make this repo a centralized and accessible place to gather techniques for training large-scale autoregressive language models, and accelerate research into large-scale training.

Pricing

Starting Price:
Free
Pricing Details:
Open source
Free Version:
Free Version available.

Integrations

Ratings/Reviews

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Company Information

EleutherAI
Founded: 2020
github.com/EleutherAI/gpt-neox

Videos and Screen Captures

GPT-NeoX Screenshot 1
Other Useful Business Software
Gemini 3 and 200+ AI Models on One Platform Icon
Gemini 3 and 200+ AI Models on One Platform

Access Google's best plus Claude, Llama, and Gemma. Fine-tune and deploy from one console.

Build generative AI apps with Vertex AI. Switch between models without switching platforms.
Start Free

Product Details

Platforms Supported
Cloud
On-Premises
Training
Documentation

GPT-NeoX Frequently Asked Questions

Q: What kinds of users and organization types does GPT-NeoX work with?
Q: What languages does GPT-NeoX support in their product?
Q: What other applications or services does GPT-NeoX integrate with?
Q: What type of training does GPT-NeoX provide?
Q: How much does GPT-NeoX cost?

GPT-NeoX Product Features