+
+

Related Products

  • Vertex AI
    783 Ratings
    Visit Website
  • Google AI Studio
    11 Ratings
    Visit Website
  • LM-Kit.NET
    23 Ratings
    Visit Website
  • Curtain LogTrace File Activity Monitoring
    4 Ratings
    Visit Website
  • imgproxy
    15 Ratings
    Visit Website
  • Teradata VantageCloud
    992 Ratings
    Visit Website
  • Odoo
    1,629 Ratings
    Visit Website
  • Proton Pass
    31,996 Ratings
    Visit Website
  • Source Defense
    7 Ratings
    Visit Website
  • Windsurf Editor
    156 Ratings
    Visit Website

About

Foundation models such as GPT-4 have driven rapid improvement in AI. However, the most powerful models are closed commercial models or only partially open. RedPajama is a project to create a set of leading, fully open-source models. Today, we are excited to announce the completion of the first step of this project: the reproduction of the LLaMA training dataset of over 1.2 trillion tokens. The most capable foundation models today are closed behind commercial APIs, which limits research, customization, and their use with sensitive data. Fully open-source models hold the promise of removing these limitations, if the open community can close the quality gap between open and closed models. Recently, there has been much progress along this front. In many ways, AI is having its Linux moment. Stable Diffusion showed that open-source can not only rival the quality of commercial offerings like DALL-E but can also lead to incredible creativity from broad participation by communities.

About

The TinyLlama project aims to pretrain a 1.1B Llama model on 3 trillion tokens. With some proper optimization, we can achieve this within a span of "just" 90 days using 16 A100-40G GPUs. We adopted exactly the same architecture and tokenizer as Llama 2. This means TinyLlama can be plugged and played in many open-source projects built upon Llama. Besides, TinyLlama is compact with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint.

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Audience

AI and LLM developers

Audience

Developers interested in a small language model

Support

Phone Support
24/7 Live Support
Online

Support

Phone Support
24/7 Live Support
Online

API

Offers API

API

Offers API

Screenshots and Videos

Screenshots and Videos

No images available

Pricing

Free
Free Version
Free Trial

Pricing

Free
Free Version
Free Trial

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Training

Documentation
Webinars
Live Online
In Person

Training

Documentation
Webinars
Live Online
In Person

Company Information

RedPajama
Founded: 2023
www.together.xyz/blog/redpajama

Company Information

TinyLlama
github.com/jzhang38/TinyLlama

Alternatives

Alpaca

Alpaca

Stanford Center for Research on Foundation Models (CRFM)

Alternatives

Llama 2

Llama 2

Meta
Dolly

Dolly

Databricks
Falcon-40B

Falcon-40B

Technology Innovation Institute (TII)
Falcon-7B

Falcon-7B

Technology Innovation Institute (TII)
Llama

Llama

Meta

Categories

Categories

Integrations

RunPod
WebLLM

Integrations

RunPod
WebLLM
Claim RedPajama and update features and information
Claim RedPajama and update features and information
Claim TinyLlama and update features and information
Claim TinyLlama and update features and information