Implementation / replication of DALL-E (paper), OpenAI's Text to Image Transformer, in Pytorch. It will also contain CLIP for ranking the generations. Kobiso, a research engineer from Naver, has trained on the CUB200 dataset here, using full and deepspeed sparse attention. You can also skip the training of the VAE altogether, using the pretrained model released by OpenAI! The wrapper class should take care of downloading and caching the model for you auto-magically. You can also use the pretrained VAE offered by the authors of Taming Transformers! Currently only the VAE with a codebook size of 1024 is offered, with the hope that it may train a little faster than OpenAI's, which has a size of 8192. In contrast to OpenAI's VAE, it also has an extra layer of downsampling, so the image sequence length is 256 instead of 1024 (this will lead to a 16 reduction in training costs, when you do the math).

Features

  • Train DALL-E with pretrained VAE
  • The default VQGan is the codebook size 1024 one trained on imagenet
  • Adjust text conditioning strength
  • Rank the generations
  • Deepspeed Sparse Attention
  • Train with Microsoft Deepspeed's Sparse Attention

Project Samples

Project Activity

See All Activity >

License

MIT License

Follow DALL-E in Pytorch

DALL-E in Pytorch Web Site

Other Useful Business Software
Our Free Plans just got better! | Auth0 Icon
Our Free Plans just got better! | Auth0

With up to 25k MAUs and unlimited Okta connections, our Free Plan lets you focus on what you do best—building great apps.

You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your security. Auth0 now, thank yourself later.
Try free now
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of DALL-E in Pytorch!

Additional Project Details

Programming Language

Python

Related Categories

Python AI Image Generators, Python Generative AI

Registered

2022-08-02