Common recurrent neural architectures scale poorly due to the intrinsic difficulty in parallelizing their state computations. In this work, we propose the Simple Recurrent Unit (SRU), a light recurrent unit that balances model capacity and scalability. SRU is designed to provide expressive recurrence, enable highly parallelized implementation, and comes with careful initialization to facilitate the training of deep models. We demonstrate the effectiveness of SRU on multiple NLP tasks. SRU achieves 5--9x speed-up over cuDNN-optimized LSTM on classification and question answering datasets, and delivers stronger results than LSTM and convolutional models. We also obtain an average of 0.7 BLEU improvement over the Transformer model on the translation by incorporating SRU into the architecture. The experimental code and SRU++ implementation are available on the dev branch which will be merged into master later.

Features

  • SRU is a recurrent unit that can run over 10 times faster than cuDNN LSTM
  • Without loss of accuracy tested on many tasks
  • Simple Recurrent Units for Highly Parallelizable Recurrence
  • SRU can be installed as a regular package via python setup.py install or pip install
  • Directly use the source without installation
  • The usage of SRU is similar to nn.LSTM

Project Samples

Project Activity

See All Activity >

License

MIT License

Follow SRU

SRU Web Site

Other Useful Business Software
Auth0 for AI Agents now in GA Icon
Auth0 for AI Agents now in GA

Ready to implement AI with confidence (without sacrificing security)?

Connect your AI agents to apps and data more securely, give users control over the actions AI agents can perform and the data they can access, and enable human confirmation for critical agent actions.
Start building today
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of SRU!

Additional Project Details

Programming Language

Python

Related Categories

Python Machine Learning Software, Python Natural Language Processing (NLP) Tool

Registered

2022-08-09