TCNs exhibit longer memory than recurrent architectures with the same capacity. Performs better than LSTM/GRU on a vast range of tasks (Seq. MNIST, Adding Problem, Copy Memory, Word-level PTB...). Parallelism (convolutional layers), flexible receptive field size (possible to specify how far the model can see), stable gradients (backpropagation through time, vanishing gradients). The usual way is to import the TCN layer and use it inside a Keras model. The receptive field is defined as the maximum number of steps back in time from current sample at time T, that a filter from (block, layer, stack, TCN) can hit (effective history) + 1. The receptive field of the TCN can be calculated. Once keras-tcn is installed as a package, you can take a glimpse of what is possible to do with TCNs. Some tasks examples are available in the repository for this purpose.

Features

  • Tested with Tensorflow 2.6, 2.7, 2.8 and 2.9.0rc2
  • For MacOS M1 users
  • The usual way is to import the TCN layer and use it inside a Keras model
  • A ready-to-use TCN model can be used that way (cf. tasks for some examples):
  • 3D tensor with shape
  • Provides parameters to configure your TCN layer

Project Samples

Project Activity

See All Activity >

License

MIT License

Follow Keras TCN

Keras TCN Web Site

Other Useful Business Software
Custom VMs From 1 to 96 vCPUs With 99.95% Uptime Icon
Custom VMs From 1 to 96 vCPUs With 99.95% Uptime

General-purpose, compute-optimized, or GPU/TPU-accelerated. Built to your exact specs.

Live migration and automatic failover keep workloads online through maintenance. One free e2-micro VM every month.
Try Free
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of Keras TCN!

Additional Project Details

Programming Language

Python

Related Categories

Python Networking Software, Python Machine Learning Software

Registered

2022-08-15