| Name | Modified | Size | Downloads / Week |
|---|---|---|---|
| Parent folder | |||
| 1.3.4 source code.tar.gz | 2017-09-28 | 4.3 MB | |
| 1.3.4 source code.zip | 2017-09-28 | 4.5 MB | |
| README.md | 2017-09-28 | 2.4 kB | |
| Totals: 3 Items | 8.8 MB | 0 | |
This version release comes with several new features, alongside a significant push for better documentation, examples, and unit testing.
ed.KLqp's score function gradient now does more intelligent (automatic) Rao-Blackwellization for variance reduction.- Automated transformations are enabled for all inference algorithms that benefit from it [tutorial].
- Added Wake-Sleep algorithm (
ed.WakeSleep). - Many minor bug fixes.
Examples
- All Edward examples now rely on the Observations library for data loading (no "official" public release yet; still in alpha).
- Added LSTM language model for text8. (
examples/lstm.py) - Added deep exponential family for modeling topics in NIPS articles. (
examples/deep_exponential_family.py) - Added sigmoid belief network for Caltech-101 silhouettes. (
examples/sigmoid_belief_network.py) - Added stochastic blockmodel on Karate club. (
examples/stochastic_block_model.py) - Added Cox process on synthetic spatial data. (
examples/cox_process.py)
Documentation & Testing
- Sealed all undocumented functions and modules in Edward.
- Parser and BibTeX to auto-generate API docs.
- Added unit testing to (most) all Jupyter notebooks.
Acknowledgements
- Thanks go to Matthew Feickert (@matthewfeickert), Alp Kucukelbir (@akucukelbir), Romain Lopez (@romain-lopez), Emile Mathieu (@emilemathieu), Stephen Ra (@stephenra), Kashif Rasul (@kashif), Philippe Rémy (@philipperemy), Charles Shenton (@cshenton), Yuto Yamaguchi (@yamaguchiyuto), @evahlis, @samnolen, @seiyab.
We are also grateful to all who filed issues or helped resolve them, asked and answered questions, and were part of inspiring discussions.