Releases: larq/zoo
Releases · larq/zoo
v0.5.0
⚠️ Breaking Changes ⚠️
- ⬆️
[email protected]
which drops support for TensorFlow 1.13 (#87) @lgeiger
🎉 Features
- Add names to all models (#88) @koenhelwegen
📖 Documentation
👷♂️ Internal Improvements
- Run unit tests in parallel using
pytest-xdist
(#83) @lgeiger - Test against TensorFlow 1.15.0 stable (#81) @lgeiger
- Use GitHub actions to run unittests and auto-release (#80) @lgeiger
⬆️ Dependencies
- Bump larq to 0.8.1 (#90) @leonoverweel
- Bump scipy from 1.3.2 to 1.3.3 (#89) @dependabot-preview
- Bump scipy from 1.3.1 to 1.3.2 (#85) @dependabot-preview
- Upgrade black (#84) @lgeiger
- Bump pillow from 6.2.0 to 6.2.1 (#82) @dependabot-preview
- Bump pillow from 6.1.0 to 6.2.0 (#78) @dependabot-preview
- Bump tensorflow==2.0.0 (#77) @lgeiger
v0.4.2
DoReFa-Net
Pretrained weights for DoReFa-Net with 1 bit weights and 2 bit activations
v0.4.1
Bi-Real Net
Pretrained weights for corrected Bi-Real net architecture (the previous version contained one additional layer by accident)
v0.4.0
🎉 Features
- Made compatible with Zookeeper >= 0.5 (#62)
- Add
ReseNetE18
andBinaryDenseNet{28,37,45}
(#60) - Add
BinaryDenseNet37Dilated
(#66)
🐛 Bugs
📖 Documentation
- Use ImageNet dataset version 5.0.0 (#53)
- Remove irrelevant docstrings (#58)
- Cleanup docs build (#67)
👷 Internal Improvements
- Bump scipy from 1.3.0 to 1.3.1 (#56)
Binary DenseNet
Pretrained weights for BinaryDenseNet
ResNetE18
Pretrained weights for ResNetE18
v0.3.0
🎉 Features
- Use improved preprocessing for pretrained models (#52):
- All models are retrained from scratch and now exceed the accuracies claimed by the respective papers: https://larq.dev/models/#available-models
- All models include plots of the training process in the docs: https://larq.dev/models/api/
📖 Documentation
- Add missing
numpy
import (#51)
XNOR-Net
Pretrained weights for XNOR-Net