kuc

keras-unet-collection

The Tensorflow, Keras implementation of U-net, V-net, U-net++, UNET 3+, Attention U-net, R2U-net, ResUnet-a, U^2-Net, TransUNET, and Swin-UNET with optional ImageNet-trained backbones.

Showing:

Popularity

Downloads/wk

0

GitHub Stars

153

Maintenance

Last Commit

2mos ago

Contributors

2

Package

Dependencies

0

License

Categories

Readme

keras-unet-collection

PyPI version PyPI license Maintenance

DOI

The tensorflow.keras implementation of U-net, V-net, U-net++, UNET 3+, Attention U-net, R2U-net, ResUnet-a, U^2-Net, TransUNET, and Swin-UNET with optional ImageNet-trained backbones.


keras_unet_collection.models contains functions that configure keras models with hyper-parameter options.

  • Pre-trained ImageNet backbones are supported for U-net, U-net++, UNET 3+, Attention U-net, and TransUNET.
  • Deep supervision is supported for U-net++, UNET 3+, and U^2-Net.
  • See the User guide for other options and use cases.
keras_unet_collection.modelsNameReference
unet_2dU-netRonneberger et al. (2015)
vnet_2dV-net (modified for 2-d inputs)Milletari et al. (2016)
unet_plus_2dU-net++Zhou et al. (2018)
r2_unet_2dR2U-NetAlom et al. (2018)
att_unet_2dAttention U-netOktay et al. (2018)
resunet_a_2dResUnet-aDiakogiannis et al. (2020)
u2net_2dU^2-NetQin et al. (2020)
unet_3plus_2dUNET 3+Huang et al. (2020)
transunet_2dTransUNETChen et al. (2021)
swin_unet_2dSwin-UNETHu et al. (2021)

Note: the two Transformer models are incompatible with Numpy 1.20; NumPy 1.19.5 is recommended.


keras_unet_collection.base contains functions that build the base architecture (i.e., without model heads) of Unet variants for model customization and debugging.

keras_unet_collection.baseNotes
unet_2d_base, vnet_2d_base, unet_plus_2d_base, unet_3plus_2d_base, att_unet_2d_base, r2_unet_2d_base, resunet_a_2d_base, u2net_2d_base, transunet_2d_base, swin_unet_2d_baseFunctions that accept an input tensor and hyper-parameters of the corresponded model, and produce output tensors of the base architecture.

keras_unet_collection.activations and keras_unet_collection.losses provide additional activation layers and loss functions.

keras_unet_collection.activationsNameReference
GELUGaussian Error Linear Units (GELU)Hendrycks et al. (2016)
SnakeSnake activationLiu et al. (2020)
keras_unet_collection.lossesNameReference
diceDice lossSudre et al. (2017)
tverskyTversky lossHashemi et al. (2018)
focal_tverskyFocal Tversky lossAbraham et al. (2019)
ms_ssimMulti-scale Structural Similarity Index lossWang et al. (2003)
iou_segIntersection over Union (IoU) loss for segmentationRahman and Wang (2016)
iou_box(Generalized) IoU loss for object detectionRezatofighi et al. (2019)
triplet_1dSemi-hard triplet loss (experimental)
crps2d_tfCRPS loss (experimental)

Installation and usage

pip install keras-unet-collection

from keras_unet_collection import models
# e.g. models.unet_2d(...)
  • Note: Currently supported backbone models are: VGG[16,19], ResNet[50,101,152], ResNet[50,101,152]V2, DenseNet[121,169,201], and EfficientNetB[0-7]. See Keras Applications for details.

  • Note: Neural networks produced by this package may contain customized layers that are not part of the Tensorflow. It is reommended to save and load model weights.

  • Changelog

Examples

  • Jupyter notebooks are provided as examples:

    • Attention U-net with VGG16 backbone [link].

    • UNET 3+ with deep supervision, classification-guided module, and hybrid loss [link].

    • Vision-Transformer-based examples are in progress, and available at keras-vision-transformer

Dependencies

  • TensorFlow 2.5.0, Keras 2.5.0, Numpy 1.19.5.

  • (Optional for examples) Pillow, matplotlib, etc.

Overview

U-net is a convolutional neural network with encoder-decoder architecture and skip-connections, loosely defined under the concept of "fully convolutional networks." U-net was originally proposed for the semantic segmentation of medical images and is modified for solving a wider range of gridded learning problems.

U-net and many of its variants take three or four-dimensional tensors as inputs and produce outputs of the same shape. One technical highlight of these models is the skip-connections from downsampling to upsampling layers, which benefit the reconstruction of high-resolution, gridded outputs.

Contact

Yingkai (Kyle) Sha <yingkai@eoas.ubc.ca> <yingkaisha@gmail.com>

License

MIT License

Citation

@misc{keras-unet-collection,
  author = {Sha, Yingkai},
  title = {Keras-unet-collection},
  year = {2021},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/yingkaisha/keras-unet-collection}},
  doi = {10.5281/zenodo.5449801}
}

Rate & Review

Great Documentation0
Easy to Use0
Performant0
Highly Customizable0
Bleeding Edge0
Responsive Maintainers0
Poor Documentation0
Hard to Use0
Slow0
Buggy0
Abandoned0
Unwelcoming Community0
100
No reviews found
Be the first to rate

Alternatives

No alternatives found

Tutorials

No tutorials found
Add a tutorial