Open Access
Article
UNet and Variants for Medical Image Segmentation
Walid Ehab
Lina Huang
Yongmin Li*
Author Information
Submitted: 22 Sept 2023 | Accepted: 25 Dec 2023 | Published: 26 Jun 2024

Abstract

Medical imaging plays a crucial role in modern healthcare by providing non-invasive visualisation of internal structures and abnormalities, enabling early disease detection, accurate diagnosis, and treatment planning. This study aims to explore the application of deep learning models, particularly focusing on the UNet architecture and its variants, in medical image segmentation. We seek to evaluate the performance of these models across various challenging medical image segmentation tasks, addressing issues such as image normalization, resizing, architecture choices, loss function design, and hyperparameter tuning. The findings reveal that the standard UNet, when extended with a deep network layer, is a proficient medical image segmentation model, while the Res-UNet and Attention Res-UNet architectures demonstrate smoother convergence and superior performance, particularly when handling fine image details. The study also addresses the challenge of high class imbalance through careful preprocessing and loss function definitions. We anticipate that the results of this study will provide useful insights for researchers seeking to apply these models to new medical imaging problems and offer guidance and best practices for their implementation.

Graphical Abstract

References

Share this article:
Graphical Abstract
How to Cite
Ehab, W., Huang, L., & Li, Y. (2024). UNet and Variants for Medical Image Segmentation. International Journal of Network Dynamics and Intelligence, 3(2), 100009. https://doi.org/10.53941/ijndi.2024.100009
RIS
BibTex
Copyright & License
article copyright Image
Copyright (c) 2024 by the authors.

This work is licensed under a This work is licensed under a Creative Commons Attribution 4.0 International License.

scilight logo

About Scilight

Contact Us

Suite 4002 Level 4, 447 Collins Street, Melbourne, Victoria 3000, Australia
General Inquiries: info@sciltp.com
© 2025 Scilight Press Pty Ltd All rights reserved.