2511002393
  • Open Access
  • Article

Deep Learning-Based Segmentation of the Brachial Plexus in Ultrasound Images: A Cross-Device Generalization Assessment

  • Dingcheng Tian,   
  • Xinlong Zhao *

Received: 04 Aug 2025 | Revised: 11 Nov 2025 | Accepted: 25 Nov 2025 | Published: 02 Dec 2025

Abstract

Ultrasound-guided brachial plexus regional anesthesia is a commonly used clinical technique for upper limb surgery analgesia. It provides excellent efficacy and high safety. However, due to the presence of substantial noise in ultrasound image pixels and the complex anatomical structures, accurate identification of nerves heavily depends on the operator’s experience. Precise nerve identification is crucial for patient recovery. Deep learning-based image segmentation can automatically identify the location of the brachial plexus in ultrasound images, thereby assisting clinicians in performing brachial plexus nerve blocks.In this study, we systematically compared the performance of three neural network architectures for brachial plexus segmentation in ultrasound images, including CNN-based, Transformer-based, and Mamba-based models. The experimental data come from ultrasound images acquired by three different devices (eSaote, Sonosite, and Butterfly). All models were trained on data from the eSaote device and tested on images from both eSaote and the Sonosite and Butterfly devices.The results indicate that on the eSaote device dataset, ConvUNeXt and UNet achieved the highest mean Intersection over Union (mIoU), with scores of 0.9027 and 0.9043, respectively. However, in cross-device testing, TransUNet and VMUNet exhibited better generalization ability. On the low-quality Butterfly test set, TransUNet maintained strong segmentation performance.In addition, the models showed some limitations in data dependency and cross-domain adaptability, and possible directions for improvement are suggested. This study can serve as a reference for selecting and optimizing ultrasound nerve segmentation models.

References 

  • 1.

    Van Boxtel, J.; Vousten, V.; Pluim, J.; et al. Hybrid deep neural network for brachial plexus nerve segmentation in ultrasound images. In Proceedings of the 29th European Signal Processing Conference (EUSIPCO), Dublin, Ireland, 23–27 August 2021; pp. 1246–1250.

  • 2.

    Pincus, E. Regional anesthesia: An overview. AORN J. 2019, 110, 263–272.

  • 3.

    Banerjee, S.; Acharya, R.; Sriramka, B. Ultrasound-guided inter-scalene brachial plexus block with superficial cervical plexus block compared with general anesthesia in patients undergoing clavicular surgery: A comparative analysis. Anesth. Essays Res. 2019, 13, 149–154.

  • 4.

    Chui, J.; Murkin, J.M.; Posner, K.L.; et al. Perioperative peripheral nerve injury after general anesthesia: A qualitative systematic review. Anesth. Analg. 2018, 127, 134–143.

  • 5.

    Mateo, J.L.; Fern´andez-Caballero, A. Finding out general tendencies in speckle noise reduction in ultrasound images. Expert Syst. Appl. 2009, 36, 7786–7797.

  • 6.

    Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In Proceedings of the Medical Image Computing and Computer-Assisted Intervention–MICCAI 2015: 18th International Conference, Munich, Germany, 5–9 October 2015; pp. 234–241.

  • 7.

    Li, X.; Fang, J.; Zhao, Y. A Multi-Target Identification and Positioning System Method for Tomato Plants Based on VGG16-UNet Model. Appl. Sci. 2024, 14, 2804.

  • 8.

    Trebing, K.; Stanczyk, T.; Mehrkanoon, S. SmaAt-UNet: Precipitation nowcasting using a small attention-UNet architecture. Pattern Recognit. Lett. 2021, 145, 178–186.

  • 9.

    Cao, H.; Wang, Y.; Chen, J.; et al. Swin-unet: Unet-like pure transformer for medical image segmentation. In European Conference on Computer Vision; Springer: Cham, Switzerland, 2022; pp. 205–218.

  • 10.

    Ruan, J.; Li, J.; Xiang, S. Vm-unet: Vision mamba unet for medical image segmentation. arXiv 2024, arXiv:2402.02491.

  • 11.

    Wu, R.; Liu, Y.; Liang, P.; et al. H-vmunet: High-order vision mamba unet for medical image segmentation. Neurocomputing 2025, 624, 129447.

  • 12.

    Han, Z.; Jian, M.; Wang, G.G. ConvUNeXt: An efficient convolution neural network for medical image segmentation. Knowl.-Based Syst. 2022, 253, 109512.

  • 13.

    Xu, Q.; Ma, Z.; Duan, W.; et al. DCSAU-Net: A deeper and more compact split-attention U-Net for medical image segmentation. Comput. Biol. Med. 2023, 154, 106626.

  • 14.

    Wu, H.; Liu, J.; Wang, W.; et al. Region-aware global context modeling for automatic nerve segmentation from ultrasound images. Proc. AAAI Conf. Artif. Intell. 2021, 35, 2907–2915.

  • 15.

    Wang, Y.; Geng, J.; Zhou, C.; et al. Segmentation of ultrasound brachial plexus based on U-Net. In Proceedings of the 2021 International Conference on Communications, Information System and Computer Engineering (CISCE), Beijing, China, 14–16 May 2021; pp. 482–485.

  • 16.

    Tyagi, A.; Tyagi, A.; Kaur, M.; et al. Nerve Block Target Localization and Needle Guidance for Autonomous Robotic Ultrasound Guided Regional Anesthesia. In Proceedings of the 2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Abu Dhabi, United Arab Emirates, 14–18 October 2024; pp. 5867–5872.

  • 17.

    Tian, D.; Zhu, B.; Wang, J.; et al. Brachial plexus nerve trunk recognition from ultrasound images: A comparative study of deep learning models. IEEE Access 2022, 10, 82003–82014.

  • 18.

    Chen, J.; Lu, Y.; Yu, Q.; et al. Transunet: Transformers make strong encoders for medical image segmentation. arXiv 2021, arXiv:2102.04306.

  • 19.

    Hu, M.; Li, Y.; Yang, X. Skinsam: Empowering skin cancer segmentation with segment anything model. arXiv 2023, arXiv:2304.13973.

Share this article:
How to Cite
Tian, D.; Zhao, X. Deep Learning-Based Segmentation of the Brachial Plexus in Ultrasound Images: A Cross-Device Generalization Assessment. AI Medicine 2025, 2 (2), 7. https://doi.org/10.53941/aim.2025.100007.
RIS
BibTex
Copyright & License
article copyright Image
Copyright (c) 2025 by the authors.