HA-Net: Hierarchical Attention Network Based on Multi-Task Learning for Ciliary Muscle Segmentation in AS-OCT

Bing Yang, Xiaoqing Zhang, Sanqian Li, Risa Higashita, Jiang Liu

Research output: Journal PublicationArticlepeer-review

3 Citations (Scopus)

Abstract

Ciliary muscle segmentation in Anterior Segment Optical Coherence Tomography (AS-OCT) images is critical significance, yet challenging due to ambiguous boundaries. In this letter, we propose a hierarchical attention multi-task network, HA-Net, based on U-Net for ciliary muscle segmentation using AS-OCT images. The network comprises a primary task for ciliary muscle segmentation and two auxiliary tasks for signed distance map regression and key point localization. The signed distance map is employed to incorporate shape priors into the model and delineate the ciliary muscle boundary, while key point localization guides the model to focus on ambiguous regions. Notably, in contrast to the widely-used multi-task model that generates results in parallel, we introduce a hierarchical attention module to exploit the affiliation prior of three tasks for generating outputs serially. Experimental results on CM544 dataset demonstrate that HA-Net outperforms state-of-the-art methods in ciliary muscle segmentation, with 0.9178 Dice score and 7.11 pixels HD95. Additionally, as a by-product of the multi-task model, key point localization facilitates the measurement of ciliary muscle thickness in clinical analysis.

Original languageEnglish
Pages (from-to)1342-1346
Number of pages5
JournalIEEE Signal Processing Letters
Volume30
DOIs
Publication statusPublished - 2023
Externally publishedYes

Keywords

  • AS-OCT
  • Ciliary muscle segmentation
  • multi-task learning

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'HA-Net: Hierarchical Attention Network Based on Multi-Task Learning for Ciliary Muscle Segmentation in AS-OCT'. Together they form a unique fingerprint.

Cite this