Salient Object Detection with Capsule-Based Conditional Generative Adversarial Network

Chao Zhang, Fei Yang, Guoping Qiu, Qian Zhang

Research output: Chapter in Book/Conference proceedingConference contributionpeer-review

5 Citations (Scopus)

Abstract

Salient Object Detection (SOD) is one significant research area which is closely correlated to the attention of human beings. Most of the nowadays CNN-based approaches for SOD are based on an U-Net architecture. In this paper, we propose a novel capsule-based salient object detection framework by integrating the novel capsule blocks into both the generator and discriminator of GAN architecture. The experimental result showed that our approach is able to generate accurate saliency maps, which also highlighted the effectiveness of the capsule blocks. We also provide a challenging dataset that contains 3,299 images for SOD with difficult foreground objects and complex background contents.

Original languageEnglish
Title of host publication2019 IEEE International Conference on Image Processing, ICIP 2019 - Proceedings
PublisherIEEE Computer Society
Pages81-85
Number of pages5
ISBN (Electronic)9781538662496
DOIs
Publication statusPublished - Sept 2019
Event26th IEEE International Conference on Image Processing, ICIP 2019 - Taipei, Taiwan, Province of China
Duration: 22 Sept 201925 Sept 2019

Publication series

NameProceedings - International Conference on Image Processing, ICIP
Volume2019-September
ISSN (Print)1522-4880

Conference

Conference26th IEEE International Conference on Image Processing, ICIP 2019
Country/TerritoryTaiwan, Province of China
CityTaipei
Period22/09/1925/09/19

Keywords

  • Capsule Net
  • Generative Adversarial Network
  • Image-level Saliency
  • Salient Object Detection
  • cGAN

ASJC Scopus subject areas

  • Software
  • Computer Vision and Pattern Recognition
  • Signal Processing

Fingerprint

Dive into the research topics of 'Salient Object Detection with Capsule-Based Conditional Generative Adversarial Network'. Together they form a unique fingerprint.

Cite this