TY - GEN
T1 - Outlier-Suppressed Triplet Loss with Adaptive Class-Aware Margins for Facial Expression Recognition
AU - Tian, Yi
AU - Wen, Zhiwei
AU - Xie, Weicheng
AU - Zhang, Xi
AU - Shen, Linlin
AU - Duan, Jinming
N1 - Publisher Copyright:
© 2019 IEEE.
PY - 2019/9
Y1 - 2019/9
N2 - Triplet loss has been proposed to increase the inter-class distance and decrease the intra-class distance for various tasks of image recognition. However, for facial expression recognition (FER) problem, the fixed margin parameter does not fit the diversity of scales between different expressions. Meanwhile, the strategy of selecting the hardest triplets can introduce noisy guidance information since various persons may present significantly different expressions. In this work, we propose a new triplet loss based on class-aware margins and outlier-suppressed triplet for FER, where each pair of expressions, e.g. 'happy' and 'fear', is assigned with an adaptive margin parameter and the abnormal hard triplets are discarded according to the feature distance distribution. Experimental results of the proposed triplet loss on the FER2013 and CK+ expression databases show that the proposed network achieves much better accuracy than the original triplet loss and the network without using the proposed strategies, and competitive performance compared with the state-of-the-art algorithms.
AB - Triplet loss has been proposed to increase the inter-class distance and decrease the intra-class distance for various tasks of image recognition. However, for facial expression recognition (FER) problem, the fixed margin parameter does not fit the diversity of scales between different expressions. Meanwhile, the strategy of selecting the hardest triplets can introduce noisy guidance information since various persons may present significantly different expressions. In this work, we propose a new triplet loss based on class-aware margins and outlier-suppressed triplet for FER, where each pair of expressions, e.g. 'happy' and 'fear', is assigned with an adaptive margin parameter and the abnormal hard triplets are discarded according to the feature distance distribution. Experimental results of the proposed triplet loss on the FER2013 and CK+ expression databases show that the proposed network achieves much better accuracy than the original triplet loss and the network without using the proposed strategies, and competitive performance compared with the state-of-the-art algorithms.
KW - class-aware margin
KW - facial expression recognition
KW - outlier suppression
KW - triplet loss
UR - http://www.scopus.com/inward/record.url?scp=85076819298&partnerID=8YFLogxK
U2 - 10.1109/ICIP.2019.8802918
DO - 10.1109/ICIP.2019.8802918
M3 - Conference contribution
AN - SCOPUS:85076819298
T3 - Proceedings - International Conference on Image Processing, ICIP
SP - 46
EP - 50
BT - 2019 IEEE International Conference on Image Processing, ICIP 2019 - Proceedings
PB - IEEE Computer Society
T2 - 26th IEEE International Conference on Image Processing, ICIP 2019
Y2 - 22 September 2019 through 25 September 2019
ER -