Detecting Small Faces in the Wild Based on Generative Adversarial Network and Contextual Information

Yongqiang Zhang, Mingli Ding, Yancheng Bai, Bernard Ghanem

Research output: Contribution to journalArticlepeer-review

26 Scopus citations

Abstract

Face detection techniques have been developed for decades, and one of the remaining open challenges is detecting small faces in unconstrained conditions. The reason is that tiny faces are often lacking detailed information and blurry. In this paper, we proposed an algorithm to directly generate a clear high-resolution face from a small blurry one by adopting a generative adversarial network (GAN). Toward this end, the basic GAN formulation achieves it by super-resolving and refining sequentially (e.g. SR-GAN and Cycle-GAN). However, we design a novel network to address the problem of super-resolving and refining jointly. Moreover, we also introduce new training losses (i.e. classification loss and regression loss) to promote the generator network to recover fine details of the small faces and to guide the discriminator network to distinguish face vs. non-face and to refine location simultaneously. Additionally, considering the importance of contextual information when detecting tiny faces in crowded cases, the context around face regions is combined to train the proposed GAN-based network for mining those very small faces from unconstrained scenarios. Extensive experiments on the challenging datasets WIDER FACE and FDDB demonstrate the effectiveness of the proposed method in restoring a clear high-resolution face from a small blurry one, and show that the achieved performance outperforms previous state-of-the-art methods by a large margin.
Original languageEnglish (US)
Pages (from-to)74-86
Number of pages13
JournalPattern Recognition
Volume94
DOIs
StatePublished - May 15 2019

Fingerprint

Dive into the research topics of 'Detecting Small Faces in the Wild Based on Generative Adversarial Network and Contextual Information'. Together they form a unique fingerprint.

Cite this