TY - GEN
T1 - Two generator game: Learning to sample via linear goodness-of-fit test
AU - Ding, Lizhong
AU - Yu, Mengyang
AU - Liu, Li
AU - Zhu, Fan
AU - Liu, Yong
AU - Li, Yu
AU - Shao, Ling
N1 - KAUST Repository Item: Exported on 2020-10-09
Acknowledgements: This work was supported in part by National Natural Science Foundation of China (No. 61703396), the CCF-Tencent Open Fund and Shenzhen Government (GJHZ20180419190732022).
PY - 2019/1/1
Y1 - 2019/1/1
N2 - Learning the probability distribution of high-dimensional data is a challenging problem. To solve this problem, we formulate a deep energy adversarial network (DEAN), which casts the energy model learned from real data into an optimization of a goodness-of-fit (GOF) test statistic. DEAN can be interpreted as a GOF game between two generative networks, where one explicit generative network learns an energy-based distribution that fits the real data, and the other implicit generative network is trained by minimizing a GOF test statistic between the energy-based distribution and the generated data, such that the underlying distribution of the generated data is close to the energy-based distribution. We design a two-level alternative optimization procedure to train the explicit and implicit generative networks, such that the hyper-parameters can also be automatically learned. Experimental results show that DEAN achieves high quality generations compared to the state-of-the-art approaches.
AB - Learning the probability distribution of high-dimensional data is a challenging problem. To solve this problem, we formulate a deep energy adversarial network (DEAN), which casts the energy model learned from real data into an optimization of a goodness-of-fit (GOF) test statistic. DEAN can be interpreted as a GOF game between two generative networks, where one explicit generative network learns an energy-based distribution that fits the real data, and the other implicit generative network is trained by minimizing a GOF test statistic between the energy-based distribution and the generated data, such that the underlying distribution of the generated data is close to the energy-based distribution. We design a two-level alternative optimization procedure to train the explicit and implicit generative networks, such that the hyper-parameters can also be automatically learned. Experimental results show that DEAN achieves high quality generations compared to the state-of-the-art approaches.
UR - http://hdl.handle.net/10754/665496
UR - http://papers.nips.cc/paper/9304-two-generator-game-learning-to-sample-via-linear-goodness-of-fit-test.pdf
UR - http://www.scopus.com/inward/record.url?scp=85090172488&partnerID=8YFLogxK
M3 - Conference contribution
BT - 33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019
PB - Neural information processing systems foundation
ER -