Research Article Details

Article ID: A29277
PMID: 34310290
Source: IEEE Trans Pattern Anal Mach Intell
Title: AlphaGAN: Fully Differentiable Architecture Search for Generative Adversarial Networks.
Abstract: Generative Adversarial Networks (GANs) are formulated as minimax game problems, where generators attempt to approach real data distributions by adversarial learning against discriminators which learn to distinguish generated samples from real ones. In this work, we aim to boost model learning from the perspective of network architectures, by incorporating recent progress on automated architecture search into GANs. Specially we propose a fully differentiable search framework, dubbed {\em alphaGAN}, where the searching process is formalized as a bi-level minimax optimization problem. The outer-level objective aims for seeking an optimal architecture towards pure Nash Equilibrium conditioned on the network parameters optimized with a traditional adversarial loss within inner level. Extensive experiments on CIFAR-10 and STL-10 datasets show that our algorithm can obtain high-performing architectures only with 3-GPU hours on a single GPU in the search space comprised of approximate 2×1011 possible configurations. We further validate the method on the state-of-the-art StyleGAN2, and push the score of Frchet Inception Distance (FID) further, i.e., achieving 1.94 on CelebA, 2.86 on LSUN-church and 2.75 on FFHQ, with relative improvements 3% ∼ 26% over the baseline architecture. We also provide a comprehensive analysis of the behavior of the searching process and the properties of searched architectures.
DOI: 10.1109/TPAMI.2021.3099829