Robotics Seminar

  • Remote Access - Zoom
  • Virtual Presentation - ET
  • Assistant Professor
  • Robotics Institute
  • Carnegie Mellon University

Understanding and Rewriting GANs

 What is the role of human agency and creativity in deep generative models such as Generative Adversarial Networks (GANs)? In this talk, I ask two questions: whether the internal representation of the model could be visualized and understood by humans, and more interestingly, whether deep generative networks can be made in a completely different way: can the rules of a network be directly rewritten by humans? Direct model rewriting is challenging because it requires a user to understand the structure, behavior, and purpose of millions of opaque parameters in a GAN model. How does a GAN represent our visual world internally? How does each layer affect the model behavior? What causes the artifacts in GAN results? In this talk, I first present an analytic framework to visualize and understand GANs at the neuron-, object-, and scene-level. Then I will describe our ongoing research to rewrite the rules of a deep generative model with minimal user inputs such as human sketching as well as simple object copy-and-paste interface. With the above methods, we can help everyone easily customize and create new generative models.

Jun-Yan Zhu is an Assistant Professor with The Robotics Institute in the School of Computer Science of Carnegie Mellon University. He also holds affiliated faculty appointments in the Computer Science Department and Machine Learning Department. Prior to joining CMU, he was a Research Scientist at Adobe Research and a postdoctoral researcher at MIT CSAIL. He obtained his Ph.D. from UC Berkeley and his B.E. from Tsinghua University. He studies computer vision, computer graphics, computational photography, and machine learning. He is the recipient of the Facebook Fellowship, ACM SIGGRAPH Outstanding Doctoral Dissertation Award, and UC Berkeley EECS David J. Sakrison Memorial Prize for outstanding doctoral research. His co-authored work has received the NVIDIA Pioneer Research Award, SIGGRAPH 2019 Real-time Live! Best of Show Award and Audience Choice Award, and The 100 Greatest Innovations of 2019 by Popular Science.

Faculty Host: Deepak Pathak

Zoom Participation. See announcement.

For More Information, Please Contact: