2019 Keynote – William T. Freeman

William T. Freeman

Thomas and Gerd Perkins Professor of Electrical Engineering
and Computer Science, Massachusetts Institute of Technology, Google Research

William T. Freeman is the Thomas and Gerd Perkins Professor of Electrical Engineering and Computer Science at MIT, and a member of the Computer Science and Artificial Intelligence Laboratory (CSAIL) there. He was the Associate Department Head from 2011 – 2014.

Dr. Freeman’s current research interests include machine learning applied to computer vision, Bayesian models of visual perception, and computational photography. He received outstanding paper awards at computer vision or machine learning conferences in 1997, 2006, 2009 and 2012, and test-of-time awards for papers from 1990, 1995 and 2005. Previous research topics include steerable filters and pyramids, orientation histograms, the generic viewpoint assumption, color constancy, computer vision for computer games, and belief propagation in networks with loops.

He is active in the program or organizing committees of computer vision, graphics, and machine learning conferences. He was the program co-chair for ICCV 2005, and for CVPR 2013.

To learn more about Professor Freeman and his research, please visit his website.

Visualizations of imperceptible visual signals

Saturday, May 18, 2019, 7:15 pm, Talk Room 1-2

Many useful visual stimuli are below the threshold of perception.  By amplifying tiny motions and small photometric changes we can reveal a rich world of sub-threshold imagery.

Using an image representation modeled after features of V1, we have developed a “motion microscope” that rerenders a video with the small motions amplified.  I’ll show motion magnified videos of singers, dancers, bridges, robots, and pipes, revealing properties that are otherwise hidden.  Small photometric changes can also be measured and amplified.  This can reveal the human pulse on skin, or people moving in an adjacent room.

Unseen intensity changes also occur when an occluder modulates light from a scene, creating an “accidental camera”.  I’ll describe the invisible signals caused by corners and plants, and show how they can reveal imagery that is otherwise out of view.

I’ll close by describing my white whale, the Earth selfie.  This is an effort to photograph the Earth from space with ground-based equipment by using the Moon as a camera.  I’ll explain why this project matters, and will summarize recent progress.