Dissertation Talk: Overcoming the Curse of Dimensionality and Mode Collapse

Lecture: Dissertation Talk: CS | May 15 | 11 a.m.-12 p.m. | Soda Hall, 306 (HP Auditorium)

 Ke Li, Ph.D. Candidate, UC Berkeley

 Electrical Engineering and Computer Sciences (EECS)

In this talk, I will present our work on overcoming two long-standing problems in machine learning and algorithms:

1. Curse of Dimensionality in Nearest Neighbour Search

Efficient algorithms for exact nearest neighbour search developed over the past 40 years do not work in high (intrinsic) dimensions, due to the curse of dimensionality. It turns out that this problem is not insurmountable - I will explain how the curse of dimensionality arises and show a simple way to overcome it, which gives rise to a new family of algorithms, known as Dynamic Continuous Indexing (DCI).

2. Mode Collapse in Generative Adversarial Nets (GANs)

Generative adversarial nets (GANs) are perhaps the most popular generative model in use today. Unfortunately, they suffer from the well-documented problem of mode collapse, which the many successive variants of GANs have failed to overcome. I will illustrate why mode collapse happens fundamentally and show a simple way to overcome it, which is the basis of a new method, known as Implicit Maximum Likelihood Estimation (IMLE).

Time permitting, I will briefly cover two other research directions I have pursued, Learning to Optimize and instance segmentation.

All are welcome - no background on either area will be assumed.

 Faculty, Students - Graduate, Students - Undergraduate

 All Audiences

 ke.li@eecs.berkeley.edu