Colloquium | November 30 | 4:10-5:15 p.m. | 60 Evans Hall
Computational complexity lower bounds like P != NP assert impossibility results for all possible programs of some restricted form. As there are presently enormous gaps in our knowledge of lower bounds, a central question on the minds of todays complexity theorists is: how will we find better ways to reason about all efficient programs? I argue that some progress can be made by (very deliberately) thinking algorithmically about the lower bound problem. Slightly more precisely, to prove a lower bound against some class C of programs, we can start by treating C as a set of inputs to another (larger) process, which is intended to perform some basic analysis of programs in C. By carefully studying the algorithmic meta-analysis of programs in C, we can learn more about the limitations of the programs being analyzed. This way of thinking has led to several recent advances in complexity lower bounds where no progress had been made for many years. Indeed, in several interesting cases, the only way we presently know how to prove lower bounds against some classes C is to directly design algorithms for analyzing C.