Friday, July 27 2018
11:30 - 13:00

Room 327

Polymorphic inquiries: (Promise) constraint satisfaction, fine-grained complexity, and more

Venkat Guruswami

CMU

What underlying mathematical structure (or lack thereof) in a computational problem governs its efficient solvability (or dictates its hardness)? In the realm of constraint satisfaction problems (CSPs), we have a definitive answer: an efficient algorithm exists precisely when there are non-trivial operations called polymorphisms under which the solution space is closed. Inspired and emboldened by this, one might speculate a broader polymorphic principle: if there are interesting ways to combine solutions to get more solutions, then the problem ought to be tractable (with the meanings of "interesting" and “tractable” being quite context dependent).

This sequence of lectures will survey some of the background on the polymorphic approach to understanding the complexity of CSPs. We will then discuss some extensions beyond CSPs where the polymorphic principle seems highly promising (yet far from understood) such as promise CSPs (where one is allowed to satisfy a relaxed version of the constraints) and fast exponential algorithms for NP-hard CSPs. En route, we will have a chance to discuss some connections to optimization, such as algorithms to solve LPs over different rings (like Z[sqrt{2}] - integers adjoined with sqrt{2}), and a random-walk based algorithm interpolating between 0-1 and linear programming. Specifically, given an n-variable LP feasible over {0,1}^n, the algorithm can find a solution in E^n, where E is a union of closed intervals containing {0,1}, in time at most (2-measure(E))^n. (Taking E=[0,1/3) U (2/3,1] for instance, one would recover Schoning’s (4/3)^n runtime for 3-SAT.)



Download as iCalendar

Done