Michael Mitzenmacher is a computer scientist at Harvard with a blog My Biased Coin. As you might expect from the title, Michael works in the area of randomized algorithms, and even has a book on the subject. His blog is an extremely useful guide to the what is happening in algorithms in CS (and what is happening in CS at Harvard, which is also quite interesting). He often provides a summary of talks given at the big theory conferences (FOCS/STOC/etc.). He just posted on this year’s FOCS (here and here).

There was one talk that caught my eye, summarized by a doctoral student:

[Editor: Fourth-year grad student Justin Thaler of Harvard contributes a summary of two unrelated talks.]

Paper Title: The Cutting Plane Method is Polynomial for Perfect Matchings.Harvard’s own Karthekeyan Chandrasekaran talked about joint work with Laszlo A. Vegh and Santosh S. Vempala on cutting plane algorithms for matching problems. The cutting plane method is a popular algorithm for solving integer programs (IPs), used in commercial solvers. It works by starting with an LP relaxation of the given IP to obtain basic optimal solution x_0, and then iteratively adding constraints that are valid for integer solutions but violated by the basic optimum. It continues until the basic optimum is integral. The goal of this paper is to take a step toward explaining the practical efficiency of cutting plane methods, by giving an efficient cutting-plane algorithm for min-cost perfect matching (MWPM) –MWPM is known to be in P, but it was open (apparently for 30 years) whether there was a polynomial-time cutting-plane algorithm for this problem.A brief summary of how they achieve this is as follows. They start with a natural, well-known LP relaxation of the MWPM problem, called the bipartite relaxation. This relaxation has the nice property that all basic optima x are half-integral, and the support of x is a disjoint union of edges and odd cycles. This makes it easy to find cuts (the cuts correspond to what are called blossom inequalities, see the paper for details). A major challenge, though, is that naively adding cuts will not preserve the half-integrality of intermediate LPs, so at each iteration they throw away some of the old cuts that were added earlier in the execution. They need to take considerable care in choosing which cuts to keep in order to guarantee half-integrality of intermediate LPs (and to ensure that their algorithm makes progress at a sufficiently high rate).

This is pretty amazing. First, it is wonderful that they were able to prove polynomiality. It had bothered me that it seemed you might need an exponential number of cuts, even for something like matching. I had looked at this 25 years ago when doing my doctorate, but didn’t have any particularly insightful ideas.

But the really amazing thing is that they were able to arrange their algorithm so they never had to work with anything worse than half-integral solutions. This is astounding! A bane of cutting plane approaches is the weird fractions that keep popping up, leading to numerical stability problems. Here, they were able to keep things well under control. And, by keeping to half-integral, maybe some old ideas I had about using generalized networks (networks with multipliers) might come back into play. The approach certainly avoids the need for Gomory-Hu cut tree approaches to finding violated inequalities: violated inequalities come straight out of connected components. This also harkens back to my dissertation where I had treated matching as a generalized network with side constraints.

So I checked out the paper that underlies the talk at arXiv, thinking I might try to implement some things this week and see how it works (CS theory people rarely implement: you could make an entire career out of simply implementing what CS people suggest). On the plus side, they reference my dissertation, so at least I am in the right ballpark. On the down side: it is looking a bit complicated! Looks like I will have to reel in one of the bright doctoral students around to plow through this with me.

And I wonder what else this might be used for?