Convex cost closure and Markov Random Fields problems: Applications and fastest-possible algorithms
Seminar | March 1 | 4-5 p.m. | 1011 Evans Hall
Dorit S. Hochbaum, IEOR dept UC Berkeley
Many problems in fitting observations while satisfying rank order constraints, occur in contexts of learning, Lipschitz regularization and Isotonic regression (with or without fused Lasso). All these problems can be abstracted as a convex cost closure problem which is to minimize the cost of deviating from the observations while satisfying rank order constraints. Any feasible solution that satisfies rank order constraints is a closed set in a related graph. The convex cost closure is a special case of the Markov Random Fields (MRF) problem, which allows for violation of rank order constraints, with penalties associated with the violation called separation costs. Applications of MRF include total variations, image segmentation, adjacency clustering and aggregate ranking.
We present here efficient algorithms for these problems. For the convex cost closure problem, and the MRF problem with "bilinear" separation costs (that are a linear function of the positive separation and of the negative separation) the algorithm devised is provably best possible. The algorithm uses a parametric cut procedure. The key properties of the algorithm and the relation of the optimum to a minimum cut partition will be discussed.