Alert button
Picture for James Saunderson

James Saunderson

Alert button

A Projection Method for Metric-Constrained Optimization

Jun 05, 2018
Nate Veldt, David Gleich, Anthony Wirth, James Saunderson

Figure 1 for A Projection Method for Metric-Constrained Optimization
Figure 2 for A Projection Method for Metric-Constrained Optimization
Figure 3 for A Projection Method for Metric-Constrained Optimization

We outline a new approach for solving optimization problems which enforce triangle inequalities on output variables. We refer to this as metric-constrained optimization, and give several examples where problems of this form arise in machine learning applications and theoretical approximation algorithms for graph clustering. Although these problem are interesting from a theoretical perspective, they are challenging to solve in practice due to the high memory requirement of black-box solvers. In order to address this challenge we first prove that the metric-constrained linear program relaxation of correlation clustering is equivalent to a special case of the metric nearness problem. We then developed a general solver for metric-constrained linear and quadratic programs by generalizing and improving a simple projection algorithm originally developed for metric nearness. We give several novel approximation guarantees for using our framework to find lower bounds for optimal solutions to several challenging graph clustering problems. We also demonstrate the power of our framework by solving optimizing problems involving up to 10^{8} variables and 10^{11} constraints.

Viaarxiv icon

Estimating the Spectral Density of Large Implicit Matrices

Feb 09, 2018
Ryan P. Adams, Jeffrey Pennington, Matthew J. Johnson, Jamie Smith, Yaniv Ovadia, Brian Patton, James Saunderson

Figure 1 for Estimating the Spectral Density of Large Implicit Matrices
Figure 2 for Estimating the Spectral Density of Large Implicit Matrices
Figure 3 for Estimating the Spectral Density of Large Implicit Matrices
Figure 4 for Estimating the Spectral Density of Large Implicit Matrices

Many important problems are characterized by the eigenvalues of a large matrix. For example, the difficulty of many optimization problems, such as those arising from the fitting of large models in statistics and machine learning, can be investigated via the spectrum of the Hessian of the empirical loss function. Network data can be understood via the eigenstructure of a graph Laplacian matrix using spectral graph theory. Quantum simulations and other many-body problems are often characterized via the eigenvalues of the solution space, as are various dynamic systems. However, naive eigenvalue estimation is computationally expensive even when the matrix can be represented; in many of these situations the matrix is so large as to only be available implicitly via products with vectors. Even worse, one may only have noisy estimates of such matrix vector products. In this work, we combine several different techniques for randomized estimation and show that it is possible to construct unbiased estimators to answer a broad class of questions about the spectra of such implicit matrices, even in the presence of noise. We validate these methods on large-scale problems in which graph theory and random matrix theory provide ground truth.

Viaarxiv icon