Alert button
Picture for Bjoern Andres

Bjoern Andres

Alert button

A 4-approximation algorithm for min max correlation clustering

Oct 30, 2023
Holger Heidrich, Jannik Irmai, Bjoern Andres

We introduce a lower bounding technique for the min max correlation clustering problem and, based on this technique, a combinatorial 4-approximation algorithm for complete graphs. This improves upon the previous best known approximation guarantees of 5, using a linear program formulation (Kalhan et al., 2019), and 40, for a combinatorial algorithm (Davies et al., 2023). We extend this algorithm by a greedy joining heuristic and show empirically that it improves the state of the art in solution quality and runtime on several benchmark datasets.

* 9 pages 
Viaarxiv icon

A Graph Multi-separator Problem for Image Segmentation

Jul 10, 2023
Jannik Irmai, Shengxian Zhao, Jannik Presberger, Bjoern Andres

Figure 1 for A Graph Multi-separator Problem for Image Segmentation
Figure 2 for A Graph Multi-separator Problem for Image Segmentation
Figure 3 for A Graph Multi-separator Problem for Image Segmentation
Figure 4 for A Graph Multi-separator Problem for Image Segmentation

We propose a novel abstraction of the image segmentation task in the form of a combinatorial optimization problem that we call the multi-separator problem. Feasible solutions indicate for every pixel whether it belongs to a segment or a segment separator, and indicate for pairs of pixels whether or not the pixels belong to the same segment. This is in contrast to the closely related lifted multicut problem where every pixel is associated to a segment and no pixel explicitly represents a separating structure. While the multi-separator problem is NP-hard, we identify two special cases for which it can be solved efficiently. Moreover, we define two local search algorithms for the general case and demonstrate their effectiveness in segmenting simulated volume images of foam cells and filaments.

* 36 pages 
Viaarxiv icon

Correlation Clustering of Bird Sounds

Jun 16, 2023
David Stein, Bjoern Andres

Figure 1 for Correlation Clustering of Bird Sounds
Figure 2 for Correlation Clustering of Bird Sounds
Figure 3 for Correlation Clustering of Bird Sounds
Figure 4 for Correlation Clustering of Bird Sounds

Bird sound classification is the task of relating any sound recording to those species of bird that can be heard in the recording. Here, we study bird sound clustering, the task of deciding for any pair of sound recordings whether the same species of bird can be heard in both. We address this problem by first learning, from a training set, probabilities of pairs of recordings being related in this way, and then inferring a maximally probable partition of a test set by correlation clustering. We address the following questions: How accurate is this clustering, compared to a classification of the test set? How do the clusters thus inferred relate to the clusters obtained by classification? How accurate is this clustering when applied to recordings of bird species not heard during training? How effective is this clustering in separating, from bird sounds, environmental noise not heard during training?

* 13 pages 
Viaarxiv icon

Partial Optimality in Cubic Correlation Clustering

Feb 09, 2023
David Stein, Silvia Di Gregorio, Bjoern Andres

Figure 1 for Partial Optimality in Cubic Correlation Clustering
Figure 2 for Partial Optimality in Cubic Correlation Clustering
Figure 3 for Partial Optimality in Cubic Correlation Clustering
Figure 4 for Partial Optimality in Cubic Correlation Clustering

The higher-order correlation clustering problem is an expressive model, and recently, local search heuristics have been proposed for several applications. Certifying optimality, however, is NP-hard and practically hampered already by the complexity of the problem statement. Here, we focus on establishing partial optimality conditions for the special case of complete graphs and cubic objective functions. In addition, we define and implement algorithms for testing these conditions and examine their effect numerically, on two datasets.

* 27 pages 
Viaarxiv icon

A Polyhedral Study of Lifted Multicuts

Feb 16, 2022
Bjoern Andres, Silvia Di Gregorio, Jannik Irmai, Jan-Hendrik Lange

Figure 1 for A Polyhedral Study of Lifted Multicuts
Figure 2 for A Polyhedral Study of Lifted Multicuts
Figure 3 for A Polyhedral Study of Lifted Multicuts
Figure 4 for A Polyhedral Study of Lifted Multicuts

Fundamental to many applications in data analysis are the decompositions of a graph, i.e. partitions of the node set into component-inducing subsets. One way of encoding decompositions is by multicuts, the subsets of those edges that straddle distinct components. Recently, a lifting of multicuts from a graph $G = (V, E)$ to an augmented graph $\hat G = (V, E \cup F)$ has been proposed in the field of image analysis, with the goal of obtaining a more expressive characterization of graph decompositions in which it is made explicit also for pairs $F \subseteq \tbinom{V}{2} \setminus E$ of non-neighboring nodes whether these are in the same or distinct components. In this work, we study in detail the polytope in $\mathbb{R}^{E \cup F}$ whose vertices are precisely the characteristic vectors of multicuts of $\hat G$ lifted from $G$, connecting it, in particular, to the rich body of prior work on the clique partitioning and multilinear polytope.

* 63 pages, 18 figures 
Viaarxiv icon

Inapproximability of Minimizing a Pair of DNFs or Binary Decision Trees Defining a Partial Boolean Function

Mar 03, 2021
David Stein, Bjoern Andres

Figure 1 for Inapproximability of Minimizing a Pair of DNFs or Binary Decision Trees Defining a Partial Boolean Function
Figure 2 for Inapproximability of Minimizing a Pair of DNFs or Binary Decision Trees Defining a Partial Boolean Function

The desire to apply machine learning techniques in safety-critical environments has renewed interest in the learning of partial functions for distinguishing between positive, negative and unclear observations. We contribute to the understanding of the hardness of this problem. Specifically, we consider partial Boolean functions defined by a pair of Boolean functions $f, g \colon \{0,1\}^J \to \{0,1\}$ such that $f \cdot g = 0$ and such that $f$ and $g$ are defined by disjunctive normal forms or binary decision trees. We show: Minimizing the sum of the lengths or depths of these forms while separating disjoint sets $A \cup B = S \subseteq \{0,1\}^J$ such that $f(A) = \{1\}$ and $g(B) = \{1\}$ is inapproximable to within $(1 - \epsilon) \ln (|S|-1)$ for any $\epsilon > 0$, unless P=NP.

* Typesetting of references fixed 
Viaarxiv icon

End-to-end Learning for Graph Decomposition

Dec 23, 2018
Jie Song, Bjoern Andres, Michael Black, Otmar Hilliges, Siyu Tang

Figure 1 for End-to-end Learning for Graph Decomposition
Figure 2 for End-to-end Learning for Graph Decomposition
Figure 3 for End-to-end Learning for Graph Decomposition
Figure 4 for End-to-end Learning for Graph Decomposition

We propose a novel end-to-end trainable framework for the graph decomposition problem. The minimum cost multicut problem is first converted to an unconstrained binary cubic formulation where cycle consistency constraints are incorporated into the objective function. The new optimization problem can be viewed as a Conditional Random Field (CRF) in which the random variables are associated with the binary edge labels of the initial graph and the hard constraints are introduced in the CRF as high-order potentials. The parameters of a standard Neural Network and the fully differentiable CRF are optimized in an end-to-end manner. Furthermore, our method utilizes the cycle constraints as meta-supervisory signals during the learning of the deep feature representations by taking the dependencies between the output random variables into account. We present analyses of the end-to-end learned representations, showing the impact of the joint training, on the task of clustering images of MNIST. We also validate the effectiveness of our approach both for the feature learning and the final clustering on the challenging task of real-world multi-person pose estimation.

Viaarxiv icon

Discrete-Continuous ADMM for Transductive Inference in Higher-Order MRFs

Apr 28, 2018
Emanuel Laude, Jan-Hendrik Lange, Jonas Schüpfer, Csaba Domokos, Laura Leal-Taixé, Frank R. Schmidt, Bjoern Andres, Daniel Cremers

Figure 1 for Discrete-Continuous ADMM for Transductive Inference in Higher-Order MRFs
Figure 2 for Discrete-Continuous ADMM for Transductive Inference in Higher-Order MRFs
Figure 3 for Discrete-Continuous ADMM for Transductive Inference in Higher-Order MRFs
Figure 4 for Discrete-Continuous ADMM for Transductive Inference in Higher-Order MRFs

This paper introduces a novel algorithm for transductive inference in higher-order MRFs, where the unary energies are parameterized by a variable classifier. The considered task is posed as a joint optimization problem in the continuous classifier parameters and the discrete label variables. In contrast to prior approaches such as convex relaxations, we propose an advantageous decoupling of the objective function into discrete and continuous subproblems and a novel, efficient optimization method related to ADMM. This approach preserves integrality of the discrete label variables and guarantees global convergence to a critical point. We demonstrate the advantages of our approach in several experiments including video object segmentation on the DAVIS data set and interactive image segmentation.

Viaarxiv icon