Alert button
Picture for Christian Igel

Christian Igel

Alert button

Bigger Buffer k-d Trees on Multi-Many-Core Systems

Dec 09, 2015
Fabian Gieseke, Cosmin Eugen Oancea, Ashish Mahabal, Christian Igel, Tom Heskes

Figure 1 for Bigger Buffer k-d Trees on Multi-Many-Core Systems
Figure 2 for Bigger Buffer k-d Trees on Multi-Many-Core Systems
Figure 3 for Bigger Buffer k-d Trees on Multi-Many-Core Systems
Figure 4 for Bigger Buffer k-d Trees on Multi-Many-Core Systems

A buffer k-d tree is a k-d tree variant for massively-parallel nearest neighbor search. While providing valuable speed-ups on modern many-core devices in case both a large number of reference and query points are given, buffer k-d trees are limited by the amount of points that can fit on a single device. In this work, we show how to modify the original data structure and the associated workflow to make the overall approach capable of dealing with massive data sets. We further provide a simple yet efficient way of using multiple devices given in a single workstation. The applicability of the modified framework is demonstrated in the context of astronomy, a field that is faced with huge amounts of data.

Viaarxiv icon

Recent Results on No-Free-Lunch Theorems for Optimization

Mar 31, 2003
Christian Igel, Marc Toussaint

Figure 1 for Recent Results on No-Free-Lunch Theorems for Optimization
Figure 2 for Recent Results on No-Free-Lunch Theorems for Optimization
Figure 3 for Recent Results on No-Free-Lunch Theorems for Optimization

The sharpened No-Free-Lunch-theorem (NFL-theorem) states that the performance of all optimization algorithms averaged over any finite set F of functions is equal if and only if F is closed under permutation (c.u.p.) and each target function in F is equally likely. In this paper, we first summarize some consequences of this theorem, which have been proven recently: The average number of evaluations needed to find a desirable (e.g., optimal) solution can be calculated; the number of subsets c.u.p. can be neglected compared to the overall number of possible subsets; and problem classes relevant in practice are not likely to be c.u.p. Second, as the main result, the NFL-theorem is extended. Necessary and sufficient conditions for NFL-results to hold are given for arbitrary, non-uniform distributions of target functions. This yields the most general NFL-theorem for optimization presented so far.

* 10 pages, LaTeX, see http://www.neuroinformatik.rub.de/PROJECTS/SONN/ 
Viaarxiv icon

Neutrality: A Necessity for Self-Adaptation

Apr 16, 2002
Marc Toussaint, Christian Igel

Figure 1 for Neutrality: A Necessity for Self-Adaptation
Figure 2 for Neutrality: A Necessity for Self-Adaptation
Figure 3 for Neutrality: A Necessity for Self-Adaptation

Self-adaptation is used in all main paradigms of evolutionary computation to increase efficiency. We claim that the basis of self-adaptation is the use of neutrality. In the absence of external control neutrality allows a variation of the search distribution without the risk of fitness loss.

* Proceedings of the Congress on Evolutionary Computation (CEC 2002), 1354-1359.  
* 6 pages, 3 figures, LaTeX 
Viaarxiv icon

On Classes of Functions for which No Free Lunch Results Hold

Aug 21, 2001
Christian Igel, Marc Toussaint

Figure 1 for On Classes of Functions for which No Free Lunch Results Hold

In a recent paper it was shown that No Free Lunch results hold for any subset F of the set of all possible functions from a finite set X to a finite set Y iff F is closed under permutation of X. In this article, we prove that the number of those subsets can be neglected compared to the overall number of possible subsets. Further, we present some arguments why problem classes relevant in practice are not likely to be closed under permutation.

* 8 pages, 1 figure, see http://www.neuroinformatik.ruhr-uni-bochum.de/ 
Viaarxiv icon