Lehr- und Forschungseinheit für Datenbanksysteme
print


Breadcrumb Navigation


Content

Accepted paper at ICDM 2021 Workshop HDM 2021

Implicit Hough Transform Neural Networks for Subspace Clustering

27.09.2021

Authors

Julian Busch, Maximilian Hünemörder, Janis Held, Peer Kröger, Thomas Seidl

icdm_logo


The 9th ICDM Workshop on High Dimensional Data Mining (HDM 2021)
in conjunction with the 21st IEEE International Conference on Data Mining (ICDM 2021),
07–10 December 2021, Auckland, New Zealand

Abstract

Subspace clustering constitutes a fundamental task in data mining and unsupervised machine learning with myriad applications. We present a novel approach to subspace clustering that detects affine hyperplanes in a given arbitrary-dimensional dataset by explicitly parametrizing them and optimizing their parameters using gradient updates w.r.t. a differentiable loss function.
The explicit parametrization allows our model to avoid the exponential search space incurred by models relying on an explicit Hough transform to detect subspaces by searching for high-density points in parameter space. Compared to other existing approaches, our method is highly scalable, can be trained very efficiently on a GPU, is applicable to out-of-sample data, and is amenable to anytime scenarios since training can be stopped at any time and convergence is usually fast.
The model can further be viewed as a linear neural network layer and trained end-to-end with an autoencoder to detect arbitrary non-linear correlations.
We provide empirical results on a wide array of synthetic datasets with different characteristics following a rigorous evaluation protocol. Our results demonstrate the advantageous properties of our model and additionally reveal that it is particularly robust to jitter and noise present in the data.