Search papers, labs, and topics across Lattice.
This paper introduces a Finsler manifold learning pipeline to capture asymmetric information in high-dimensional data, addressing the limitations of traditional symmetric Riemannian methods. They construct asymmetric dissimilarities and embed data in a Finsler space, generalizing existing embedding techniques like t-SNE and UMAP to handle asymmetry. Experiments on synthetic and real datasets demonstrate that this approach reveals valuable information, such as density hierarchies, and produces higher-quality embeddings compared to Euclidean methods.
Unlock hidden structure in your data: Finsler manifold learning harnesses asymmetry to reveal density hierarchies and improve embeddings beyond what symmetric methods can achieve.
Manifold learning is a fundamental task at the core of data analysis and visualisation. It aims to capture the simple underlying structure of complex high-dimensional data by preserving pairwise dissimilarities in low-dimensional embeddings. Traditional methods rely on symmetric Riemannian geometry, thus forcing symmetric dissimilarities and embedding spaces, e.g. Euclidean. However, this discards in practice valuable asymmetric information inherent to the non-uniformity of data samples. We suggest to harness this asymmetry by switching to Finsler geometry, an asymmetric generalisation of Riemannian geometry, and propose a Finsler manifold learning pipeline that constructs asymmetric dissimilarities and embeds in a Finsler space. This greatly broadens the applicability of existing asymmetric embedders beyond traditionally directed data to any data. We also modernise asymmetric embedders by generalising current reference methods to asymmetry, like Finsler t-SNE and Finsler Umap. On controlled synthetic and large real datasets, we show that our asymmetric pipeline reveals valuable information lost in the traditional pipeline, e.g. density hierarchies, and consistently provides superior quality embeddings than their Euclidean counterparts.