Paper by Erik D. Demaine

Reference:
Erik Demaine, Adam Hesterberg, Frederic Koehler, Jayson Lynch, and John Urschel, “Multidimensional Scaling: Approximation and Complexity”, in Proceedings of the 38th International Conference on Machine Learning (ICML 2021), edited by Meila, Marina and Zhang, Tong, Proceedings of Machine Learning Research, volume 139, July 18–24, 2021, pages 2568–2578.

Abstract:
Metric Multidimensional scaling (MDS) is a classical method for generating meaningful (non-linear) low-dimensional embeddings of high-dimensional data. MDS has a long history in the statistics, machine learning, and graph drawing communities. In particular, the Kamada-Kawai force-directed graph drawing method is equivalent to MDS and is one of the most popular ways in practice to embed graphs into low dimensions. Despite its ubiquity, our theoretical understanding of MDS remains limited as its objective function is highly non-convex. In this paper, we prove that minimizing the Kamada-Kawai objective is NP-hard and give a provable approximation algorithm for optimizing it, which in particular is a PTAS on low-diameter graphs. We supplement this result with experiments suggesting possible connections between our greedy approximation algorithm and gradient-based methods.

Comments:
The long paper (22 pages) and code are supplementary materials.

This paper and supplementary materials are also available from PMLR.

The full paper is also available as arXiv:2109.11505.

Length:
The paper is 11 pages.

Availability:
The paper is available in PDF (2384k).
See information on file formats.
[Google Scholar search]


See also other papers by Erik Demaine.
These pages are generated automagically from a BibTeX file.
Last updated March 12, 2024 by Erik Demaine.