Researchers have proposed using non-Euclidean metrics for clustering data points. A metric for clustering should recognize that two points in the same cluster are close, even if their Euclidean distance is far. Multiple proposals have been suggested, including the Edge-Squared Metric (an example of a shortest-path metric on a graph) and the Nearest Neighbor Metric (a continuous, geodesic metric).

In this talk, we prove that the edge-squared and nearest-neighbor metrics are equivalent. Previous best work showed that the edge-squared metric was a 3-approximation of the Nearest Neighbor metric. This paper represents one of the first proofs known to us equating a continuous metric with a discrete metric, using entirely discrete methods. Our proof makes use of the Kirszbraun theorem, a notable theorem in embedding theory and computation geometry.

The results of our paper, combined with the results of Hwang, Damelin, and Hero, tell us that the Nearest Neighbor distance on i.i.d samples is a reasonable constant approximation of a natural density-based distance function.