The Effect of Scale on Quality Metrics for Dimensionality Reduction

Despite its widespread use as a quality metric for dimensionality reduction techniques, normalized stress is sensitive to uniform scaling (stretching, shrinking) of the embedding. While not as often used, KL divergence suffers from the same drawback. By exploring how normalized stress behaves on a variety of simple datasets, we can construct a safe and reliable variant, scale-normalized stress.





Description & Acknowledgement

This is a web application that contains several figures. The figures plot the "normalized" stress and KL divergence curves of MDS, t-SNE, and Random (points are randomly placed) embeddings over a range of scalars for various well-known datasets. The minimum value for each curve has been labeled with a point, which denotes the scale-normalized metric score.

The code and instructions needed to reproduce the experiments upon which this graphical interface is based, including the implementation of scale-normalized stress and KL divergence, are made available on GitHub.

Published on July 7, 2024, (updated April 15, 2025) and created by Kiran Smelser to demonstrate the issues with using normalized stress as a quality metric. Thanks to Professor Kobourov for pointing out that normalized stress is scale sensitive.

References

  1. Smelser, K., Miller, J., & Kobourov, S. (2024, October). “Normalized Stress” is Not Normalized: How to Interpret Stress Correctly. In 2024 IEEE Evaluation and Beyond-Methodological Approaches for Visualization (BELIV) (pp. 41-50). IEEE.
  2. Smelser, K., Gunaratne, K., Miller, J., & Kobourov, S. (2025, April). The Effect of Scale on Quality Metrics for Dimensionality Reduction. (In Review)

Updates and Corrections

If you see a mistake or want to suggest a change, please create an issue on GitHub.