What is it about?
We introduce novel Randomized dimensionality reduction techniques for Approximate Nearest Neighbor search among points in High-Dimensional Euclidean spaces. Our methods avoid the curse of dimensionality and use close to optimal storage.
Featured Image
Photo by Markus Spiske on Unsplash
Why is it important?
Our methods slightly worse complexity than state-of-the-art methods such as Locality Sensitive Hashing (LSH) but use less memory space than LSH. Moreover, they are quite easy to program, as illustrated by our publicly available implementation.
Perspectives
Read the Original
This page is a summary of: Randomized Embeddings with Slack and High-Dimensional Approximate Nearest Neighbor, ACM Transactions on Algorithms, June 2018, ACM (Association for Computing Machinery),
DOI: 10.1145/3178540.
You can read the full text:
Contributors
The following have contributed to this page