Abstract
Hashing is a basic tool for dimensionality reduction employed in several aspects of machine learning. However, the perfomance analysis is often carried out under the abstract assumption that a truly random unit cost hash function is used, without concern for which concrete hash function is employed. The concrete hash function may work fine on sufficiently random input. The question is if they can be trusted in the real world where they may be faced with more structured input. In this paper we focus on two prominent applications of hashing, namely similarity estimation with the one permutation hashing (OPH) scheme of Li et al. [NIPS'12] and feature hashing (FH) of Weinberger et al. [ICML'09], both of which have found numerous applications, i.e. in approximate near-neighbour search with LSH and large-scale classification with SVM. We consider the recent mixed tabulation hash function of Dahlgaard et al. [FOCS'15] which was proved theoretically to perform like a truly random hash function in many applications, including the above OPH. Here we first show improved concentration bounds for FH with truly random hashing and then argue that mixed tabulation performs similar when the input vectors are not too dense. Our main contribution, however, is an experimental comparison of different hashing schemes when used inside FH, OPH, and LSH. We find that mixed tabulation hashing is almost as fast as the classic multiply-modprime scheme (ax + b) mod p. Mutiply-mod-prime is guaranteed to work well on sufficiently random data, but here we demonstrate that in the above applications, it can lead to bias and poor concentration on both real-world and synthetic data. We also compare with the very popular MurmurHash3, which has no proven guarantees. Mixed tabulation and MurmurHash3 both perform similar to truly random hashing in our experiments. However, mixed tabulation was 40% faster than MurmurHash3, and it has the proven guarantee of good performance (like fully random) on all possible input making it more reliable.
Original language | English |
---|---|
Title of host publication | Neural Information Processing Systems 2017 |
Editors | I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, R. Garnett |
Number of pages | 11 |
Publisher | NIPS Proceedings |
Publication date | 2017 |
Publication status | Published - 2017 |
Event | 31st Annual Conference on Neural Information Processing Systems - Long Beach, United States Duration: 4 Dec 2017 → 9 Dec 2017 Conference number: 31 |
Conference
Conference | 31st Annual Conference on Neural Information Processing Systems |
---|---|
Number | 31 |
Country/Territory | United States |
City | Long Beach |
Period | 04/12/2017 → 09/12/2017 |
Series | Advances in Neural Information Processing Systems |
---|---|
Volume | 30 |
ISSN | 1049-5258 |