Hashing, SVM and Shor’s algorithm

Hasan Mustafa
3 min readMar 20, 2021

Well you might think how three different topics from three different fields make to the title. Here are some random thoughts how…

To illustrate consider hash tables, hash tables rely on the fact that pre-hashing which maps keys to certain integers of the hash table is a process that relies on uniform distribution of keys to hashes- incidentally which quantum computing happens to be so great at.

A hash function maps multiple strings to an integer and relies on the fact that there is a vanishingly small probability that all strings will be mapped to only a few indices.

A quantum computer can be used to implement the hash function, although naively implementing such a scheme does not show any obvious benefits in terms of reducing the possibility of collision and reducing the size of hash table but it might be able to protect hashing against possible collision attacks or preimage attacks.

I will explore the possibilities in future articles as this one is only to explore correlations.

Let’s see what Shor’s algorithm has done. It make RSA encryption a joke once sufficient qubits are available. In the NISQ era however Shor isn’t expected to make impact as the sufficient noiseless qubits won’t be available.

However it is surprising to note that 1,099,551,473,989 was factored using the Variational Quantum Factoring involves a process called QAOA(Quantum Approximate Optimization Algorithm)

All of these techniques make the role of hashing in cryptography ever so important as currently there is no known quantum algorithm to break a hash function. More security measures will rely on hashing rather than encryption.

On a different note:

Support Vector Machines, also known as large margin classifiers are a class of classification machine learning algorithm. It is widely used because of a “Kernel Trick” that can be implemented.

Let’s look closely at a support vector machine. Essentially it tries to separate samples into two categories with a hyperplane(which is a line in 2-Dimensions).

However in cases where data is not linearly separable, the kernel trick comes into the picture. It allows us to map the data to a higher dimensional space and compute the parameters without explicitly knowing how to map to a higher dimension.

Several methods have been proposed to implement the Kernel Function one such approach is to use Quantum Computing to find the kernel. One can embed the data points into the hilbert space and then find the Kernel. Kernels that are difficult to implement in classical computer may be implemented using quantum computers. However as of now Quantum kernels don’t offer any speed ups over classical counterpart.

Another interesting idea that has been proposed is to use hash functions to implement the kernel. This we reserve for another day.

Signing off.

I’m experimenting right now with a method of studying. Computer Science, Machine Learning and Quantum Computing aren’t all that different. Each one often borrows concepts from other. At a moment when computing is at the verge of the quantum revolution and big data taking over it is imperative to study the three in a syncretic fashion.

--

--