Should you be thinking of Quantum Computing?

Alex Barrera
5 min readNov 6, 2017

This story was first published at The Aleph Report. If you want to read the latest reports, please subscribe to our newsletter and our Twitter.

The short answer is, depends. If your organization is dealing with Deep Learning, Machine Learning, complex simulations or optimizations, you should care. Quantum computing is one of those technologies that we get hyped, look into them, frown in disappointment and then dismiss. The truth though, is that you shouldn’t. Not now.

In theory, Quantum Computing enables companies to run hard (exponential based) problems orders of magnitude faster than current technologies. I say in theory because in most cases, the mathematical algorithms aren’t there yet. That said, this is changing and fast.

When I mean fast, I mean exponentially fast. Some weeks ago, Microsoft released its Quantum Computing Toolkit. IBM released something similar last year called IBM Quantum Experience (IBM-Q), becoming the first company to offer Universal Quantum Computing in the cloud.

The news caught my attention. It surprised me that more and more technology companies are releasing Quantum simulators. I wondered, isn’t it far away from being useful? The truth is, it is, and it’s not. So let me separate two things.

Quantum Computers

On one side you have the Quantum Computer itself, the hardware. The speed of innovation on the hardware side is impressive. Right now there might be close to nine or ten different approaches to building a Quantum Computing. Some are very recent, like the Flip-Flip Qubit proposed by the University of New South Wales in Australia. Others are improvements over current technologies, like the Loop-Based technique from the University of Tokyo.

Hardware is still evolving. It reminds me of the early days of digital computers. Each company is outperforming the other’s architecture. The significant difference, in this case, is the speed of innovation. The acceleration of the space will bring forward a viable (like in 1000–4000 qubits) Universal Quantum Computer during the next few years, not more.

It’s easy to dismiss the technology as it’s currently subpar with traditional computing. There is an ongoing debate about how faster can Quantum Computers operate. A discussion that, so far, Quantum has been loosing. I don’t expect this to be the case for long though.

Image: IBM’s Quantum x2000 chip

Quantum Algorithms

On the other hand, you’ve got Quantum Algorithms. This is the software abstraction that runs on top of the Quantum Computers.

Writing Quantum Algorithms is nothing like current programming. It’s the comeback of assembly language, but on steroids, it’s a trip down Universal Turing Machine memory-lane.

Quantum Computing requires a complete rewrite of the underlying math of any classical algorithm. Not all algorithms are suitable to run on Quantum. Quantum developers need to develop new mathematical devices to make them workable. And when I say Quantum developers, I mean, hardcore mathematicians and physics.

It all comes down to developing the right Quantum algorithm, something that isn’t easy or achievable by many. Here though is where the exciting space lies. Most technology leaders are investing in building their own Quantum Computers. Meanwhile, startups are focusing on developing the right algorithms for potential customers. One example of this is the Vancouver-based 1QBit.

In 2014, two Singularity University alumni, Landon Downs, President and Andrew Fursman, CEO co-founded 1QBit. Their goal? To bring the right Quantum algorithms to solve intractable problems. Their clients? Financial institutions like Dow Jones, Pharma companies, Technology moguls like Fujitsu, AI-heavy companies, etc.

Their focus is on developing the Quantum algorithms to solve expensive computational problems. Developing these takes time and effort, which is why it’s so important to start doing it now.

In a way, the fact that both IBM and Microsoft are encouraging developers to play with their Quantum languages is for a reason. There aren’t enough people qualified to be Quantum developers, and the need is becoming very real.

Quantum for what?

Three critical spaces are the ones driving the field. The obvious one is cryptography. Our current infrastructure’s security relies on Public-Private cryptography. Behind it, there is one of the toughest mathematical problems, which is the factorization of prime numbers.

In 1994, Peter Shor, an American professor of Applied Mathematics at MIT, developed a new algorithm to factor prime numbers called the Shor Algorithm. The new algorithm took advantage of the way Quantum Computing works, achieving considerable speedup times. It wasn’t until 2001 that someone attempted to put it in play with a real Quantum Computer. Fast-forward to 2014, scientists have already achieved the factoring of a six digit number.

While still some years away, everyone is expecting a breakthrough in no time. Such is the pace that the National Institute of Standards and Technology (NIST), the organization in charge of validating our most used cryptographic algorithms, is already talking about post-quantum cryptography (PDF).

But crypto, while important, is the tip of the iceberg. Artificial Intelligence, but more specifically, Machine Learning and Deep Learning algorithms, are becoming ubiquitous too. These algorithms need, not only massive amounts of data but tremendous computational speed. Such is the need that the industry is fine-tuning their chip designs to supply even fastest training capacity to their customers.

It’s not about who uses AI or not anymore. It’s about who can re-train their models faster.

The quest for fast Deep Learning training is pushing the investment on Quantum Computers too. It’s not about who uses AI or not anymore. It’s about who can re-train their models faster.

So far, the inroads into Quantum Deep Learning haven’t been much. The underlying mathematics behind most Artificial Neural Networks don’t play well with Quantum Computation. This is changing though, and quickly.

Last but not least, optimization problems, for example in the logistics and operations industries, will also benefit from it. Calculating the perfect route to transport goods, with the least cost, is still a costly problem for classic computers. There are traditional optimizations, but they’re sub-optimal. As more companies go into e-commerce or ride-sharing services, being able to slash costs in logistics is becoming critical.

If we add Autonomous Vehicles (AV) on top of this, the picture starts becoming clear. AV requires both, faster Deep Learning algorithms, but also better-optimized routes. Both problems Quantum Computers should be able to assist within a few years.


Quantum Computing isn’t for everyone. It’s only suitable for some mathematical issues. For those that are suitable, it will allow faster and more powerful computations. While the hardware isn’t there yet, it’s evolving at an exponential rate. The bottleneck isn’t with the hardware per se, but on the capacity to develop the right Quantum Algorithms. The development of such algorithms isn’t trivial and requires extensive mathematical knowledge. Something that isn’t common.

Those organizations that start training their people in this space and start focusing on their own Industry Quantum Algorithms will gain a massive competitive advantage during the next five to ten years.

As a side note, I wonder if the current Deep Learning models can’t be applied to the task of developing new Quantum Algorithms. Just a final thought to get your mind reeling.

If you enjoyed this post, please share. And don’t forget to subscribe to our weekly newsletter and to follow us on Twitter!



Alex Barrera

Chief Editor at The Aleph Report (@thealeph_report), CEO at, Cofounder & associated editor @tech_eu, former editor @KernelMag.