Skip to main content

Quantum Cloud Computing (QC) and Killer Apps built on it can help create products with life-changing feature experiences. So, what's happening?



I have always wanted to write an article that explained why I chose the Q-bits background image for my LinkedIn profile. For those who follow the evolution of quantum computing and know its intricate evolution path, they will readily see the disruptions a quantum cloud compute environment is all set to produce in the near-future. Others- mainly the skeptics, the curious and the uninitiated, please read on. I may end up changing your world view on Quantum Computing and what is can do for us, homo sapiens and the planet we inhabit.

Of course, I am not saying we have the first Quantum Computer available for sale on an e-commerce site and SW vendors are shipping applications that run on Quantum computers. There is still the issue of non-availability of a taxonomy-defined instruction set needed to actually program Quantum Compute (QC) machines, even if the hardware could be prototyped. But, to think that articulating a quantum processor instruction architecture set is such a novel idea that it's commercialization time is too far off in the future may well be a significant under-estimation. Let's not forget that similar under-estimation happened when Nvidia released its first Graphical Processing Unit (GPU) called GeForce 256 in August 1999. People thought that Nvidia would struggle to come up with a programming language that could unleash the power of their GPUs, but they did and prevailed. Nvidia created a programming language called Cuda to program the GPUs in a massively parallel processing (MPP) cluster. CUDA type processor instruction set languages can be developed for QC processors as well, with some tweaks of course to fit the physics of Quantum Computing. For more information on Cuda, see Nvidia's Cuda page. However, coding tweaks for Quantum Processing Units (QPUs) needs re-thinking of algorithms, instruction sets and the actual physical signals needed to program the Q-bits. While classical computers can more easily handle deterministic Boolean logic, problems that mimic the solution complexity of a natural physically or chemically occurring phenomenon requires quantum logic that is best processed on a QPU using a programmer-friendly language that can manipulate the QPUs. Problems that mimic the complexity of natural phenomenon are probabilistic by design, and, as such, algorithms have to be re-written for varying probabilistic quantum states (as the answers would be different every time a process is run through the QPU).

To expand on the programming of a Quantum Computer, beginnings of a language based on Radio Frequency as the calibrating mechanism is already being articulated. This under-development instruction program set that can efficiently program and scale MPP QPUs . For those who want to deep-dive into how a practical Quantum Instruction Set Architecture may look like: Robert S. Smith, Michael J. Curtis, William J. Zeng of Rigetti Computing in Berkeley, CA , have published an academic paper on introducing an abstract machine architecture for classical/quantum computations as well as compiling, along with a quantum instruction language called QUIL (Quantum Universal Instruction Language). Just like when the transistor revolution, followed by Intel introducing its first 4004 commercial grade general purpose processing unit, ushered in a new set of highly productive and entertaining killer apps, a low-cost quantum compute resource programmable by a user-friendly and open language is sure to usher in an era of quantum-scale killer apps. Such apps at its core would take advantage of compute power that is exponentially greater (as is witnessed in simulated computations of natural phenomenon like particle formations during supernova explosions, human gene expressions and many more). More expansively, if you think about it in generic ever-day terms, all our "needs and wants" products that rely on natural or physical chemical processes for synthesis and production could make use of Quantum Computing based algorithms to simulate incorporation of hitherto unimaginable or unsolvable feature sets.

For further presentation of this article, a bit more technical explanation on how radically different the compute power of a quantum computer can be is needed . While the compute power expansion of Classical Computing follows 2^n (n bits), Quantum Computing follows 2^Q (Quantum-bits or Q-bits). Q-bits themselves follow 2^n (n stochastic states like spin, rotation etc.), effectively giving us an exponential scale of 2^(2^n) compute power. Just like classical bits, Q-bits will also follow Moore's law and rapidly multiply to a certain number of Q-bits (i.e. stabilized, isolatable, measurable quantum states) threshold, at which time QC compute power is sure to pass classical compute power, and then exponentially diverge away from classical compute power which has CPU power limitations as the semi-conductor transistor's size cannot be reduced to below a certain number of atoms (to prevent electron leakage) and at the same time be able to achieve gating effects.Therefore, at some point in the future, when Q-bits have grown to tens of stable, measurable, isolatable states, Quantum Computing would have gained the required compute power to solve complex stochastic problems in bursts of micro-seconds, unlike the present CPU, GPU, TPU (Tensor Processing Unit) based MPP compute clusters that still take days or weeks to solve hyperscale, exponential, stochastic-outcomes problems of orders of complexity that mimic the stochastic outcomes of a naturally occurring phenomenon.

Further, given that now we are in an age of explosion of connected devices, software code, end use cases, APIs to connect them all, and end points that collect huge volume and variety of data (structured, unstructured, semi-structured, probabilistic, noise-filled etc.) at rapid velocities (real time, bursts, streams etc.), the need for quantum computing as a faster, more efficient, cheaper way to solve stochastic-outcome problems at super hyper scales may become a necessity. Even less efficiently written algorithms on a Quantum Computer might solve Big-Data stochastic problems in faster time-space than the most efficient algorithms written to solve similar problems on classical CPU, GPU or TPU MPP clusters.

An ideal state, therefore, in my opinion, would be when the CPUs -> GPUs -> TPUs -> QPUs continuum of evolution leads to each processor unit type being identified as optimal to solving a specific complex problem set at a superior efficiency, scale and unit costs that the others cannot at the same level of superiority. Therefore, it follows that cluster combinations of these various processing units as a super-hyper scale, hybrid compute-cloud orchestration may achieve the optimal solution efficiency at the lowest possible cost of service.
One drawback of a Quantum Computer is that it cannot store results. Therefore, by necessity, a Quantum Computer requires integration with Classical Cloud infrastructure for storage and network capabilities. This trend is now being called Classical-Quantum hybrid computing and efforts are on to simulate hybrid data centers that mimic the power of the QC Cloud. There is already a Cloud Virtual Machine infrastructure that simulates a 30 Q-bit equivalent quantum computing cluster. This simulator provides an environment for engineers to write, test, and productize code that takes advantage of the virtues of a 30 Q-bit QC, even though the actual commercial Quantum hardware is years away from being even prototyped. But, why wait? Simulate away. 

Another drawback of a Quantum Computer is that its hardware requires specialized cooling systems to eliminate naturally occurring atomic thermal vibrations or at least reduce them to a minimum (as in bringing them close to zero degree kelvin rest states). Achieving close to zero degree Kelvin tempertaure requires tremendous investment in climate systems effectively putting a multi-million dollar price tag on an off-the-shelf Quantum Computer . It, then, naturally follows that unit economics can be achieved at scale only via global-scale data centers housing clusters of Quantum computers, much like the unit cost economics being achieved via hyper-scale, globally distributed classical cloud computing data centers of today. Fortunately, the movement toward adopting cloud computing with its ultra-efficient unit cost economics and pay-for-use SaaS model is getting tremendous traction with Fortune 500 companies. By 2020, a significant amount (>50%) of Fortune 500 companies' workloads may be on the cloud, according to IDC. A cloud data center, therefore, would be an ideal environment to house QC hardware and clusters, with hybridized QC and classical compute clusters enabling optimal orchestrations and abstractions that can solve quantum-scale problems. 

Who or when would the next generation of start-ups soon that will take advantage of Quantum-Classical hybrid compute environments to build complex applications and hardware-software products that would change our lives in ways unimaginable or unsolvable today? I dont know who but can fathom that the 'when' would happen soon not just because I am an eternal optimist, but also based on facts, data, and momentum that demonstrate evolution toward commercialization, as pieces of the Quantum Computing puzzle fall into place (which also means more government and private funded research money to incubate ideas considered crazy today). 

How and when do you see Quantum Computing evolving? How do you think QC will impact your lives? What kinds of killer apps can we see taking advantage of QC? If you have some thoughts on these please share them on the comments section below.

Comments

Popular posts from this blog

In today's world, Indirect Channels should be a core part of your sales strategy!

Indirect sales channels provide the flexibility to up scale and down scale as required without locking into significant OpEx. I am of the opinion that at least 60-70% of sales should come from Indirect Channels.  Among other things, I deal with Indirect partners on a daily basis and find them competitive on a number of parameters: Faster sales adoption of new rate plans than direct teams Focused improvements in sales volumes and revenues based on a directly correlated incentive structure Typically pay variable compensation only at much higher attainment cutoffs than what's specified for internal sales teams. In one instance, I found that one of our partners set a minimum 70% quota attainment for a sales rep to be paid. Better at prolonging Customer Lifetime Value (CLV) as a residual commissions based system incents them to prolong the tenure of the customers Can be tailored to play in verticals and zip codes that return higher ARPU, lower churn, and lower cost per gross add.

Are Product Managers ready for challenges in this exploding technology world? Rate yourself with the work stream chart below.

Product management is an intricate function and is really a confluence of Technology, Users and the Business. Some Product Managers (PM) love to use the venn diagram where they show that PM is the intersection of Users, Technology and Business and leave it at that. But I am a keen- eyed product manager, more organized and methodical and find that the PM work streams need to be spelt out a a lot more deeply. As a PM, I really love this detailed work stream interactions (not my chart; if you need source, ping me) based triangular confluence below. It demonstrates why Product Managers have to keep an eye on so many stakeholder interactions and not just be writing out engineering specs and participating in stand-up meetings. It take a lot folks to get a good product out and then measure the success KPIs of the product post release. Maintain or kill decision post release is one of the hardest things to do especially when you have sunk in millions of dollars into your product managemen