Next Big Disruption In The Waiting : Quantum Computing
If you ask me what is going to be the next big disruption in the way we work will definitely Quantum Computing. It is going to change the way we compute, we live, we work. All leading computer as well as software giants, Labs and Research Institutes are working overnight in this area. Those who win in the race, will rule. Sadly, in this race India has just started. We will talk about that at a later stage but first let us explore what do we mean by quantum computing and what are the challenges it can pose.
Do you know that nature -- including molecules like caffeine -- follows the laws of quantum mechanics, a branch of physics that explores how the physical world works at the most fundamental levels. At this level, particles behave in strange ways, taking on more than one state at the same time, and interacting with other particles that are very far away. Quantum computing harnesses these quantum phenomena to process information in a novel and promising way.
The computers the one we use today are known as classical computers. They’ve been a driving force in the world for decades -- advancing everything from healthcare to the way we live.
But there are certain problems that classical computers will simply never be able to solve. Consider the caffeine molecule in a cup of coffee. Surprisingly, it’s complex enough that no computer that exists or could be built would be capable of modeling caffeine and fully understanding its detailed structure and properties. This is the type of challenge quantum has the potential to tackle.
Classical computers encode information in bits. Each bit can take the value of 1 or 0. These 1s and 0s act as on/off switches that ultimately drive computer functions. On the other hand, the Quantum computers are based on qubits, which operate according to two key principles of quantum physics: superposition and entanglement. Superposition means that each qubit can represent both a 1 and a 0 at the same time. Entanglement means that qubits in a superposition can be correlated with each other; that is, the state of one (whether it is a 1 or a 0) can depend on the state of another. Using these two principles, qubits can act as more sophisticated switches, enabling quantum computers to function in ways that allow them to solve difficult problems that are intractable using today’s computers.
Quantum systems may untangle the complexity of molecular and chemical interactions leading to the discovery of new medicines and materials. They may enable ultra-efficient logistics and supply chains, such as optimizing fleet operations for deliveries during the holiday season. They may help us find new ways to model financial data and isolate key global risk factors to make better investments. And they may make facets of artificial intelligence such as machine learning much more powerful.
Many top players are in the race but IBM Q is an industry-first initiative to build commercially available universal quantum computing systems. As part of this effort, The IBM Q experience enables anyone to connect at no cost to one of IBM’s quantum processors via the IBM Cloud, to run algorithms and experiments, and to collaboratively explore what might be possible with quantum computing. Check out our User Guides and interactive Demos to learn more about quantum principles. Or, dive right in to create and run algorithms on real quantum computing hardware, using the Quantum Composer and QISKit software developer kit.
Before the dream of a quantum computer took shape in the 1980s, most computer scientists took for granted that classical computing was all there was. The field’s pioneers had convincingly argued that classical computers—epitomized by the mathematical abstraction known as a Turing machine—should be able to compute everything that is computable in the physical universe, from basic arithmetic to stock trades to black hole collisions.
Classical machines couldn’t necessarily do all these computations efficiently, though. Let’s say you wanted to understand something like the chemical behavior of a molecule. This behavior depends on the behavior of the electrons in the molecule, which exist in a superposition of many classical states. Making things messier, the quantum state of each electron depends on the states of all the others—due to the quantum-mechanical phenomenon known as entanglement. Classically calculating these entangled states in even very simple molecules can become a nightmare of exponentially increasing complexity.
A quantum computer, by contrast, can deal with the intertwined fates of the electrons under study by superposing and entangling its own quantum bits. This enables the computer to process extraordinary amounts of information. Each single qubit you add doubles the states the system can simultaneously store: Two qubits can store four states, three qubits can store eight states, and so on. Thus, you might need just 50 entangled qubits to model quantum states that would require exponentially many classical bits—1.125 quadrillion to be exact—to encode.
A quantum machine could therefore make the classically intractable problem of simulating large quantum-mechanical systems tractable, or so it appeared. “Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical,” the physicist Richard Feynman famously quipped in 1981. “And by golly it’s a wonderful problem, because it doesn’t look so easy.”
It wasn’t, of course.
Even before anyone began tinkering with quantum hardware, theorists struggled to come up with suitable software. Early on, Feynman and David Deutsch, a physicist at the University of Oxford, learned that they could control quantum information with mathematical operations borrowed from linear algebra, which they called gates. As analogues to classical logic gates, quantum gates manipulate qubits in all sorts of ways—guiding them into a succession of superpositions and entanglements and then measuring their output. By mixing and matching gates to form circuits, the theorists could easily assemble quantum algorithms.
Conceiving algorithms that promised clear computational benefits proved more difficult. By the early 2000s, mathematicians had come up with only a few good candidates. Most famously, in 1994, a young staffer at Bell Laboratories named Peter Shor proposed a quantum algorithm that factors integers exponentially faster than any known classical algorithm—an efficiency that could allow it to crack many popular encryption schemes. Two years later, Shor’s Bell Labs colleague Lov Grover devised an algorithm that speeds up the classically tedious process of searching through unsorted databases. Richard Jozsa is a quantum information scientist at the University of Cambridge. He explains that there aere a variety of examples that indicate quantum computing power should be greater than classical,
But Jozsa, along with other researchers, would also discover a variety of examples that indicated just the opposite. “It turns out that many beautiful quantum processes look like they should be complicated” and therefore hard to simulate on a classical computer, Jozsa says that with clever, subtle mathematical techniques, we can figure out what they will do. He and his colleagues found that they could use these techniques to efficiently simulate—or “de-quantize,” as Calude would say—a surprising number of quantum circuits. For instance, circuits that omit entanglement fall into this trap, as do those that entangle only a limited number of qubits or use only certain kinds of entangling gates.
What, then, guarantees that an algorithm like Shor’s is uniquely powerful? Jozsa answers that it is very much an open question. He accepts that We never really succeeded in understanding why some [algorithms] are easy to simulate classically and others are not. Clearly entanglement is important, but it’s not the end of the story.
But there is no dearth of experts who begin to wonder whether many of the quantum algorithms that are believed to be superior may turn out to be only ordinary.
Until recently, the pursuit of quantum power was largely an abstract one. Jozsa says that We weren’t really concerned with implementing our algorithms because nobody believed that in the reasonable future we’d have a quantum computer to do it, running Shor’s algorithm for integers large enough to unlock a standard 128-bit encryption key, for instance, would require thousands of qubits—plus probably many thousands more to correct for errors. Experimentalists, meanwhile, were fumbling while trying to control more than a handful.
By 2011, things were starting to look up. That fall, at a conference in Brussels, Preskill speculated that the day when well-controlled quantum systems can perform tasks surpassing what can be done in the classical world might not be far off.
Recent laboratory results, Preskill could soon lead to quantum machines on the order of 100 qubits. Getting them to pull off some “super-classical” feat maybe wasn’t out of the question.
Although D-Wave Systems’ commercial quantum processors could by then wrangle 128 qubits and now boast more than 2,000, they tackle only specific optimization problems; many experts doubt they can outperform classical computers.
Preskill says that he is just trying to emphasize that he is getting close—that he might finally reach a real milestone in human civilization where quantum technology becomes the most powerful information technology that the world has so far. He called this milestone “quantum supremacy.” The name—and the optimism—stuck. “It took off to an extent I didn’t suspect.”
The buzz about quantum supremacy reflected a growing excitement in the field—over experimental progress, yes, but perhaps more so over a series of theoretical breakthroughs that began with a 2004 paper by the IBM physicists Barbara Terhal and David DiVincenzo. In their effort to understand quantum assets, the pair had turned their attention to rudimentary quantum puzzles known as sampling problems. In time, this class of problems would become experimentalists’ greatest hope for demonstrating an unambiguous speedup on early quantum machines.
Government of India has just wake up to start its quest in this field. While the Physics departments at the Indian Institute of Science, Bangalore, and the Harish Chandra Research Institute, Allahabad, have only forayed into the theoretical aspects of quantum computing, a DST official said that “the time has come to build one.”
Experts from across the country are expected to gather this month in Allahabad for a workshop to develop such a computer. Internationally, Canada’s D-Wave Systems, is a pioneer in developing quantum computers and has sold machines to Lockheed Martin and Google.
Experts, however, say that ‘true quantum computers’ are still years away, and existing systems use principles of quantum computing to solve very limited problems.
Comments
Post a Comment