Friday, 3 January 2025

What is quantum computing?

Quantum computing is an emergent field of cutting-edge computer science harnessing the unique qualities of quantum mechanics to solve problems beyond the ability of even the most powerful classical computers. 

The field of quantum computing contains a range of disciplines, including quantum hardware and quantum algorithms. While still in development, quantum technology will soon be able to solve complex problems that supercomputers can’t solve, or can’t solve fast enough.

By taking advantage of quantum physics, fully realized quantum computers would be able to process massively complicated problems at orders of magnitude faster than modern machines. For a quantum computer, challenges that might take a classical computer thousands of years to complete might be reduced to a matter of minutes.

The study of subatomic particles, also known as quantum mechanics, reveals unique and fundamental natural principles. Quantum computers harness these fundamental phenomena to compute probabilistically and quantum mechanically.

Four key principles of quantum mechanics

Understanding quantum computing requires understanding these four key principles of quantum mechanics:
  • Superposition: Superposition is the state in which a quantum particle or system can represent not just one possibility, but a combination of multiple possibilities.
  • Entanglement: Entanglement is the process in which multiple quantum particles become correlated more strongly than regular probability allows.
  • Decoherence: Decoherence is the process in which quantum particles and systems can decay, collapse or change, converting into single states measurable by classical physics.
  • Interference: Interference is the phenomenon in which entangled quantum states can interact and produce more and less likely probabilities.
Qubits

While classical computers rely on binary bits (zeros and ones) to store and process data, quantum computers can encode even more data at once using quantum bits, or qubits, in superposition.

A qubit can behave like a bit and store either a zero or a one, but it can also be a weighted combination of zero and one at the same time. When combined, qubits in superposition can scale exponentially. Two qubits can compute with four pieces of information, three can compute with eight, and four can compute with sixteen.

However, each qubit can only output a single bit of information at the end of the computation. Quantum algorithms work by storing and manipulating information in a way inaccessible to classical computers, which can provide speedups for certain problems.

As silicon chip and superconductor development has scaled over the years, it is distinctly possible that we might soon reach a material limit on the computing power of classical computers. Quantum computing could provide a path forward for certain important problems.

With leading institutions such as IBM, Microsoft, Google and Amazon joining eager startups such as Rigetti and Ionq in investing heavily in this exciting new technology, quantum computing is estimated to become a USD 1.3 trillion industry by 2035.

What are qubits?

Generally, qubits are created by manipulating and measuring quantum particles (the smallest known building blocks of the physical universe), such as photons, electrons, trapped ions and atoms. Qubits can also engineer systems that behave like a quantum particle, as in superconducting circuits.

To manipulate such particles, qubits must be kept extremely cold to minimize noise and prevent them from providing inaccurate results or errors resulting from unintended decoherence.

There are many different types of qubits used in quantum computing today, with some better suited for different types of tasks.

Key principles of quantum computing

When discussing quantum computers, it is important to understand that quantum mechanics is not like traditional physics. The behaviors of quantum particles often appear to be bizarre, counterintuitive or even impossible. Yet the laws of quantum mechanics dictate the order of the natural world.

Describing the behaviors of quantum particles presents a unique challenge. Most common-sense paradigms for the natural world lack the vocabulary to communicate the surprising behaviors of quantum particles.

To understand quantum computing, it is important to understand a few key terms:

  • Superposition
  • Entanglement
  • Decoherence
  • Interference
Superposition

A qubit itself isn't very useful. But it can place the quantum information it holds into a state of superposition, which represents a combination of all possible configurations of the qubit. Groups of qubits in superposition can create complex, multidimensional computational spaces. Complex problems can be represented in new ways in these spaces.

This superposition of qubits gives quantum computers their inherent parallelism, allowing them to process many inputs simultaneously.

Entanglement

Entanglement is the ability of qubits to correlate their state with other qubits. Entangled systems are so intrinsically linked that when quantum processors measure a single entangled qubit, they can immediately determine information about other qubits in the entangled system.

When a quantum system is measured, its state collapses from a superposition of possibilities into a binary state, which can be registered like binary code as either a zero or a one.

Decoherence

Decoherence is the process in which a system in a quantum state collapses into a nonquantum state. It can be intentionally triggered by measuring a quantum system or by other environmental factors (sometimes these factors trigger it unintentionally). Decoherence allows quantum computers to provide measurements and interact with classical computers.

Interference

An environment of entangled qubits placed into a state of collective superposition structures information in a way that looks like waves, with amplitudes associated with each outcome. These amplitudes become the probabilities of the outcomes of a measurement of the system. These waves can build on each other when many of them peak at a particular outcome, or cancel each other out when peaks and troughs interact. Amplifying a probability or canceling out others are both forms of interference.

How the principles work together

To better understand quantum computing, consider that two counterintuitive ideas can both be true. The first is that objects that can be measured—qubits in superposition with defined probability amplitudes—behave randomly. The second is that objects too distant to influence each other—entangled qubits—can still behave in ways that, though individually random, are somehow strongly correlated.

A computation on a quantum computer works by preparing a superposition of computational states. A quantum circuit, prepared by the user, uses operations to generate entanglement, leading to interference between these different states, as governed by an algorithm. Many possible outcomes are cancelled out through interference, while others are amplified. The amplified outcomes are the solutions to the computation.

Classical computing versus quantum computing

Quantum computing is built on the principles of quantum mechanics, which describe how subatomic particles behave differently from macrolevel physics. But because quantum mechanics provides the foundational laws for our entire universe, on a subatomic level, every system is a quantum system.

For this reason, we can say that while conventional computers are also built on top of quantum systems, they fail to take full advantage of the quantum mechanical properties during their calculations. Quantum computers take better advantage of quantum mechanics to conduct calculations that even high-performance computers cannot.

What is a classical computer?

From antiquated punch-card adders to modern supercomputers, traditional (or classical) computers essentially function in the same way. These machines generally perform calculations sequentially, storing data by using binary bits of information. Each bit represents either a 0 or 1.

When combined into binary code and manipulated by using logic operations, we can use computers to create everything from simple operating systems to the most advanced supercomputing calculations.

What is a quantum computer?

Quantum computers function similarly to classical computers, but instead of bits, quantum computing uses qubits. These qubits are special systems that act like subatomic particles made of atoms, superconducting electric circuits or other systems that data in a set of amplitudes applied to both 0 and 1, rather than just two states (0 or 1). This complicated quantum mechanical concept is called a superposition. Through a process called quantum entanglement, those amplitudes can apply to multiple qubits simultaneously.

The difference between quantum and classical computing

Classical computing
  • Used by common, multipurpose computers and devices.
  • Stores information in bits with a discrete number of possible states, 0 or 1.
  • Processes data logically and sequentially.

Quantum computing
  • Used by specialized and experimental quantum mechanics-based quantum hardware.
  • Stores information in qubits as 0, 1 or a superposition of 0 and 1.
  • Processes data with quantum logic at parallel instances, relying on interference.

Quantum processors do not perform mathematical equations the same way classical computers do. Unlike classical computers that must compute every step of a complicated calculation, quantum circuits made from logical qubits can process enormous datasets simultaneously with different operations, improving efficiency by many orders of magnitude for certain problems.

Quantum computers have this capability because they are probabilistic, finding the most likely solution to a problem, while traditional computers are deterministic, requiring laborious computations to determine a specific singular outcome of any inputs.

While traditional computers commonly provide singular answers, probabilistic quantum machines typically provide ranges of possible answers. This range might make quantum seem less precise than traditional computation; however, for the kinds of incredibly complex problems quantum computers might one day solve, this way of computing could potentially save hundreds of thousands of years of traditional computations.

While fully realized quantum computers would be far superior to classical computers for certain kinds of problems requiring large data sets or for completing other problems like advanced prime factoring, quantum computing is not ideal for every, or even most problems.

More at: https://www.ibm.com/

IBM announces 50-fold quantum speed improvement

IBM launched its most advanced quantum computer yet last week at its inaugural quantum developer conference. It features nearly twice the gates of last year’s quantum utility demonstration – and a 50-fold speed increase.

Last year, in a paper published in Nature, IBM announced a breakthrough demonstration of quantum computing that can produce accurate results beyond those of classical computers. IBM calls this “utility scale.”

“We’re specifically referring to how quantum computers can now serve as scientific tools to explore new classes of problems in chemistry, physics, materials, and other fields that are beyond the reach of brute-force classical computing techniques,” says Tushar Mittal, head of product for quantum services at IBM. “Put simply, quantum computers are now better at running quantum circuits than a classical supercomputer is at exactly simulating them.”

That computer, Eagle, had a total of 127 superconducting cubits, 2,880 two-qubit gates, and took 112 hours to complete the quantum utility experiment. Today, IBM’s newest quantum chip, the 156-qubit Heron quantum processor, can handle circuits of up to 5,000 gates, and the same experiment was completed in 2.2 hours.

“This circuit itself is mainly used for benchmarking right now, but could be used for calculating expectation values for materials science problems,” Mittal says.

And there’s another improvement. Last year’s experiment used custom circuits and software. Now, IBM customers can run the same experiments using IBM’s quantum computing software development kit, Qiskit.

Up until now, the users were computational scientists exploring how these quantum circuits can be used for specific scientific domains, Mittal says. That’s starting to change. “At the 5,000-gate operations scale, we are also starting to see the emergence of quantum working in line with classical computing to calculate the properties of systems that are relevant to chemistry,” he says.

Today, researchers, scientists, and quantum developers are beginning to leverage quantum computing to help solve complex problems. For example, Cleveland Clinic is exploring this technology to simulate molecular bonds, which is key to solving pharmaceutical problems.

“We are pushing through traditional scientific boundaries using cutting-edge technology such as Qiskit to advance research and find new treatments for patients around the globe,” says Lara Jehi, chief research information officer at Cleveland Clinic, in a statement. “

“The work with Cleveland Clinic is already beginning to yield results,” says Mittal. The secret sauce, he says, is that the Cleveland Clinic combined classical and quantum computing in one workflow, which produced results not possible with quantum alone.

“Enterprises can use our utility-scale systems now,” he says. “However, our ultimate goal is that developers now use these existing quantum computers to search for heuristic quantum advantages, much like the early days of GPUs being employed to find speedups in high-performance computing.”

But quantum advantage – where quantum computers are cheaper, faster, or more accurate than traditional computers – is still a few years away, he says.

IBM also demonstrated its generative AI-powered Qiskit Code Assistant, first announced a month ago, which is now in private preview. The assistant, which is built on top of IBM’s Granite gen AI models, helps users build quantum circuits, or migrate old quantum code to the latest version of Qiskit.

The latest announcement is important because it couples progress on the hardware side with that of the software, says Heather West, research manager in the infrastructure systems, platforms, and technology group at IDC.

“Not only has IBM introduced a method for efficiently scaling their systems in a modular fashion, they are also introducing the software that is needed to help optimize the circuits that will run on the hardware,” she says.

But we’re not at the end goal yet, she adds. “Like all other quantum hardware vendors, IBM is still trying to solve the error correction issues that plague the systems,” she says.

These issues are preventing quantum computers from being able to solve some of the most complex problems. “Once this issue is resolved, enterprises will be able to use the technology for more than just small scale experimentation,” she says.

https://www.networkworld.com