It’s time to start putting those qubits to work, global experts at an IBM-sponsored virtual roundtable said.
Although quantum computing is still relatively nascent, IBM and partners are strongly urging organizations to begin discussing the issues they have that quantum could solve and think about investing in the technology—now.
Already, industries, universities, and governments around the world are making important contributions to scientific research toward practical quantum computing applications, according to scientists from IBM, Maastricht University, the European Organization for Nuclear Research (CERN), and quantum startup Qu & Co., during an IBM-sponsored virtual roundtable on Thursday.
The speakers provided insight into their research efforts and the current quantum computing landscape. They also discussed scientific challenges, such as Qu & Co.’s quantum chemistry application research; the IBM-Maastricht University-CERN collaboration to address the computational needs of the future Einstein Telescope; and the LHCb detector at the High-Luminosity Large Hadron Collider at CERN.
Quantum is entering the mainstream of the computing world because “today’s powerful computers are not enough,” said Heike Riel, IBM Fellow and lead IBM Research Quantum Europe & Africa.
Quantum computing can help solve some of the world’s biggest problems, including research and development in agriculture (feeding people), climate change (dealing with the CO2 problem), transportation (developing zero emissions), and healthcare (understanding certain diseases and finding the right treatments), Riel said.
Classical computers cannot do the necessary simulations to tackle these issues, she said. Quantum computers use quantum bits and calculate on the basis of quantum physics and are needed to solve challenges, Riel said.
The road ahead
IBM’s roadmap for quantum has gone from 27 qubit chips in 2019 to 65 qubits in 2020. Next year’s goal is to break 100 the quantum barrier and demonstrate a 127 chip called Eagle, Riel said. In 2022, the expectation is 433 qubits and a chip called Osprey, as well as more miniaturization of components and integration, she said.
By the end of 2023, the goal is to break the 1,000 qubit barrier to allow for more applications, and “lead us to a path of one million qubits, which will require new infrastructure and quantum error correction,” Riel said.
Gideon Koekoek, an assistant professor at Maastricht University, said the challenge they are working on with IBM is how to speed up finding the right gravitational wave to deal with enormous amounts of data in the coming Einstein Telescope, “which, in theory, will make it possible to explore the universe as far back as the Big Bang,” according to the university. “We’re going to have much more data that won’t be able to scale up using traditional techniques,” Koekoek said.
Storing the data isn’t the problem; filtering it will take a lot of time, he added.
Benno Broer, founder and CEO of the Netherlands-based Qu & Co., said the company is working on developing novel quantum algorithms and quantum software for applications in chemistry and the life sciences.
Panelist Alberto Di Meglio, the openlab head at CERN, said simulations are necessary for “high-energy physicists studying the mysteries of the universe,” which is very compute intensive.
Quantum computers can help accelerate that and extract information from data “in a way that would take classical hardware an enormous amount of time,” Di Meglio said. Simulating certain types of quantum mechanics is “where we’ll see the famous quantum advantage. No matter how much hardware you throw at classical hardware, it cannot solve [those] challenges. This is a completely new realm of understanding of the physics of how the universe works.”
One of the end goals is a need for full quantum error correction, which will require even bigger quantum computers, Riel said. Also in development is “how to correct noisy qubits.”
Even though today’s quantum processors are not ready to deliver quantum benefits, “that doesn’t mean you can’t do valuable things today,” said Broer. “Today, the field is more than ready for clients to develop software to solve complex and intractable problems. It starts with mathematics.”
Most of Qu & Co.’s clients “are convinced they actually need to hurry up and they can’t wait until the technology develops,” because quantum requires a completely new way of thinking about how to approach problems, Broer added.
In terms of security, there has been discussion about quantum breaking encryption, Riel said. Research is underway to test for a new standard for encrypting data, she said. “I don’t see this as a current high risk.”
What organizations should do
To prepare for quantum, organizations need to investigate the problems they would like to apply the technology to, Broer said. “What you see quite often is companies start by identifying the most relevant use cases … where they’re trying to use high-performance computing-type solutions that are falling short or expect to.”
The second action is for people to start sharing their domain knowledge, he recommended. “Get that dialogue going because if we don’t share domain expertise with someone who could potentially solve [a problem] with quantum, the solution won’t be developed.”
Because it will take time to develop those methods, people should also collaborate on quantum projects and test them on new hardware, he said.
It’s also important for organizations to educate their workforces “at a minimum level to use quantum solutions” that will eventually be available so they can help advance the field, Boer said.
But Di Meglio cautioned that “just because someone knows physics doesn’t mean they can use quantum out of the box.” CERN is addressing this by offering a weekly introduction to quantum computers course. “It took us by surprise that the first lecture was followed by almost 2,000 people around the world,” he noted.
There are many resources available to learn quantum and what can be done, Di Meglio said. “It requires some attention and investment, but it’s really going to lead to a revolution in how we use computing in next five to 10 years. It’s worth the investment.”