The race to build a quantum computer is slightly overstated in Ravitej Uppu’s view. Of course, downplaying it is easy if you’ve taken a huge step forward ahead of the rest of the field, which he and fellow researchers at the University of Copenhagen just did.
Going Up Against Google?
Uppu is the lead author of the results of a recent University of Copenhagen experiment that developed a nanochip for a photonic quantum computer. It’s significant news, but he nevertheless explained a working system is still a long way off.
“Neither Google nor any of us have a quantum computer to make a blueprint for one. So it’s a complex process that all of us appreciate,” he said, talking to 6GWorldTM about reports in 2019 that Google’s quantum computer had performed a calculation beyond the capabilities of a classical machine, a so-called “quantum advantage.”
To clarify, Google’s “quantum computer” isn’t a typical computer system that is readily available. For Uppu, it’s important to qualify it.
“When someone says quantum computer, I typically assign the term ‘universal’ in front of it, because I would guess that’s what you would consider… an add-on to an existing classical computer to do similar sorts of tasks,” said Uppu, prefacing the explanation by saying a working system is a big “ask” right now.
Even so, there’s no diminishing the significance of the Danish research group’s findings, performed in collaboration with the University of Bochum. The nanochip that Professor Peter Lodahl’s group developed, of which Uppu was a part, is less than one tenth the thickness of a human hair. It’s projected to be a key component of a quantum simulator, software that enables a more affordable way to project how qubits (quantum bits) will react relative to a full-fledged quantum system.
Bringing Photons into the Mix
The Niels Bohr Institute (at the University of Copenhagen) research showed the simulator in question is capable of achieving a quantum advantage over classical systems. Part and parcel is the role photons played.
Photons are one possible quantum building block, like atoms or electrons. Photons were chosen because they don’t interact very easily with one another, staying “coherent” for a long time as a result. This makes them better candidates to be manipulated, according to Uppu. That’s not all.
“You have this added benefit that, if you can somehow put information onto photons, you can also transmit them to a remote node and do some other processing there,” he said. “They’re natural carriers of information, robustly coherent, and easy to manipulate and control.”
Hence the difference between what Google and the Niels Bohr Institute accomplished. Considering the benefits of photons, it’s no surprise other groups are working with them in quantum computing, too. Mere days before the findings were publicised, Chinese physicists also “challenged Google’s quantum advantage” over classical computers in a separate experiment using photons.
“What Google demonstrated […] is the capability of having 50 qubits and performing operations using these qubits that […] would actually be faster than a classical computer… What we were trying to do is build up the photonic counterpart of this […] with photons to demonstrate the quantum advantage,” Uppu clarified, with the Danish group successfully benchmarking its research against what he called one of the biggest classical supercomputers in existence, IBM’s Summit.
Photonic vs. Quantum Computing
Uppu explained that because a quantum machine can outperform a classical one in certain situations, new problems can start being solved. It’s a notable development, even if it would just be on simulators. Uppu used drug design as an example of an application area. As an illustration, pharmaceutical multinational Roche recently partnered with start-up Cambridge Quantum Computing to discover treatments for several diseases.
Meanwhile, advances such as these potentially have ramifications outside quantum computing as well. When asked, Uppu said they may help advance photonic computing, even if the two fields are actually distinct from one another. Nick Harris, CEO of Lightmatter, talked separately to 6GWorld. He described the difference between quantum computing and the raison d’etre of the photonic computing company.
“There are some really challenging pieces that you need to attach onto that optical processor if you want to build a quantum computer,” he said. “One is the source of generating single particles of light and the other one is the ability to detect single particles of light… So, the processor itself has applications in AI when you don’t use those single-photon sources and detectors and that’s what we’re doing at Lightmatter.
“We’re leveraging those standard lasers, standard detectors, all the normal telecommunications stuff in conjunction with this photonic-processor technology to build a computer chip that processes AI models very fast and very efficiently.”
When Harris said “efficiently,” he emphasised greater computing speed, while using less energy compared to today’s computers. He said the applications for Lightmatter’s computing platform are near-limitless.
“We’re targeting data centres, [Amazon Web Services], Google Cloud, Azure. They have public clouds where people can basically buy compute time and they can run their websites… We can power the AI that underlies all of the cloud-computing space,” he said. “If you want to do language translation, if you want to do text summary […] if you want to do image recognition, if you want to play games… we can play across the whole space. It’s really a general-purpose computer.”
Within the Quantum Realm of Possibilities
Uppu agreed with Harris’ assessment of the difficulties in building quantum-computing hardware. He broke it down, explaining there’s a photon generated in one chip, processed in another, and detected in yet another. There are challenges combining all these heterogeneous components and getting them to work at a high efficiency. It gets even more difficult when you look past the hardware. He listed several questions currently being asked within the quantum computing community, including the nature of error-correction mechanisms that would need to be incorporated.
“The benchmark that we considered in the paper doesn’t look at an error-correction stage, because it’s not a computational problem. If you want to go computational, where you want a specific end result, then you would have an error-correction component, which adds overhead both on the number of qubits that you would require and the way these qubits have to interact, to the framework of the network,” he said.
“From the photonic side, [our] demonstration was about this single emitter generating photons and that would just not be sufficient. You would have to have multiple emitters… These are some of the aspects that we are also exploring.”
Nevertheless, it’s still a huge step in the right direction. While mastery of quantum computing remains well out of reach for now, there is nonetheless a race of a sort taking place. Independent Policy Consultant Paul Ulrich went so far as to call it an arms race, in a separate interview with 6GWorld.
Aside from drug design, quantum computing has potential in military applications as well as in security, the latter of which was talked about in depth at a recent Federal Communications Commission forum on the topic. According to Ulrich, quantum computing would be used not just for encryption purposes, but also to break encryption. The stakes are incredibly high.
“It will make current systems of encryption obsolete overnight, whoever develops the first workable quantum computing system to do that,” Ulrich said. “It has huge national security implications and undoubtedly whoever develops one first isn’t going to let the others know they have it. In other words, it’s a very formidable weapon.”
Feature image courtesy of plotplot (via Shutterstock).