Intel to invest $50 million in quantum computer research – ExtremeTech
Intel CEO Brian Krzanich released an open letter today, pledging to dedicate $50 million to long-term research of quantum computing. The CPU giant is partnering with TU Delft, the largest and oldest Dutch public technical university, and will work with QuTech, TU Delft’s quantum research institute. Intel is also pledging to dedicate its own resources and engineers to solving the problems of quantum computing.
It might seem odd to see Intel pumping so much money into quantum computing research, given that D-Wave’s systems have been tested and largely verified to be quantum computers. D-Wave’s devices, however, have some significant limitations. The number of Qubits has grown fairly quickly, but the total number of connections between the Qubits hasn’t scaled at the same rate — and it’s the connections between Qubits that dictate the complexity and nature of the problems the computer can actually solve. D-Wave systems are sparsely connected, which vastly simplifies routing and construction but also limits the real-world use cases of the computer.
D-Wave’s devices are one type of quantum computer, called an annealer, but it’s not the only type of quantum computer that might be theoretically constructed, nor universally the best for every kind of potential task. The challenges of building these devices, however, are considerable. Because quantum computation is extremely easy to disrupt, D-Wave uses liquid nitrogen to cool its hardware. Intel hasn’t stated which kind of devices it wants to investigate, but room-temperature quantum computing isn’t possible (at least, not as far as we know).
These types of computers, then, aren’t the kind of hardware that slots into a smartphone or that you’re likely to have sitting on your desk. In some ways, a functional quantum computer would resemble the hardware of the 1950s and 60s — huge installations with enormous power needs, fixed locations, and high operating costs. The reason that Intel and other manufacturers are so interested in building them anyway is because quantum computers can be used to solve certain problems that are so fiendishly difficult, it would require billions or trillions of years to accurately answer them using traditional transistors and cutting-edge equipment.
Even if you think Moore’s law will pick up steam again at some point, the time scales involved make conventional transistors ill-suited to the task. As the Intel-provided infographic above points out, there are a number of other specialized applications for quantum computing as well, such as theoretically unbreakable cryptography (with the side effect that any existing cryptographic scheme can be trivially broken by full-scale quantum computing.
As early quantum computers come online, we’re beginning to get a basic sense of how quickly they can operate and what types of problems they solve best. Ars Technica recently covered recent updates to ongoing efforts to benchmark D-Wave systems that illustrate how understanding how a quantum computer works, and what kinds of answers it can provide, significantly changes the way we benchmark and test such systems. Ongoing research into the practical systems we can build today will guide further work on the blue-sky projects of tomorrow. As Krzanich notes, “Fifty years ago, when Gordon Moore first published his famous paper, postulating that the number of transistors on a chip would double every year (later amending to every two years), nobody thought we’d ever be putting more than 8 billion of them on a single piece of silicon. This would have been unimaginable in 1965, and yet, 50 years later, we at Intel do this every day.”
The physics of liquid nitrogen make it unlikely that we’ll have quantum smartphones 50 years from now — but that doesn’t mean quantum hardware won’t be pushing the frontiers of human knowledge and our understanding of the universe.