Quantum Tech

Unlocking opportunities in quantum computing

12th December 2024
Caitlin Gittins
0

Quantum computing is entering an era of more focused investment, more commercial contracts, and more competition. Many start-ups are transitioning to scale-ups with growing teams, ambitions, and revenue streams. As quantum computing capabilities are becoming more sophisticated, so has engagement from research institutions and supercomputing centers who are investing tens of millions for on-premises installations of the latest machines.

By Dr. Tess Skyrme, Senior Technology Analyst, IDTechEx

Yet, although breakthrough achievements abound across the ecosystem, the industry is still at a relatively early stage. Uncertainty remains as to which hardware approaches will be the most successful. Roadmaps suggest a turning point for advantage and value creation should only be a few years away, but significant technology challenges lie ahead to truly achieve the scalability and revolutionary promise of quantum computing.

So, as pressure mounts on quantum computing to continue showing it is on track to provide a return on investment from governments and private ventures alike – what are the crucial next steps? IDTechEx outlines some key trends below, taken from their recently released report, 'Quantum Computing Market 2025-2045: Technology, Trends, Players, Forecasts'.

More error correction

Awareness of the importance of reducing errors in quantum computers has grown significantly. Individual physical qubits are notoriously vulnerable to decoherence from a variety of noise sources – from temperature and electromagnetic radiation to crosstalk. Decoherence is catastrophic for quantum advantage, seeing qubits no longer simultaneously represent 1s and 0s, but quite classically 1s or 0s.

One method of overcoming the impact of noise and decoherence is quantum error correction (QEC). In simple terms, this requires creating abstracted, error-free, logical qubits from a collection of noisy physical qubits. In oversimplified terms, by comparing the properties of the group, enough information about the noise can be extracted to correct it. Analogous to playing a game of broken telephone enough times to decode the original message. The exact mathematical approaches to large-scale error correction remain a highly active area of research – particularly by the likes of experts at Riverlane. Yet the conclusion is clear: the number of logical qubits per system is becoming a more important benchmark of quantum computer hardware’s long-term potential for success.

Strikingly, it is apparent that the required ratio of physical to logical qubit varies dramatically between qubit modalities. Evidence suggests that for photonic, it could be as low as 2:1, for neutral atom and trapped ion nearer 10:1 – while superconducting could require more than 1000:1. To some extent, this has temporarily leveled the playing field in the quantum computing market, seeing challengers such as QuEra catch-up, if not overtake, giants like IBM and Google in the race for high numbers of logical qubits.

Overall, the need to now transition into a ‘logical era’ is clear. This is well evidenced by the focus on this benchmark in the latest roadmaps by multiple players across the industry. Yet, unfortunately, solely optimising system design towards reducing errors won’t be enough to secure long-term success. For this – the impact on overall size and power consumption must also be considered.

Less demanding infrastructure

Overcoming the infrastructure limitations associated with scaling quantum computer hardware is no easy task. Almost all systems today require cooling, whether it be using cryostats or lasers. It is often the cooling system that can be the most demanding on space. However, as efforts to increase logical qubit number increase – space per cooling system to house them is running out.

As a result, today, many hardware roadmaps show a modular approach with multiple systems connected. On the one hand, quantum computing is designed for high-value problems – to be solved over the Cloud, and so requiring a large footprint within a data center isn’t necessarily a huge barrier to adoption. However, in some instances, the associated power demand for this approach for an LSFT machine is calculated to be in the Mega Watts, which is enough to warrant its own small modular reactor. To truly follow the trend of classical computing from vacuum tube to smartphone, it’s time to start making components smaller before capabilities can get bigger.

One key aspect impacting infrastructure demand is qubit density or the physical size of qubits. Some modalities claim to have a significant advantage in this area over others. For example, it is currently estimated that superconducting and photonic designs could integrate thousands of qubits per chip, trapped-ion tens of thousands, and silicon-spin billions. This is partly limited by the dimensions of the quantum state utilised as well as the manufacturing methods available to produce them. The size advantage offered by silicon-spin is largely a result of leveraging the highly optimised techniques already adopted by the semiconductor industry for transistor and CMOS manufacture. Notably, Microsoft is also working towards hardware protected Majorana qubits, microns in scale, specifically stating the advantage of enabling a ‘single module machine of practical size’. That being said, given the impact of crosstalk and other noise sources, how the impact of spacing between qubits required will change at scale across all modalities remains uncertain.

Furthermore, it can’t be overlooked that as well as the qubit themselves, often the most space is needed for manipulation and readout systems. For example, moving from hundreds to thousands of qubits can lead to unfeasible requirements for microwave cabling, interconnects, lasers, and more. As a result, many players are now also developing more optimised approaches for scalable manipulation and control. SEEQC have created a digital, on-chip alternative to analog control for superconducting qubits, which is now of growing interest to other modalities in the eco-system. Similarly, Oxford Ionics have recently patented an ‘electronic qubit control’, an on-chip interface for trapped-ion modalities. In fact, it is the almost ubiquitous focus of research in start-ups and established players to overcome ‘the wiring challenge’. Looking ahead, remaining agile across the quantum stack offers will offer an advantage over vertical integration in this regard.

Market outlook

In this increasingly competitive industry, the coming years will illuminate which strategies hold the greatest promise for securing a lasting quantum commercial advantage. This task will be an uphill balancing act between reducing errors and scaling up logical qubit numbers while also optimising for resource efficiency. This is without even considering gate-speed, algorithm development, and many other crucial factors. The enormity of the task will likely see many players fail to survive until the end of the decade. Yet with market consolidation and convergence of talent should come increased clarity as to where and when quantum advantage could be offered first - serving only to increase end-user confidence and engagement. Despite the headwinds, the world-changing potential of quantum computers within finance, healthcare, sustainability, and security will remain a tantalising enough carrot for not only individual companies but entire nations to chase.f

Featured products

Product Spotlight

Upcoming Events

No events found.
Newsletter
Latest global electronics news
© Copyright 2024 Electronic Specifier