Netherlands company’s demonstration of a production-ready method to reduce errors in a quantum computer is a first for European quantum industry


April 2, 2026 – Enschede, The Netherlands – QuiX Quantum, a leading provider of photonic quantum computing hardware, today announced it has demonstrated “below threshold” error mitigation for the first time on a photonic quantum computer, suppressing physical qubit errors to the level compatible with scalable, fault‑tolerant quantum computing.
The achievement marks the first time a European company has demonstrated a production-ready method of error reduction, demonstrating the scalability of the QuiX quantum computing platform based on photonics. The project was conducted on the QuiX BiaTM Cloud Quantum Computing Service in collaboration with NASA’s Quantum Artificial Intelligence Laboratory, the University of Twente, and Freie Universität Berlin.
Quantum information is fragile, and without error correction, a computation of any user-relevant size will be impossible. For this reason, the ability to control errors in the quantum state is seen as a crucial milestone for any of the competing computing platforms. Increasingly, experts consider the ability to deal with such errors as the crucial differentiator between different technological approaches.
For such a protocol to be meaningful, it must meet two conditions: it must remove more errors than it introduces, and it must not impede the operation of the rest of the computer. QuiX is the first party in photonics to demonstrate a protocol that meets both requirements simultaneously. The findings are described in a paper available at https://arxiv.org/abs/2601.05947 which is currently undergoing peer review.
“Below-threshold, physical error mitigation has never been implemented in a photonic quantum computer. This achievement marks a significant milestone and places QuiX Quantum at the forefront of progress toward fault-tolerant photonic quantum computing,” said Stefan Hengesbach, CEO of QuiX Quantum. “We believe the most resource-efficient strategy is to reduce errors early rather than correct them at great expense — and by demonstrating net positive error mitigation on real hardware, we’ve taken a foundational step that showcases European leadership in accelerating quantum technologies toward powerful, large-scale systems.”
“This paper represents an important jump forward towards large-scale photonic quantum computing,” said David DiVincenzo, director of the Institute of Theoretical Nanoelectronics at the Peter Grünberg Institute at the Forschungszentrum Jülich.”
“By using a multimode optical Fourier transform, the authors have established experimentally an elegant photon distillation scheme that would significantly slash required resource costs in the future photonic quantum processor. This work takes a big step forward on one of the most stubborn bottlenecks in creating indistinguishable photons, giving a hint of a scalable path towards quantum fault tolerance.”
Photonic quantum computers use photons – particles of light – as their information carriers. The photons move around on an optical chip and entangle with each other because of their quantum particle statistics. However, the sources producing these particles are imperfect, and any path information inherent in the particles will destroy the entanglement, resulting in distinguishability errors.
Photon distillation is a hardware level, coherent technique for error reduction that improves the quality of single photons before computation. Using quantum interference among multiple imperfect photons, the method creates a cleaner, more indistinguishable photon without heavy qubit redundancy or classical post-processing.
Using a programmable 20‑mode photonic processor, the team demonstrated a photon distillation gate that makes photons measurably more alike, reducing photon indistinguishability error by a factor of 2.2. And despite additional noise introduced by the gate, the device still delivered a 1.2X net reduction in total error, demonstrating net‑gain mitigation.
The research also shows that combining photon distillation with quantum error correction may significantly reduce system level resource demands. Modeling with current photon source performance and photonic architectures, the approach could reduce the number of photon sources required per logical qubit by up to a factor of four, lowering system complexity and cost.
“For any quantum computer modality to scale, you have to prove you can remove more error than you add while the computer is still able to run, and that’s what we’ve shown here,” said Jelmar Renema, Chief Scientist at QuiX. “Our photon distillation gate is compatible with running real computations and delivers net gain error mitigation once all gate noise is included. That’s why this is a major achievement for photonics and quantum computing in general.”
The project was partially funded by the Netherlands Ministry of Defense's Purple NECtar Quantum Challenges initiative.
About QuiX Quantum
QuiX Quantum is a leading provider of photonic quantum computing hardware driving innovation across Europe in the development of its Universal Quantum Computer. The first system, already sold and contracted for delivery, underscores the impact of QuiX Quantum’s market-leading hardware and renowned quality. Following its expansion across Europe and UK, QuiX Quantum pushes the boundaries of quantum technology and industry, strengthening Europe’s international competitiveness, leveraging a wide network of partners while serving a growing global customer base.