WERQSHOP Talk

Noise vs quantum algorithms

Yihui Quek ⊗ MIT ⊗ [Slides]

I am a postdoctoral fellow at MIT, working with Peter Shor and Aram Harrow. Previously I was a long-term visitor at the Simons Institute for the Theory of Computing, a Harvard Quantum Initiative research fellow at Harvard University and a Alexander von Humboldt fellow at the Free University of Berlin. And even before that I completed my PhD at Stanford University and Bachelor's at MIT. I would love to live in a world with large, error-corrected quantum computers. But right now we only have noisy, small ones. To illuminate paths or dead-ends towards that goal, much of my work involves applying information and complexity theory to open up new classical-quantum gaps, or study the limitations of near-term quantum devices. I strongly believe that scrutinizing theoretical physics through a computational lens will lead to its next paradigm shift.

Abstract

What can we compute in the presence of noise? Noise limits our ability to error-mitigate, a term that refers to near-term schemes where errors that arise in a quantum computation are dealt with in classical pre-processing. I present a unifying framework for error mitigation and an analysis that strongly limits the degree to which quantum noise can be effectively `undone' for larger system sizes, and shows that current error mitigation schemes are more or less as good as they can be. I will then switch gears and describe techniques to classically simulate expectation values on random circuits in the presence of non-unital noise.

View all talks