NetIAS Lectures Series "Machine Learning Approaches for Debugging a Quantum Computer"
As part of the European NetIAS Lectures Series, Violeta Ivanova-Rohling presented a talk about "Machine Learning Approaches for Debugging a Quantum Cumputer".
The European NetIAS Lecture Series is organized jointly by the institutes participating in NetIAS. In the winter semester 2021/22, the New Europe College is responsible for the organization of the current series of conferences.
Researchers from different fields and from various European centres invite us - every last Thursday of the month - to reflect on knowledge in a digital age. In December, due to the winter holidays, the lecture will be given on the second Thursday of the month.
This semester promises some fascinating forays into South-East European history (Constantin Ardeleanu, New Europe College), computer science (Violeta Ivanova-Rohling, Zukunftskolleg - University of Konstanz), history of science and technology (Lino Camprubi, Institut d'Etudes Avancées d'Aix-Marseille Université) and radiology (Ruben Pauwels, Aarhus Institute of Advanced Studies).
More information here.
Abstract:
In the past decades, the mounting evidence that quantum algorithms can solve specific tasks with efficiency beyond the capability of a state-of-the-art classical computer has attracted tremendous interest in the field. A turning point was Shor’s algorithm for prime factorization, a polynomial quantum algorithm solving a problem that is hard for classical computers. A fully functioning all-purpose quantum device would have an enormous impact on our lives, with applications in science, drug discovery, disaster preparedness, space exploration, and environmental sustainability among many others. As a consequence, an increasing number of countries and companies are investing billions of dollars in a race to produce and commercialize the quantum computer. Various physical systems for quantum computation have already been developed, and hybrid quantum algorithms, which aim at solving optimization problems more efficiently, can run on existing noisy intermediate-sized quantum devices. However, a full-size general-purpose quantum computer is still out of reach. One of the difficulties in developing such a device is that as the size and complexity of the quantum computer grow, more sophisticated techniques for calibration and evaluation of their performance are required in order to develop fault-tolerant devices. Quantum state tomography (QST) is a prominent technique for the verification of a quantum computer, which allows for the reconstruction of a given quantum state from measurement data. By providing comprehensive information for a given quantum state, QST is known as the “gold standard” for the verification of a quantum device, however, its computational costs, make it infeasible for a system larger than few qubits. Moreover, it can be time consuming even for small systems, i.e. building blocks of a quantum computer of only one or two qubits. Efficient QST would be an important step to making a general-purpose quantum device possible. One aspect of the efficiency of the QST procedure depends on the choice of the measurement scheme, which determines the number of measurements one needs to do in order to perform the QST. Finding a measurement scheme that minimizes the number of required measurements can be formulated as an optimization problem. My work is focused on applying and developing various optimization and machine learning methods with the goal of finding measurement schemes, which minimize the number of measurements needed. By using prior knowledge of the landscape of potential solutions, such as particular symmetries and invariances, one could improve the exploration of the search space and find the optimal measurement schemes.