Development of quantum algoritms and quantum computational architectures, assessment of quantum information resources
Quantum computing is on the verge of becoming a production technology rather than scientific curiosity. Multiple providers give access to their NISQ (Noisy Intermediate-Scale Quantum) computers, containing 50–100 quantum bits, mostly realised with superconducting circuits. A technological leap is needed to increase the number of quantum bits. In the meantime, the computational performance can be enhanced by error detection and correction, rather than extending hardware. Another key problem is quantum computer benchmarking. In the frame of the Quantum Information National Laboratory, we aim at multidisciplinary effort to study the different types of errors and inherent noise, including errors correlated over space and time. In order to assess gate errors, we develop general-purpose randomised benchmark schemes built on unitary designs, which go beyond the popular Clifford gates, and so they can be used for other types of gates. Another prospective benchmark tool, which we have developed before, is based on the nonlinear dynamics of iterated, measurement-corrected quantum protocols. These protocols will be able to cope with certain quantum computing tasks, as the dynamics is extremely sensitive to noise in some domains. Atoms trapped in optical lattice, which will be realised in the National Laboratory, are prospective candidates to implement such protocols.