Randomized Benchmarking doesn?t measure what you (probably) think it does
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Physical Review B
Abstract not provided.
Abstract not provided.
New Journal of Physics
While adiabatic quantum computing (AQC) has some robustness to noise and decoherence, it is widely believed that encoding, error suppression and error correction will be required to scale AQC to large problem sizes. Previous works have established at least two different techniques for error suppression in AQC. In this paper we derive a model for describing the dynamics of encoded AQC and show that previous constructions for error suppression can be unified with this dynamical model. In addition, the model clarifies the mechanisms of error suppression and allows the identification of its weaknesses. In the second half of the paper, we utilize our description of non-equilibrium dynamics in encoded AQC to construct methods for error correction in AQC by cooling local degrees of freedom (qubits). While this is shown to be possible in principle, we also identify the key challenge to this approach: the requirement of high-weight Hamiltonians. Finally, we use our dynamical model to perform a simplified thermal stability analysis of concatenated-stabilizer-code encoded many-body systems for AQC or quantum memories. This work is a companion paper to 'Error suppression and error correction in adiabatic quantum computation: techniques and challenges (2013 Phys. Rev. X 3 041013)', which provides a quantum information perspective on the techniques and limitations of error suppression and correction in AQC. In this paper we couch the same results within a dynamical framework, which allows for a detailed analysis of the non-equilibrium dynamics of error suppression and correction in encoded AQC. © IOP Publishing and Deutsche Physikalische Gesellschaft.
Abstract not provided.
Physical Review Letters
Abstract not provided.
Physical Review
Abstract not provided.
New Journal of Physics
Abstract not provided.
Physical Review Letters
Abstract not provided.
Physical Review X
Abstract not provided.
Physical Review Letters
Abstract not provided.
Abstract not provided.
Physical Review Letters
We present an approach to the simulation of quantum systems driven by classical stochastic processes that is based on the polynomial chaos expansion, a well-known technique in the field of uncertainty quantification. The polynomial chaos technique represents the density matrix as an expansion in orthogonal polynomials over the principle components of the stochastic process and yields a sparsely coupled hierarchy of linear differential equations. We provide practical heuristics for truncating this expansion based on results from time-dependent perturbation theory and demonstrate, via an experimentally relevant one-qubit numerical example, that our technique can be significantly more computationally efficient than Monte Carlo simulation. © 2013 American Physical Society.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Proposed for publication in Physical Review Letters.
Abstract not provided.
Proposed for publication in Physical Review Letters.
Abstract not provided.
Abstract not provided.
Abstract not provided.
In this report we describe the construction and characterization of a small quantum processor based on trapped ions. This processor could ultimately be used to perform analogue quantum simulations with an engineered computationally-cold bath for increasing the system's robustness to noise. We outline the requirements to build such a simulator, including individual addressing, distinguishable detection, and low crosstalk between operations, and our methods to implement and characterize these requirements. Specifically for measuring crosstalk, we introduce a new method, simultaneous gate set tomography to characterize crosstalk errors.