Quantum Error Correction

Resilience through entanglement.

Quantum Error Correction (QEC) is a crucial technique that employs redundancy and entanglement to detect and correct errors occurring in quantum bits (qubits) due to decoherence and noise, thereby preserving quantum information for reliable computation.

        graph LR
  Center["Quantum Error Correction"]:::main
  Pre_qubit["qubit"]:::pre --> Center
  click Pre_qubit "/terms/qubit"
  Pre_decoherence["decoherence"]:::pre --> Center
  click Pre_decoherence "/terms/decoherence"
  Rel_topological_quantum_computation["topological-quantum-computation"]:::related -.-> Center
  click Rel_topological_quantum_computation "/terms/topological-quantum-computation"
  Rel_algorithmic_stablecoin["algorithmic-stablecoin"]:::related -.-> Center
  click Rel_algorithmic_stablecoin "/terms/algorithmic-stablecoin"
  Rel_standardization["standardization"]:::related -.-> Center
  click Rel_standardization "/terms/standardization"
  classDef main fill:#7c3aed,stroke:#8b5cf6,stroke-width:2px,color:white,font-weight:bold,rx:5,ry:5;
  classDef pre fill:#0f172a,stroke:#3b82f6,color:#94a3b8,rx:5,ry:5;
  classDef child fill:#0f172a,stroke:#10b981,color:#94a3b8,rx:5,ry:5;
  classDef related fill:#0f172a,stroke:#8b5cf6,stroke-dasharray: 5 5,color:#94a3b8,rx:5,ry:5;
  linkStyle default stroke:#4b5563,stroke-width:2px;

      

🧒 5 yaşındaki gibi açıkla

🛡️ Protecting fragile quantum information by spreading it across many physical bits so that no single error can destroy the data.

🤓 Expert Deep Dive

Quantum Error Correction (QEC) is essential for fault-tolerant quantum computing, as qubits are highly susceptible to environmental noise and decoherence, leading to errors in quantum states. QEC codes, such as the Shor code, Steane code, and surface code, are designed to protect quantum information by encoding a logical qubit into multiple physical qubits. These codes operate by detecting and correcting errors without measuring the encoded quantum state directly, which would collapse it.

A common approach involves using stabilizer measurements. For instance, in a simple repetition code (analogous to classical error correction), a logical $|0 angle$ might be encoded as $|000 angle$ and a logical $|1 angle$ as $|111 angle$. An error on one qubit can be detected by measuring parity checks. For a 3-qubit code, we can define stabilizer operators like $S_1 = Z_1Z_2$ and $S_2 = Z_2Z_3$. Measuring these stabilizers reveals information about the errors without collapsing the encoded state. If $S_1$ measures +1 and $S_2$ measures -1, it indicates an error on the second qubit. The measured syndrome then dictates the appropriate correction operation (e.g., an X gate).

More sophisticated codes like the surface code leverage a 2D lattice of qubits. Errors are detected by measuring plaquette (even parity of Z operators) and star (odd parity of X operators) stabilizers. The pattern of these syndrome measurements forms a 'stain' on the lattice, and decoding algorithms (e.g., minimum weight perfect matching) are used to infer the most likely error locations and apply corrections. The threshold theorem suggests that if the physical error rate is below a certain threshold, arbitrarily long quantum computations can be performed reliably by increasing the code distance (number of physical qubits per logical qubit).

🔗 İlgili terimler

Ön koşullar:

📚 Kaynaklar