1. Qubit Calibration and Selection
1. Qubit Calibration and Selection
Begin by selecting three physical qubits from the 'ibm_sherbrooke' using the calibration data provided by IBM. The calibration metrics are:
T1: energy relaxation time (us)
T2: dephasing time (us)
ϵ_(√X): error rate of the √X (or sx) gate
Construct the following tuple:
Data={(q_i, ((T_1)^(i)), ((T_2)^(i)), ϵ_i) i ∈ [0, 126]}
Sort the qubits to minimize ϵ_i and maximize ((T_1)^(i)) and ((T_2)^(i)), and select the top 3 qubits:
q_0: State to teleport (Teleportee)
q_1: Alice's qubit (half of the entangled Bell pair)
q_2: Bob's qubit (recipient of the teleported state)
2. Register Initialization
Define:
A QuantumRegister of 3 qubits:
Q = {q_0, q_1, q_2}
A ClassicalRegister of 2 bits (for Alice's measurements):
C={c_0, c_1}
Initialize a quantum circuit QC using these registers.
3. Initialization of the State to be Teleported
Prepare the qubit q_0 in an arbitrary state to teleport. initialize the qubit to be teleported in the Hadamard state:
∣ψ⟩ = H ∣0⟩ = 1/√2 (∣0⟩ + ∣1⟩)
4. Bell Pair Creation Between Alice and Bob
Entangle q_1 and q_2 using the standard Bell state construction:
H(q_1) ⇒ 1/√2 (∣0⟩ + ∣1⟩)
CX (q_1, q_2) ⇒ 1/√2 (∣00⟩ + ∣11⟩)
So now, q_1 and q_2 are in a maximally entangled Bell state.
5. Twistor-Inspired Encoding
Apply a twistor-based encoding to the entangled qubits {q_1, q_2}. This encoding is based on structured geometric rotations and entanglements:
For each qubit q_i in the encoding list:
θ_i = π/4 (i + 1)
RY(θ_i) on q_i
CX(q_i, q_(i + 1)) if i < 'N − 1'
This introduces geodesic entanglement and holomorphic phase constraints, consistent with the geometry of twistor space, a complex projective space introduced by Roger Penrose.
6. Teleportation Protocol
Apply the standard teleportation steps:
Entangle the teleportee with Alice’s qubit:
CX(q_0, q_1)
Apply Hadamard to teleportee:
H(q_0)
This prepares the system for measurement in the Bell basis, enabling state transfer through classical communication.
7. Deliberate Error Injection
To simulate a noisy environment, inject errors after the entanglement but before measurement:
Apply a Pauli-X gate:
X(q_1)
Apply a controlled-Z gate:
CZ(q_0, q_1)
These represent bit-flip and entanglement-decoherence errors.
8. Measurement of Alice's Qubits
Alice measures her two qubits in the computational basis:
Measure q_0 -> c_0, q_1 -> c_1
This projects the entangled state and yields a classical bitstring c_0c_1 ∈ {00, 01, 10, 11}
9. Result Storage and Visualization
The distribution of measured bitstrings are visualized using a histogram, giving a probability profile of solution candidates.
9. Conditional Correction (In Post-Processing)
Ordinarily, Bob would apply Pauli gates based on Alice’s result. But, since SamplerV2 does not support classical conditionals, this defers correction to post-processing by filtering the measured bitstrings and computing fidelity manually (code at bottom).
10. Transpilation and Execution
The circuit is transpiled with optimization level 3 for the ibm_sherbrooke. The final transpiled circuit is submitted using:
SamplerV2. run([QC], shots = 8192)
11. Results Storage and Visualization
The measured counts are stored in a json. The histogram of classical outcomes is plotted, showing the distribution of measurement outcomes.
Code:
# Main Circuit
# Imports
import numpy as np
import json
import logging
import pandas as pd
from qiskit import QuantumCircuit, transpile, QuantumRegister, ClassicalRegister
from qiskit_ibm_runtime import QiskitRuntimeService, Session, SamplerV2
from qiskit.circuit.library import RYGate, CXGate, XGate, CZGate
from qiskit.visualization import plot_histogram
import matplotlib.pyplot as plt
# Setup logging
logging.basicConfig(level=logging. INFO)
logger = logging.getLogger(__name__)
# Load IBMQ account
service = QiskitRuntimeService(
channel='ibm_quantum',
instance='ibm-q/open/main',
token='YOUR_IBMQ_KEY_O-`'
)
backend_name = 'ibm_sherbrooke'
backend = service.backend(backend_name)
# Load calibration data and select best qubits
def load_calibration_data(file_path):
logger. info("Loading calibration data from %s", file_path)
calibration_data = pd. read_csv(file_path)
calibration_data.columns = calibration_data.columns.str.strip()
logger. info("Calibration data loaded successfully")
return calibration_data
def select_best_qubits(calibration_data, n_qubits):
logger. info("Selecting best qubits based on T1, T2, and √X error")
qubits_sorted = calibration_data.sort_values(
by=["√x (sx) error", "T1 (us)", "T2 (us)"],
ascending=[True, False, False]
)
best_qubits = qubits_sorted["Qubit"].head(n_qubits).tolist()
logger. info("Selected qubits: %s", best_qubits)
return best_qubits
# Load calibration
calibration_file = '/Users/steventippeconnic/Downloads/ibm_sherbrooke_calibrations_2025-04-03T00_36_16Z.csv'
calibration_data = load_calibration_data(calibration_file)
selected_qubits = select_best_qubits(calibration_data, 3)
# Initialize registers
qr = QuantumRegister(3, 'q') # q0: teleportee, q1: Alice, q2: Bob
cr = ClassicalRegister(2, 'c') # For Alice's measurement
teleport_circuit = QuantumCircuit(qr, cr)
# Twistor-inspired encoding
def twistor_encoding(qc, qubit_indices):
for i, q in enumerate(qubit_indices):
theta = np.pi / 4 * (i + 1)
qc.append(RYGate(theta), [q])
if i < len(qubit_indices) - 1:
qc.append(CXGate(), [q, qubit_indices[i + 1]])
qc.barrier()
# Prepare state to teleport
teleport_circuit.h(qr[0])
teleport_circuit.barrier()
# Create entanglement between Alice and Bob
teleport_circuit.h(qr[1])
teleport_circuit.cx(qr[1], qr[2])
teleport_circuit.barrier()
# Apply twistor encoding to entangled qubits (q1, q2)
twistor_encoding(teleport_circuit, [qr[1], qr[2]])
# Apply teleportation gates
teleport_circuit.cx(qr[0], qr[1])
teleport_circuit.h(qr[0])
teleport_circuit.barrier()
# Inject deliberate errors
teleport_circuit.append(XGate(), [qr[1]]) # An X error
teleport_circuit.append(CZGate(), [qr[0], qr[1]]) # A correlated error
teleport_circuit.barrier()
# Measure Alice's qubits
teleport_circuit.measure(qr[0], cr[0])
teleport_circuit.measure(qr[1], cr[1])
teleport_circuit.barrier()
# Transpile circuit
transpiled_qc = transpile(teleport_circuit, backend=backend, optimization_level=3)
# Execute with SamplerV2
with Session(service=service, backend=backend) as session:
sampler = SamplerV2(session=session)
job = sampler. run([transpiled_qc], shots=8192)
job_result = job.result()
# Extract result
data_bin = job_result._pub_results[0]['__value__']['data']
classical_register = transpiled_qc.cregs[0].name
counts = data_bin[classical_register].get_counts() if classical_register in data_bin else {}
# Save results to JSON
results_data = {
"experiment_name": "Twistor-Based Quantum Teleportation with Error Correction",
"raw_counts": counts
}
file_path = '/Users/steventippeconnic/Documents/QC/Twistor_QTeleportation_Results_0.json'
with open(file_path, 'w') as f:
json.dump(results_data, f, indent=4)
# Visualization
plot_histogram(counts)
plt.title("Quantum Teleportation with Twistor Error Correction")
plt. show()
# End
////////////////////////////////////////////////////
Code to Calculate Fidelity from Run Result
# Imports
import json
# Experiment Results
file_path = '/Users/steventippeconnic/Documents/QC/Twistor_QTeleportation_Results_0.json'
with open(file_path, 'r') as f:
data = json.load(f)
counts = data['raw_counts']
total_shots = sum(counts.values())
# Fidelity is 1 if Bob state matches |+⟩, 0 if |−⟩
# 00 and 10 → Bob should be in |+⟩
correct_matches = counts.get('00', 0) + counts.get('10', 0)
fidelity = correct_matches / total_shots
print(f" Teleportation Fidelity: {fidelity:.5f}")
# End
//////////////////////////////////////////////////
# Code For All Visualizations Using Run Data
import json
import matplotlib.pyplot as plt
import numpy as np
from math import log2
from scipy.stats import entropy
import seaborn as sns
from math import log2
from collections import defaultdict
from itertools import product
# Load results
file_path = '/Users/steventippeconnic/Documents/QC/Twistor_QTeleportation_Results_0.json'
with open(file_path, 'r') as f:
data = json.load(f)
counts = data['raw_counts']
total_shots = sum(counts.values())
bitstrings = list(counts.keys())
total = sum(counts.values())
# Measurement Outcome Histogram
plt.figure(figsize=(8, 5))
plt.bar(counts.keys(), counts.values(), color='navy')
plt.title('Twistor-Based Quantum Teleportation Measurement Outcomes')
plt.xlabel('Measurement (c0c1)')
plt.ylabel('Counts')
plt.grid(True)
plt.tight_layout()
plt.show()
# Hamming Distance from Ideal Outcome |00⟩
def hamming_dist(a, b):
return sum(c1 != c2 for c1, c2 in zip(a, b))
distances = {key: hamming_dist(key, '00') for key in counts}
hdist_values = [distances[k] for k in counts]
plt.figure(figsize=(8, 5))
plt.bar(distances.keys(), hdist_values, color='crimson')
plt.title('Hamming Distance from Ideal Outcome |00⟩')
plt.xlabel('Bitstring')
plt.ylabel('Hamming Distance')
plt.grid(True)
plt.tight_layout()
plt.show()
# Entropy Contribution by Bitstring
probs = np.array([v / total_shots for v in counts.values()])
shannon_entropy = -np.sum(probs * np.log2(probs))
bit_entropy = {k: -(v / total_shots) * log2(v / total_shots) for k, v in counts.items()}
plt.figure(figsize=(8, 5))
plt.bar(bit_entropy.keys(), bit_entropy.values(), color='darkgreen')
plt.title(f'Entropy Contribution per Bitstring (Total: {shannon_entropy:.4f} bits)')
plt.xlabel('Bitstring')
plt.ylabel('Entropy Contribution')
plt.grid(True)
plt.tight_layout()
plt.show()
# Geodesic Shell Projection (Cluster View)
# Treat each bitstring as a 2D point: x = int(c0), y = int(c1)
points = np.array([[int(s[0]), int(s[1])] for s in counts.keys()])
weights = np.array([v for v in counts.values()])
plt.figure(figsize=(6, 6))
sns.kdeplot(
x=points[:, 0], y=points[:, 1],
weights=weights, fill=True, cmap='magma',
bw_adjust=0.5, levels=100, thresh=0.01
)
plt.title('Twistor Geodesic Shell Density Projection')
plt.xlabel('Bit q0')
plt.ylabel('Bit q1')
plt.grid(True)
plt.tight_layout()
plt.show()
# Mutual Information Between c0 and c1
joint_probs = np.zeros((2, 2))
for bitstr, freq in counts.items():
i = int(bitstr[0]) # c0
j = int(bitstr[1]) # c1
joint_probs[i][j] += freq / total_shots
# Marginals
p_c0 = np.sum(joint_probs, axis=1)
p_c1 = np.sum(joint_probs, axis=0)
# Mutual Information
mi = 0
for i in [0, 1]:
for j in [0, 1]:
if joint_probs[i][j] > 0:
mi += joint_probs[i][j] * log2(joint_probs[i][j] / (p_c0[i] * p_c1[j]))
plt.figure(figsize=(6, 5))
sns.heatmap(joint_probs, annot=True, fmt=".3f", cmap="Blues", xticklabels=["c1=0", "c1=1"], yticklabels=["c0=0", "c0=1"])
plt.title(f"Mutual Information Heatmap (MI = {mi:.4f} bits)")
plt.xlabel("c₁")
plt.ylabel("c₀")
plt.tight_layout()
plt.show()
# Frequency by Parity
# Even parity: 00 and 11, Odd: 01 and 10
parity_counts = {"Even": 0, "Odd": 0}
for bit, val in counts.items():
ones = bit.count('1')
if ones % 2 == 0:
parity_counts["Even"] += val
else:
parity_counts["Odd"] += val
plt.figure(figsize=(6, 5))
plt.bar(parity_counts.keys(), parity_counts.values(), color='purple')
plt.title("Bitstring Counts by Parity (Even vs Odd)")
plt.ylabel("Counts")
plt.tight_layout()
plt.grid(True)
plt.show()
# Cumulative Probability Distribution (CDF)
sorted_counts = dict(sorted(counts.items(), key=lambda item: item[1], reverse=True))
labels = list(sorted_counts.keys())
probs = np.array([v / total_shots for v in sorted_counts.values()])
cdf = np.cumsum(probs)
plt.figure(figsize=(7, 5))
plt.plot(labels, cdf, marker='o', color='darkorange')
plt.title("Cumulative Distribution Function (CDF)")
plt.xlabel("Bitstring (Sorted by Frequency)")
plt.ylabel("Cumulative Probability")
plt.grid(True)
plt.tight_layout()
plt.show()
# Twistor-Sector Projection (Complex Plane Encoding)
# Encode each bitstring as z = c0 + i*c1, then plot polar radius/angle
angles = []
radii = []
weights = []
for bit, freq in counts.items():
x = int(bit[0])
y = int(bit[1])
z = complex(x, y)
angles.append(np.angle(z))
radii.append(np.abs(z))
weights.append(freq)
plt.figure(figsize=(6, 6))
plt.subplot(projection='polar')
plt.scatter(angles, radii, s=np.array(weights) * 0.05, c=weights, cmap='plasma', alpha=0.8)
plt.title("Twistor-Sector Projection in Complex Plane")
plt.tight_layout()
plt.show()
# Autocorrelation Between c0 and c1
c0_bits = []
c1_bits = []
for bit, freq in counts.items():
c0_bits.extend([int(bit[0])] * freq)
c1_bits.extend([int(bit[1])] * freq)
c0_arr = np.array(c0_bits)
c1_arr = np.array(c1_bits)
correlation = np.corrcoef(c0_arr, c1_arr)[0, 1]
plt.figure(figsize=(6, 5))
plt.scatter(c0_arr, c1_arr, alpha=0.05)
plt.title(f'Autocorrelation Scatter of c₀ and c₁ (ρ = {correlation:.4f})')
plt.xlabel('c₀')
plt.ylabel('c₁')
plt.grid(True)
plt.tight_layout()
plt.show()
# Bit Transition Matrix
keys = list(counts.keys())
transition_matrix = defaultdict(lambda: defaultdict(int))
# Weighted by frequency
sequence = []
for k, v in counts.items():
sequence.extend([k] * v)
for i in range(len(sequence) - 1):
transition_matrix[sequence[i]][sequence[i + 1]] += 1
states = sorted(counts.keys())
matrix = np.zeros((4, 4))
state_idx = {k: i for i, k in enumerate(states)}
for src in states:
row_sum = sum(transition_matrix[src].values())
if row_sum > 0:
for tgt in states:
matrix[state_idx[src]][state_idx[tgt]] = transition_matrix[src][tgt] / row_sum
plt.figure(figsize=(7, 6))
sns.heatmap(matrix, xticklabels=states, yticklabels=states, cmap="YlGnBu", annot=True, fmt=".2f")
plt.title("Bitstring Transition Probability Matrix (Markov View)")
plt.xlabel("To State")
plt.ylabel("From State")
plt.tight_layout()
plt.show()
# Hamming Ring Structure
bitlist = list(counts.keys())
hamming_graph = defaultdict(int)
for a, b in product(bitlist, repeat=2):
dist = sum(x != y for x, y in zip(a, b))
weight = counts[a] * counts[b]
hamming_graph[dist] += weight
distances = sorted(hamming_graph.keys())
values = [hamming_graph[d] for d in distances]
plt.figure(figsize=(7, 5))
plt.bar(distances, values, color='darkred')
plt.title("Hamming Distance Ring Structure Among Bitstrings")
plt.xlabel("Hamming Distance")
plt.ylabel("Weighted Co-occurrence")
plt.xticks(distances)
plt.grid(True)
plt.tight_layout()
plt.show()
# Twistor-Tessellation Lattice Heatmap (Geodesic Normalized)
# Normalize by radius in L2 space from origin (0,0)
lattice = np.zeros((2, 2))
normalizer = np.zeros((2, 2))
for bit, count in counts.items():
x, y = int(bit[0]), int(bit[1])
r = np.sqrt(x**2 + y**2)
if r == 0:
norm_count = count
else:
norm_count = count / r
lattice[y][x] += norm_count
normalizer[y][x] += 1
# Avoid divide-by-zero
lattice[normalizer > 0] /= normalizer[normalizer > 0]
plt.figure(figsize=(6, 5))
sns.heatmap(lattice, annot=True, fmt=".1f", cmap="magma", xticklabels=["0", "1"], yticklabels=["1", "0"])
plt.title("Twistor-Tessellation Lattice (Radius-Normalized Density)")
plt.xlabel("c₀ (X-axis)")
plt.ylabel("c₁ (Y-axis)")
plt.tight_layout()
plt.show()