1. Qubit Calibration and Based Selection
This circuit begins by extracting qubit-specific calibration data from the backend. The data includes each qubit's T₁ (energy relaxation time), T₂ (dephasing time), and √X (sx) error rate.
The calibration dataset be denoted as:
Data = {(q_i, T1_i, T2_i, ϵ_i) ∣ i ∈ [0, 126]}
Thus, sort the qubits by:
Minimizing error ϵ_i,
Maximizing coherence times T1_i, T2_i.
The top 127 qubits are selected:
Logical qubits: q_0, q_1, …, q_6
Physical qubits: q_7, …, q_126
These selected qubits become the computational substrate for the annealing experiment.
2. Register Initialization
Two quantum registers are created:
A QuantumRegister of 127 qubits:
Q = {q_0, q_1, …, q_126}
A ClassicalRegister of 7 bits:
C = {c_0, c_1, …, c_6}
A quantum circuit QCQCQC is then initialized:
QC = QuantumCircuit(Q, C)
3. Twistor-Inspired Initialization
Define a custom twistor-inspired encoding that reflects a geometric rotation and entanglement pattern across qubits. For each qubit q_i ∈ Q, apply:
RY(θ_i) where θ_i = π/4(i + 1)
Followed by an entanglement step:
CX(q_i, q_(i + 1)) for i = 0 to 125
This sequence maps the initial quantum state into a geometrically rich twistor space, embedding complex inter-qubit correlations.
4. Time-Dependent Annealing Hamiltonian via RY Gates
To simulate quantum annealing, gradually evolve the quantum state from an initial “easy” Hamiltonian to a “problem” Hamiltonian using a time-dependent schedule. Mimic this evolution through a sequence of RY rotations across T = 10 discrete steps.
At each annealing step t ∈ [0, T], compute:
α_t = t/T
RY ((θ_i)^(t)) = (α_t) * (π/4) * (i + 1)) for each q_i ∈ Q
These intermediate rotations control the interpolation between energy landscapes, encouraging the system to settle into a low energy solution.
5. Final Entangling Layer
To reinforce correlations before measurement, we apply another cascade of entangling gates across the entire selected qubit chain:
CX (q_i, q_(i + 1)) for i = 0 to 125
This step geometrically reorients the state to emphasize alignment with the twistor topology.
6. Measurement of Logical Qubits
Measure only the logical qubits:
Measure q_0 -> c_0, q_1 -> c_1, …, q_6 -> c_6
This collapses the wavefunction and allows to the circuit to sample from the final probability distribution representing possible solutions.
7. Transpilation and Gate Management
The entire circuit is transpiled with optimization level 3. To avoid exceeding hardware gate limits (5000), split the circuit into subcircuits:
Each subcircuit maintains logical continuity.
Classical and quantum registers are preserved.
subcircuits = {QC^(1), QC^(2), …, QC^(n)}
8. Execution
Each subcircuit is executed using SamplerV2. For each run:
counts_i = SamplerV2(QC^(i), shots=8192)
The measurement results are extracted.
All counts are aggregated:
AllCounts = ∑ counts_i
i
9. Result Storage and Visualization
The distribution of measured bitstrings are visualized using a histogram, giving a probability profile of solution candidates.
Code:
Code
# Main Circuit
# imports
import numpy as np
import json
import logging
import pandas as pd
from qiskit import QuantumCircuit, transpile, QuantumRegister, ClassicalRegister
from qiskit_ibm_runtime import QiskitRuntimeService, Session, SamplerV2
from qiskit.circuit.library import RYGate, CXGate
from qiskit.visualization import plot_histogram
import matplotlib.pyplot as plt
# Setup logging
logging.basicConfig(level=logging. INFO)
logger = logging.getLogger(__name__)
# Load IBMQ account
service = QiskitRuntimeService(
channel='ibm_quantum',
instance='ibm-q/open/main',
token='YOUR_IBMQ_KEY_O-`'
)
backend_name = 'ibm_sherbrooke'
backend = service.backend(backend_name)
# Load calibration data and select best qubits
def load_calibration_data(file_path):
logger. info("Loading calibration data from %s", file_path)
calibration_data = pd. read_csv(file_path)
calibration_data.columns = calibration_data.columns.str.strip()
logger. info("Calibration data loaded successfully")
return calibration_data
def select_best_qubits(calibration_data, n_qubits):
logger. info("Selecting the best qubits based on T1, T2, and error rates")
qubits_sorted = calibration_data.sort_values(by=["√x (sx) error", "T1 (us)", "T2 (us)"], ascending=[True, False, False])
best_qubits = qubits_sorted["Qubit"].head(n_qubits).tolist()
logger. info("Selected qubits: %s", best_qubits)
return best_qubits
# Load calibration data
calibration_file = '/Users/Downloads/ibm_sherbrooke_calibrations_2025-04-01T18_25_39Z.csv'
calibration_data = load_calibration_data(calibration_file)
# Select qubits
num_logical_qubits = 7
num_physical_qubits = 127
best_qubits_full = select_best_qubits(calibration_data, num_physical_qubits)
logical_qubits = best_qubits_full[:num_logical_qubits]
physical_qubits = best_qubits_full[num_logical_qubits:num_logical_qubits + (num_physical_qubits - num_logical_qubits)]
encode_qubits = logical_qubits + physical_qubits
# Initialize registers
qr = QuantumRegister(num_physical_qubits)
cr = ClassicalRegister(num_logical_qubits)
qc = QuantumCircuit(qr, cr)
# Twistor encoding
def twistor_encoding(qc, qubits):
for i, q in enumerate(qubits):
theta = np.pi / 4 * (i + 1)
qc.append(RYGate(theta), [q])
if i < len(qubits) - 1:
qc.append(CXGate(), [q, qubits[i + 1]])
qc.barrier()
twistor_encoding(qc, encode_qubits)
# Annealing schedule
num_anneal_steps = 10
for step in range(num_anneal_steps):
alpha = step / num_anneal_steps
for i in encode_qubits:
angle = alpha * np.pi / 4 * (i + 1)
qc.ry(angle, i)
qc.barrier()
# Final entanglement
for i in range(len(encode_qubits) - 1):
qc. cx(encode_qubits[i], encode_qubits[i + 1])
qc.barrier()
# Measurement
for idx, lq in enumerate(logical_qubits):
qc.measure(lq, idx)
# Transpile
transpiled_qc = transpile(qc, backend=backend, optimization_level=3)
# Split circuit
max_gates = 5000
def split_circuit(qc, max_gates):
subcircuits = []
current_qc = QuantumCircuit(qc.qregs[0], qc.cregs[0])
gate_count = 0
for inst in qc. data:
if gate_count + len(inst.qubits) > max_gates:
subcircuits.append(current_qc)
current_qc = QuantumCircuit(qc.qregs[0], qc.cregs[0])
gate_count = 0
current_qc.append(inst.operation, inst.qubits, inst.clbits)
gate_count += len(inst.qubits)
subcircuits.append(current_qc)
return subcircuits
subcircuits = split_circuit(transpiled_qc, max_gates)
# Execute
all_counts = {}
with Session(service=service, backend=backend) as session:
sampler = SamplerV2(session=session)
for idx, sub_qc in enumerate(subcircuits):
job = sampler. run([sub_qc], shots=8192)
job_result = job.result()
# Extracting counts
data_bin = job_result._pub_results[0]['__value__']['data']
classical_register = sub_qc.cregs[0].name
counts = data_bin[classical_register].get_counts() if classical_register in data_bin else {}
for key, val in counts.items():
if key in all_counts:
all_counts[key] += val
else:
all_counts[key] = val
# Save results
results_data = {
"experiment_name": "Twistor-Inspired Quantum Annealing",
"raw_counts": all_counts
}
file_path = '/Users/Documents/Twistor_Quantum_Annealing_0.json'
with open(file_path, 'w') as f:
json.dump(results_data, f, indent=4)
# Visualize
plot_histogram(all_counts)
plt.title("Twistor-Inspired Quantum Annealing with Qubit Selection")
plt. show()
# End.
# ////////////////////////////////////////////////////////////////////////////////
# Code for all visualizations from Run Data
import json
import matplotlib.pyplot as plt
from qiskit.visualization import plot_histogram
import numpy as np
import seaborn as sns
from collections import Counter
import seaborn as sns
from scipy.stats import entropy
from scipy.fft import fft
from collections import defaultdict
from sklearn.decomposition import PCA
# Load results
with open('/Users/Documents/Twistor_Quantum_Annealing_0.json', 'r') as f:
data = json.load(f)
counts = data["raw_counts"]
shots_total = sum(counts.values())
states = list(counts.keys())
# Helper functions
def hamming_distance(a, b):
return sum(c1 != c2 for c1, c2 in zip(a, b))
def shannon_entropy(dist):
probs = np.array(list(dist.values())) / sum(dist.values())
return entropy(probs, base=2)
# Reference state
reference = "0000000"
n_qubits = len(reference)
# Sort counts
sorted_counts = dict(sorted(counts.items(), key=lambda item: item[1], reverse=True))
# Histogram of Top 20 States
plot_histogram(dict(list(sorted_counts.items())[:20]), title="Top 20 Measurement Outcomes")
plt.show()
# Hamming Distance Heatmap (from |0000000>)
def hamming_distance(a, b):
return sum(c1 != c2 for c1, c2 in zip(a, b))
reference_state = "0000000"
distances = [hamming_distance(reference_state, state) for state in counts]
distance_counts = Counter(distances)
plt.figure(figsize=(10,5))
sns.barplot(x=list(distance_counts.keys()), y=list(distance_counts.values()))
plt.xlabel("Hamming Distance from |0000000⟩")
plt.ylabel("Number of States")
plt.title("Error Landscape from Ground State")
plt.show()
# Population vs Bit-Flip Magnitude
bitflip_magnitudes = []
for state, count in counts.items():
bitflip_magnitudes.extend([state.count('1')] * count)
plt.figure(figsize=(10,5))
sns.histplot(bitflip_magnitudes, bins=range(9), discrete=True)
plt.title("Distribution of Bit-Flip Magnitudes")
plt.xlabel("Number of Flipped Qubits (|1⟩s)")
plt.ylabel("Frequency")
plt.show()
# State Clustering by Hamming Distance from Top-5 States
top_states = list(sorted_counts.keys())[:5]
cluster_distances = {s: [] for s in top_states}
for state in counts:
for top in top_states:
d = hamming_distance(state, top)
cluster_distances[top].append(d)
plt.figure(figsize=(12,6))
for top, dist_list in cluster_distances.items():
sns.histplot(dist_list, label=f'From {top}', bins=range(9), kde=True)
plt.title("State Clustering Around Top-5 States")
plt.xlabel("Hamming Distance")
plt.ylabel("Frequency")
plt.legend()
plt.show()
# Shannon Entropy Distribution per Hamming Distance
reference = "0000000"
hd_groups = {}
for state in counts:
d = hamming_distance(reference, state)
hd_groups.setdefault(d, {})[state] = counts[state]
hd_entropy = {d: shannon_entropy(group) for d, group in hd_groups.items()}
plt.figure(figsize=(10,5))
sns.barplot(x=list(hd_entropy.keys()), y=list(hd_entropy.values()))
plt.title("Shannon Entropy per Hamming Distance from |0000000⟩")
plt.xlabel("Hamming Distance")
plt.ylabel("Entropy (bits)")
plt.show()
# State Frequency vs Hamming Weight
bitflips = [state.count("1") for state in counts for _ in range(counts[state])]
plt.figure(figsize=(10,5))
sns.histplot(bitflips, bins=range(0,8), discrete=True)
plt.title("State Frequency vs Hamming Weight")
plt.xlabel("Hamming Weight (number of 1s)")
plt.ylabel("Frequency")
plt.show()
# Twistor Flow Projection
transition_matrix = np.zeros((7, 7))
for state, freq in counts.items():
flipped = [i for i, bit in enumerate(state) if bit == '1']
for i in flipped:
for j in flipped:
if i != j:
transition_matrix[i][j] += freq
plt.figure(figsize=(8, 6))
sns.heatmap(transition_matrix, annot=False, cmap="coolwarm", square=True, xticklabels=True, yticklabels=True)
plt.title("Twistor Flow Projection: Qubit-Qubit Co-Flips")
plt.xlabel("Qubit j")
plt.ylabel("Qubit i")
plt.show()
# Fourier Spectrum of Population Distribution
# Convert count values to frequency vector over sorted state space
sorted_states = sorted(counts.keys())
freq_vector = np.array([counts[state] for state in sorted_states])
freq_vector = freq_vector / np.linalg.norm(freq_vector) # Normalize
# Apply FFT
spectrum = np.abs(fft(freq_vector))**2
freqs = np.fft.fftfreq(len(freq_vector))
plt.figure(figsize=(10,5))
plt.plot(freqs[:len(freqs)//2], spectrum[:len(spectrum)//2])
plt.title("Fourier Spectrum of Measurement Outcome Frequencies")
plt.xlabel("Frequency")
plt.ylabel("Power")
plt.grid(True)
plt.show()
# Geodesic Map of Hamming Paths
hamming_paths = defaultdict(int)
for state, count in counts.items():
d = sum(c1 != c2 for c1, c2 in zip(state, reference))
for i, bit in enumerate(state):
if bit != reference[i]:
hamming_paths[i] += count
plt.figure(figsize=(10, 5))
plt.bar(range(n_qubits), [hamming_paths[q] for q in range(n_qubits)])
plt.xlabel("Qubit Index (Deviation Axis)")
plt.ylabel("Total Error Weight (Geodesic Intensity)")
plt.title("Geodesic Map of Hamming Path Deviation from |0000000⟩")
plt.grid(True)
plt.show()
# Mutual Information Between Qubits
qubit_probs = np.zeros((n_qubits, 2)) # P(q_i = 0), P(q_i = 1)
joint_probs = np.zeros((n_qubits, n_qubits, 2, 2))
for state, count in counts.items():
bits = list(map(int, list(state)))
for i in range(n_qubits):
qubit_probs[i][bits[i]] += count
for j in range(n_qubits):
joint_probs[i][j][bits[i]][bits[j]] += count
# Normalize
qubit_probs /= shots_total
joint_probs /= shots_total
# Compute Mutual Information
mi_matrix = np.zeros((n_qubits, n_qubits))
for i in range(n_qubits):
for j in range(n_qubits):
for xi in [0,1]:
for xj in [0,1]:
pij = joint_probs[i][j][xi][xj]
if pij > 0:
mi_matrix[i][j] += pij * np.log2(pij / (qubit_probs[i][xi] * qubit_probs[j][xj]))
plt.figure(figsize=(8,6))
sns.heatmap(mi_matrix, annot=True, cmap="viridis")
plt.title("Mutual Information Between Qubits")
plt.xlabel("Qubit j")
plt.ylabel("Qubit i")
plt.show()
# Topological Shell Volume
shell_volume = defaultdict(int)
for state in counts:
d = sum(c1 != c2 for c1, c2 in zip(state, reference))
shell_volume[d] += 1
shells = sorted(shell_volume.keys())
volumes = [shell_volume[d] for d in shells]
plt.figure(figsize=(10,5))
plt.plot(shells, volumes, marker='o')
plt.title("Topological Shell Volume Around |0000000⟩")
plt.xlabel("Hamming Distance from |0000000⟩")
plt.ylabel("Number of Distinct Bitstrings")
plt.grid(True)
plt.show()
# Bitstring Embedding via PCA (2D)
bitstring_matrix = []
weights = []
for state, count in counts.items():
bitvec = np.array(list(map(int, list(state))))
bitstring_matrix.append(bitvec)
weights.append(count)
bitstring_matrix = np.array(bitstring_matrix)
weights = np.array(weights)
# PCA
pca = PCA(n_components=2)
coords = pca.fit_transform(bitstring_matrix)
plt.figure(figsize=(10,6))
plt.scatter(coords[:,0], coords[:,1], s=weights/100, alpha=0.7, c=weights, cmap="plasma")
plt.colorbar(label="Frequency")
plt.title("2D Projection of Quantum State Space via PCA")
plt.xlabel("PC1")
plt.ylabel("PC2")
plt.grid(True)
plt.show()