How might generative AI and human-centered design open Quantum Inspire to everyone? A design research program at TU Delft.
Quantum computing has reached a turning point. The hardware works. The algorithms exist. But the quantum ecosystem faces a critical skills gap — and it's not a physics problem. It's a design problem.
Premier design venues — CHI, DIS, UIST — have barely engaged with quantum technologies. The field has evolved almost entirely without input from designers, UX researchers, or interaction specialists.
Research shows that people with computing backgrounds can grasp quantum concepts through the right metaphors and interactions — without studying physics first. What's been missing isn't intelligence or preparation. It's design.
Millions of software developers already have the computational thinking skills to work with quantum systems. They just need interfaces that speak their language.
Well-designed metaphors score 4.0/5 on explainability but only 2.6/5 on actionability. The design opportunity: make quantum concepts not just understandable, but usable.
Generative AI can now translate natural language into working quantum circuits — collapsing the expertise barrier and opening quantum to anyone who can describe what they want to compute.
PhD research at TU Delft proposes a shift: instead of teaching quantum through physics, ground it in computational thinking — the skills that developers already have.
Key insight: Start with higher-complexity concepts like algorithms and gates that connect to existing computing knowledge, rather than the traditional physics-first approach starting from superposition.
Structured metaphors grounded in computing, not physics. Each uses tangible features from everyday experience:
The actionability gap is the design opportunity: metaphors help people understand quantum, but don't show them what they can do. This is what interface design can solve.
Of 26 works citing the 2019 CHI call-to-action for HCI in quantum, only 5 were published at design venues. The field is wide open.
Quantum scientists use "design" to mean engineering. Designers use it to mean human-centered experience. Bridging this creates a powerful new collaboration space.
What made computing universal wasn't faster chips — it was GUIs, mice, and touchscreens. Quantum needs its equivalent interaction paradigm.
Generative AI collapses the expertise barrier. Natural language to quantum circuits means anyone can explore quantum computing — no physics degree required.
The QCT thesis asked: how do we make quantum concepts actionable? Generative AI provides one answer: skip the syntax entirely. Describe what you want to compute in natural language, and AI translates it into working quantum circuits running on real hardware.
AI writes Qiskit and cQASM circuits from natural language descriptions
LLM-generated solutions tested against quantum computing problem sets
Circuits generated by AI run on Quantum Inspire backends (Starmon-7, Tuna-5, Tuna-9)
AI generates hybrid algorithms with execute() and finalize() hooks for QI platform
Three.js quantum state visualizations built through AI-human collaboration
Describe what you want to compute. AI translates to quantum circuits and explains results.
# Human prompt: "Create a Bell state and measure it"
# AI generates and executes:
from qiskit import QuantumCircuit
from qiskit_aer import AerSimulator
qc = QuantumCircuit(2, 2)
qc.h(0) # Superposition
qc.cx(0, 1) # Entanglement
qc.measure([0, 1], [0, 1])
result = AerSimulator().run(qc).result()
# {'00': 512, '11': 512} — perfectly entangledThis entire website — interactive quantum visualizations, real hardware experiments, paper replications — was built through vibecoding: human intent translated by AI into working code. 349 prompts, 445 sessions, zero lines of code written by hand.