Research

Technical reports, experiments, and prototypes. Published as work matures.

step 0step 12k
Quantum Diffusion Models: First Experiments

Experiment · Mar 2026 · Released

We trained a quantum denoising circuit on systems from 4 to 16 qubits, characterised the quantum noise process across all scales, and ran the denoiser at 10 qubits. Full empirical data: noise schedule characterisation via OTOC and entanglement entropy, barren plateau analysis, generalisation gap, and a path toward a working generative model. Preliminary results show over 90% reduction in required training data versus classical baselines.

LayerSkip for Mixture of Experts (MoE) Architecture

Technical Report · May 2025 · Released

We integrate Meta Research's LayerSkip early-exit framework into a Mixture-of-Experts architecture, combining width-wise sparsity (MoE expert routing) with depth-wise sparsity (layer dropout and early exit). Trained on WikiText-2 with a 12-layer, 8-expert model. Preliminary results show 25–35% inference time reductions while maintaining comparable perplexity. Analysis of exit layer patterns reveals that tokens requiring complex reasoning (proper nouns) exit at deeper layers (10–11), while common words and repeated phrases exit early (5–7). Co-authored with Nicholas Papciak at Georgia Tech.