Neural Surrogates
Learning to emulate computationally expensive simulations using deep temporal models. Achieving 125,000x speedup at 99.8% accuracy for epidemiological agent-based models used in WHO pandemic preparedness.
I am a PhD researcher and software engineer at Imperial College London, specialising in foundational machine learning and applied mathematics for scientific computing, working on sequential neural surrogates, state-space models, and simulation-based inference. I also work as a full-stack Machine Learning Research Software Engineer supporting the WHO national malaria control programmes. Previously at Cambridge and DFKI. Work spanning NeurIPS, Nature and The Lancet.
Key contributions in neural surrogates and state-space architectures.
I make expensive computations fast and tractable. I build neural surrogates that replace costly simulations with learned emulators, design state-space architectures for efficient language and time-series modelling, and optimise LLM training through multi-fidelity Bayesian methods.
Learning to emulate computationally expensive simulations using deep temporal models. Achieving 125,000x speedup at 99.8% accuracy for epidemiological agent-based models used in WHO pandemic preparedness.
Efficient state-space architectures (Mamba-2) in JAX for causal language modelling and time-series forecasting. Contributions merged into Google's jax-ml/bonsai model zoo with up to 28x inference speedup through state-space caching.
Developing methods for likelihood-free inference in complex stochastic systems. Publications in Nature and The Lancet. NeurIPS 2025 workshop on tokenised flow matching for hierarchical SBI.
Methods for learning sequentially without catastrophic forgetting. Enabling models to adapt over time as new data arrives while retaining previously acquired knowledge.
Sample-efficient multi-fidelity optimisation for expensive black-box functions. Applied to LLM data mixture discovery, jointly optimising over model scale and training duration via learning curve extrapolation.
Frameworks: JAX, PyTorch, Flax, CUDA.
Focus: State-space models, temporal emulation, Bayesian inference, DuckDB analytics.
Recognition: SAGE Award from UK Chief Scientific Officers.
Research goal:
make simulations learnable,
make inference tractable,
make science faster.
Always open to interesting problems and people.