About Me
I am a recent graduate of UC San Diego getting a bachellors degree in Computational Physics and a minor in Mathematics. I have a strong interest in the intersection of physics/mathematics and machine learning. My goal is to develop efficient machine learning models rooted in the underlying physical and mathematical principles the govern the world and their task.
I am currently taking a gap year in Zurich Switzerland, studying Datascience at ETH Zurich before starting my PhD the following year at Yale University in Elecrical & Computer Engineering.
I am interested in model compression, efficient deep learning, and the application of machine learning to scientific problems. I work on Quantization Aware training, pruning, and knowledge distillation to create efficient deep learning models.
Notable Papers
Neural Architecture Codesign for Fast Physics Applications
Published in Machine Learning Science and Technology Journal, 2025
This paper presents a new hardware-aware neural architecture search pipeline for low latency deployment
Arxiv here
wa-hls4ml: Benchmark and Surrogate Models for hls4ml Resource Estimation
Published in ACM Transactions on Reconfigurable Technology and Systems (TRETS), 2025
Benchmark and dataset of over 680,000 synthesized neural networks to evaluate GNN- and transformer-based surrogate models predicting FPGA resource usage and latency
Arxiv here
