Education
Academic background and degrees

PhD in Machine Learning, 4.0 GPA
Featured2024 — 2028 (expected)
- Advisors: Dr. Pan Li and Dr. Victor Fung
- Research Focus: Deep Learning for Scientific Applications
- NSF Graduate Research Fellowship Honorable Mention (2024)
- CSGF Fellowship Alternate List (2025)

M.S. in Computer Science (ML Focus), 4.0 GPA
Featured2020 — 2021
- Advisor: Dr. Hyesoon Kim
- "Thank a Teacher" Award, Georgia Tech Center for Teaching and Learning (2020, 2021)

B.S. in Computer Science (ML Focus), Magna Cum Laude
Featured2015 — 2019
- ACM SIGBED Student Research Competition Bronze Medal (2019)
- Zell Miller Scholarship Recipient (2015 - 2019)

International Baccalaureate Diploma, 4.0 GPA
Druid Hills High School
2011 — 2015
Work Experience
My professional and research history

Research Scientist Intern, AI for Science
FeaturedMay 2025 — Present
- Developed foundational spatio-temporal models for protein dynamics, producing realistic conformational ensembles and long-horizon trajectories to support drug-discovery use cases (e.g., mechanism insight and pathway exploration).
- Adapted video diffusion with history-aware temporal attention and noise-aware training, improving long-horizon stability/robustness; achieved state-of-the-art quality and diversity/coverage on key benchmarks.
- Built an end-to-end pipeline: protein-dynamics generation → physics-based relaxation (simulator) → quality/diversity evaluation; operated at scale with distributed multi-GPU training and backtesting-style analysis.

Graduate Research Assistant
Aug 2024 — Present
- Working with Dr. Pan Li and Dr. Victor Fung on robust fine-tuning strategies for large-scale pre-trained GNN models.
- Developed parameter-efficient fine-tuning strategies for machine learning interatomic potentials models trained on the Materials Project dataset, achieving near-SOTA performance on the MatBench Discovery benchmark. (Digital Discovery 2025*)

Machine Learning Intern
FeaturedJune 2024 — Aug 2024
- Worked with Dr. Kamran Paynabar to develop novel pre-trained transformer models for manufacturing process data.
- Developed transformer models pre-trained on approximately 500,000 time-series data points from manufacturing processes to predict process outcomes and detect anomalies.
- Fine-tuned models to achieve accuracy improvements (relative to previous production models) on real-world manufacturing datasets.

Temporary Research Staff
Dec 2023 — May 2024
- Worked with Dr. Hyesoon Kim and Dr. Stefano Petrangeli on efficient inference strategies for pre-trained image diffusion models, with a focus on generating diverse, high-quality images.
- Developed an efficient sampling method for Denoising Diffusion Probabilistic Models (DDPMs) which leverages the structure of the latent space to guide sampling, reducing the number of samples needed for high-quality image generation. (Digital Discovery 2025)

AI Resident, FAIR Chemistry Team
FeaturedAug 2021 — Aug 2023
- Worked with Dr. Larry Zitnick, Dr. Abhishek Das, and Dr. Brandon Wood on the Open Catalyst Project, focusing on atomic property prediction and catalyst discovery using large-scale pre-trained models.
- Developed large foundation models for atomic property prediction, pre-trained on data from diverse chemical domains. Fine-tuned the model to achieve state-of-the-art results across 35 out of 41 downstream tasks. (ICLR 2024*)
- Benchmarked state-of-the-art machine learning interatomic potentials models on the Open Catalyst 2022 dataset, one of the largest datasets for automatic catalyst discovery. (ACS Catalysis 2023)
- Co-authored a paper discussing the challenges and potential of developing generalizable machine learning models for catalyst discovery, highlighting the importance of large-scale datasets like the Open Catalyst 2020 Data set (OC20). (ACS Catalysis 2022)
- Contributed to the development of a transfer learning approach using Graph Neural Networks to generalize models across domains in molecular and catalyst discovery, reducing the need for large, domain-specific datasets. (J Chem Phys 2022)

Graduate Research Assistant
May 2019 — May 2021
- Developed software-level and hardware-level techniques for accelerating deep learning training and inference under the guidance of advisors Dr. Hyesoon Kim and Dr. Moinuddin Qureshi.
- Introduced SmaQ, a quantization scheme that leverages the normal distribution of neural network data structures to efficiently quantize them, addressing the memory bottleneck in single-machine training of deep networks. (IEEE CAL 2021*)
- Developed NNW-BDI, a neural network weight compression scheme that reduces memory usage by up to 85% without sacrificing inference accuracy on an MNIST classification task. (MemSys 2020*)
- Demonstrated the feasibility of running ORB-SLAM2 in real-time on the Raspberry Pi 3B+ for embedded robots through optimizations that achieved a 5x speedup with minor impact on accuracy. (SRC ESWEEK 2019, 3rd Place*)
- Co-authored a paper on a context-aware task handling technique for resource-constrained mobile robots, enabling concurrent execution of critical tasks with improved real-time performance. (IEEE Edge 2023)
- Contributed to a study that formalized the subsystems of autonomous drones and quantified the complex tradeoffs in their design space to enable optimized solutions for diverse applications. (ASPLOS 2021)
- Collaborated on the development of Pisces, a power-aware SLAM implementation that consumes 2.5x less power and executes 7.4x faster than the state of the art by customizing efficient sparse algebra on FPGAs. (DAC 2020)
- Participated in an in-depth analysis of the hardware and software components of autonomous drones, characterizing the performance of the ArduCopter flight stack and providing insights to optimize flight controllers and increase drone range. (ISPASS 2020)

Graduate Teaching Assistant
Aug 2020 — May 2021
- Led weekly recitations, graded assignments, and held office hours to help students understand course material for CS 4510: Automata and Complexity, a senior-level undergraduate course on the theory of computation. Taught the course in Fall 2020 with Dr. Merrick Furst and in Spring 2021 with Dr. Zvi Galil.
- Received two "Thank a Teacher" awards from the Georgia Tech Center for Teaching and Learning in recognition of outstanding contributions and positive impact as a teaching assistant. (2020, 2021)

Research Assistant
Jan 2020 — Aug 2020
- Developed Graph Neural Network (GNN) based machine learning models to analyze social media data for detecting incoming cyber attacks, under the guidance of advisor Dr. Maria Konte.
- Utilized GNNs to effectively capture the complex relationships and patterns within social media networks, enabling early detection and prevention of potential cyber threats.

Software Engineering Intern
May 2017 — Aug 2018
- Developed software to interface with network devices and maintained CI/CD pipelines for build processes.
- Collaborated with cross-functional teams to ensure smooth integration of software components and timely delivery of projects.
- Gained valuable experience in software development best practices, version control, and agile methodologies.