Hi, Welcome to my site!

My name is Zeyu Dong, an PhD candidate in Applied Math & Statistics @ Stony Brook University. See bio page for official bio.

I devoted to building embodied AI that is capible to percept the world, learn through interaction, and plan to accomplish complicated tasks. Therefore, my research lays on the intersection of the following fields:

  1. Percpetion: multi-sensor fusion and representation learning.
  2. Learning: reinforcement learning (RL) in partially observed environment.
  3. Planning: large vision-language model for open-world task planning.
  4. Inferencing: real-time inferencing with hybrid architecture.

Here are some research projects I have been working on:

Research Projects

Generalization of End-to-end Autonomous Driving with LLM image

Generalization of End-to-end Autonomous Driving with LLM

Designed a hybrid architecture combining VLMs with end-to-end driving models, leveraging pre-trained VLMs for generalization, and achieving ~50% failure rate reduction in real-world deployments.

Training Models to Assist Legacy Devices image

Training Models to Assist Legacy Devices

Developed Learning to Help, a hybrid framework jointly optimizing cloud and edge models, improving image classification by 20% with minimal server interaction.

Sim2Real for End-to-end Autonomous Driving image

Sim2Real for End-to-end Autonomous Driving

Developed a training approach to transfer expert driving knowledge from simulation to real-world, addressing data scarcity with transformers and domain-randomized pre-training, achieving ~60% failure rate reduction on unseen real-world tasks.

Intelligent Control for Electron Orbit Feedback System image

Intelligent Control for Electron Orbit Feedback System

Applied Deep RL to control high-dimensional system at NSLS-II, developing model-based policy gradient algorithms for adaptive control, deploying on FPGA with <100ns latency, improving electron orbit stability by ~30%.