Shutong Wu

PhD Student in Computer Science

I'm a second-year PhD student at the University of Rochester, advised by Prof. Zhen Bai in the Interplay Lab. My research focuses on Mixed Reality and 3D Generation, with particular interest in how users create 3D worlds and the systems we can build to facilitate immersive, interactive experiences.

Recent Updates

  • Oct 2025
    Unity-MCP is accepted at Siggraph Asia 2025 Technical Communication Track.
  • Aug 2025
    Volunteered at SIGGRAPH 2025.
  • Jul 2025
    Maintainer and major contributor to the open-source repo Unity-MCP. GitHub
  • May 2025
    AGen received Best LBW Paper Nomination at AIED 2025. Paper
  • Sep 2024
    Started PhD at University of Rochester, advised by Prof. Zhen Bai in Interplay Lab.

Research Interests

Mixed Reality

Exploring immersive AR/VR experiences, interaction techniques, and novel applications that enhance user engagement and productivity in mixed reality environments.

3D Generation

Investigating generative AI approaches for 3D content creation, with focus on enabling users to create and manipulate 3D worlds intuitively.

Human-Computer Interaction

Understanding how users interact with complex 3D systems, evaluating user experiences, and designing interfaces that facilitate creative workflows.

Analogy & Analogical Learning

Investigating how people generate and understand analogies, leveraging analogical reasoning to develop innovative educational tools and exploring novel computational approaches to analogical thinking.

Publications

AGen: Personalized Analogy Generation through LLM

AIED 2025 (Best LBW Paper Nomination)
Published

An analogy generation platform that creates tailored analogies based on user profiles and education levels, enhancing personalized learning experiences.

AGen System Screenshot

EmbodiedCreate: In-Situ 3D Authoring for Analogical Learning

Submitted to CHI 2026
Under Review

An in-situ 3D authoring toolkit that empowers learners and educators to dynamically reshape virtual analogical learning environments in real-time, encouraging democratization and personalized learning.

EmbodiedCreate 2.0: Enhanced System for Immersive 3D Content Creation

In Preparation
Draft

An extension of the EmbodiedCreate system that improves accessibility and usability while enabling further embodied design interactions for enhanced pedagogical benefits.

Current Projects

Unity-MCP

Maintainer and main contributor to the Unity-MCP project that enables MCP (Model Context Protocol) clients to perform Unity In-Editor actions, bridging AI language models with Unity development workflows.

Unity C# MCP AI Integration
Unity-MCP Interface Screenshot

VR-MCP (In Development)

Extending MCP capabilities to VR environments, enabling seamless integration between AI language models and virtual reality development and interaction paradigms.

Unity XR VR MCP C#

VR Vision Testing (Penn Medicine)

Designed and implemented VR vision tests on Quest 2 platform, providing accessible virtual alternatives to physical vision tests for low-vision patients. Secured two patents for novel VR-based vision testing methodologies.

Unity XR Quest 2 Medical VR Shaders

Diminished Reality for Enhanced Focus

An adaptive Mixed Reality technique that leverages semantic understanding to remove daily objects and improve user focus and productivity in AR environments.

Mixed Reality Computer Vision Unity XR Machine Learning

Experience

University of Rochester

PhD Student, Computer Science
September 2024 - Present (Expected 2028)
  • Research focus on Mixed Reality, 3D Generation, and Human-Computer Interaction
  • Advised by Prof. Zhen Bai in the Interplay Lab
  • GPA: 4.0/4.0
  • Teaching Assistant for AR/VR Interaction, Intro to AI, and Computer Algorithms

Penn Medicine Ophthalmology

VR Software Developer
December 2022 - April 2024
  • Designed and implemented VR vision tests on Quest 2 platform for low-vision patients
  • Secured two patents for novel VR-based vision testing methodologies
  • Developed end-to-end VR software solution within 9 months
  • Utilized Unity XR, custom shaders, and post-processing techniques

Penn CG Lab

Research Assistant
December 2022 - April 2024
  • Collaborated with Prof. Lingjie Liu on NeRF-based research project
  • Created Unity animation infrastructure in C# for volumetric scene generation
  • Developed C++ plugins to convert SMPL files to FBX animations

ByteDance

Platform Engineer Intern
October 2021 - April 2022
  • Developed efficiency tools including Overdraw and Mipmap Collector
  • Reduced average frame time by 10ms through graphics optimization
  • Collaborated with game studios on performance analysis and optimization

NetEase Games

Game Developer
January 2021 - October 2021
  • Developed mobile game features and mechanics
  • Optimized game performance and user experience
  • Collaborated with cross-functional teams on game development projects

Technical Skills

Programming Languages

C++ C# Python Java Swift Kotlin

Graphics & VR

Unity XR OpenXR Unreal Engine OpenGL Vulkan CUDA

AI & ML

PyTorch LLaMA OpenAI

Tools & Frameworks

Git React Docker Linux MCP

Contact

I'm always interested in discussing research collaborations, new opportunities, or just connecting with fellow researchers in AR/VR and 3D generation.

📧 Email 💻 GitHub 💼 LinkedIn 📄 CV

Office: Wegmans Hall, University of Rochester, Rochester, NY
Lab: Interplay Lab