Qiyang Yan

I'm on the job market. Please reach out through email if you have good opportunities!

I am currently working as a research intern at Peking University and AgiBot, advised by Dr. Hao Dong.

Previously, I obtained my Master’s degree in 2024 and my Bachelor's degree in 2023 in Electrical and Electronic Engineering from Imperial College London, advised by Dr. Adam J. Spiers at the Manipulation and Touch Lab (MTL).

Outside of research, I’m passionate about climbing, with a best sport climbing grade of 7b (onsight) and indoor bouldering grade of V10 (7C+).

Email  /  CV  /  Linkedin

profile photo

News

[Jan, 2025] One paper gets accepted to ICRA 2025!

[Jun, 2024] I completed my Master's at Imperial College London!

[Jun, 2023] I completed my Bachelor at Imperial College London!

Publications

ClutterDexGrasp: A Sim-to-Real System for General Dexterous Grasping in Cluttered Scenes
Zeyuan Chen*, Qiyang Yan*, Yuanpei Chen*, Tianhao Wu, Jiyao Zhang, Zihan Ding, Jinzhou Li, Yaodong Yang, Hao Dong
Website / Paper (Coming Soon)

Introducing the first zero-shot sim-to-real closed-loop system for target-oriented dexterous grasping in cluttered scenes, demonstrating robust performance across diverse objects and layouts.

TwinAligner: Visual and Physical Real2Sim2Real All-in-One for Robotic Manipulation
Hongwei Fan, Hang Dai, Jiyao Zhang, Jinzhou Li, Qiyang Yan, Yujie Zhao, Xuanyu Lai, Hao Tang, Hao Dong
Website / Paper (Coming Soon)

Introducing a unified Real2Sim2Real framework: TwinRecon replicates visually and geometrically accurate scenes via 3D Gaussian Splatting and 6D pose estimation; TwinRigid jointly optimizes robot-object dynamics with limited human-in-the-loop data collection.

Variable-Friction In-Hand Manipulation for Arbitrary Objects via Diffusion-Based Imitation Learning
Qiyang Yan, Zihan Ding, Xin Zhou, Adam J. Spiers,
International Conference on Robotics and Automation (ICRA) 2025
Website / Paper

Introducing an end-to-end data-efficient learning framework for variable friction hand, allowing gripper to learn to precisely manipulate arbitrary objects (the first time) for any target pose on real hardware within 2 hours, with error around 3mm and 3°.

Research Experiences

Sensor-Agnostic Pattern Recognition Framework for Multi-Modal Tactile Sensing
Qiyang Yan (Research Assitant)
Manipulation and Touch Lab, Imperial College London, 2024

Responsible for dataset preparation for development of generalisable learning-based approaches to bridge the gap between various type of tactile sensors.

Projects

Variable-Friction In-Hand Manipulation for Polygons via Reinforcement Learning with Sim2Real Transfer
Qiyang Yan (Master Thesis)
Manipulation and Touch Lab, Imperial College London
report

Developed the first learning-based framework for the variable-friction gripper to learn to manipulate irregular polygons on real robot, achieving 95% success rate with average errors around 6 mm and 6°.

A Pick-Manipulate-Insert System with Variable-Friction Gripper for Cube
Qiyang Yan (Leader)
Manipulation and Touch Lab, Imperial College London, May-June 2023
Website / Code

Developed a vision-based closed-loop pick-manipulate-insert system with a variable-friction gripper and UR5e robotic arm using ROS. Developed a model-based IHM planner and UR5e arm trajectory planner, achieving a 92% success rate for this task for cube with positional accuracy around 3mm.

Academic Service

Conference Reviewer: CoRL'25, IROS'25