Haoran Zhang

Hi! I'm currently a second-year MS student in Electrical and Computer Engineering Department at Carnegie Mellon University (CMU). Now I am working with Prof. Carlee Joe-Wong on federated learning. I am also working with Dr. Marie Siew and Prof. Rachid El-Azouzi in the current research project.

Previously, I obtained my bachelor's degree in Automation from Huazhong University of Science and Technology. During the undergrad study, I worked on AI for healthcare with Prof. Hao Chen at HKUST, and Dr. Zhongliang Jiang at TUM.

I will graduate from CMU in May 2025, and after that, begin my PhD study at University of Texas at Austin this fall.

Email  /  CV  /  Scholar  /  Github /  LinkedIn

profile photo

Research

My current project is about multi-model federated learning (MMFL): local clients can handle multiple unrelated FL models concurrently. For example, a smartphone user can generate data for both keyboard prediction and speech recognition models. Instead of building two single-model FL systems, we build one MMFL system and solve system-level optimization problems to make it more efficient.

My research at CMU: Federated Learning and Optimization

[On-going project] Client sampling in a more heterogeneous MMFL scenario. How to optimize the sampling distribution?

Group-based Client Sampling in Multi-Model Federated Learning
Zejun Gong*, Haoran Zhang*, Marie Siew, Carlee Joe-Wong, Rachid El-Azouzi
Under Review, 2024. *Equal contribution

Poster: Optimal Variance-Reduced Client Sampling for Multiple Models Federated Learning
Haoran Zhang, Zekai Li, Zejun Gong, Marie Siew, Carlee Joe-Wong, Rachid El-Azouzi
ICDCS, 2024 [Best Poster Award] [Poster]
Supplementary material

This paper explores an MMFL scenario where each client is restricted to training only one model per global round due to limited computational resources. We propose a client sampling and task assignment method that theoretically minimizes the variance of global updates. The method achieves a 30% higher accuracy than the baseline.

Fair Concurrent Training of Multiple Models in Federated Learning
Marie Siew, Haoran Zhang, Jong-Ik Park, Yuezhou Liu, Yichen Ruan, Lili Su, Stratis Ioannidis, Edmund Yeh, Carlee Joe-Wong
arXiv, 2024
Supplementary material

In the real-world scenario, clients could contribute to multiple federated learning models' training at the same time. For example, a company may update their keyboard prediction model, speech recognition model, and other more federated learning models on your phone. In this paper, we propose a client-task assignment strategy to preserve the fairness across multiple training tasks.

My research at HKUST: Medical AI

Efficient 3D Transformer with cluster-based Domain-Adversarial Learning for 3D Medical Image Segmentation
Haoran Zhang, Hao Chen
ISBI (one of the best conferences in medical image processing), 2023

We proposed a pyramidally downsampled 3D Transformer to boost efficiency and maintain high segmentation accuracy. This paper also discussed the domain generalization problem, and proposed a cluster-based domain-adversarial learning method to exploit available domains at a fine-grained level.

Some Presentations

Topics on Multi-Model Federated Learning (about our ICDCS poster)
CMU LIONS group seminar, May 17, 2024
Trade-off between client sampling and training stability
CMU LIONS group seminar, Sep 27, 2024

Teaching and Mentoring

Teaching Assistant
18786 Introduction to Deep Learning (Spring 2025)
ECE Peer Mentor
Mentees (Spring 2025): Mary Tonwe, Fanyun Xu, Han Zhou

Education

About Me

I was born in Xinxiang, China. My family name is 张 (Zhang), which is one of the most common family names in China. And my given name is 皓然 (Haoran), 皓(Hao) means "the color of the moon", 然(Ran) doesn't have a specific meaning, maybe it means "yeah, uh-huh". As the first in my family to attend college, I am deeply grateful for the support from my parents, mentors, and friends.


Template stolen from Jon Barron.
Last updated: