Haoran Zhang (张皓然)

Hi! I'm currently a second-year MS student in Electrical and Computer Engineering Department at Carnegie Mellon University (CMU). Now I am working with Prof. Carlee Joe-Wong on federated learning. I am also working with Dr. Marie Siew and Prof. Rachid El-Azouzi in the current research project.

Previously, I obtained my bachelor's degree in Automation from Huazhong University of Science and Technology (GPA: 3.9/4.0). During the undergrad study, I worked on AI for healthcare with Prof. Hao Chen at HKUST, and Dr. Zhongliang Jiang at TUM.

I will graduate from CMU in May 2025, and I am looking for a Ph.D. position in federated learning / distributed learning / optimization starting from Fall 2025. If you are interested in my research, please feel free to contact me!!!

Email  /  CV  /  Scholar  /  Github /  LinkedIn

profile photo

Research

I did resaerch in AI for healthcare in my undergrad study, and now I am working on federated learning. My current project is about multiple model federated learning (MMFL): local clients handle multiple FL models at the same time. In this scenario, how to select clients, and assign them with different training tasks is really important to improve the performance of the system.

Poster: Optimal Variance-Reduced Client Sampling for Multiple Models Federated Learning
Haoran Zhang, Zekai Li, Zejun Gong, Marie Siew, Carlee Joe-Wong, Rachid El-Azouzi
accepted by ICDCS, 2024
Supplementary material

This paper explores an MMFL scenario where each client is restricted to training only one model per global round due to limited computational resources. We propose a client sampling and task assignment method that theoretically minimizes the variance of global updates. The method achieves a 30% higher accuracy than the baseline.

Fair Concurrent Training of Multiple Models in Federated Learning
Marie Siew, Haoran Zhang, Yuezhou Liu, Yichen Ruan, Lili Su, Stratis Ioannidis, Edmund Yeh, Carlee Joe-Wong
arXiv, 2024
Supplementary material

In the real-world scenario, clients could contribute to multiple federated learning models' training at the same time. For example, a company may update their keyboard prediction model, speech recognition model, and other more federated learning models on your phone. In this paper, we propose a client-task assignment strategy to preserve the fairness across multiple training tasks.

Efficient 3D Transformer with cluster-based Domain-Adversarial Learning for 3D Medical Image Segmentation
Haoran Zhang, Hao Chen
ISBI (one of the best conferences in medical image processing), 2023

We proposed a pyramidally downsampled 3D Transformer to boost efficiency and maintain high segmentation accuracy. We also proposed a cluster-based domain-adversarial learning method to exploit available domains at a fine-grained level.

Education

Some Presentations

Topics on Multi-Model Federated Learning (about our ICDCS poster)
CMU LIONS group seminar, May 17, 2024

About Me

I was born in Xinxiang, China. The meaning of "Xin-Xiang" is "New-York". My family name is 张 (Zhang), which is one of the most common family names in China. And my given name is 皓然 (Haoran), 皓(Hao) means "the color of the moon", 然(Ran) doesn't have a specific meaning, maybe it means "yeah, uh-huh".

I like watching movies and football (soccer). I support Bayern Munich and Manchester City!


Template stolen from Jon Barron.
Last updated: May 7, 2024