Zhenheng Tang
coolzhtang at gmail dot com
I’m currently a Postdoctoral Researcher at The Hong Kong University of Science and Technology under the supervision of Prof. Bo Li and Prof. Xiaowen Chu. Previously, I obtained my Ph.D. in Computer Science from Hong Kong Baptist University advised by Prof. Xiaowen Chu and Prof. Amelie Chi Zhou, co-supervised by Prof. Bo Han. My Ph.D. thesis focuses on distributed and federated learning—improving training efficiency and addressing data heterogeneity. My prior experience lies in machine learning systems and understanding how data influences training dynamics. I received a B.E. in Telecommunications Engineering from Huazhong University of Science and Technology in 2018.
Research Interests
My current research focuses on Efficient LLM/VLM agents, deploying LLM/VLM in real-world applications, and efficient LLM/VLM training and inference. Specifically, I’m studying:
- LLM Agents: Agents with external tools/memory/environment deployed in real-world applications
- Efficient Agentic LLMs: Memory and computation efficient agentic LLMs
- Data Selection/Synthesis: Methods to improve training efficiency through data optimization
- Model Structures: Training paradigms and model structures to improve data exploitation efficiency
I’m open for academic collaborations. If you are interested, please feel free to contact me.
Education
- 2020.09 - 2024.08: Ph.D. in Computer Science, Hong Kong Baptist University
- 2014.09 - 2018.06: B.E. in Telecommunications Engineering, Huazhong University of Science and Technology
Work & Research Experience
- 09/2024-present: PostDoc Researcher, The Hong Kong University of Science and Technology, advised by Prof. Bo Li and Prof. Xiaowen Chu
- 09/2023-08/2024: Visiting Researcher, HKUST (Guangzhou), advised by Prof. Xiaowen Chu
- 02/2023-05/2023: Visiting Researcher, National University of Singapore, advised by Prof. Bingsheng He
- 06/2022-10/2022: Research Intern, FedML Inc, advised by Dr. Chaoyang He
- 10/2018-09/2020: Research Assistant, Hong Kong Baptist University, advised by Prof. Xiaowen Chu
News
See all news for more details.
- 2025.11: GitTaskBench accepted at AAAI 2025 (Oral)
- 2025.09: ChunkKV accepted at NeurIPS 2025
- 2025.07: One paper accepted at COLM 2025
- 2025.06: One paper accepted at ACL 2025 Findings
- 2025.05: One paper accepted at ICML 2025; four papers accepted at ICML Workshops
- 2025.01: Two papers accepted at ICLR 2025; two ICLR Blogposts selected
- 2024.11: Top Reviewer, NeurIPS 2024 (Main & D&B Tracks)
- 2024.10: Outstanding Student Paper Award, FL@FM Workshop @ NeurIPS 2024
- 2024.09: FuseFL accepted at NeurIPS 2024 (Spotlight); Started as Postdoc at HKUST
Selected Publications
The $\star$ represents equal contribution, 📧 corresponding author.
- Z. Tang$\star$, Z. Tang$\star$, J. Huang, X. Pan, R. Yan, Y. Wang, A. C. Zhou, S. Shi📧, X. Chu📧, B. Li📧. DreamDDP: Accelerating Data Parallel Distributed LLM Training with Layer-wise Scheduled Partial Synchronization. MLSys 2026.
- Z. Tang$\star$, Z. Tang$\star$📧, G. Pan, B. Liu, K. Lai, X. Chu📧, B. Li📧. Ghost in the Cloud: Your Geo-Distributed Large Language Models Training is Easily Manipulated. ICLR 2026.
- F. Wei$\star$, Z. Tang$\star$, R. Zeng, T. Liu, C. Zhang, X. Chu, B. Han📧. JailbreakLoRA: Your Downloaded LoRA from Sharing Platforms might be Unsafe. ICLR 2026.
- Q. Li, J. Wu, X. Liu, Y. Wang, Z. Li, Y. Chen, S. Shi, Z. Tang📧, X. Chu📧. Reasoning Language Model Inference Serving Unveiled: An Empirical Study. ICLR 2026.
- X. Liu$\star$, Z. Tang$\star$, P. Dong, Z. Li, B. Li, X. Hu, X. Chu📧. ChunkKV: Semantic-Preserving KV Cache Compression for Efficient Long-Context LLM Inference. NeurIPS 2025.
- P. Dong$\star$, Z. Tang$\star$📧, X. Liu, L. Li, X. Chu📧, B. Li. Can Compressed LLMs Truly Act? An Empirical Evaluation of Agentic Capabilities in LLM Compression. ICML 2025.
- Z. Tang, X. Liu, Q. Wang, P. Dong, B. He, X. Chu📧, B. Li📧. The Lottery LLM Hypothesis, Rethinking What Abilities Should LLM Compression Preserve? ICLR 2025 Blogpost.
- P. Dong, L. Li, Z. Tang, X. Liu, Z. Wei, Q. Wang, X. Chu📧. ParZC: Parametric Zero-Cost Proxies for Efficient NAS. AAAI 2025 (Oral Presentation).
- Y. Zhu$\star$, Z. Tang$\star$, X. Liu, A. Li, B. Li, X. Chu, B. Han📧. OracleKV: Oracle Guidance for Question-Independent KV Cache Eviction. ICML 2025 Workshop LCFM (Oral).
- Z. Tang, Y. Zhang, P. Dong, Y. Cheung, A. C. Zhou, B. Han, X. Chu📧. FuseFL: One-Shot Federated Learning through the Lens of Causality with Progressive Model Fusion. NeurIPS 2024 (Spotlight).
- L. Shen$\star$, Z. Tang$\star$, L. Wu, Y. Zhang, X. Chu, T. Qin, B. Han📧. Hot-pluggable Federated Learning: Bridging General and Personalized FL via Dynamic Selection. ICLR 2025. Federated Foundation Models@NeurIPS 2024 Workshop (Oral, Outstanding Student Paper Award).
- Z. Tang, Y. Zhang, S. Shi, X. Tian, T. Liu, B. Han, X. Chu📧. FedImpro: Measuring and Improving Client Update in Federated Learning. ICLR 2024.
- Z. Tang, J. Huang, R. Yan, Y. Wang, Z. Tang💡📧, S. Shi, A. C. Zhou, X. Chu📧. Bandwidth-Aware and Overlap-Weighted Compression for Communication-Efficient Federated Learning. ICPP 2024.
- Z. Tang, S. Shi, B. Li, X. Chu📧. GossipFL: A Decentralized Federated Learning Framework with Sparsified and Adaptive Communication. IEEE TPDS 2022.
- Z. Tang, Y. Zhang$\star$, S. Shi, X. He, B. Han, X. Chu📧. Virtual Homogeneity Learning: Defending against Data Heterogeneity in Federated Learning. ICML 2022.
Professional Activities
Invited Area Chair:
- NeurIPS 2025
- ACL ARR 2025 October
- ICML 2026
Invited Program Committee Member (Reviewer):
- Machine Learning: KDD’23-26, ICML’22-26, NeurIPS’22-24, ICLR’23-26, AAAI’23-25, AISTATS’23-25, UAI’22, IJCAI’22, COLM’25, ACL’25-26, EMNLP’25
- Networking & Systems: HPCC’21, ICDCS’22-23, ICPADS’22, IWQOS’23-24
Invited Journal Reviewer:
- IEEE TPAMI, TMLR, TNNS, JAIR, Neural Networks, Machine Learning, TACO, TPDS, JSAC, ToN, TNSE, TIST, IEEE Network, JPDC, ACM Computing Surveys
Honors and Awards
- 2024: NeurIPS Top Reviewer
- 2024: Outstanding Student Paper Award, FL@FM-NeurIPS’24
- 2024: NeurIPS Scholar Award
- 2024: ICLR Scholar Award
- 2022-2024: Multiple Research Performance Awards, HKBU CS Department
- 2020: Scholarship for Nominees of Hong Kong PhD Fellowship Scheme
- 2018: Outstanding Graduate, HUST
Teaching
Teaching Assistant at HKBU:
- 2023 Spring: COMP7940 Cloud Computing
- 2022 Fall: COMP7015 Artificial Intelligence
- 2022 Spring: COMP7550 IT Project Management
- 2021 Fall: COMP7015 Artificial Intelligence
- 2021 Spring: COMP7930 Big Data Analytics
Projects
- FedCV - Federated Learning Framework
- FedML Parrot - Scalable FL System
- FedImpro - Federated Learning Client Update
- Pruner-Zero - LLM Pruning
- Virtual Homogeneity Learning - Federated Learning
- GossipFL - Decentralized FL