首页
学院概况
学院简介
院长致辞
领导团队
历史沿革
师资队伍
教职人员
实验室人员
科学研究
研究团队
研究进展
支撑平台
人才培养
本科生教育
研究生教育
物理实验教学中心
教育培训
党团工作
党建工作
团建工作
教师风采
校友寄语
招生录取
本科生招生
研究生招生
国际交流
交流访问
学术会议
最新动态
学术讲座
通知公告
综合新闻
人大主页
邮箱登录
Material simulation
Gao ZeFengLecturer

Address:Room 307B,Physics Building

E-mail:zfgao@ruc.edu.cn

Phone:010-82503678

Fax:010-82503678

Webpage:www.gaozefeng.tech

Personal Profile

Young Talent of Renmin University of China, Lecturer at the School of Physics. Research directions include numerical methods in quantum physics, compression of pre-trained models, and AI-assisted discovery and generation of functional crystal materials. Theoretical methods based on the representation of matrix product operators have been constructed in three areas: neural networks, model compression and miniaturization for speech enhancement, and lightweight fine-tuning and model expansion for pre-trained models. At the same time, AI methods are applied to assist in the discovery of new functional materials. More than twenty papers have been published in domestic and international academic journals and conferences related to this field, including important AI conferences such as ACL, NeurIPS, EMNLP, COLING, and key SCI journals such as Phys.Rev.Research, IEEE TASLP. Among them, the work on the over-parametrization process of pre-trained models based on matrix product operators was nominated for the best paper at ACL2023. Research results have been cited by experts from academic institutions such as the University of Cambridge, Stanford University, and Meta. In the past three years, six research grants have been presided over, including one National Natural Science Foundation of China Youth Fund project, one National Natural Science Foundation of China general project, one National Natural Science Foundation of China key project (sub-topic), and three horizontal projects.



Educational Background

2016-2021, Renmin University of China, School of Physics, Ph.D.

2012-2016, Renmin University of China, School of Physics, B.A.


Work Experience

2021.7-2024.6: Postdoctoral Fellow at the Gaoling School of Artificial Intelligence, Renmin University of China, Collaborating with Professor Ji-Rong Wen.


Research Interest

Phys for AI:

Large Language Models (LLMs); Model Compression; Model Expansion


AI for Phys:

Scientific Intelligence; AI-Driven Discovery of Functional New Materials; Theoretical and Methodological AI for Physics Information


Talent Cultivation

Recruiting PhD/Master's Students and Interns (including undergraduates) Year-Round


Requirements:

1. Passion for research work

2. Diligent and eager to learn

3. Solid theoretical foundation


Teaching Courses

Spring 2022-2023: Introduction to Artificial Intelligence (Undergraduate Course)

Fall 2022-2023: Artificial Intelligence and Physics (Undergraduate Course)

Fall 2024-2025: Artificial Intelligence and Physics (Undergraduate Course)


Scientific Research Projects


  • Principal Investigator (PI) Projects:

  • 1. National Natural Science Foundation of China, General Program, Research on Efficient Compression Methods for Large-Scale Pre-trained Language Models (Nos. 62476278), Principal Investigator

  • 2. National Natural Science Foundation of China, Youth Program, Research on Lightweight Fine-Tuning and Model Expansion Methods for Large-Scale Pre-trained Language Models (Nos. 62206299), Principal Investigator

  • 3. National Natural Science Foundation of China, Key Project Sub-topic, First-Principles Many-Body Computational Methods for Ultrafast Dynamics of Strongly Correlated Electronic Materials (Nos. 12434009), Principal Investigator

  • 4. Beijing Academy of Artificial Intelligence, Horizontal Project, Lightweight Fine-Tuning of Multimodal Models Based on Matrix Product Operators, Principal Investigator

  • 5. CCF-Zhipu Large Model Project, Horizontal Project, Tensor Decomposition-Based Compression Strategies for Pre-trained Models, Principal Investigator

  • 6. University of Chinese Academy of Sciences Collaborative Project, Horizontal Project, Theoretical and Technical Research on Physics-Inspired Deep Graph Learning for Nonlinear Dynamical Systems Modeling, Principal Investigator


  • Participant Projects:

  • 1. National Natural Science Foundation of China, General Program, Scientific Computing Theory and Algorithms for Complex Spatiotemporal Systems Based on Physics-Embedded Deep Graph Learning (Nos. 62276269)

  • 2. National Natural Science Foundation of China, General Program, Integration, Development, and Application of Tensor Networks and Neural Networks in Strongly Correlated Systems (Nos. 11874421)

  • 3. National Natural Science Foundation of China, General Program, Research on Several Issues in Quantum Impurity Systems Based on Natural Orbital Renormalization Group (Nos. 11774422)

  • 4. National Natural Science Foundation of China, General Program, Research on Several Issues in Many-Body Localization (Nos. 11874421)

  • 5. Beijing Natural Science Foundation, Key Research Topic, Research on Ion Trap Quantum Computing Technology and Algorithms



 

     

  


Academic Paper


  • For more papers, please refer to my personal website:  http://www.gaozefeng.tech

  • 1. Gao Ze-Feng; Zhou kun; Liu Peiyu; Zhao Wayne Xin; Wen Ji-Rong; Small Pre-trained Language Models Can be Fine-tuned as Large Models via Over-Parameterization, Association for Computational Linguistics. (ACL 2023) (Nominated for Best Paper Award)

  • 2. Gao, Ze-Feng; Liu Peiyu; Zhao, Wayne Xin; Wen, Ji-Rong; Parameter-Efficient Mixture-of-Experts Architecture for Pre-trained Language Models. International Conference on Computational Linguistics. (COLING 2022)

  • 3. Gao, Ze-Feng; Cheng Song; He Rong-Qiang; Xie Zhi-Yuan; Zhao Hui-Hai; Lu Zhong-Yi; Xiang Tao; Compressing Deep Neural Networks by Matrix Product Operators, Physical Review Research 2 (2), 023300, 2020 (Physical Review Research).

  • 4. Liu Peiyu; Gao, Ze-Feng; Zhao, Wayne Xin; Ma Yipeng; Wang Tao; Wen, Ji-Rong; Unlocking Data-free Low-bit Quantization with Matrix Decomposition for KV Cache Compression, Association for Computational Linguistics. (ACL 2024)

  • 5. Liu Peiyu#; Gao, Ze-Feng# ; Zhao, Wayne Xin; Xie Zhi-Yuan; Lu Zhong-Yi; Wen, Ji-Rong; Enabling Lightweight Fine-tuning for Pre-trained Language Model Compression based on Matrix Product Operators, Association for Computational Linguistics. (ACL 2021)

  • 6. Liu Peiyu#; Gao, Ze-Feng# ; Zhang Xiao; Zhao, Wayne Xin; Wen Ji-Rong; Enhancing Parameter-efficient Fine-tuning with Simple Calibration based on Stable Rank.International Conference on Computational Linguistics. (COLING 2024)

  • 7. Liu Peiyu; Liu Zikang; Gao, Ze-Feng ; Gao Dawei; Zhao, Wayne Xin; Li Yaliang; Ding Bolin; Wen Ji-Rong; Do Emergent Abilities Exist in Quantized Large Language Models: An Empirical Study. International Conference on Computational Linguistics. (COLING 2024)

  • 8. Liu Peiyu#; Gao, Ze-Feng# ; Chen Yushuo; Zhao Wayne Xin; Wen Ji-Rong; Enhancing  Scalability of Pre-trained Language Models via Efficient Parameter Sharing. Conference on Empirical Methods in Natural Language Processing. (EMNLP 2023)

  • 9. Sun Xingwei# ; Gao, Ze-Feng# ; Lu Zhong-Yi; Li Junfeng; Yan Yonghong; A Model Compression Method With Matrix Product Operators for Speech Enhancement. IEEE/ACM Transactions on Audio, Speech, and Language Processing. (IEEE TASLP)

  • 10. Wang Jiaqi; Gao Ze-Feng* ; Li Yongfeng; Wang Lu; Numerical research of optimization FeimiNet using self-attention mechanism. Journal of Tianjin Normal University (Natural  Science Edition), 2022.

  • 11. Gao Ze-Feng; Liu Peiyu; Zhao Wayne Xin; Xie Zhi-Yuan; Wen Ji-Rong; Lu Zhong-Yi; Compression Image Dataset Based on Multiple Matrix Product States. Future of Information  and Communication Conference. (FICC 2024)

  • 12. Liu Peiyu; Yao Bowen; Gao Ze-Feng*; Zhao Wayne Xin; Matrix Product Operator based Sequential Recommendation Model. China Conference on Information Retrieval. (CCIR 2023)


  • # Co-first authors        * Corresponding authors




Awards and Honors

ACL 2023 Best Paper Nomination, May 2023

National Scholarship for Doctoral Students, September 2020


Professional Affiliations

Program Committee Member, CCIR

Program Committee Member, WSDM

Reviewer for EMNLP, ACL, ACCV, WSDM