Ze-Feng Gao / 高泽峰

I am Postdoctor at the Gaoling School of Artificial Intelligence at Renmin University of China. My Cooperative tutor is Prof. Ji-Rong Wen.

Currently, my primary research interests include model compression, data compression and pre training language model based on tensor network.

I have an Ph.D of Science in Department of Physics, Renmin University of China, advised by Prof. Zhong-Yi Lu.

Email: zfgao@ruc.edu.cn

GitHub  /  Google Scholar  /  DBLP  /  Zhihu

profile photo

Education

  • Ph.D. of Science, Renmin University of China, 2016-2021
  • B.Sc. of Physics, Renmin University of China, 2012-2016
  • Publications

    2021

    project image

    Enabling Lightweight Fine-tuning for Pre-trained Language Model Compression based on Matrix Product Operators


    Peiyu Liu*, Ze-Feng Gao*, Wayne Xin Zhao#, Z.Y. Xie, Zhong-Yi Lu#, Ji-Rong Wen
    ACL 2021 main conference, 2021
    paper / code / slides /

    This paper presents a novel pre-trained language models (PLM) compression approach based on the matrix product operator (short as MPO) from quantum many-body physics.




    2020

    project image

    Compressing LSTM Networks by Matrix Product Operators


    Ze-Feng Gao*, Xingwei Sun*, Lan Gao, Junfeng Li#, Zhong-Yi Lu#
    Preprint, 2020
    arxiv /

    We propose an alternative LSTM model to reduce the number of parameters significantly by representing the weight parameters based on matrix product operators (MPO), which are used to characterize the local correlation in quantum states in physics.

    project image

    A Model Compression Method With Matrix Product Operators for Speech Enhancement


    Xingwei Sun*, Ze-Feng Gao*, Zhong-Yi Lu#, Junfeng Li#, Yonghong Yan
    IEEE/ACM Transactions on Audio, Speech, and Language Processing 28, 2837-2847, 2020
    paper /

    In this paper, we propose a model compression method based on matrix product operators (MPO) to substantially reduce the number of parameters in DNN models for speech enhancement.

    project image

    Compressing deep neural networks by matrix product operators


    Ze-Feng Gao*,Song Cheng*, Rong-Qiang He, Zhi-Yuan Xie#, Hui-Hai Zhao#, Zhong-Yi Lu#, Tao Xiang#
    Physical Review Research 2 (2), 023300, 2020
    paper / code /

    In this paper, we show that neural network can be effectively solved by representing linear transformations with matrix product operators (MPOs), which is a tensor network originally proposed in physics to characterize the short-range entanglement in one-dimensional quantum states.

    * Equal contribution

    # Corresponding author




    Selected Awards and Honors

  • Outstanding graduates, Renmin University of China, 2021
  • National Scholarship for Graduate Student, Ministry of Education of P.R.China, 2020
  • First class academic scholarship, Renmin University of China, 2020
  • Social Volunteer Service Scholarship,Renmin University of China, 2019
  • First class academic scholarship, Renmin University of China, 2019
  • First class academic scholarship, Renmin University of China, 2018
  • First class academic scholarship, Renmin University of China, 2016
  • Innovation experiment plan for college students, national level, 2013