Hi there!

Welcome to Xinyin Ma(马欣尹)’s website :laughing:! I am currently a Ph.D student in Learning and Vision Lab @ NUS from August 2022, advised by Prof.Xinchao Wang. Previously I obtained my master in computer science from Zhejiang University where I was advised by Prof.Weiming Lu. I obtained my bachelor degree in software engineering also in Zhejiang University and got the honor degree from Chu Kochen Honors College.

I’m currently conducting some research in efficient learning, including:
🌲 The efficiency of the Large Language Model, Pre-trained Language Model and Diffusion Model.
🌱 The acceleration of training: dataset distillation and coreset
🌿 Compression under low-resource setting, e.g., data-free distillation.

I have published several papers in NeurIPS, EMNLP, IJCAI and CVPR. You can find more information about my publications in Google Scholar

🔥 News

  • 2024.02: DeepCache is accepted by CVPR’24!
  • 2023.12:  🌟Our new work, DeepCache, accelerates Diffusion Models for FREE! Check our paper and code!
  • 2023.09:  Two papers are accepted by NeurIPS’23.
  • 2023.06:  🎉🎉 Release LLM-Pruner🐏, the first structural pruning work of LLM. See our paper and code!
  • 2023.02:  One paper ‘DepGraph: Towards Any Structural Pruning’ accepted by CVPR’23.
  • 2022.08:  ⛵Start my Ph.D. journey in NUS!
  • 2022.04:   One paper ‘Prompting to distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt’ accepted by IJCAI’22.
  • 2022.04:   Got my master degree from ZJU! Thanks to my supervisor and all my friends in ZJU!

📝 Publications

CVPR 2024
sym

(🎈NEW)DeepCache: Accelerating Diffusion Models for Free sym

Xinyin Ma, Gongfan Fang, Xinchao Wang

  • A training-free paradigm that accelerates diffusion models
  • Utilizes the U-Net’s properties to efficiently reuse high-level features and update low-level features
  • 2.3× speedup for Stable Diffusion v1.5 and a 4.1× speedup for LDM-4-G, based upon DDIM/PLMS
[paper] [code] [Project Page] [abstract]
NeurIPS 2023
sym

LLM-Pruner: On the Structural Pruning of Large Language Models sym

Xinyin Ma, Gongfan Fang, Xinchao Wang

  • Task-agnostic Compression: The compressed LLM retain its multi-task ability.
  • Less Training Corpus: We use only 50k samples to post-train the LLM.
  • Efficient Compression: 3 minutes for pruning and 3 hours for post-training.
  • Automatic Structural Pruning: Pruning new LLMs with minimal human effort.
[paper] [code] [abstract]
  • sym DepGraph: Towards Any Structural Pruning. Gongfan Fang, Xinyin Ma, Mingli Song, Michael Bi Mi, Xinchao Wang. CVPR 2023.
    [paper] [code] [abstract]
  • Prompting to distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt. Xinyin Ma, Xinchao Wang, Gongfan Fang, Yongliang Shen, Weiming Lu. IJCAI 2022.
    [paper] [abstract]
  • MuVER: Improving First-Stage Entity Retrieval with Multi-View Entity Representations. Xinyin Ma, Yong Jiang, Nguyen Bach, Tao Wang, Zhongqiang Huang, Fei Huang, Weiming Lu. EMNLP 2021(short).
    [code] [paper] [abstract]
  • Adversarial Self-Supervised Data-Free Distillation for Text Classification. Xinyin Ma, Xinchao Wang, Gongfan Fang, Yongliang Shen, Weiming Lu. EMNLP 2020.
    [paper] [video] [abstract]
  • Structural Pruning for Diffusion Models. Gongfan Fang, Xinyin Ma, Xinchao Wang. NeurIPS 2023.
    [paper] [code] [abstract]
  • A Locate and Label: A Two-stage Identifier for Nested Named Entity Recognition. Yongliang Shen, Xinyin Ma, Zeqi Tan, Shuai Zhang, Wen Wang, Weiming Lu. ACL 2021.
    [paper] [code] [abstract]
  • A Trigger-Sense Memory Flow Framework for Joint Entity and Relation Extraction. Yongliang Shen, Xinyin Ma, Yechun Tang, Weiming Lu. WWW 2021.
    [paper] [code] [abstract]

🎖 Honors and Awards

  • 2019-2022(M.Eng.): Outstanding Graduate(2022), Tecent Scholarship(2021), CETC28 Scholarship(2021), Huawei Elite Scholarship(2020), Shenzhen Stock Exchange Scholarship(2020), Award of Honor for Graduate(2021, 2020)
  • 2015-2019(B.Eng.): Outstanding Engineer Scholarship (2018), Outstanding Student of Zhejiang University (2018, 2017, 2016), Second-Class Academic Scholarship of Zhejiang University (2017, 2016), Second Class Scholarship of National Talent Training Base (2017), CASC Second Class Scholarship (2016)

📖 Educations

  • 2022.08 - (now), Ph.D. Student in Electrical and Computer Engineering, College of Design and Engineering, National University of Singapore
  • 2019.08 - 2022.04, M.Eng. in Computer Science, College of Computer Science and Technology, Zhejiang University
  • 2015.09 - 2019.06, B.Eng. in Software Engineering, Chu Kochen Honors College, Zhejiang University

📋 Academic Service

ICML’24, IJCAI’24, ICLR’24, NAACL’24(ARR 2023 Dec), NeurIPS’23, EMNLP’23, ICML’23, ACL’23, ACL’22, EMNLP’22, ACL’21, EMNLP’21 and several ARRs

💻 Internships

  • 2020.12 - 2021.6, Alibaba DAMO Academy, Research Intern. Mentor: Yong Jiang.
  • 2018.07 - 2018.11, Netease Thunderfire UX, Data Analyst Intern. Mentor: Lei Xia.