Hi there!

Welcome to Xinyin Ma(马欣尹)’s website! I am currently a Ph.D candidate in Learning and Vision Lab @ NUS from August 2022, advised by Prof.Xinchao Wang. Previously I obtained my master degree in computer science from Zhejiang University where I was advised by Prof.Weiming Lu. I obtained my bachelor degree in software engineering also in Zhejiang University and got the honor degree from Chu Kochen Honors College.

I’m currently conducting some research in efficient learning, including:
🌲 The efficiency of the Language Model and Diffusion Model.
🌱 The acceleration of training: dataset distillation and coreset
🌿 Compression with synthetic data, e.g., data-free distillation.

I have published several papers in NeurIPS, CVPR, EMNLP, IJCAI. You can find more information about my publications in Google Scholar

I‘m actively seeking internship and visiting opportunities. If you have any opportunities available, I would greatly appreciate it if you could reach out to me. Thank you😎!

🔥 News

  • 2024.09:  Four papers (Learning-to-Cache, AsyncDiff, SlimSAM and RemixDiT) accepted by NeurIPS’24!
  • 2024.07:  One co-authored paper accepted by ECCV’24!
  • 2024.07:  ⛵Passed my qualifying exam!
  • 2024.06:  One co-authored paper accepted by Interspeech’24!
  • 2024.02: DeepCache is accepted by CVPR’24!
  • 2023.12:  🌟Our new work, DeepCache, accelerates Diffusion Models for FREE! Check our paper and code!
  • 2023.09:  Two papers are accepted by NeurIPS’23.
  • 2023.06:  🎉🎉 Release LLM-Pruner🐏, the first structural pruning work of LLM. See our paper and code!
  • 2023.02:  One paper ‘DepGraph: Towards Any Structural Pruning’ accepted by CVPR’23.
  • 2022.08:  ⛵Start my Ph.D. journey in NUS!
  • 2022.04:   One paper ‘Prompting to distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt’ accepted by IJCAI’22.
  • 2022.04:   Got my master degree from ZJU! Thanks to my supervisor and all my friends in ZJU!

📝 Publications

NeurIPS 2024
sym

Learning-to-Cache: Accelerating Diffusion Transformer via Layer Caching sym

Xinyin Ma, Gongfan Fang, Michael Bi Mi, Xinchao Wang

  • A novel scheme that learns to conduct caching in a dynamic manner for diffusion transformers.
  • A large proportion of layers in the diffusion transformer can be removed, without updating the model parameters.
  • Learning-to-Cache largely outperforms samplers such as DDIM and DPM-Solver.
[paper] [code] [abstract]
CVPR 2024
sym

DeepCache: Accelerating Diffusion Models for Free sym

Xinyin Ma, Gongfan Fang, Xinchao Wang

  • A training-free paradigm that accelerates diffusion models
  • Utilizes the U-Net’s properties to efficiently reuse high-level features and update low-level features
  • 2.3× speedup for Stable Diffusion v1.5 and a 4.1× speedup for LDM-4-G, based upon DDIM/PLMS
[paper] [code] [Project Page] [abstract]
NeurIPS 2023
sym

LLM-Pruner: On the Structural Pruning of Large Language Models sym

Xinyin Ma, Gongfan Fang, Xinchao Wang

  • Task-agnostic Compression: The compressed LLM retain its multi-task ability.
  • Less Training Corpus: We use only 50k samples to post-train the LLM.
  • Efficient Compression: 3 minutes for pruning and 3 hours for post-training.
  • Automatic Structural Pruning: Pruning new LLMs with minimal human effort.
[paper] [code] [abstract]
  • Prompting to distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt. Xinyin Ma, Xinchao Wang, Gongfan Fang, Yongliang Shen, Weiming Lu. IJCAI 2022.
    [paper] [abstract]
  • MuVER: Improving First-Stage Entity Retrieval with Multi-View Entity Representations. Xinyin Ma, Yong Jiang, Nguyen Bach, Tao Wang, Zhongqiang Huang, Fei Huang, Weiming Lu. EMNLP 2021(short).
    [paper] [code] [abstract]
  • Adversarial Self-Supervised Data-Free Distillation for Text Classification. Xinyin Ma, Yongliang Shen, Gongfan Fang, Chen Chen, Chenghao Jia, Weiming Lu. EMNLP 2020.
    [paper] [video] [abstract]
  • AsyncDiff: Parallelizing Diffusion Models by Asynchronous Denoising. Zigeng Chen, Xinyin Ma, Gongfan Fang, Zhenxiong Tan, Xinchao Wang. NeurIPS 2024.
    [paper] [code] [abstract]
  • SlimSAM: 0.1% Data Makes Segment Anything Slim. Zigeng Chen, Gongfan Fang, Xinyin Ma, Xinchao Wang. NeurIPS 2024.
    [paper] [code] [abstract]
  • Remix-DiT: Mixing Diffusion Transformers for Multi-Expert Denoising. Gongfan Fang, Xinyin Ma, Xinchao Wang. NeurIPS 2024.
  • Isomorphic Pruning for Vision Models. Gongfan Fang, Xinyin Ma, Michael Bi Mi, Xinchao Wang. ECCV 2024.
    [paper] [code] [abstract]
  • LiteFocus: Accelerated Diffusion Inference for Long Audio Synthesis. Zhenxiong Tan, Xinyin Ma, Gongfan Fang, Xinchao Wang. Interspeech 2024.
    [paper] [code] [abstract]
  • sym DepGraph: Towards Any Structural Pruning. Gongfan Fang, Xinyin Ma, Mingli Song, Michael Bi Mi, Xinchao Wang. CVPR 2023.
    [paper] [code] [abstract]
  • Structural Pruning for Diffusion Models. Gongfan Fang, Xinyin Ma, Xinchao Wang. NeurIPS 2023.
    [paper] [code] [abstract]
  • A Locate and Label: A Two-stage Identifier for Nested Named Entity Recognition. Yongliang Shen, Xinyin Ma, Zeqi Tan, Shuai Zhang, Wen Wang, Weiming Lu. ACL 2021.
    [paper] [code] [abstract]
  • A Trigger-Sense Memory Flow Framework for Joint Entity and Relation Extraction. Yongliang Shen, Xinyin Ma, Yechun Tang, Weiming Lu. WWW 2021.
    [paper] [code] [abstract]

🎖 Honors and Awards

  • 2019-2022(M.Eng.): Outstanding Graduate(2022), Tecent Scholarship(2021), CETC28 Scholarship(2021), Huawei Elite Scholarship(2020), Shenzhen Stock Exchange Scholarship(2020), Award of Honor for Graduate(2021, 2020)
  • 2015-2019(B.Eng.): Outstanding Engineer Scholarship (2018), Outstanding Student of Zhejiang University (2018, 2017, 2016), Second-Class Academic Scholarship of Zhejiang University (2017, 2016), Second Class Scholarship of National Talent Training Base (2017), CASC Second Class Scholarship (2016)

📖 Educations

  • 2022.08 - (now), Ph.D. Student in Electrical and Computer Engineering, College of Design and Engineering, National University of Singapore
  • 2019.08 - 2022.04, M.Eng. in Computer Science, College of Computer Science and Technology, Zhejiang University
  • 2015.09 - 2019.06, B.Eng. in Software Engineering, Chu Kochen Honors College, Zhejiang University

📋 Academic Service

  • Conference: AAAI’24, NeurIPS’24, EMNLP’24(ARR’24 June), ECCV’24, ACL’24(ARR’24 Feb), ICML’24, IJCAI’24, ICLR’24, NAACL’24(ARR’23 Dec), NeurIPS’23, EMNLP’23, ICML’23, ACL’23, EMNLP’22, ACL’22, EMNLP’21, ACL’21 and several ARRs
  • Journal: JVCI

🔭 Teaching Experience

  • Fall 2024, Fall 2023, Spring 2023. TA for EE2211, Introduction to Machine Learning, NUS.

💻 Internships

  • 2020.12 - 2021.6, Alibaba DAMO Academy, Research Intern. Mentor: Yong Jiang.
  • 2018.07 - 2018.11, Netease Thunderfire UX, Data Analyst Intern. Mentor: Lei Xia.