avatar
Wei He (何为)
Postgraduate @ Fudan University | Coder & NLPer | hewei2001

就我而言,我一无所知,但满眼的繁星让我入梦。

For my part I know nothing with any certainty, but the sight of the stars makes me dream.

Education

M.S. Computer Science and Technology @ Fudan University, 2023 - 2026 (expected)

B.E. Computer Science and Technology @ Harbin Institute of Technology (Shenzhen), 2019 - 2023

Research Interests

As a beginner in research, I have great enthusiasm and interest in various fields and am willing to be exposed to new knowledge and challenges.

I am currently working on Language-Model-as-a-Service (LMaaS), especially the application of large language models as intelligent agent, which aims to enable models to play a variety of roles and accomplish a variety of tasks through prompt engineering.

I am also interested in NLP for software engineering, such as vulnerability detection and code generation.

Selected Publications

* : Equal contribution. See here for all publications.

Self-Demos: Eliciting Out-of-Demonstration Generalizability in Large Language Models

  • Wei He, Shichun Liu, Jun Zhao, Yiwen Ding, Yi Lu, Zhiheng Xi, Tao Gui, Qi Zhang, Xuanjing Huang
  • ✨CCF-B NAACL 2024 Accepted (Findings). [Paper] / [Code]

LongHeads: Multi-Head Attention is Secretly a Long Context Processor

  • Yi Lu, Xin Zhou, Wei He, Jun Zhao, Tao Ji, Tao Gui, Qi Zhang, Xuanjing Huang
  • 📃Preprint EMNLP 2024 Under Review. [Paper] / [Code]

AgentGym: Evolving Large Language Model-based Agents across Diverse Environments

  • Zhiheng Xi, Yiwen Ding, Wenxiang Chen, Boyang Hong, Honglin Guo, Junzhe Wang, Dingwen Yang, Chenyang Liao, Xin Guo, Wei He, Songyang Gao, Lu Chen, Rui Zheng, Yicheng Zou, Tao Gui, Qi Zhang, Xipeng Qiu et al.
  • 📃Preprint ⭐200+ Stars NeurIPS 2024 Under Review. [Paper] / [Code] / [Project Page] / [Report]

Training Large Language Models for Reasoning through Reverse Curriculum Reinforcement Learning

  • Zhiheng Xi, Wenxiang Chen, Boyang Hong, Senjie Jin, Rui Zheng, Wei He, Yiwen Ding, Shichun Liu, Xin Guo, Junzhe Wang, Honglin Guo, Wei Shen, Xiaoran Fan, Yuhao Zhou, Shihan Dou, Xiao Wang, Xinbo Zhang et al.
  • 🚀CCF-A ICML 2024 Accepted. [Paper] / [Code] / [Report]

The Rise and Potential of Large Language Model Based Agents: A Survey

  • Zhiheng Xi*, Wenxiang Chen*, Xin Guo*, Wei He*, Yiwen Ding*, Boyang Hong, Ming Zhang, Junzhe Wang, Senjie Jin, Enyu Zhou, Rui Zheng, Xiaoran Fan, Xiao Wang, Limao Xiong, Yuhao Zhou, Weiran Wang, Changhao Jiang et al.
  • 📃Preprint 🔥300+ Citation Artificial Intelligence 2024 Under Review. [Paper] / [Paperlist] / [Report]

TopicAns: Topic-Informed Architecture for Answer Recommendation on Technical Q&A Site

  • Yuanhang Yang, Wei He, Cuiyun Gao, Zenglin Xu, Xin Xia, Chuanyi Liu
  • 🚀CCF-A TOSEM 2023 Accepted. [Paper] / [Code]

Experience

AI Application Group @ Futu Holdings. Futu Network Technology (Shenzhen)

STAR Lab @ Harbin Institute of Technology (Shenzhen)

  • NLP Research Intern, advised by Prof. Cuiyun Gao, 2021.11 - 2023.02
  • One paper about NLP for software engineering accepted.

Selected Honors

[2023] Outstanding Academic Scholarship (Frist Prize) @ Fudan University

[2023]🎓Outstanding Graduate @ Harbin Institute of Technology

[2023]🏆The 14th LanQiao Cup C/C++ Programming Contest (Provincial First Prize)

[2023] Huawei Scholarship @ Huawei Technologies Co., Ltd.

[2022] PACT518 Scholarship @ Surfilter Network Technology Co., Ltd.

[2021]🏆National Mathematical Contest in Modeling (Honorable Mention)

[2020] Gongjin Scholarship @ Shenzhen Gongjin Electronics Co., Ltd.

[2020]🏫National Scholarship @ The China Ministry of Education

[2020]🏆Contemporary Undergraduate Mathematical Contest in Modeling (National First Prize)

About Blog

Recording, producing and creating! Including but not limited to:

  • Learning Notes (学习笔记) & Research Notes (科研札记);
  • Project Experience (项目经历) & Contest Reviews (比赛回顾);
  • Heartfelt Essay (心情随笔)。

Built on Hexo+GitHub, with Fluid theme, continuously optimized and updated.

Click here to visit the Mirror Site (访问镜像站点 - faster access in China, maybe).