Qinyuan Ye


Email: qinyuany [at] usc.edu


[CV] [Twitter] [Github] [INK Lab]
[Google Scholar] [Semantic Scholar]

Hello! My name is Qinyuan Ye (叶沁媛). I am a fifth-year CS Ph.D. student at University of Southern California, advised by Prof. Xiang Ren. I am part of INK Lab and the USC NLP community. Prior to USC, I was an undergraduate student at Tsinghua University, majoring in Automation.

I am interested in enabling NLP and AI systems to learn efficiently and proactively. Towards this, I develop methods that enable these systems to

These research efforts are often linked to fields such as meta-learning, in-context learning, and instruction tuning.

Recent News

[May 2024] PE2 was accepted to ACL 2024 (Findings)!

[Apr 2024] I did my thesis proposal. I will also present a poster at NAACL 2024 Student Research Workshop (thesis proposal track).

[Nov 2023] I gave a talk on “Learning from Observations of Large Language Model Capabilities” at Fudan NLP Group, hosted by Dr. Siyuan Wang and Prof. Tao Gui. Check out the slides and the papers [1,2] covered in the talk!


Prompt Engineering a Prompt Engineer
Qinyuan Ye, Maxamed Axmed, Reid Pryzant, Fereshte Khani
Accepted to ACL 2024 (Findings).
[Preprint] [Poster] [Tweet]

Cross-Task Generalization Abilities of Large Language Models
Qinyuan Ye
In Proceedings of NAACL 2024 Student Research Workshop (Thesis Proposal Track).
[Paper] [Poster]

How Predictable Are Large Language Model Capabilities? A Case Study on BIG-bench
Qinyuan Ye, Harvey Yiyun Fu, Xiang Ren, Robin Jia
In Proceedings of EMNLP 2023 (Findings).
[Paper] [Github] [Video] [Slides] [Poster] [Tweet]

Estimating Large Language Model Capabilities without Labeled Test Data
Harvey Yiyun Fu, Qinyuan Ye, Albert Xu, Xiang Ren, Robin Jia
In Proceedings of EMNLP 2023 (Findings).
[Paper] [Github]

FiD-ICL: A Fusion-in-Decoder Approach for Efficient In-Context Learning
Qinyuan Ye, Iz Beltagy, Matthew E. Peters, Xiang Ren, Hannaneh Hajishirzi
In Proceedings of ACL 2023.
[Paper] [Github] [Video] [Slides] [Poster]

Eliciting and Understanding Cross-Task Skills with Task-Level Mixture-of-Experts
Qinyuan Ye, Juan Zha, Xiang Ren
In Proceedings of EMNLP 2022 (Findings).
[Paper] [Github] [Slides] [Poster]

Sparse Distillation: Speeding Up Text Classification by Using Bigger Student Models
Qinyuan Ye, Madian Khabsa, Mike Lewis, Sinong Wang, Xiang Ren, Aaron Jaech
In Proceedings of NAACL 2022 (Oral Presentation).
[Paper] [Github] [Video] [Slides]

Refining Language Models with Compositional Explanations
Huihan Yao, Ying Chen, Qinyuan Ye, Xisen Jin, Xiang Ren
In Proceedings of NeurIPS 2021 (Spotlight Presentation).
[Paper] [Github] [Blog Post]

CrossFit: A Few-shot Learning Challenge for Cross-task Generalization in NLP
Qinyuan Ye, Bill Yuchen Lin, Xiang Ren
In Proceedings of EMNLP 2021.
[Paper] [Github] [Video] [Slides (Qual Exam Version)]

On the Influence of Masking Policies in Intermediate Pre-training
Qinyuan Ye, Belinda Z. Li, Sinong Wang, Benjamin Bolte, Hao Ma, Wen-tau Yih, Xiang Ren, Madian Khabsa
In Proceedings of EMNLP 2021.
[Paper] [Video] [Slides]

Learning to Generate Task-specific Adapters from Task Description
Qinyuan Ye, Xiang Ren
In Proceedings of ACL-IJCNLP 2021 (Short Paper).
[Paper] [Github] [Video] [Slides] [Poster]

Semi-Automated Protocol Disambiguation and Code Generation
Jane Yen, Tamás Lévai, Qinyuan Ye, Xiang Ren, Ramesh Govindan, Barath Raghavan
In Proceedings of SIGCOMM 2021.
[Paper] [Github]

Teaching Machine Comprehension with Compositional Explanations
Qinyuan Ye, Xiao Huang, Elizabeth Boschee, Xiang Ren
In Proceedings of EMNLP 2020 (Findings).
[Paper] [Project Homepage]

LEAN-LIFE: A Label-Efficient Annotation Framework Towards Learning from Explanation
Dong-Ho Lee*, Rahul Khanna*, Bill Yuchen Lin, Jamin Chen, Seyeon Lee, Qinyuan Ye, Elizabeth Boschee, Leonardo Neves and Xiang Ren
In Proceedings of ACL 2020 (Demo Track).
[Paper] [Project Homepage]

Learning from Explanations with Neural Execution Tree
Ziqi Wang*, Yujia Qin*, Wenxuan Zhou, Jun Yan, Qinyuan Ye, Leonardo Neves, Zhiyuan Liu, Xiang Ren
In proceedings of ICLR 2020.
[Paper] [Project Homepage]

Looking Beyond Label Noise: Shifted Label Distribution Matters in Distantly Supervised Relation Extraction
Qinyuan Ye*, Liyuan Liu*, Maosen Zhang, Xiang Ren
In proceedings of EMNLP-IJCNLP 2019 (Oral Presentation).
[Paper] [Github] [Video] [Tweet]


Learning from Observations of Large Language Model Capabilities
Nov 2023, Fudan NLP Group

Acquiring and Understanding Cross-task Generalization from Diverse NLP Tasks
Oct 2022, USC ISI Natural Language Seminar

Acquiring Cross-task Generalization from Diverse NLP Tasks
Feb 2022, NEC Labs Europe


I was fortunate to serve as a teaching assistant for these amazing classes:

Fall 2022, CSCI 699, Date-Centric NLP
Instructor: Prof. Swabha Swayamdipta

Spring 2023, CSCI 467, Introduction to Machine Learning
Instructor: Prof. Robin Jia

I gave guest lectures on Prompting and Instruction Tuning for CSCI 499, Language Models in NLP [Slides] and CSCI 662, Advanced Natural Language Processing [Slides] in Fall 2023.


May 2023 - Aug 2023, Research Intern,
@ Office of Applied Research, Microsoft, Redmond, WA, U.S.

May 2022 - Aug 2022, Research Intern,
@ AllenNLP, Allen Institute for AI, Seattle, WA, U.S.

May 2021 - Aug 2021, Research Intern (Remote),
@ AI Integrity, Facebook, Seattle, WA, U.S.

May 2020 - Aug 2020, Research Intern (Remote),
@ AI Integrity, Facebook, Seattle, WA, U.S.

Mar 2019 - Jul 2019, Software Engineering Intern in Machine Learning,
@ TensorFlow Lite, Google, Beijing, China.

Awards and Honors

Mar 2022, Finalist for Two Sigma Diversity PhD Fellowship.
Nov 2021, WiSE Qualcomm Top-Off Fellowship, University of Southern California.
Feb 2019, Annenberg Fellowship, University of Southern California.
Oct 2017 and Oct 2018, China National Scholarship.
Oct 2016, POSCO Asia Fellowship.
July 2014, Bronze Medal, China National Olympiad in Informatics.


I am from Chongming Island, Shanghai, China. We have delicious crabs and sticky rice cakes in Chongming!
I played volleyball at Tsinghua. I also hold the record of 2kg solid ball casting at Tsinghua.
I am a big fan of TV show Super Vocal and I’ve become a fan of live performances (opera, musical, concert, …) after watching this show!
I support Wigs For Kids with my hair!