• Home
  • News
  • Publications
  • Members
  • Contact

Yu Bai

PhD Student, Beijing Institute of Technology

Interests

  • large language models

Education

  • PhD Student in Computer Science,

    Beijing Institute of Technology

Biography

I used to study Cross-lingual Summarization to investigate how to better generate text with low-resource data. As large language models (LLMs) continue to gain popularity, I am curious about why they are competent in so many challenging tasks. Hence, I have begun to think of LLMs as natural objects. By studying their working mechanisms and characteristics, I aim to understand these models more deeply and use these insights to improve them in terms of their generation quality, controllability, and safety.

Publications

Fundamental Capabilities of Large Language Models and their Applications in Domain Scenarios: A Survey

Jiawei Li, Yizhe Yang, Yu Bai, Xiaofeng Zhou, Yinghao Li, Huashan Sun, Yuhang Liu, Xingpeng Si, Yuhao Ye, Yixiao Wu, Yiguan Lin, Bin Xu, Bowen Ren, Chong Feng, Heyan Huang, Yang Gao
ACL 2024
June, 2024
Details

Stage-wise Stylistic Headline Generation: Style Generation and Summarized Content Insertion

Jiaao Zhan, Yang Gao, Yu Bai, Qianhui Liu
IJCAI
June, 2024
Details PDF

PSP: Pre-trained Soft Prompts for Few-Shot Abstractive Summarization

Xiaochen Liu, Yang Gao, Yu Bai, Jiawei Li, Yinan Hu, Heyan Huang, Boxing Chen
COLING 2022
April, 2022
Details PDF

Cross-Lingual Abstractive Summarization with Limited Parallel Resources

Yu Bai, Yang Gao, Heyan Huang
ACL 2021
May, 2021
Details PDF

© Nlp Group Bit, 2024 · Partially powered by the Academic theme for Hugo.