Yang Gao, Doctoral Supervisor, primarily engages in large language model training, automatic text generation technologies, and the application and transformation of these technologies. She has published over 60 high-level papers in international journals and conferences, including ACL, AAAI, WWW, IJCAI, EMNLP, TKDE and others. She has served as a chair in the text generation area for conferences like EMNLP and COLING, as an editorial board member for journals like Web Intelligence and Natural Language Processing Journal, as a program committee member for international conferences such as AAAI, ACL, EMNLP, NAACL, ICDM, and as a reviewer for journals including TNNLS and Computing Surveys. She has led the development of the Mingde Foundation large language model (MindLLM) trained from scratch and the construction of domestic ecological large models.