Follow
Cheng Yang
Cheng Yang
Verified email at mails.tsinghua.edu.cn
Title
Cited by
Cited by
Year
AgentBoard: An Analytical Evaluation Board of Multi-turn LLM Agents
C Ma*, J Zhang*, Z Zhu*, C Yang*, Y Yang, Y Jin, Z Lan, L Kong, J He
arXiv preprint arXiv:2401.13178, 2024
162024
Specialist or generalist? instruction tuning for specific NLP tasks
C Shi, Y Su, C Yang, Y Yang, D Cai
EMNLP 2023, 2023
13*2023
Autoconv: Automatically generating information-seeking conversations with large language models
S Li*, C Yang*, Y Yin, X Zhu, Z Cheng, L Shang, X Jiang, Q Liu, Y Yang
ACL 2023, 2023
92023
Question answering as programming for solving time-sensitive questions
X Zhu, C Yang, B Chen, S Li, JG Lou, Y Yang
EMNLP 2023, 2023
52023
Iigroup submissions for wmt22 word-level autocompletion task
C Yang, S Li, C Shi, Y Yang
WMT 2022, 2022
52022
Newsdialogues: Towards proactive news grounded conversation
S Li, Y Yin, C Yang, W Jiang, Y Li, Z Cheng, L Shang, X Jiang, Q Liu, ...
Findings of ACL 2023, 2023
32023
Acronym Extraction with Hybrid Strategies.
S Li*, C Yang*, T Liang*, X Zhu, C Yu, Y Yang
SDU@ AAAI 2022, 2022
22022
ChartMimic: Evaluating LMM's Cross-Modal Reasoning Capability via Chart-to-Code Generation
C Shi*, C Yang*, Y Liu*, B Shui*, J Wang*, M Jing, L Xu, X Zhu, S Li, ...
arXiv preprint arXiv:2406.09961, 2024
12024
An energy-based model for word-level autocompletion in computer-aided translation
C Yang, G Huang, M Yu, Z Zhang, S Li, M Yang, S Shi, Y Yang, L Liu
TACL 2024, 2024
12024
Enhancing Dialogue Generation with Conversational Concept Flows
S Li, W Jiang, P Si, C Yang, Q Yao, J Zhang, J Zhou, Y Yang
Findings of EACL 2023, 2023
12023
Multilingual acronym disambiguation with multichoice classification
X Zhu, C Yu, S Li, T Liang, C Yang, Y Yang
SDU@ AAAI 2022, 2022
12022
ToolBeHonest: A Multi-level Hallucination Diagnostic Benchmark for Tool-Augmented Large Language Models
Y Zhang, J Chen, J Wang, Y Liu, C Yang, C Shi, X Zhu, Z Lin, H Wan, ...
arXiv preprint arXiv:2406.20015, 2024
2024
HoLLMwood: Unleashing the Creativity of Large Language Models in Screenwriting via Role Playing
J Chen, X Zhu, C Yang, C Shi, Y Xi, Y Zhang, J Wang, J Pu, R Zhang, ...
arXiv preprint arXiv:2406.11683, 2024
2024
Unchosen Experts Can Contribute Too: Unleashing MoE Models' Power by Self-Contrast
C Shi*, C Yang*, X Zhu*, J Wang*, T Wu, S Li, D Cai, Y Yang, Y Meng
arXiv preprint arXiv:2405.14507, 2024
2024
The system can't perform the operation now. Try again later.
Articles 1–14