Reducing the Dilution: An Analysis of the Information Sensitiveness of Capsule Network With a Practical Solution Z Yang, X Wang caparXiv 1903, 2019 | 18* | 2019 |
Improving Event Duration Prediction via Time-aware Pre-training Z Yang, X Du, A Rush, C Cardie EMNLP 2020 (Findings), 2020 | 16 | 2020 |
A Survey on Semantic Processing Techniques Z Yang*, R Mao*, K He*, X Zhang*, G Chen*, J Ni*, E Cambria Information Fusion, 2023 | 13* | 2023 |
End-to-end Case-Based Reasoning for Commonsense Knowledge Base Completion Z Yang, X Du, E Cambria, C Cardie EACL 2023 (Oral), 2023 | 5 | 2023 |
Language Models as Inductive Reasoners Z Yang, L Dong, X Du, H Cheng, E Cambria, X Liu, J Gao, F Wei EACL 2024, 2022 | 5 | 2022 |
Finding the Pillars of Strength for Multi-Head Attention J Ni, R Mao, Z Yang, H Lei, E Cambria ACL 2023, 2023 | 3 | 2023 |
Logical Reasoning over Natural Language as Knowledge Representation: A Survey Z Yang, X Du, R Mao, J Ni, E Cambria ACL 2023 NLRSE workshop (non-archival), 2023 | 3 | 2023 |
Task-Aware Self-Supervised Framework for Dialogue Discourse Parsing W Li, L Zhu, W Shao, Z Yang, E Cambria EMNLP 2023 (Findings), 2023 | | 2023 |
Large Language Models for Automated Open-domain Scientific Hypotheses Discovery Z Yang, X Du, J Li, J Zheng, S Poria, E Cambria arXiv preprint arXiv:2309.02726, 2023 | | 2023 |