Glm-130b: An open bilingual pre-trained model A Zeng, X Liu, Z Du, Z Wang, H Lai, M Ding, Z Yang, Y Xu, W Zheng, X Xia, ... arXiv preprint arXiv:2210.02414, 2022 | 270 | 2022 |
Codegeex: A pre-trained model for code generation with multilingual evaluations on humaneval-x Q Zheng, X Xia, X Zou, Y Dong, S Wang, Y Xue, Z Wang, L Shen, A Wang, ... arXiv preprint arXiv:2303.17568, 2023 | 129 | 2023 |
Cogagent: A visual language model for gui agents W Hong, W Wang, Q Lv, J Xu, W Yu, J Ji, Y Wang, Z Wang, Y Dong, ... arXiv preprint arXiv:2312.08914, 2023 | 38 | 2023 |
Codegeex: A pre-trained model for code generation with multilingual benchmarking on humaneval-x Q Zheng, X Xia, X Zou, Y Dong, S Wang, Y Xue, L Shen, Z Wang, A Wang, ... Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and …, 2023 | 28 | 2023 |
Codegeex: A pre-trained model for code generation with multilingual evaluations on humaneval-x. CoRR, abs/2303.17568, 2023. doi: 10.48550 Q Zheng, X Xia, X Zou, Y Dong, S Wang, Y Xue, Z Wang, L Shen, A Wang, ... arXiv preprint arXiv.2303.17568, 0 | 9 | |
Sciglm: Training scientific language models with self-reflective instruction annotation and tuning D Zhang, Z Hu, S Zhoubian, Z Du, K Yang, Z Wang, Y Yue, Y Dong, ... arXiv preprint arXiv:2401.07950, 2024 | 3 | 2024 |
Rethinking the setting of semi-supervised learning on graphs Z Li, M Ding, W Li, Z Wang, Z Zeng, Y Cen, J Tang arXiv preprint arXiv:2205.14403, 2022 | | 2022 |