Follow
Raphael Tang
Title
Cited by
Cited by
Year
DocBERT: BERT for Document Classification
A Adhikari, A Ram, R Tang, J Lin
arXiv preprint arXiv:1904.08398, 2019
4592019
Distilling Task-Specific Knowledge from BERT into Simple Neural Networks
R Tang*, Y Lu*, L Liu*, L Mou, O Vechtomova, J Lin
arXiv preprint arXiv:1903.12136, 2019
4572019
DeeBERT: Dynamic early exiting for accelerating BERT inference
J Xin, R Tang, J Lee, Y Yu, J Lin
arXiv preprint arXiv:2004.12993, 2020
3392020
Deep Residual Learning for Small-Footprint Keyword Spotting
R Tang, J Lin
2018 IEEE International Conference on Acoustics, Speech and Signal …, 2018
2722018
What would elsa do? freezing layers during transformer fine-tuning
J Lee, R Tang, J Lin
arXiv preprint arXiv:1911.03090, 2019
1342019
Rethinking Complex Neural Network Architectures for Document Classification
A Adhikari*, A Ram*, R Tang, J Lin
Proceedings of the 2019 Conference of the North American Chapter of the …, 2019
1302019
BERxiT: Early exiting for BERT with better fine-tuning and extension to regression
J Xin, R Tang, Y Yu, J Lin
Proceedings of the 16th conference of the European chapter of the …, 2021
1122021
What the DAAM: Interpreting Stable Diffusion Using Cross Attention
R Tang*, L Liu*, A Pandey, Z Jiang, G Yang, K Kumar, P Stenetorp, J Lin, ...
arXiv preprint arXiv:2210.04885, 2022
1052022
Rapidly Bootstrapping a Question Answering Dataset for COVID-19
R Tang, R Nogueira, E Zhang, N Gupta, P Cam, K Cho, J Lin
arXiv preprint arXiv:2004.11339, 2020
722020
Covidex: neural ranking models and keyword search infrastructure for the COVID-19 open research dataset
E Zhang, N Gupta, R Tang, X Han, R Pradeep, K Lu, Y Zhang, R Nogueira, ...
arXiv preprint arXiv:2007.07846, 2020
702020
“Low-resource” text classification: A parameter-free classification method with compressors
Z Jiang, M Yang, M Tsirlin, R Tang, Y Dai, J Lin
Findings of the Association for Computational Linguistics: ACL 2023, 6810-6828, 2023
632023
An Experimental Analysis of the Power Consumption of Convolutional Neural Networks for Keyword Spotting
R Tang, W Wang, Z Tu, J Lin
2018 IEEE International Conference on Acoustics, Speech and Signal …, 2018
582018
The art of abstention: Selective prediction and error regularization for natural language processing
J Xin, R Tang, Y Yu, J Lin
Proceedings of the 59th Annual Meeting of the Association for Computational …, 2021
572021
Honk: A PyTorch Reimplementation of Convolutional Neural Networks for Keyword Spotting
R Tang, J Lin
arXiv preprint arXiv:1710.06554, 2017
452017
Flops as a direct optimization objective for learning sparse neural networks
R Tang, A Adhikari, J Lin
arXiv preprint arXiv:1811.03060, 2018
382018
Exploring the limits of simple learners in knowledge distillation for document classification with DocBERT
A Adhikari, A Ram, R Tang, WL Hamilton, J Lin
Proceedings of the 5th Workshop on Representation Learning for NLP, 72-77, 2020
342020
Natural Language Generation for Effective Knowledge Distillation
R Tang, Y Lu, J Lin
Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource …, 2019
332019
Found in the Middle: Permutation Self-Consistency Improves Listwise Ranking in Large Language Models
R Tang*, X Zhang*, X Ma, J Lin, F Ture
arXiv preprint arXiv:2310.07712, 2023
282023
Howl: A Deployed, Open-Source Wake Word Detection System
R Tang*, J Lee*, A Razi, J Cambre, I Bicking, J Kaye, J Lin
arXiv preprint arXiv:2008.09606, 2020
172020
Incorporating Contextual and Syntactic Structures Improves Semantic Similarity Modeling
L Liu, W Yang, J Rao, R Tang, J Lin
Proceedings of the 2019 Conference on Empirical Methods in Natural Language …, 2019
142019
The system can't perform the operation now. Try again later.
Articles 1–20