Follow
Raphael Tang
Title
Cited by
Cited by
Year
Docbert: Bert for document classification
A Adhikari, A Ram, R Tang, J Lin
arXiv preprint arXiv:1904.08398, 2019
3902019
Distilling Task-Specific Knowledge from BERT into Simple Neural Networks
R Tang*, Y Lu*, L Liu*, L Mou, O Vechtomova, J Lin
arXiv preprint arXiv:1903.12136, 2019
3772019
DeeBERT: Dynamic early exiting for accelerating BERT inference
J Xin, R Tang, J Lee, Y Yu, J Lin
arXiv preprint arXiv:2004.12993, 2020
2452020
Deep Residual Learning for Small-Footprint Keyword Spotting
R Tang, J Lin
2018 IEEE International Conference on Acoustics, Speech and Signal …, 2018
2352018
Rethinking Complex Neural Network Architectures for Document Classification
A Adhikari*, A Ram*, R Tang, J Lin
Proceedings of the 2019 Conference of the North American Chapter of the …, 2019
1112019
What would elsa do? freezing layers during transformer fine-tuning
J Lee, R Tang, J Lin
arXiv preprint arXiv:1911.03090, 2019
982019
BERxiT: Early exiting for BERT with better fine-tuning and extension to regression
J Xin, R Tang, Y Yu, J Lin
Proceedings of the 16th conference of the European chapter of the …, 2021
722021
Rapidly Bootstrapping a Question Answering Dataset for COVID-19
R Tang, R Nogueira, E Zhang, N Gupta, P Cam, K Cho, J Lin
arXiv preprint arXiv:2004.11339, 2020
702020
Covidex: Neural ranking models and keyword search infrastructure for the covid-19 open research dataset
E Zhang, N Gupta, R Tang, X Han, R Pradeep, K Lu, Y Zhang, R Nogueira, ...
arXiv preprint arXiv:2007.07846, 2020
672020
An Experimental Analysis of the Power Consumption of Convolutional Neural Networks for Keyword Spotting
R Tang, W Wang, Z Tu, J Lin
2018 IEEE International Conference on Acoustics, Speech and Signal …, 2018
522018
Honk: A PyTorch Reimplementation of Convolutional Neural Networks for Keyword Spotting
R Tang, J Lin
arXiv preprint arXiv:1710.06554, 2017
422017
The art of abstention: Selective prediction and error regularization for natural language processing
J Xin, R Tang, Y Yu, J Lin
Proceedings of the 59th Annual Meeting of the Association for Computational …, 2021
362021
Exploring the limits of simple learners in knowledge distillation for document classification with DocBERT
A Adhikari, A Ram, R Tang, WL Hamilton, J Lin
Proceedings of the 5th Workshop on Representation Learning for NLP, 72-77, 2020
312020
Natural Language Generation for Effective Knowledge Distillation
R Tang, Y Lu, J Lin
Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource …, 2019
302019
Flops as a direct optimization objective for learning sparse neural networks
R Tang, A Adhikari, J Lin
arXiv preprint arXiv:1811.03060, 2018
292018
What the DAAM: Interpreting Stable Diffusion Using Cross Attention
R Tang*, L Liu*, A Pandey, Z Jiang, G Yang, K Kumar, P Stenetorp, J Lin, ...
arXiv preprint arXiv:2210.04885, 2022
222022
“Low-Resource” Text Classification: A Parameter-Free Classification Method with Compressors
Z Jiang, M Yang, M Tsirlin, R Tang, Y Dai, J Lin
Findings of the Association for Computational Linguistics: ACL 2023, 6810-6828, 2023
162023
Incorporating Contextual and Syntactic Structures Improves Semantic Similarity Modeling
L Liu, W Yang, J Rao, R Tang, J Lin
Proceedings of the 2019 Conference on Empirical Methods in Natural Language …, 2019
142019
Howl: A Deployed, Open-Source Wake Word Detection System
R Tang*, J Lee*, A Razi, J Cambre, I Bicking, J Kaye, J Lin
arXiv preprint arXiv:2008.09606, 2020
132020
Inserting information bottlenecks for attribution in transformers
Z Jiang, R Tang, J Xin, J Lin
arXiv preprint arXiv:2012.13838, 2020
102020
The system can't perform the operation now. Try again later.
Articles 1–20