Shaohui Lin (林绍辉)
Shaohui Lin (林绍辉)
Verified email at - Homepage
Cited by
Cited by
Towards Optimal Structured CNN Pruning via Generative Adversarial Learning
S Lin, R Ji, C Yan, B Zhang, L Cao, Q Ye, F Huang, D Doermann
Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2019
Accelerating Convolutional Networks via Global & Dynamic Filter Pruning
S Lin, R Ji, Y Li, Y Wu, F Huang, B Zhang
IJCAI, 2425-2432, 2018
Contrastive Learning for Compact Single Image Dehazing
H Wu, Y Qu, S Lin, J Zhou, R Qiao, Z Zhang, Y Xie, L Ma
CVPR, 2021
Toward compact convnets via structure-sparsity regularized filter pruning
S Lin, R Ji, Y Li, C Deng, X Li
IEEE transactions on neural networks and learning systems 31 (2), 574-588, 2019
Holistic cnn compression via low-rank decomposition with knowledge transfer
S Lin, R Ji, C Chen, D Tao, J Luo
IEEE transactions on pattern analysis and machine intelligence 41 (12), 2889 …, 2018
Exploiting Kernel Sparsity and Entropy for Interpretable CNN Compression
Y Li, S Lin, B Zhang, J Liu, D Doermann, Y Wu, F Huang, R Ji
Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2019
Masked face detection via a modified LeNet
S Lin, L Cai, X Lin, R Ji
Neurocomputing 218, 197-202, 2016
Towards Convolutional Neural Networks Compression via Global Error Reconstruction
S Lin, R Ji, X Guo, X Li
Proceedings of the Twenty-Fifth International Joint Conference on Artificial …, 2016
Farewell to Mutual Information: Variational Distillation for Cross-Modal Person Re-Identification
X Tian, Z Zhang, S Lin, Y Qu, Y Xie, L Ma
CVPR (oral), 2021
PAMS: Quantized Super-Resolution via Parameterized Max Scale
H Li, C Yan, S Lin, X Zheng, Y Li, B Zhang, F Yang, R Ji
ECCV, 2020
Towards Compact CNNs via Collaborative Compression
Y Li, S Lin, J Liu, Q Ye, M Wang, F Chao, F Yang, J Ma, Q Tian, R Ji
CVPR, 2021
Pruning blocks for CNN compression and acceleration via online ensemble distillation
Z Wang, S Lin, J Xie, Y Lin
IEEE Access 7, 175703-175716, 2019
ESPACE: Accelerating Convolutional Neural Networks via Eliminating Spatial & Channel Redundancy
S Lin, R Ji, C Chen, F Huang
Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence …, 2017
Towards Compact Single Image Super-Resolution via Contrastive Self-distillation
Y Wang, S Lin, Y Qu, H Wu, Z Zhang, Y Xie, A Yao
IJCAI, 2021
Deep Neural Network Compression and Acceleration: A Review [J]
R Ji, S Lin, F Chao, Y Wu, F Huang
J Comp Res Develop 55 (09), 1871-1888, 2018
Disco: Remedy self-supervised learning on lightweight models with distilled contrastive learning
Y Gao, JX Zhuang, S Lin, H Cheng, X Sun, K Li, C Shen
ECCV(oral), 2022
Neural network compression via learnable wavelet transforms
M Wolter, S Lin, A Yao
International Conference on Artificial Neural Networks, 39-51, 2020
纪荣嵘, 林绍辉, 晁飞, 吴永坚, 黄飞跃
计算机研究与发展 55 (9), 1871-1888, 2018
HybridCR: Weakly-Supervised 3D Point Cloud Semantic Segmentation via Hybrid Contrastive Regularization
M Li, Y Xie, Y Shen, B Ke, R Qiao, B Ren, S Lin, L Ma
Training convolutional neural networks with cheap convolutions and online distillation
J Xie, S Lin, Y Zhang, L Luo
arXiv preprint arXiv:1909.13063, 2019
The system can't perform the operation now. Try again later.
Articles 1–20