追蹤
Yury Nahshan
Yury Nahshan
未知關係機構
沒有已驗證的電子郵件地址
標題
引用次數
引用次數
年份
Post training 4-bit quantization of convolutional networks for rapid-deployment
R Banner, Y Nahshan, D Soudry
Advances in Neural Information Processing Systems 32, 2019
640*2019
Accurate post training quantization with small calibration sets
I Hubara, Y Nahshan, Y Hanani, R Banner, D Soudry
International Conference on Machine Learning, 4466-4475, 2021
206*2021
Loss aware post-training quantization
Y Nahshan, B Chmiel, C Baskin, E Zheltonozhskii, R Banner, ...
Machine Learning 110 (11), 3245-3262, 2021
1382021
Robust quantization: One model to rule them all
B Chmiel, R Banner, G Shomron, Y Nahshan, A Bronstein, U Weiser
Advances in neural information processing systems 33, 5308-5317, 2020
732020
Linear Log-Normal Attention with Unbiased Concentration
Y Nahshan, J Kampeas, E Haleva
https://openreview.net/pdf?id=5nM2AHzqUj, 2023
2023
Rotation Invariant Quantization for Model Compression
J Kampeas, Y Nahshan, H Kremer, G Lederman, S Zaloshinski, Z Li, ...
arXiv preprint arXiv:2303.03106, 2023
2023
ACIQ: ANALYTICAL CLIPPING FOR INTEGER QUAN
R Banner, Y Nahshan, E Hoffer, D Soudry
arXiv preprint arXiv:1810.05723, 2018
2018
Supplementary Material: Accurate Post Training Quantization With Small Calibration Sets
I Hubara, Y Nahshan, Y Hanani, R Banner, D Soudry
系統目前無法執行作業,請稍後再試。
文章 1–8