Follow
Elad Hoffer
Elad Hoffer
PhD, Research @ Habana Labs
Verified email at habana.ai - Homepage
Title
Cited by
Cited by
Year
Deep metric learning using triplet network
E Hoffer, N Ailon
Similarity-based pattern recognition: third international workshop, SIMBAD …, 2015
25672015
Train longer, generalize better: closing the generalization gap in large batch training of neural networks
E Hoffer, I Hubara, D Soudry
Advances in neural information processing systems 30, 2017
9822017
The implicit bias of gradient descent on separable data
D Soudry, E Hoffer, MS Nacson, S Gunasekar, N Srebro
Journal of Machine Learning Research 19 (70), 1-57, 2018
9762018
Scalable methods for 8-bit training of neural networks
R Banner, I Hubara, E Hoffer, D Soudry
Advances in neural information processing systems 31, 2018
4022018
Augment your batch: Improving generalization through instance repetition
E Hoffer, T Ben-Nun, I Hubara, N Giladi, T Hoefler, D Soudry
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2020
318*2020
Norm matters: efficient and accurate normalization schemes in deep networks
E Hoffer, R Banner, I Golan, D Soudry
Advances in Neural Information Processing Systems 31, 2018
1832018
Bayesian gradient descent: Online variational Bayes learning with increased robustness to catastrophic forgetting and weight pruning
C Zeno, I Golan, E Hoffer, D Soudry
arXiv preprint arXiv:1803.10123, 2018
130*2018
The knowledge within: Methods for data-free model compression
M Haroush, I Hubara, E Hoffer, D Soudry
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2020
1122020
Fix your classifier: the marginal value of training the last weight layer
E Hoffer, I Hubara, D Soudry
arXiv preprint arXiv:1801.04540, 2018
1082018
Exponentially vanishing sub-optimal local minima in multilayer neural networks
D Soudry, E Hoffer
arXiv preprint arXiv:1702.05777, 2017
1052017
Aciq: Analytical clipping for integer quantization of neural networks
R Banner, Y Nahshan, E Hoffer, D Soudry
852018
Neural gradients are lognormally distributed: understanding sparse and quantized training
B Chmiel, L Ben-Uri, M Shkolnik, E Hoffer, R Banner, D Soudry
arXiv, 2020
56*2020
Task-agnostic continual learning using online variational bayes with fixed-point updates
C Zeno, I Golan, E Hoffer, D Soudry
Neural Computation 33 (11), 3139-3177, 2021
54*2021
Semi-supervised deep learning by metric embedding
E Hoffer, N Ailon
arXiv preprint arXiv:1611.01449, 2016
422016
Deep unsupervised learning through spatial contrasting
E Hoffer, I Hubara, N Ailon
arXiv preprint arXiv:1610.00243, 2016
352016
Mix & match: training convnets with mixed image sizes for improved accuracy, speed and scale resiliency
E Hoffer, B Weinstein, I Hubara, T Ben-Nun, T Hoefler, D Soudry
arXiv preprint arXiv:1908.08986, 2019
272019
Logarithmic unbiased quantization: Practical 4-bit training in deep learning
B Chmiel, R Banner, E Hoffer, HB Yaacov, D Soudry
25*2021
At Stability's Edge: How to Adjust Hyperparameters to Preserve Minima Selection in Asynchronous Training of Neural Networks?
N Giladi, MS Nacson, E Hoffer, D Soudry
arXiv preprint arXiv:1909.12340, 2019
202019
Accurate neural training with 4-bit matrix multiplications at standard formats
B Chmiel, R Banner, E Hoffer, H Ben-Yaacov, D Soudry
The Eleventh International Conference on Learning Representations, 2023
72023
Quantized back-propagation: Training binarized neural networks with quantized gradients
I Hubara, E Hoffer, D Soudry
62018
The system can't perform the operation now. Try again later.
Articles 1–20