Follow
Noam Wies
Noam Wies
Verified email at cs.huji.ac.il
Title
Cited by
Cited by
Year
Deep autoregressive models for the efficient variational simulation of many-body quantum systems
O Sharir, Y Levine, N Wies, G Carleo, A Shashua
Physical Review Letters 124 (2), 020503, 2020
1162020
Limits to Depth Efficiencies of Self-Attention
Y Levine, N Wies, O Sharir, H Bata, A Shashua
Advances in Neural Information Processing Systems 34 (NeurIPS), 2020
13*2020
The Inductive Bias of In-Context Learning: Rethinking Pretraining Example Design
Y Levine, N Wies, D Jannai, D Navon, Y Hoshen, A Shashua
arXiv preprint arXiv:2110.04541, 2021
42021
Which transformer architecture fits my data? A vocabulary bottleneck in self-attention
N Wies, Y Levine, D Jannai, A Shashua
International Conference on Machine Learning, 11170-11181, 2021
32021
Sub-Task Decomposition Enables Learning in Sequence to Sequence Tasks
N Wies, Y Levine, A Shashua
arXiv preprint arXiv:2204.02892, 2022
2022
Tensors for deep learning theory: Analyzing deep learning architectures via tensorization
Y Levine, N Wies, O Sharir, N Cohen, A Shashua
Tensors for Data Processing, 215-248, 2022
2022
The system can't perform the operation now. Try again later.
Articles 1–6