Noam Wies
Noam Wies
Verified email at
Cited by
Cited by
Deep autoregressive models for the efficient variational simulation of many-body quantum systems
O Sharir, Y Levine, N Wies, G Carleo, A Shashua
Physical Review Letters 124 (2), 020503, 2020
Limits to Depth Efficiencies of Self-Attention
Y Levine, N Wies, O Sharir, H Bata, A Shashua
Advances in Neural Information Processing Systems 34 (NeurIPS), 2020
The Inductive Bias of In-Context Learning: Rethinking Pretraining Example Design
Y Levine, N Wies, D Jannai, D Navon, Y Hoshen, A Shashua
arXiv preprint arXiv:2110.04541, 2021
Which transformer architecture fits my data? A vocabulary bottleneck in self-attention
N Wies, Y Levine, D Jannai, A Shashua
International Conference on Machine Learning, 11170-11181, 2021
Sub-Task Decomposition Enables Learning in Sequence to Sequence Tasks
N Wies, Y Levine, A Shashua
arXiv preprint arXiv:2204.02892, 2022
Tensors for deep learning theory: Analyzing deep learning architectures via tensorization
Y Levine, N Wies, O Sharir, N Cohen, A Shashua
Tensors for Data Processing, 215-248, 2022
The system can't perform the operation now. Try again later.
Articles 1–6