Follow
Aleksandar Botev
Aleksandar Botev
Google Deepmind
Verified email at google.com
Title
Cited by
Cited by
Year
A scalable laplace approximation for neural networks
H Ritter, A Botev, D Barber
6th international conference on learning representations, ICLR 2018 …, 2018
3952018
Online structured laplace approximations for overcoming catastrophic forgetting
H Ritter, A Botev, D Barber
Advances in Neural Information Processing Systems 31, 2018
2842018
Practical Gauss-Newton optimisation for deep learning
A Botev, H Ritter, D Barber
International Conference on Machine Learning, 557-565, 2017
2162017
Hamiltonian generative networks
P Toth, DJ Rezende, A Jaegle, S Racanière, A Botev, I Higgins
arXiv preprint arXiv:1909.13789, 2019
2072019
Nesterov's accelerated gradient and momentum as approximations to regularised update descent
A Botev, G Lever, D Barber
2017 International joint conference on neural networks (IJCNN), 1899-1903, 2017
1762017
Better, faster fermionic neural networks
JS Spencer, D Pfau, A Botev, WMC Foulkes
arXiv preprint arXiv:2011.07125, 2020
432020
Complementary Sum Sampling for Likelihood Approximation in Large Scale Classification
A Botev, B Zheng, D Barber
AISTATS 54, 1030-1038, 2017
332017
Disentangling by subspace diffusion
D Pfau, I Higgins, A Botev, S Racanière
Advances in Neural Information Processing Systems 33, 17403-17415, 2020
302020
International Conference on Learning Representations
P Toth, DJ Rezende, A Jaegle, S Racanière, A Botev, I Higgins
OpenReview, 2020
292020
Deep learning without shortcuts: Shaping the kernel with tailored rectifiers
G Zhang, A Botev, J Martens
arXiv preprint arXiv:2203.08120, 2022
272022
Which priors matter? benchmarking models for learning latent dynamics
A Botev, A Jaegle, P Wirnsberger, D Hennes, I Higgins
arXiv preprint arXiv:2111.05458, 2021
232021
Deep transformers without shortcuts: Modifying self-attention for faithful signal propagation
B He, J Martens, G Zhang, A Botev, A Brock, SL Smith, YW Teh
arXiv preprint arXiv:2302.10322, 2023
182023
Sampling QCD field configurations with gauge-equivariant flow models
R Abbott, MS Albergo, A Botev, D Boyda, K Cranmer, DC Hackett, ...
Sissa Medialab, 2022
152022
Aspects of scaling and scalability for flow-based sampling of lattice QCD
R Abbott, MS Albergo, A Botev, D Boyda, K Cranmer, DC Hackett, ...
The European Physical Journal A 59 (11), 257, 2023
92023
Normalizing flows for lattice gauge theory in arbitrary space-time dimension
R Abbott, MS Albergo, A Botev, D Boyda, K Cranmer, DC Hackett, ...
arXiv preprint arXiv:2305.02402, 2023
92023
Symetric: Measuring the quality of learnt hamiltonian dynamics inferred from vision
I Higgins, P Wirnsberger, A Jaegle, A Botev
Advances in Neural Information Processing Systems 34, 25591-25605, 2021
82021
The Gauss-Newton matrix for deep learning models and its applications
A Botev
UCL (University College London), 2020
52020
Dealing with a large number of classes--Likelihood, Discrimination or Ranking?
D Barber, A Botev
arXiv preprint arXiv:1606.06959, 2016
52016
Overdispersed variational autoencoders
H Shah, D Barber, A Botev
2017 International Joint Conference on Neural Networks (IJCNN), 1109-1116, 2017
22017
Griffin: Mixing Gated Linear Recurrences with Local Attention for Efficient Language Models
S De, SL Smith, A Fernando, A Botev, G Cristian-Muraru, A Gu, R Haroun, ...
arXiv preprint arXiv:2402.19427, 2024
12024
The system can't perform the operation now. Try again later.
Articles 1–20