Seguir
hadi daneshmand
hadi daneshmand
Inria Paris
Dirección de correo verificada de inria.fr
Título
Citado por
Citado por
Año
Inferring causal molecular networks: empirical assessment through a community-based effort
SM Hill, LM Heiser, T Cokelaer, M Unger, NK Nesser, DE Carlin, Y Zhang, ...
Nature methods 13 (4), 310-318, 2016
2062016
Estimating diffusion network structures: Recovery conditions, sample complexity & soft-thresholding algorithm
H Daneshmand, M Gomez-Rodriguez, L Song, B Schoelkopf
International conference on machine learning, 793-801, 2014
1202014
Exponential convergence rates for batch normalization: The power of length-direction decoupling in non-convex optimization
J Kohler, H Daneshmand, A Lucchi, T Hofmann, M Zhou, K Neymeyr
The 22nd International Conference on Artificial Intelligence and Statistics …, 2019
93*2019
Local saddle point optimization: A curvature exploitation approach
L Adolphs, H Daneshmand, A Lucchi, T Hofmann
The 22nd International Conference on Artificial Intelligence and Statistics …, 2019
932019
Escaping saddles with stochastic gradients
H Daneshmand, J Kohler, A Lucchi, T Hofmann
International Conference on Machine Learning, 1155-1164, 2018
882018
Adaptive Newton method for empirical risk minimization to statistical accuracy
A Mokhtari, H Daneshmand, A Lucchi, T Hofmann, A Ribeiro
Advances in Neural Information Processing Systems 29, 2016
432016
Starting small-learning with adaptive sample sizes
H Daneshmand, A Lucchi, T Hofmann
International conference on machine learning, 1463-1471, 2016
392016
Estimating diffusion networks: Recovery conditions, sample complexity & soft-thresholding algorithm
M Gomez-Rodriguez, L Song, H Daneshmand, B Schölkopf
The Journal of Machine Learning Research 17 (1), 3092-3120, 2016
39*2016
A time-aware recommender system based on dependency network of items
SM Daneshmand, A Javari, SE Abtahi, M Jalili
The Computer Journal 58 (9), 1955-1966, 2015
182015
Batch normalization provably avoids ranks collapse for randomly initialised deep networks
H Daneshmand, J Kohler, F Bach, T Hofmann, A Lucchi
Advances in Neural Information Processing Systems 33, 18387-18398, 2020
15*2020
Batch normalization orthogonalizes representations in deep random networks
H Daneshmand, A Joudaki, F Bach
Advances in Neural Information Processing Systems 34, 2021
52021
Revisiting the Role of Euler Numerical Integration on Acceleration and Stability in Convex Optimization
P Zhang, A Orvieto, H Daneshmand, T Hofmann, RS Smith
International Conference on Artificial Intelligence and Statistics, 3979-3987, 2021
12021
Polynomial-time sparse measure recovery
H Daneshmand, F Bach
arXiv preprint arXiv:2204.07879, 2022
2022
Rethinking the Variational Interpretation of Accelerated Optimization Methods
P Zhang, A Orvieto, H Daneshmand
Advances in Neural Information Processing Systems 34, 2021
2021
Optimization for Neural Networks: Quest for Theoretical Understandings
H Daneshmand
ETH Zurich, 2020
2020
Mixing of Stochastic Accelerated Gradient Descent
P Zhang, H Daneshmand, T Hofmann
arXiv preprint arXiv:1910.14616, 2019
2019
Accelerated Dual Learning by Homotopic Initialization
H Daneshmand, H Hassani, T Hofmann
arXiv preprint arXiv:1706.03958, 2017
2017
El sistema no puede realizar la operación en estos momentos. Inténtalo de nuevo más tarde.
Artículos 1–17