Seguir
Daniel Rothchild
Daniel Rothchild
Dirección de correo verificada de berkeley.edu
Título
Citado por
Citado por
Año
Carbon emissions and large neural network training
D Patterson, J Gonzalez, Q Le, C Liang, LM Munguia, D Rothchild, D So, ...
arXiv preprint arXiv:2104.10350, 2021
6282021
Fetchsgd: Communication-efficient federated learning with sketching
D Rothchild, A Panda, E Ullah, N Ivkin, I Stoica, V Braverman, J Gonzalez, ...
International Conference on Machine Learning, 8253-8265, 2020
3562020
The carbon footprint of machine learning training will plateau, then shrink
D Patterson, J Gonzalez, U Hölzle, Q Le, C Liang, LM Munguia, ...
Computer 55 (7), 18-28, 2022
2372022
Communication-efficient distributed SGD with sketching
N Ivkin, D Rothchild, E Ullah, I Stoica, R Arora
Advances in Neural Information Processing Systems 32, 2019
1992019
Strongly lensed SNe Ia in the era of LSST: observing cadence for lens discoveries and time-delay measurements
S Huber, SH Suyu, UM Noebauer, V Bonvin, D Rothchild, JHH Chan, ...
Astronomy & Astrophysics 631, A161, 2019
512019
Squeezewave: Extremely lightweight vocoders for on-device speech synthesis
B Zhai, T Gao, F Xue, D Rothchild, B Wu, JE Gonzalez, K Keutzer
arXiv preprint arXiv:2001.05685, 2020
332020
The impact of observing strategy on cosmological constraints with LSST
M Lochner, D Scolnic, H Almoubayyed, T Anguita, H Awan, E Gawiser, ...
The Astrophysical Journal Supplement Series 259 (2), 58, 2022
242022
C5t5: Controllable generation of organic molecules with transformers
D Rothchild, A Tamkin, J Yu, U Misra, J Gonzalez
arXiv preprint arXiv:2108.10307, 2021
242021
Carbon emissions and large neural network training. arXiv 2021
D Patterson, J Gonzalez, Q Le, C Liang, LM Munguia, D Rothchild, D So, ...
arXiv preprint arXiv:2104.10350, 0
15
Optimizing the lsst observing strategy for dark energy science: Desc recommendations for the wide-fast-deep survey
M Lochner, DM Scolnic, H Awan, N Regnault, P Gris, R Mandelbaum, ...
arXiv preprint arXiv:1812.00515, 2018
102018
Carbon emissions and large neural network training (pp. 1–22)
D Patterson, J Gonzalez, Q Le, C Liang, LM Munguia, D Rothchild, D So, ...
arXiv preprint arXiv:2104.10350, 2021
62021
ALTSched: Improved Scheduling for Time-domain Science with LSST
D Rothchild, C Stubbs, P Yoachim
Publications of the Astronomical Society of the Pacific 131 (1005), 115002, 2019
52019
Optimizing the LSST Observing Strategy for Dark Energy Science: DESC Recommendations for the Deep Drilling Fields and other Special Programs
DM Scolnic, M Lochner, P Gris, N Regnault, R Hložek, G Aldering, ...
arXiv preprint arXiv:1812.00516, 2018
52018
Copyright Implications of the Use of Code Repositories to Train a Machine Learning Model
J Rothchild, D Rothchild
Free Software Foundation, 2022
32022
lsst/rubin_sim: 0.12. 1
P Yoachim, RL Jones, EH Neilsen, T Ribeiro, S Daniel, N Abrams, ...
Zenodo, 2022
12022
The LSST Dark Energy Science Collaboration Cadence Note
M Lochner, D Scolnic, H Almoubayyed, T Anguita, H Awan, E Gawiser, ...
LSST Survey Cadence Notes, 57, 2021
12021
Investigating the Behavior of Diffusion Models for Accelerating Electronic Structure Calculations
D Rothchild, AS Rosen, E Taw, C Robinson, JE Gonzalez, ...
arXiv preprint arXiv:2311.01491, 2023
2023
Accelerating Electronic Structure Calculations with Machine Learning
D Rothchild
University of California, Berkeley, 2023
2023
Using Automated Vehicle (AV) Technology to Smooth Traffic Flow and Reduce Greenhouse Gas Emissions
S Almatrudi, K Parvate, D Rothchild, U Vijay
2022
Deterministic Telescope Scheduling for Synoptic Surveys: An Alternative for LSST
D Rothchild, C Stubbs, P Yoachim
American Astronomical Society Meeting Abstracts# 233 233, 456.08, 2019
2019
El sistema no puede realizar la operación en estos momentos. Inténtalo de nuevo más tarde.
Artículos 1–20