David Duvenaud

On Leave

Faculty Member, Vector Institute

Associate Professor, Department of Statistical Sciences and Department of Computer Science, University of Toronto

Canada CIFAR Artificial Intelligence Chair

David Duvenaud is an Associate Professor in Computer Science and Statistics at the University of Toronto. He holds a Sloan Research Fellowship, a Canada Research Chair in Generative Models, and a Canada CIFAR AI chair. His research focuses on deep learning and AI governance. His postdoc was done at Harvard University and his Ph.D. at the University of Cambridge. He is a Founding Member of the Vector Institute for Artificial Intelligence.

Research Interests

  • Deep Learning 
  • Generative Models
  • AI Governance
  • AI Alignment

Highlights

  • Holds a Tier 2 Canada Research Chair in Generative Models
  • Co-founder of Fable Therapeutics
  • Alfred P. Sloan Research Fellow in Computer Science

Publications

Getting to the Point. Index Sets and Parallelism-Preserving Autodiff for Pointful Array Programming

Adam Paszke and Daniel Johnson and David Duvenaud and Dimitrios Vytiniotis and Alexey Radul and Matthew Johnson and Jonathan Ragan-Kelley and Dougal Maclaurin

2021

Meta-Learning for Semi-Supervised Few-Shot Classification

M. Ren, E. Triantafillou, S. Ravi, J. Snell, K. Swersky, J. B. Tenenbaum, et al.

International Conference on Learning Representations (ICLR) 2018

Stochastic Hyperparameter Optimization through Hypernetworks

J. Lorraine, and D. Duvenaud

2018

Isolating Sources of Disentanglement in Variational Autoencoders

T. Q. Chen, X. Li, R. B. Grosse, and D. K. Duvenaud

Advances in Neural Information Processing Systems 31 2018

Noisy Natural Gradient as Variational Inference

G. Zhang, S. Sun, D. Duvenaud, and R. Grosse

Proceedings of the 35th International Conference on Machine Learning (ICML) 2018

Backpropagation through the Void: Optimizing control variates for black-box gradient estimation

W. Grathwohl, D. Choi, Y. Wu, G. Roeder, and D. Duvenaud

International Conference on Learning Representations (ICLR) 2018

Automatic chemical design using a data-driven continuous representation of molecules

R. Gomez-Bombarelli, J. N. Wei, D. Duvenaud, J. M. Hernandez-Lobato, B. Sanchez-Lengeling, D. Sheberla, et al.

American Chemical Society Central Science 2018

Sticking the landing: Simple, lower-variance gradient estimators for variational inference

G. Roeder, Y. Wu, and D. Duvenaud

Advances in Neural Information Processing Systems (NIPS) 2017

Reinterpreting Importance-Weighted Autoencoders

C. Cremer, Q. Morris, and D. Duvenaud

International Conference on Learning Representations (ICLR) Workshop Track 2017