David Duvenaud

Faculty Member

Associate Professor, Department of Statistical Sciences and Department of Computer Science, Faculty of Arts & Science, University of Toronto

Canada CIFAR Artificial Intelligence Chair

Co-founder, Invenia

Canada Research Chair in Generative Models

David Duvenaud is an associate professor in computer science and statistics at the University of Toronto, and holds a Canada Research Chair in generative models. He is also a founding member of the Vector Institute. His postdoc was at Harvard University, where he worked on hyperparameter optimization, variational inference, deep learning and automatic chemical design. He did his Ph.D at the University of Cambridge, studying Bayesian nonparametrics with Zoubin Ghahramani and Carl Rasmussen. David spent two summers on the machine vision team at Google Research, and also co-founded Invenia, an energy forecasting and trading company.

Research Interests

  • Approximate inference
  • Automatic model-building
  • Model-based optimization

Publications

Getting to the Point. Index Sets and Parallelism-Preserving Autodiff for Pointful Array Programming

Adam Paszke and Daniel Johnson and David Duvenaud and Dimitrios Vytiniotis and Alexey Radul and Matthew Johnson and Jonathan Ragan-Kelley and Dougal Maclaurin

2021

Meta-Learning for Semi-Supervised Few-Shot Classification

M. Ren, E. Triantafillou, S. Ravi, J. Snell, K. Swersky, J. B. Tenenbaum, et al.

International Conference on Learning Representations (ICLR) 2018

Stochastic Hyperparameter Optimization through Hypernetworks

J. Lorraine, and D. Duvenaud

2018

Isolating Sources of Disentanglement in Variational Autoencoders

T. Q. Chen, X. Li, R. B. Grosse, and D. K. Duvenaud

Advances in Neural Information Processing Systems 31 2018

Noisy Natural Gradient as Variational Inference

G. Zhang, S. Sun, D. Duvenaud, and R. Grosse

Proceedings of the 35th International Conference on Machine Learning (ICML) 2018

Backpropagation through the Void: Optimizing control variates for black-box gradient estimation

W. Grathwohl, D. Choi, Y. Wu, G. Roeder, and D. Duvenaud

International Conference on Learning Representations (ICLR) 2018

Automatic chemical design using a data-driven continuous representation of molecules

R. Gomez-Bombarelli, J. N. Wei, D. Duvenaud, J. M. Hernandez-Lobato, B. Sanchez-Lengeling, D. Sheberla, et al.

American Chemical Society Central Science 2018

Sticking the landing: Simple, lower-variance gradient estimators for variational inference

G. Roeder, Y. Wu, and D. Duvenaud

Advances in Neural Information Processing Systems (NIPS) 2017

Reinterpreting Importance-Weighted Autoencoders

C. Cremer, Q. Morris, and D. Duvenaud

International Conference on Learning Representations (ICLR) Workshop Track 2017