Daniel Roy
  • Associate Professor, Department of Statistical Sciences, Faculty of Arts & Science, University of Toronto

    Associate Professor, Department of Computer and Mathematical Sciences, University of Toronto Scarborough

    Faculty Member, Vector Institute

    Canada CIFAR Artificial Intelligence Chair

    Website | Google Scholar

Research Interests

  • Foundations of machine learning
  • Statistical learning theory
  • Neural networks
  • Probabilistic programming
  • Bayesian-frequentist interface
  • PAC-Bayes
  • Computability

Biography

Daniel’s research blends computer science, statistics and probability theory; He studies “probabilistic programming” and develop computational perspectives on fundamental ideas in probability theory and statistics. Daniel is particularly interested in: representation theorems that connect computability, complexity, and probabilistic structures; stochastic processes, the use of recursion to define stochastic processes, and applications to nonparametric Bayesian statistics; and the complexity of probabilistic and statistical inference, especially in the context of probabilistic programming. Ultimately, Daniel is motivated by the long term goal of making lasting contributions to our understanding of complex adaptive systems and especially Artificial Intelligence.

Highlights

  • Ontario Early Researcher Award
  • Google Faculty Research Award
  • Connaught New Researcher Award
  • Newton International Fellowship, Royal Academy of Engineering
  • Research Fellowship, Emmanuel College
  • MIT George M. Sprowls Doctoral Dissertation Award
  • Siebel Scholarship

Selected Papers

  • Data-dependent PAC-Bayes priors via differential privacyGintare Karolina Dziugaite and Daniel M. Roy https://arxiv.org/abs/1802.09583
  • On the computability of graphons (with Nathanael L. Ackerman, Jeremy Avigad, Cameron E. Freer, and Jason M. Rute) https://arxiv.org/abs/1801.10387
  • Entropy-SGD optimizes the prior of a PAC-Bayes bound: Generalization properties of Entropy-SGD and data-dependent priors Gintare Karolina Dziugaite and Daniel M. Roy https://arxiv.org/abs/1712.09376
  • An estimator for the tail-index of graphex processes Zacharie Naulet, Ekansh Sharma, Victor Veitch, Daniel M. Roy https://arxiv.org/abs/1712.01745
  • On Extended Admissible Procedures and their Nonstandard Bayes Risk Haosui Duanmu and Daniel M. Roy https://arxiv.org/abs/1612.09305
  • Sampling and Estimation for (Sparse) Exchangeable Graphs Victor Veitch and Daniel M. Roy https://arxiv.org/abs/1611.00843
  • The Class of Random Graphs Arising from Exchangeable Random Measures Victor Veitch and Daniel M. Roy https://arxiv.org/abs/1512.03099
  • Sequential Monte Carlo as Approximate Sampling: bounds, adaptative resampling via \infty-ESS, and an application to Particle Gibbs (with Jonathan Huggins) To appear in Bernoulli. http://arxiv.org/abs/1503.00966
  • Computing Nonvacuous Generalization Bounds for Deep (Stochastic) Neural Networks with Many More Parameters than Training Data Gintare Karolina Dziugaite and Daniel M. Roy In Proc. Uncertainty in Artificial Intelligence (UAI), 2017. https://arxiv.org/abs/1703.11008
  • On computability and disintegration (with Nate Ackerman and Cameron Freer) In Mathematical Structures in Computer Science. https://arxiv.org/abs/1509.02992
Scroll to Top
Pascal PoupartPerson headshot