I'm currently a PhD candidate in machine learning at cs.ubc, advised by Professors Nando de Freitas (now at Oxford and Google DeepMind) and Alex Bouchard-Côté from the stats department. I work on Bayesian optimization (with the unfortunate acronym BO) and statistical modelling for the purpose of global optimization. I'm very interested in making BO a robust, practical tool that people can confidently use in both industry and academia. Check out pybo!
Previously, I completed an MSc degree in applied mathematics from Simon Fraser University where I worked with Prof. Steve Ruuth on solving partial differential equations on general surfaces. Before that, I completed my BSc in honours physics at McGill.

Recent news

  • I’m on the job market! Please feel free to contact me by email or linkedin with any interesting opportunities.

  • 25 Jan 2016: I gave a talk at the LCI forum. Here are the slides if you missed it or if you want to follow up on any references.

  • Going to Spain for AISTATS 2016! I’ll present my latest work on unbounded Bayesian optimization via regularization. Here’s a preview.

  • The 2015 edition of the NIPS Workshop on Bayesian optimization was a huge success! Thanks to my co-organizers, particularly Roberto Calandra for doing much of the leg work with me. Thanks to the program committee, invited speakers, and of course attendees for making the workshop and panel discussion interesting! Finally, thank you to Microsoft Research for a generous sponsorship.

  • As co-organizer, I gave a talk introducing Bayesian optimization, focussing on challenges concerning scalability and flexibility at the NIPS 2015 BayesOpt workshop.



My collaborator Matt W. Hoffman and I have recently released our python package for Bayesian optimization! Fork us on github! All feedback, suggestions, and feature requests welcome!


Quick and dirty implementation of the Deep Network for Global Optimization (DNGO) from the nice paper by Snoek et al. (ICML 2015). Have fun playing around with it!


For our work on portfolio Bayesian optimization I’ve had to experiment with Spearmint so I wrote spex, a small python package to handle much of the bookkeeping involved in running repeated experiments.


  1. Shahriari, B., Swersky, K., Wang, Z., Adams, R. P., & de Freitas, N. (2016). Taking the Human Out of the Loop: A Review of Bayesian Optimization. In Proceedings of the IEEE (Vol. 104, pp. 1–28). (link)
  2. Shahriari, B., Bouchard-Côté, A., & de Freitas, N. (2016). Unbounded Bayesian Optimization via Regularization. In AISTATS. (link) (poster)
  3. Hoffman, M. W., & Shahriari, B. (2014). Modular mechanisms for Bayesian optimization. In NIPS workshop on Bayesian optimization. (pdf) (code)
  4. Shahriari, B., Wang, Z., Hoffman, M. W., Bouchard-Côté, A., & de Freitas, N. (2014). An entropy search portfolio for Bayesian optimization. In NIPS workshop on Bayesian optimization. (link) (poster)
  5. Hoffman, M. W., Shahriari, B., & de Freitas, N. (2014). On correlation and budget constraints in model-based bandit optimization with application to automatic machine learning. In AISTATS (pp. 365–374). (pdf) (poster) (talk)
  6. M. Sc. Thesis. (2010). The modified Cahn-Hilliard equation on general surfaces. (link)