I am a master’s student in the Department of Computer Science at the University of British Columbia, where I am advised by Mark Schmidt. My research interests are in optimization and approximate Bayesian inference.
May 24: New work on line-searches for stochastic gradient descent (with convergence rates under interpolation) is on arXiv!
September 4: Our paper on low-rank Gaussian variational inference for Bayesian neural networks was accepted at NeurIPS 2018! The paper is available on arXiv here.
January 21 - June 30: I joined Emtiyaz Khan as an intern at the RIKEN Center for Advanced Intelligence Project (AIP).
Painless Stochastic Gradient: Interpolation, Line-Search, and Convergence Rates. S. Vaswani, A. Mishkin, I. Laradji, M. Schmidt, G. Gidel, S. Lacoste-Julien. arXiv, 2019.
SLANG: Fast Structured Covariance Approximations for Bayesian Deep Learning with Natural Gradient. A. Mishkin, F. Kunstner, D. Nielsen, M. Schmidt, M. E. Khan. NeurIPS, 2018.
Web ValueCharts: Analyzing Individual and Group Preferences with Interactive, Web-based Visualizations. A. Mishkin. Review of Undergraduate Computer Science, 2018.