Julie A. Nutini


PhD Student, Computer Science
University of British Columbia

 201 2366 Main Mall
 Vancouver, BC, V6T 1Z4
 Canada

jnutini@cs.ubc.ca

I was born and raised in the small ski town of Rossland, British Columbia, located in the West Kootenays. I graduated from the University of British Columbia (Okanagan Campus) in 2010 with my BSc in General Mathematics (with honors), and in 2012 with my MSc in Mathematical Optimization (Governor General's Gold Medal recipient). My MSc thesis, supervised by Warren Hare, focused on derivative-free optimization methods for finite minimax problems. I am currently a PhD student supervised by Mark Schmidt at the University of British Columbia in Vancouver. I work on optimization and machine learning, with my most recent project focusing on coordinate descent methods.

Curriculum Vitae


Publications:

*W. Hare and J. Nutini. "A derivative-free approximate gradient sampling algorithm for finite minimax problems", Computational Optimization and Applications, 56(1):1-38, 2013 [pdf] [slides].

*W. Hare, J. Nutini and S. Tesfamariam. "A survey of non-gradient optimization methods in structural engineering", Advances in Engineering Software, 59:19-28, 2013 [pdf].

*K. Bigdeli, W. Hare, J. Nutini and S. Tesfamariam. "Optimizing Damper Connectors for Adjacent Buildings", Optimization and Engineering, 17(1):47-75, 2016 [pdf].

J. Nutini, M. Schmidt, I. H. Laradji, M. Friedlander and H. Koepke. "Coordinate Descent Converges Faster with the Gauss-Southwell Rule Than Random Selection", ICML, 2015 [pdf] [slides] [poster] [talk].

J. Nutini, B. Sepehry, I. H. Laradji, M. Schmidt, H. Koepke and A. Virani. "Convergence Rates for Greedy Kaczmarz Algorithms, and Faster Randomized Kaczmarz Rules Using the Orthogonality Graph", 2016 [pdf] [poster] [code].

H. Karimi, J. Nutini and M. Schmidt. "Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Lojasiewicz Condition", ECML-PKDD, 2016 [pdf] [slides] [poster].

* authors listed in alphabetical order


UBC Machine Learning Reading Group talks:

  • Conditional Inference and Cutset Conditioning [slides].
  • Coordinate Descent and Ascent Methods [slides].
  • Principle and Independent Component Analysis [slides].
  • Feedforward Neural Nets and Backpropagation [slides].
  • Monte Carlo Methods [slides].

Miscellaneous Projects

  • Research Proficiency Exam (RPE) project - Putting the Curvature Back into Sparse Solvers [pdf] [slides].