## Julie NutiniSenior Scientist, Planet LabsRossland, BC, Canada (Remote) julie.nutini@planet.com PhD, Computer Science, University of British Columbia MSc, Mathematics, University of British Columbia (Okanagan) BSc, Mathematics, University of British Columbia (Okanagan) |

I was born and raised in the small mountain town of Rossland, British Columbia, located in the West Kootenays. I graduated from the University of British Columbia in 2018 with a PhD in Computer Science, supervised by Mark Schmidt. My PhD thesis focused on optimization methods for large scale structured problems.

I am currently a Senior Scientist, SAR Specialist at Planet Labs working on using machine learning with complementary sensor datasets (e.g., synthetic aperture radar (SAR), optical and auxiliary data) to improve optical fusion products. I have also worked on developing numerical optimization methods for problems such as satellite model refinement and radiometric/geometric corrections of optical remote sensing data.

Curriculum Vitae

J. Nutini, I. Laradji and M. Schmidt. Let's Make Block Coordinate Descent Converge Faster: Faster Greedy Rules, Message-Passing, Active-Set Complexity, and Superlinear Convergence, *JMLR*, 2022 [pdf][slides][poster][code].

Y. Sun, H. Jeong, J. Nutini and M. Schmidt. ``Are we there yet? Manifold identification of gradient related proximal methods",*AISTATS*, 2019 [pdf] [poster].

J. Nutini, M. Schmidt and W. Hare. "Active-set complexity" of proximal gradient: How long does it take to find the sparsity pattern?,*Optimization Letters*, 2018 [pdf] [poster].

I. Laradji, J. Nutini and M. Schmidt. Graphical Newton for Huge-Block Coordinate Descent on Sparse Graphs,*NeurIPS Optimization Workshop*, 2017 [pdf] [poster].

H. Karimi, J. Nutini and M. Schmidt. Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Lojasiewicz Condition,*ECML-PKDD*, 2016 [pdf] [slides] [poster].

J. Nutini, B. Sepehry, I. H. Laradji, M. Schmidt, H. Koepke and A. Virani. Convergence Rates for Greedy Kaczmarz Algorithms, and Faster Randomized Kaczmarz Rules Using the Orthogonality Graph,*UAI*, 2016 [pdf] [poster] [code].

^{*}K. Bigdeli, W. Hare, J. Nutini and S. Tesfamariam. Optimizing Damper Connectors for Adjacent Buildings, *Optimization and Engineering*, 17(1):47-75, 2016 [pdf].

J. Nutini, M. Schmidt, I. H. Laradji, M. Friedlander and H. Koepke. Coordinate Descent Converges Faster with the Gauss-Southwell Rule Than Random Selection,*ICML*, 2015 [pdf] [slides] [poster] [video talk].

^{*}W. Hare, J. Nutini and S. Tesfamariam. A survey of non-gradient optimization methods in structural engineering, * Advances in Engineering Software*, 59:19-28, 2013 [pdf].

^{*}W. Hare and J. Nutini. A derivative-free approximate gradient sampling algorithm for finite minimax problems, *Computational Optimization and Applications*, 56(1):1-38, 2013 [pdf] [slides].

Y. Sun, H. Jeong, J. Nutini and M. Schmidt. ``Are we there yet? Manifold identification of gradient related proximal methods",

J. Nutini, M. Schmidt and W. Hare. "Active-set complexity" of proximal gradient: How long does it take to find the sparsity pattern?,

I. Laradji, J. Nutini and M. Schmidt. Graphical Newton for Huge-Block Coordinate Descent on Sparse Graphs,

H. Karimi, J. Nutini and M. Schmidt. Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Lojasiewicz Condition,

J. Nutini, B. Sepehry, I. H. Laradji, M. Schmidt, H. Koepke and A. Virani. Convergence Rates for Greedy Kaczmarz Algorithms, and Faster Randomized Kaczmarz Rules Using the Orthogonality Graph,

J. Nutini, M. Schmidt, I. H. Laradji, M. Friedlander and H. Koepke. Coordinate Descent Converges Faster with the Gauss-Southwell Rule Than Random Selection,

^{*} authors listed in alphabetical order