MSc Thesis Presentation - Yao Kuang

Date
Location

LSK 306

Name: Yao Kuang

Date: September 10, 2025

Time: 2 pm

Location: LSK 306

Supervisor’s name: Michael P. Friedlander

Thesis title: Communication-Efficient Algorithms for Decentralized Multi-Task Learning

Abstract
Distributed optimization requires nodes to coordinate, yet full synchronization scales poorly. When $n$ nodes collaborate through $m$ pairwise regularizers, standard methods demand $O(m)$ communications per iteration. We propose randomized local coordination: each node independently samples one regularizer uniformly and coordinates only with nodes sharing that term. This exploits partial separability, where each regularizer $G_j$ depends on a subset $S_j \subseteq \{1,\ldots,n\}$ of nodes. For graph-guided regularizers where $|S_j|=2$, expected communication drops to exactly 2 messages per iteration regardless of the network topology. This method achieves $\tilde{O}(\epsilon^{-2})$ iterations for convex objectives (with bounded subgradients) and under strong convexity, $O(\epsilon^{-1})$ to an $\epsilon$-solution and $O(\log(1/\epsilon))$ to a neighborhood. Replacing the proximal map of the sum $\sum_j G_j$ with the proximal map of a single randomly selected regularizer $G_j$ preserves convergence while eliminating global coordination. Experiments validate both convergence rates and communication efficiency across synthetic and real-world datasets.