Bypass and Beyond: Extension–Contraction Strategies for Escaping Training Stagnation and Achieving Lossless Pruning

KIAS website link

Abstract

What can be done when gradient-based training slows down near saddle points or suboptimal local minima? In this talk, I introduce Bypass, a principled method that actively guides optimization away from stationary regions by temporarily extending the model space, exploring new descent directions, and contracting back to the original architecture while preserving the learned function. This extension–contraction framework is algebraically grounded, easy to implement, and remarkably effective in improving both convergence and generalization. Building on the same algebraic foundation, I present Catalyst, a novel regularization technique for structured pruning. Catalyst identifies the geometry of pruning-invariant sets and extends the parameter space with a geometry-aware regularizer that enables lossless pruning with clear bifurcation dynamics. It offers a theoretically sound and empirically robust alternative to conventional magnitude-based pruning methods. Together, Bypass and Catalyst demonstrate how algebraic insights can lead to practical improvements in both training and compression. This talk will be of interest to researchers and practitioners working on optimization, model efficiency, and the geometry of deep learning.

Date
Nov 5, 2025 2:00 PM — 3:00 PM
Location
Room 7323, Korea Institute of Advanced Study
85 Hoegi-ro, Dongdaemun-gu, Seoul, 02455

We are excited to see this talk! It is first event for AIML@K students to give a talk at KIAS.

Jaeheun Jung
Jaeheun Jung
Ph.D. Candidate

Inventing AI methods using mathematics