Research
Research
Seminars
Events Calendar
Dissertation Defense
Computational Mathematics Approaches to Deep Neural Network Architecture Design
Krishnanunni C G
Ph.D. Candidate
Aerospace Engineering ang Engineering Mechanics
The University of Texas at Austin
Wednesday, April 1, 2026
2:30 pm - 4:30 pm
2:30 pm - 4:30 pm
POB 4.304
Designing appropriate neural network architectures remains a central open challenge in deep learning, with performance depending critically on depth, width, and connectivity. Current approaches rely largely on trial and error or computationally prohibitive search algorithms. This dissertation develops principled, mathematically rigorous methods for neural network architecture adaptation, drawing on ideas from topology optimization and finite element-inspired error estimation.
In the first part of this presentation, we address adaptive architecture design for regression and classification tasks, tackling three fundamental questions: when to add capacity during training phase (add a new layer in a network), where to add it, and how to initialize it. A topo-logical derivative framework quantifies the sensitivity of a shape functional to the insertion of a new layer, yielding both the optimal insertion location and a principled initialization strategy for the added layer. Complementing this, a dual weighted residual approach from finite element theory decomposes approximation error across layers to guide adaptive depth refinement, forging formal connections between neural network approximation theory and classical numerical analysis.
The second part of the presentation introduces the Linear Latent Network (LiLaN), a novel architecture designed for learning solution operators of stiff nonlinear differential equations with significant computational speedup over traditional solvers. LiLaN maps states into a latent space where dynamics admit closed-form solutions, eliminating numerical integration atinference time. Universal approximation guarantees are established theoretically, and numerical experiments in chemical kinetics, plasma physics, and phase-field modeling demonstrate up to three orders of magnitude speedup over traditional solvers. Together, these contributions provide a suite of computational mathematics tools for adaptive neural network architecture design and accelerated scientific machine learning.
In the first part of this presentation, we address adaptive architecture design for regression and classification tasks, tackling three fundamental questions: when to add capacity during training phase (add a new layer in a network), where to add it, and how to initialize it. A topo-logical derivative framework quantifies the sensitivity of a shape functional to the insertion of a new layer, yielding both the optimal insertion location and a principled initialization strategy for the added layer. Complementing this, a dual weighted residual approach from finite element theory decomposes approximation error across layers to guide adaptive depth refinement, forging formal connections between neural network approximation theory and classical numerical analysis.
The second part of the presentation introduces the Linear Latent Network (LiLaN), a novel architecture designed for learning solution operators of stiff nonlinear differential equations with significant computational speedup over traditional solvers. LiLaN maps states into a latent space where dynamics admit closed-form solutions, eliminating numerical integration atinference time. Universal approximation guarantees are established theoretically, and numerical experiments in chemical kinetics, plasma physics, and phase-field modeling demonstrate up to three orders of magnitude speedup over traditional solvers. Together, these contributions provide a suite of computational mathematics tools for adaptive neural network architecture design and accelerated scientific machine learning.
Contact Tan Bui-Thanh (tanbui@oden.utexas.edu)
Sign Up for Seminar Announcements
To sign up for our weekly seminar announcements, send an email to sympa@utlists.utexas.edu with the subject line: Subscribe ase-em-seminars.