Math
Research
My current research relates to graphons, which are
a notion of infinite graphs. Because they are continuous objects, graphons are well-suited to framing graph problems
with functional analysis, and in particular, non-local analysis. For example, my paper on graphon Ginzburg--Landau functionals
uses Young measures and Gamma-convergence, which is a type of convergence that is used in the calculus of variations.
My (forthcoming) paper on graphon reaction--diffusion equations uses techniques from numerical analysis and stochastic analysis.
CV
CV (updated Dec 2024)
Publications
Google scholar
Ginzburg--Landau functionals in the large-graph limit (in revision with minor corrections
at Jounral of Pure and Applied Functional Analysis, 2024)
- Ginzburg--Landau (GL) theory, which was originally developed to model superconductors, has more recently been applied to
a variety of applications including graph clustering, image processing, and min-cut/max-flow problems. These applications typically
use the graph GL functional, which is an approximation of the graph-cut functional. The GL functional is the sum of a Dirichlet energy
term, which penalizes the variability of a function, and a double-well potential, which penalizes deviations from the values +1 and -1.
This paper defines a graphon GL functional and shows that the minimizers of the graph GL functional converge to the
minimizers of the graphon GL functional.
A Particle Algorithm for Mean-Field Variational Inference (preprint, 2024)
- As stated below (in the description of "On Representations of Mean-Field Variational Inference"), variational inference (VI) is
a Bayesian statistical inference method that relies on a constrained optimization problem. In this paper, we propose a particle-based
algorithm that efficiently approximates variational inference. We describe convergence guarantees.
On Representations of Mean-Field Variational Inference (preprint, 2022)
- Variational inference is an approach to Bayesian inference that is based on an optimization problem.
It is widely-used, but VI algorithms have lacked theoretical guarantees. The idea of VI is to seek a
constrained minimizer of the distance to the Bayesian posterior,
where the constrained set is the set of product measures (probability measures where each component is independent of all others).
Minimizations in probability space can in principle be approximated by a gradient flow with respect to a Wasserstein metric; in the case
of the unconstrained Bayesian problem, the Wasserstein--gradient flow is a Fokker--Planck equation.
The problem we tackle is: how do we describe the constrained problem as a gradient flow in constrained space?
We express variational inference as an approximate Wasserstein gradient flow, and we discuss theoretical implications.
Unveiling Mode-Connectivity of the ELBO Landscape (Workshop paper, 2021)
- Mode-connectivity is an interesting property that has been discovered in neural net loss functions, which are high-dimensional and nonconvex.
It has been related to a "no bad local minima" property that's been noticed in practice, and is related to overparametrization.
This workshop paper demonstrates and begins to explain the existence of mode-connectivity in the ELBO, which is the loss function of
variational inference. This paper began as my final project for a course (and it was nominated by the professor in a "Best Student Project" competition!).
The workshop paper is an abbreviated version of my final project.
In-Progress Publications
Graphon Reaction--Diffusion Equations
-
A graph reaction--diffusion (RD) equation is a system of differential equations that is defined on the nodes of a graph.
Consider a sequence of growing graphs that converges to a limiting graphon.
In this paper, we show that the sequence of solutions of the sequence of graph RD equations converges to the solution of a limiting
non-local RD equation, which we call a graphon RD equation. Furthermore, we show that a sequence of stochastic particle
processes (that consist of a random walk and a birth--death process) on the sequence of graphs converges to the solution
of the graphon RD equation. This project is joint with Qiang Du
and James Scott.
Teaching
Adjunct Professor: Calculus II at The Cooper Union for the Advancement of Science and Art (Spring 2025)
Grader: Mathematics for data science (Spring 2020), Linear algebra (Fall 2021), Intro to numerical methods (Spring 2022), Linear algebra (Fall 2022), Numerical analysis (Fall 2023)
TA: Partial Differential Equations (Fall 2019)
Math tutor: UVA Athletics, UVA Mathematics (2016-2019)
Seminar
Applied Math student seminar
This semester (Spring 2024), the Applied Math student seminar meets on Thursdays, 1-2pm in the APAM conference room.
I co-founded the AM seminar in spring 2023 to gather with our
fellow graduate students to present recent learnings and progress, practice giving talks and feedback on talks.