Skip to Content

Department of Mathematics

Applied and Computational Mathematics Seminars

We invite speakers to present original research in Applied and Computational Mathematics.

2024 – 2025 Academic Year

Organized by: Changhui Tan (tan@math.sc.edu) & Siming He (siming@mailbox.sc.edu)

Unless otherwise noted, the seminar will be held on Fridays from 2:30pm to 3:30pm in LeConte 440.

This page will be update as new seminar information becomes available. Check back often for the most up to date information!

When: November 15, 2024, from 2:30 – 3:30 p.m.

Where: LeConte 440

Speaker: Li Xiantao (Penn State University)

Abstract: Quantum computing has recently emerged as a potential tool for large-scale scientific computing. In sharp contrast to their classical counterparts, quantum computers use qubits that can exist in superposition, potentially offering exponential speedup for many computational problems.  Current quantum devices are noisy and error-prone, and in near term, a hybrid approach is more appropriate. I will discuss this hybrid framework using three examples: quantum machine learning, quantum algorithms for density-functional theory and quantum optimal control. In particular, this talk will outline how quantum algorithms can be interfaced with a classical method, the convergence properties and the overall complexity.

When: November 8, 2024, from 2:30 – 3:30 p.m.

Where: LeConte 440

Speaker: Yuming Zhang (Auburn University)

Abstract: In tumor growth models, two primary approaches are commonly used. The first, described by Porous Medium type equations, models the tumor cells as a distribution evolving over space. The second, based on Hele-Shaw type flows, focuses on the evolution of the domain occupied by the cells. These two models are connected through the incompressible limit. In this talk, I will discuss the uniform strict propagation of the free boundaries and their convergence in the incompressible limit. As an outcome, we provide an upper bound on the Hausdorff dimension of free boundaries and show that the limiting free boundary has finite \((d-1)\)-dimensional Hausdorff measure. This is joint work with Jiajun Tong.

When: November 1, 2024, from 2:30 – 3:30 p.m.

Where: LeConte 440

Speaker: Yupei Huang (Duke University)

Abstract: Classification of the steady states for the 2D Euler equation is a classical topic in fluid mechanics. In this talk, we consider the rigidity of the analytic steady states in bounded, simply connected domains. By studying an over-determined elliptic problem in Serrin type, we show the stream functions for the steady state are either radial functions or solutions to semi-linear elliptic equations.  This is the joint work with Tarek Elgindi, Ayman Said, and Chunjing Xie.

When: October 25, 2024, from 2:30 – 3:30 p.m.

Where: LeConte 440

Speaker: Mitchel J. Colebank (University of South Carolina)

Abstract: Mathematical modeling of the cardiovascular system is now recognized as a tool for disease management and is prominent in the development of physiological digital twins. A common model is the one-dimensional (1D) pulse wave propagation model, which is a hyperbolic system of PDEs derived from Naiver-Stokes in cylindrical coordinates with only one spatial coordinate. These models then need to be personalized, requiring the solution of a limited-data, noisy inverse problem. In this talk, we will introduce the 1D system and its application in computational blood flow modeling. We will discuss how emulation techniques (e.g., polynomial chaos expansions and Gaussian processes) can help speed up computation time for both forward and inverse problems. We will then see the application of this speed up in determining parameter identifiability using a robust method called the profile-likelihood, and show how emulation is necessary for near real time predictions as part of the future of digital twin technologies.

When: October 11, 2024, from 2:30 – 3:30 p.m.

Where: LeConte 440

Speaker: Xiaoqian Xu (Duke Kunshan University)

Abstract: In the study of incompressible fluid, one fundamental phenomenon that arises in a wide variety of applications is dissipation enhancement by so-called mixing flow. In this talk, I will give a brief introduction to the idea of mixing flow and the role it plays in the field of advection-diffusion-reaction equation. I will also discuss the examples of such flows in this talk.

When: August 30, 2024, from 3:40 – 4:30 p.m.

Where: LeConte 440

Speaker: Viktor Stein (TU Berlin)

Abstract: We give a comprehensive description of Wasserstein gradient flows of maximum mean discrepancy (MMD) functionals \(F_\nu := \text{MMD}_K^2(\cdot, \nu)\) towards given target measures \(\nu\) on the real line, where we focus on the negative distance kernel \(K(x,y) := -|x-y|\). In one dimension, the Wasserstein-2 space can be isometrically embedded into the cone \(C(0,1) \subset L_2(0,1)\) of quantile functions leading to a characterization of Wasserstein gradient flows via the solution of an associated Cauchy problem on \(L_2(0,1)\). Based on the construction of an appropriate counterpart of \(F_\nu\) on \(L_2(0,1)\) and its subdifferential, we provide a solution of the Cauchy problem. For discrete target measures \(\nu\), this results in a piecewise linear solution formula. We prove invariance and smoothing properties of the flow on subsets of \(C(0,1)\). For certain \(F_\nu\)-flows this implies that initial point measures instantly become absolutely continuous, and stay so over time. Finally, we illustrate the behavior of the flow by various numerical examples using an implicit Euler scheme and demonstrate differences to the explicit Euler scheme, which is easier to compute, but comes with limited convergence guarantees. This is joint work with Richard Duong (TU Berlin), Robert Beinert (TU Berlin), Johannes Hertrich (UCL) and Gabriele Steidl (TU Berlin).

Joint Seminar with the RTG Seminars on Data Science

Previous Years

Organized by: Changhui Tan (tan@math.sc.edu) & Siming He (siming@mailbox.sc.edu)

Unless otherwise noted, the seminar will be held on Fridays from 2:30pm to 3:30pm in LeConte 440.

This page will be update as new seminar information becomes available. Check back often for the most up to date information!

When: April 19th 2024 from 2:30 p.m. to 3:30 p.m.

Where: LeConte 440

Speaker: Weinan Wang (University of Oklahoma)

Abstract:  In this talk, I will talk about some recent well-posedness and stability results for three incompressible fluid equations. More precisely, I will first discuss a global well-posedness result for the 2D Boussinesq equations with fractional dissipation and the long-time behavior of solutions. For the Oldroyd-B model, we show that small smooth data lead to global and stable solutions. When the Navier-Stokes is coupled with the magnetic field in the magneto-hydrodynamics (MHD) system, solutions near a background magnetic field are shown to be always global in time. The magnetic field stabilizes the fluid. In the examples for Oldroyd-B and MHD, the systems governing the perturbations can be converted to damped wave equations, which reveal the smoothing and stabilizing effect. If time permits, I will discuss some open problems.

When: April 12th 2024 from 2:30 p.m. to 3:30 p.m.

Where: LeConte 440

Speaker: Feng Bao (Florida State University)

Abstract: Generative machine learning models, including variational auto-encoders (VAE), normalizing flows (NF), generative adversarial networks (GANs), diffusion models, have dramatically improved the quality and realism of generated content, whether it's images, text, or audio. In science and engineering, generative models can be used as powerful tools for probability density estimation or high-dimensional sampling that critical capabilities in uncertainty quantification (UQ), e.g., Bayesian inference for parameter estimation. Studies on generative models for image/audio synthesis focus on improving the quality of individual sample, which often make the generative models complicated and difficult to train. On the other hand, UQ tasks usually focus on accurate approximation of statistics of interest without worrying about the quality of any individual sample, so direct application of existing generative models to UQ tasks may lead to inaccurate approximation or unstable training process. To alleviate those challenges, we developed several new generative diffusion models for various UQ tasks, including diffusion-model-assisted supervised learning of generative models, a score-based nonlinear filter for recursive Bayesian inference, and a training-free ensemble score filter for tracking high dimensional stochastic dynamical systems. We will demonstrate the effectiveness of those methods in various UQ tasks including density estimation and data assimilation problems.

When: March 1st 2024 from 2:30 p.m. to 3:30 p.m.

Where: LeConte 440

Speaker: Ruiwen Shu (University of Georgia)

Abstract: I will discuss the behavior of interaction energy minimizers on bounded domains. When the interaction potential is more singular than Newtonian, the mass does not tend to concentrate on the boundary; when it is Newtonian or less singular, the mass necessarily concentrates on the boundary for purely repulsive potentials. We also draw a connection between bounded-domain minimizers and whole-space minimizes.

When: February 28th 2024 from 2:30 p.m. to 3:30 p.m.

Where: Virtual via Zoom

Speaker: David Pardo (University of the Basque Country, Spain)

Abstract: Download here [PDF]

Join Zoom Meeting
Meeting ID: 982 8541 4873
Passcode: 839056

When: December 8th 2023 from 2:30pm to 3:30pm

Where: Virtual via Zoom

Speaker: Yuan-Nan Young (New Jersey Institute of Technology)

Host: Paula Vasquez

Abstract: The Stoichiometric Model for the interaction of centrosomes with cortically anchored pulling motors, through their associated microtubules (MTs), has been applied to study key steps in the cell division such as spindle positioning and elongation. In this work we extend the original Stoichiometric Model to incorporate (1) overlap in the cortical motors, and  (2) the dependence of velocity in the detachment rate of  MTs from the cortical motors. We examine the effects of motor overlap and velocity-dependent detachment rate on the centrosome dynamics, such as the radial oscillation around the geometric center of the cell, the nonlinear nature (supercritical and subcritical Hopf bifurcation) of such oscillation, and the nonlinear orbital motions previously found for a centrosome. We explore biologically feasible parameter regimes where these effects may lead to significantly different centrosome/nucleus dynamics. Furthermore we use this extended Stoichiometric Model to study the migration of a nucleus being positioned by a centrosome.  This is joint work with Justin Maramuthal, Reza Farhadifar and Michael Shelley.

Click to Join Zoom 

Meeting ID: 942 9769 4178
Passcode: 488494

When:  December 1st 2023 from 3:40pm to 4:40pm

Where:  LeConte 440

Speaker: Yuehaw Khoo (University of Chicago)

Host: Wuchen Li  (Joint RTG Seminar)

Abstract: Tensor-network ansatz has long been employed to solve the high-dimensional Schrödinger equation, demonstrating linear complexity scaling with respect to dimensionality. Recently, this ansatz has found applications in various machine learning scenarios, including supervised learning and generative modeling, where the data originates from a random process. In this talk, we present a new perspective on randomized linear algebra, showcasing its usage in estimating a density as a tensor-network from i.i.d. samples of a distribution, without the curse of dimensionality, and without the use of optimization techniques. Moreover, we illustrate how this concept can combine the strengths of particle and tensor-network methods for solving high-dimensional PDEs, resulting in enhanced flexibility for both approaches.

When: November 17th, 2023 from 2:30pm to 3:30pm

Where:  LeConte 440

Speaker:  Quyuan Lin (Clemson University)

Host: Changhui Tan

Abstract: Large scale dynamics of the ocean and the atmosphere are governed by the primitive equations (PE). In this presentation, I will first review the derivation of the PE and some well-known results for this model, including well-posedness of the viscous PE and ill-posedness of the inviscid PE. The focus will then shift to discussing singularity formation and the stability of singularities for the inviscid PE, as well as the effect of fast rotation (Coriolis force) on the lifespan of the analytic solutions. Finally, I will talk about a machine learning algorithm, the physics-informed neural networks (PINNs), for solving the viscous PE, and its rigorous error estimate.

When: November 3rd, 2023 from 2:30pm to 3:30pm

Where:  LeConte 440

Speaker:  Xiantao Li (Penn State University)

Host: Yi Sun

Abstract: Quantum computing has recently emerged as a potential tool for large-scale scientific computing. In sharp contrast to their classical counterparts, quantum computers use qubits that can exist in superposition, potentially offering exponential speedup for many computational problems. Current quantum devices are noisy and error-prone, and in near term, a hybrid approach is more appropriate. I will discuss this hybrid framework using three examples: quantum machine learning, quantum algorithms for density-functional theory and quantum optimal control. In particular, this talk will outline how quantum algorithms can be interfaced with a classical method, the convergence properties and the overall complexity.

When: October 27nd, 2023 from 2:30pm to 3:30pm

Where:  LeConte 440

Speaker: Adrian Tudorascu (West Virginia University)

Host: Changhui Tan

Abstract: We study Zeldovich's Sticky-Particles system when the evolution is confined to arbitrary closed subsets of the real line. Only the sticky boundary condition leads to a rigorous formulation of the initial value problem, whose well-posedness is proved under the Oleinik and initial strong continuity of energy conditions. For solutions confined to compact sets a long-time asymptotic limit is shown to exist.

When: October 27nd, 2023 from 3:40pm to 4:40pm

Where:  LeConte 440

Speaker: Jiajia Yu (Duke University)

Host: Wuchen Li (Joint RTG Seminar)

Abstract: Mean-field games study the Nash Equilibrium in a non-cooperative game with infinitely many agents. Most existing works study solving the Nash Equilibrium with given cost functions. However, it is not always straightforward to obtain these cost functions. On the contrary, it is often possible to observe the Nash Equilibrium in real-world scenarios. In this talk, I will discuss a bilevel optimization approach for solving inverse mean-field game problems, i.e., identifying the cost functions that drive the observed Nash Equilibrium. With the bilevel formulation, we retain the essential characteristics of convex objective and linear constraint in the forward problem. This formulation permits us to solve the problem using a gradient-based optimization algorithm with a nice convergence guarantee. We focus on inverse mean-field games with unknown obstacles and unknown metrics and establish the numerical stability of these two inverse problems. In addition, we prove and numerically verify the unique identifiability for the inverse problem with unknown obstacles. This is a joint work with Quan Xiao (RPI), Rongjie Lai (Purdue) and Tianyi Chen (RPI).

When: September 29th, 2023 from 3:40pm to 4:40pm

Where: LeConte 440

Speaker: Qi Feng (Florida State University)

Host: Wuchen Li (Joint RTG Seminar)

Abstract: In this talk, I will discuss long-time dynamical behaviors of Langevin dynamics, including Langevin dynamics on Lie groups and mean-field underdamped Langevin dynamics. We provide unified Hessian matrix conditions for different drift and diffusion coefficients. This matrix condition is derived from the dissipation of a selected Lyapunov functional, namely the auxiliary Fisher information functional. We verify the proposed matrix conditions in various examples. I will also talk about the application in distribution sampling and optimization. This talk is based on several joint works with Erhan Bayraktar and Wuchen Li.

When: September 22nd, 2023 from 3:40pm-4:40pm

Where: LeConte 440

Speaker: Guosheng Fu (University of Notre Dame)

Host: Wuchen Li (Joint RTG Seminar)

Abstract: We design and compute first-order implicit-in-time variational schemes with high-order spatial discretization for initial value gradient flows in generalized optimal transport metric spaces. We first review some examples of gradient flows in generalized optimal transport spaces from the Onsager principle. We then use a one-step time relaxation optimization problem for time-implicit schemes, namely generalized Jordan-Kinderlehrer-Otto schemes. Their minimizing systems satisfy implicit-in-time schemes for initial value gradient flows with first-order time accuracy. We adopt the first-order optimization scheme ALG2 (Augmented Lagrangian method) and high-order finite element methods in spatial discretization to compute the one-step optimization problem. This allows us to derive the implicit-in-time update of initial value gradient flows iteratively. We remark that the iteration in ALG2 has a simple-to-implement point-wise update based on optimal transport and Onsager's activation functions. The proposed method is unconditionally stable for convex cases. Numerical examples are presented to demonstrate the effectiveness of the methods in two-dimensional PDEs, including Wasserstein gradient flows, Fisher--Kolmogorov-Petrovskii-Piskunov equation, and two and four species reversible reaction-diffusion systems. This is a joint work with Stanley Osher from UCLA and Wuchen Li from University South Carolina.

When: September 1st, 2023 from 2:30pm-3:30pm

Where: Leconte 440

Speaker: Tianyi Lin (Massachusetts Institute of Technology)

Host: Wuchen Li (Joint RTG Seminar)

Abstract: Reliable and multi-agent machine learning has seen tremendous achievements in recent years; yet, the translation from minimization models to min-max optimization models and/or variational inequality models --- two of the basic formulations for reliable and multi-agent machine learning --- is not straightforward. In fact, finding an optimal solution of either nonconvex-nonconcave min-max optimization models or nonmonotone variational inequality models is computationally intractable in general. Fortunately, there exist special structures in many application problems, allowing us to define reasonable optimality criterion and develop simple and provably efficient algorithmic schemes. In this talk, I will present the results on structure-driven algorithm design in reliable and multi-agent machine learning. More specifically, I explain why the nonconvex-concave min-max formulations make sense for reliable machine learning and show how to analyze the simple and widely used two-timescale gradient descent ascent by exploiting such special structure. I also show how a simple and intuitive adaptive scheme leads to a class of optimal second-order variational inequality methods. Finally, I discuss two future research directions for reliable and multi-agent machine learning with potential for significant practical impacts: reliable multi-agent learning and reliable topic modeling.

Previous seminar information can be found on Dr. Changhui Tan's website


Challenge the conventional. Create the exceptional. No Limits.

©