skip to main content
Home  /  Projects

Projects

Here are a few ongoing research projects:

Randomized low-rank matrix approximation is an approach for compressing a large matrix to a low-rank factored form while preserving the data as accurately as possible. Randomized low-rank matrix approximation can accelerate machine learning algorithms for prediction and clustering, making it a vital tool for modern data science.

With Ethan Epperly and Joel Tropp, I am developing faster, more accurate algorithms for randomized low-rank matrix approximation, including "randomized block Krylov iteration" for general matrices and "randomly pivoted Cholesky" for positive semidefinite kernel matrices.


For more details, see:

  • Randomized algorithms for low-rank matrix approximation: Design, analysis, and applications [arXiv]
  • Randomly pivoted Cholesky: Practical approximation of a kernel matrix with few entry evaluations [arXiv]
  • 2023 seminar talk on YouTube

Clustering error
Our algorithm, randomized pivoted Cholesky (RPC), leads to near-perfect clustering of chemistry data, while other algorithms converge slowly. See [arXiv] for details.

Rare events can be highly impactful. Yet, estimating the probability p of a rare event by direct numerical simulation requires a very large sample size (>100p-1). Since generating such a large sample can be prohibitively expensive, are there more practical methods for calculating rare event probabilities?

To help answer this question, I have investigated "splitting and killing" algorithms for rare event probability estimation. These algorithms "split" selected trajectories to promote progress toward a rare event and randomly "kill" other trajectories to control the computational cost.

With Dorian Abbot, Sam Hadden, and Jonathan Weare, I recently applied splitting and killing to evaluate the probability that Mercury will become unstable and collide with another celestial body over the next 2.2 billion years. We calculated the probability to be ~10-4 and obtained a speed-up of nearly 100x over direct numerical simulation.


For more details, see:

QDMC
We applied splitting and killing every 0.2 Gyr, starting at 1.4 Gyr and ending at a target time of 2.4 Gyr. The vertical axis is the 30-Myr running average of Mercury's eccentricity. The x's signify a close encounter between Mercury and Venus.

The ground state and the first few excited states determine the fundamental properties of a quantum system at low temperatures. However, as the system size increases, it becomes exponentially more difficult to calculate eigenstates with traditional methods. To address this curse of dimensionality, I developed two modern Monte Carlo methods.

  1. With Michael Lindsey, I introduced the Rayleigh-Gauss-Newton (RGN) method, which uses Monte Carlo sampling to efficiently optimize a neural network model for the ground-state wavefunction.
  2. With Timothy Berkelbach, Samuel Greene, and Jonathan Weare, I developed Fast Randomized Iteration (FRI), which uses sampling to calculate the dominant eigenvalues and eigenvectors of a large matrix.

My collaborators and I have applied RGN to spin systems with up to 400 spins (hence 2400 possible spin configurations) and have applied FRI to molecules as large as oxo-Mn(salen) (which has 28 interacting electrons).


For more details, see:

RGN Method
Our new RGN optimization method for ground state wavefunctions leads to faster convergence and lower energy errors compared to the conventional method of natural gradient descent.