Page Not Found
Page not found. Your pixels are in another canvas.
A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.
Page not found. Your pixels are in another canvas.
About me
This is a page not in th emain menu
Published:
This post will show up by default. To disable scheduling of future posts, edit config.yml
and set future: false
.
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Published in Journal of Computational and Applied Mathematics, 2022
Analysis of eigenvalue methods for multivariate numerical rootfinding.
Published in 37th Annual Conference on Learning Theory (COLT), 2024
Establishes a separation in the representation cost and sample complexity needed to approximate functions with two vs. three layer neural networks.
Published in arXiv Preprint, 2025
Analyzes how deep linear neural networks trained via gradient descent with weight decay automatially adapt to structure in data from inverse problems.
Published in SIAM Journal on Mathematics of Data Science, 2025
Connects the representation cost of neural networks with 1 ReLU layer and many linear layers to the spectrum of the expected gradient outer product matrix (EGOP), showing that this architecture is biased towards single- and multi-index models.
Published in PLOS Computational Biology, 2025
Highlights principles of effective strategies for trainee-led computational biology educational outreach.
Published:
A fundamental question in the theory of neural networks is the role of depth. Empirically it is widely known that deeper networks tend to perform better than shallow ones. However, the reasoning behind this phenomenon is not well understood. In this talk I will discuss the role of depth in the simplified case where most of the layers have a linear activation. Specifically, the regularization associated with training a neural network with many linear layers followed by a single ReLu layer using weight decay is equivalent to a function-space penalty that encourages the network to select a low-rank function, i.e. one with small active subspace.
Published:
Abstract: Machine learning methods are increasingly used to solve inverse problems, wherein a signal must be estimated from few measurements generated via a known acquisition procedure. While approaches based on neural networks perform well empirically, they have limited theoretical guarantees. Specifically, it is unclear whether neural networks can reliably take advantage of low-dimensional structure shared by signals of interest — thus facilitating recovery in settings where the signal dimension far exceeds the number of available measurements. In this talk, I will present a positive resolution to this question for the special case of underdetermined linear inverse problems. I will show that, when trained with standard techniques and without explicit guidance, deep linear neural networks automatically adapt to underlying low-dimensional structure in the data, resulting in improved robustness against noise. These results shed light on how neural networks generalize well in practice by naturally capturing hidden patterns in data.
Undergraduate course, University 1, Department, 2014
This is a description of a teaching experience. You can use markdown like any other post.
Workshop, University 1, Department, 2015
This is a description of a teaching experience. You can use markdown like any other post.