Sums of dilates

Speaker: 

Jeck Lim

Institution: 

Caltech

Time: 

Wednesday, October 30, 2024 - 3:00pm to 4:00pm

Host: 

Location: 

510R Rowland Hall

For any subset $A$ of a commutative ring $R$ (or, more generally, an $R$-module $M$) and any elements $\lambda_1, \dots, \lambda_k$ of $R$, let

\[\lambda_1 \cdot A + \cdots + \lambda_k \cdot A = \{\lambda_1 a_1 + \cdots + \lambda_k a_k : a_1, \dots, a_k \in A\}.\]

Such sums of dilates have attracted considerable attention in recent years, with the basic problem asking for an estimate on the minimum size of $|\lambda_1 \cdot A + \cdots + \lambda_k \cdot A|$ given $|A|$. In this talk, I will discuss various generalizations and settings of this problem, and share recent progress. This is based on joint work with David Conlon.

Independence number of hypergraphs under degree conditions

Speaker: 

Marcelo Sales

Institution: 

UCI

Time: 

Wednesday, October 23, 2024 - 3:00pm to 4:00pm

Host: 

Location: 

510R Rowland Hall

A well-known result of Ajtai et al. from 1982 states that every $k$-graph $H$ on $n$ vertices, with girth at least five, and average degree $t^{k-1}$ contains an independent set of size $c n \frac{(\log t)^{1/(k-1)}}{t}$ for some $c>0$. In this talk, we explore a related problem where we relax the girth condition, allowing certain cycles of length 2, 3, and 4. We will also present lower bounds on the size of independent sets in hypergraphs under specific degree conditions. This is joint work with Vojtěch Rödl and Yi Zhao. 

Randomly sparsified Richardson iteration is really fast

Speaker: 

Robert Webber

Institution: 

UCSD

Time: 

Wednesday, November 6, 2024 - 3:00pm to 4:00pm

Location: 

510R Rowland Hall

Recently, a class of algorithms combining classical fixed point iterations with repeated random sparsification of approximate solution vectors has been successfully applied to eigenproblems with matrices as large as 10^{108} x 10^{108}. So far, a complete mathematical explanation for their success has proven elusive. The family of random sparsification methods has not yet been extended to the important case of linear system solves. This talk proposes a new algorithm based on repeated random sparsification that is capable of solving linear systems in extremely high dimensions and provides a complete mathematical analysis of the new algorithm. The analysis establishes a faster-than-Monte Carlo convergence rate and justifies use of the scheme even when the solution vector is too large to store.

Fairness and Foundations in Machine Learning

Speaker: 

Deanna Needell

Institution: 

UCLA

Time: 

Wednesday, November 13, 2024 - 3:00pm to 4:00pm

In this talk, we will address areas of recent work centered around the themes of fairness and foundations in machine learning as well as highlight the challenges in this area. We will discuss recent results involving linear algebraic tools for learning, such as methods in non-negative matrix factorization that include tailored approaches for fairness. We will showcase our derived theoretical guarantees as well as practical applications of those approaches.  Then, we will discuss new foundational results that theoretically justify phenomena like benign overfitting in neural networks.  Throughout the talk, we will include example applications from collaborations with community partners, using machine learning to help organizations with fairness and justice goals. 

Introduction to Robust Statistics III

Speaker: 

Pedro Abdalla

Institution: 

UCI

Time: 

Wednesday, October 16, 2024 - 3:00pm to 4:00pm

Host: 

Location: 

RH 510R

Robust Statistics is a classical topic that dates back to the seminal work of Huber in the 1980s. In essence, the main goal of the field is to account for the effect of outliers when performing estimation tasks, such as mean estimation. A recent line of research, inspired by the seminal work of Catoni, has revisited some classical problems in robust statistics from a non-asymptotic perspective. The goal of this short seminar series is to introduce the key ideas related to robust estimation and discuss various notions of robustness, including heavy-tailed distributions and adversarial contamination. The primary example will be the mean estimation problem, and if time permits, I will also cover covariance estimation

Introduction to Robust Statistics II

Speaker: 

Pedro Abdalla

Institution: 

UCI

Time: 

Wednesday, October 9, 2024 - 3:00pm to 4:00pm

Host: 

Location: 

RH 510R

Robust Statistics is a classical topic that dates back to the seminal work of Huber in the 1980s. In essence, the main goal of the field is to account for the effect of outliers when performing estimation tasks, such as mean estimation. A recent line of research, inspired by the seminal work of Catoni, has revisited some classical problems in robust statistics from a non-asymptotic perspective. The goal of this short seminar series is to introduce the key ideas related to robust estimation and discuss various notions of robustness, including heavy-tailed distributions and adversarial contamination. The primary example will be the mean estimation problem, and if time permits, I will also cover covariance estimation

Introduction to Robust Statistics I

Speaker: 

Pedro Abdalla

Institution: 

UCI

Time: 

Wednesday, October 2, 2024 - 3:00pm to 4:00pm

Host: 

Location: 

RH 510R

Robust Statistics is a classical topic that dates back to the seminal work of Huber in the 1980s. In essence, the main goal of the field is to account for the effect of outliers when performing estimation tasks, such as mean estimation. A recent line of research, inspired by the seminal work of Catoni, has revisited some classical problems in robust statistics from a non-asymptotic perspective. The goal of this short seminar series is to introduce the key ideas related to robust estimation and discuss various notions of robustness, including heavy-tailed distributions and adversarial contamination. The primary example will be the mean estimation problem, and if time permits, I will also cover covariance estimation

A new proof of Friedman's second eigenvalue theorem with strong implications

Speaker: 

Jorge Garza Vargas

Institution: 

Caltech

Time: 

Wednesday, June 5, 2024 - 2:00pm

Location: 

510R Rowland Hall

In 2004 J. Friedman wrote a ~100 page paper proving a conjecture of Alon which stated that random d-regular graphs are nearly optimal expanders. Since then, Friedman's result has been refined and generalized in several directions, perhaps most notably by Bordenave and Collins who in 2019 established strong convergence of independent permutation matrices (a massive generalization of Friedman's theorem), a result that led to groundbreaking results in spectral theory and geometry.  

In this talk I will present joint work with C. Chen, J. Tropp and R. van Handel, where we introduce a new proof technique that allows one to convert qualitative results in random matrix theory into quantitative ones. This technique yields a fundamentally new approach to the study of strong convergence which is more flexible and significantly simpler than the existing techniques. Concretely, we're able to obtain  (1) a remarkably short of Friedman's theorem (2) a quantitative version of the result of Bordenave and Collins (3) a proof of strong convergence for arbitrary stable representations of the symmetric group, which constitutes a substantial generalization of the result of Bordenave and Collins. 

Pages

Subscribe to RSS - Combinatorics and Probability