One-Bit Phase Retrieval

Speaker: 

Junren Chen

Institution: 

The University of Hong Kong

Time: 

Wednesday, March 5, 2025 - 3:00pm to 4:00pm

Location: 

510R Rowland Hall

Abstract: In this talk, we present our results for the 1-bit phase retrieval problem: the recovery of a possibly sparse signal $x$ from the observations $y=sign(|Ax|-1)$. This problem is closely related to 1-bit compressed sensing where we observe $y=sign(Ax)$, and phase retrieval where we observe $y=|Ax|$. We show that the major findings in 1-bit compressed sensing theory, including hyperplane tessellation, optimal rate and a very recent efficient and optimal algorithm, can be developed in this phase retrieval setting. This is a joint work with Ming Yuan: https://arxiv.org/abs/2405.04733

The delocalization conjecture for random band matrices

Speaker: 

Jun Yin

Institution: 

UCLA

Time: 

Wednesday, February 26, 2025 - 3:00pm to 4:00pm

Location: 

510R Rowland Hall

In this talk, we present a new proof of the delocalization conjecture for random band matrices. The conjecture asserts that for an $N\times N$ random matrix with bandwidth $W$, the eigenvectors in the bulk of the spectrum are delocalized provided that $W \gg N^{1/2}$. Moreover, in this regime, the eigenvalue distribution aligns with that of Gaussian random ensembles (i.e., GOE or GUE). Our proof employs a novel loop hierarchy method and leverages the sum-zero property, a concept that was instrumental in the previous work on high-dimensional random matrices. 

 

This work is a joint collaboration with H.T. Yau.

Locally Sampleable Uniform Symmetric Distributions

Speaker: 

Kewen Wu

Institution: 

UC Berkeley

Time: 

Wednesday, January 29, 2025 - 3:00pm to 4:00pm

Host: 

Location: 

Rowland Hall 510R

We characterize the power of constant-depth Boolean circuits in generating uniform symmetric distributions. Let $f:\{0,1\}^m \rightarrow \{0,1\}^n$ be a Boolean function where each output bit of $f$ depends only on $O(1)$ input bits. Assume the output distribution of $f$ on uniform input bits is close to a uniform distribution $D$ with a symmetric support. We show that $D$ is essentially one of the following six possibilities: (1) point distribution on $0^n$, (2) point distribution on $1^n$, (3) uniform over $\{0^n,1^n\}$, (4) uniform over strings with even Hamming weights, (5) uniform over strings with odd Hamming weights, and (6) uniform over all strings. This confirms a conjecture of Filmus, Leigh, Riazanov, and Sokolov (RANDOM 2023).
Joint work with Daniel M. Kane and Anthony Ostuni.

Bias and Free Infinite Divisibility

Speaker: 

Todd Kemp

Institution: 

UCSD

Time: 

Wednesday, January 22, 2025 - 3:00pm to 4:00pm

Location: 

510R Rowland Hall

Size bias and other distributional transforms play an important role in sampling problems.  They are also very useful tools in sharp Normal (and other distributional) approximation, giving slick Stein's method quantitative proofs of Central Limit Theorems.  Recently, Goldstein and Schmook discovered a connection between size bias and infinitely divisible distributions, yielding a new kind of Levy--Khintchine formula for positively supported distributions.

 

In this talk, I will discuss joint work with Goldstein exploring the free probability analogues of bias transforms, with applications to freely infinitely divisible distributions.  In most cases the classical results are transferable, but with a significant change in perspective required.  Among other things, this approach gives a probabilistic meaning to the (free) Levy--Khintchine measure, and a new result simply characterizing those distributions that are positively freely infinitely divisible.

Pages

Subscribe to RSS - Combinatorics and Probability