**Speaker**: Ata Kabán (Birmingham)

**Title**: Uncovering benign input geometries with random projections

**Abstract**: In this talk we consider two fundamental questions in statistical machine learning (an old and a new):

Q1: Given a machine learning problem, what kinds of data distributions make it easier or harder to be solved? For instance, it is known that large margin makes classification problems easier.

Q2: Given a big data problem in machine learning, when can we solve it from a few random projections of the data, perhaps trading off some accuracy for the computational efficiency gain? This is the compressed learning problem.

We will present selections from our results, and work in progress, highlighting some parallels between these two questions, which lead to the hypothesis that learning with random projections is not just a timely subject with practical relevance in big data problems, but it can also help us make a previously elusive question more approachable. On the flip side, the parallel also helps us broaden the previous understanding of compressed learning, and start figuring out which compressive learning problems become easier if the original data had a sparse representation, and which problems are unaffected.