Event Archive‎ > ‎

Large Low-Rank Matrices

Abstract:

The rank of a matrix is the number of linearly independent rows (or columns) that the matrix has. Modeling using low-rank matrices typically works best for the many scenarios where data has high ambient dimension but low intrinsic dimension; indeed, methods like PCA, matrix factorization etc. are now bedrock methods in data analysis. 


In this talk we develop new techniques for low-rank modeling for scenarios where the size of the matrix is very large. We cover two tasks: 

(1) doing low-rank approximation (i.e. PCA) with only two passes of the data. 
(2) jointly identifying outliers, and fitting a low-rank model to the remaining data points. 

For both these tasks, we provide best in class theoretical guarantees. However, this talk will focus on the methods and their computational advantages. 

The talk will be accessible to a broad audience.

Speaker: 

Sujay Sanghavi Ph.D is an Associate Professor in UT Austin. His interests lie in large-scale machine learning, and performance modeling of communication networks. He is a recipient of the NSF CAREER award and the DTRA young investigator award. He has also been a visiting scientist at Google and Qualcomm.

Registration:

Location:

THE ADVISORY BOARD - BUILDING 7 (map - http://bit.ly/PA804c)

Room Number: Suite 100

12357-C Riata Trace Parkway

Bldg 7, Suite 100

Austin, Texas

United States 78727

Meeting Agenda:

6:30 p.m. Networking and Gathering (with free food, drinks)

6:50 p.m. Call to Order, Announcement

7:00 p.m. Presentation, with Q/A

8:30 p.m. Meeting Evaluation, Adjourn


Comments