报告人：Dr. Linglong Kong
Department of Mathematical and Statistical Sciences
The University of Alberta
Abstract：Matrix factorization has wide applications in recommender systems and signal processing. Existing matrix factorization methods are mostly based on squared loss and aim to yield a low-rank matrix to interpret conditional sample means. However, in many real applications with extreme data, least squares cannot explain their central tendency or tail distributions, incurring undesired estimates. In this talk, we will present a general framework of quantile matrix factorization (QMF), which introduces the check loss originated from quantile regression into matrix factorization. However, the non-smoothness of the check loss has brought significant challenges to numerical computation. We propose a nearly optimal and efficient algorithm to solve QMF by extending Nesterov's optimal smooth approximation procedure to the case of matrix factorization. We theoretically show that under certain conditions, the optimal solution to the proposed smooth approximation will converge to the optimal solution to the original nonsmooth and nonconvex QMF problem, with competitive convergence rates. We will also present numerical simulations based on synthetic and real-world data to verify our theoretical findings as well as algorithm performance.
Biography：Dr. Linglong Kong is an associate professor at the department of Mathematical and Statistical Sciences of the University of Alberta. He has published more than 30 peer-reviewed manuscripts including top journals AOS, JASA and JRSSB, and top conferences including ICML, AAAI and IJCAI. Currently, he is serving as associate editors of Journal of the American Statistical Association, International Journal of Imaging Systems and Technology, Canadian Journal of Statistics, and the ASA Statistical Imaging Session program chair. His research interests include high-dimensional data analysis, neuroimaging data analysis, robust statistics and quantile regression, and statistical machine learning.