• 学院内网
搜索
返回主站
English
  • 学院概况
    • 概览
    • 学科方向
    • 院长致辞
    • 学院刊物
      • 宣传手册
      • 季度简报
      • 年报
    • 常见问题
    • 联系我们
  • 项目设置
    • 简介
    • 本科生
      • 数据科学与大数据技术
      • 统计学
      • 计算机科学与技术
      • 金融工程
      • 2+2双主修
        • 跨学科数据分析 + X 双主修课程
        • 航天科学与地球信息学 + X 双主修课程
      • 哥伦比亚大学工程学院3+2直硕项目(哥大班)
    • 硕士研究生
      • 数据科学理学硕士
      • 金融工程理学硕士(全日/兼读制)
      • 人工智能与机器人理学硕士
      • 计算机科学理学硕士
      • 统计学理学硕士
      • 生物信息学理学硕士
    • 博士研究生(哲学硕士)
      • 数据科学哲学硕士-博士
      • 计算机科学哲学硕士-博士
  • 师资力量
    • 教职人员
    • 荣休教授
    • 兼职人员
    • 科研/访问人员
    • “数说名师”教授访谈
  • SDS学生
    • 本科生学业咨询系统
    • 博士生
    • 学生访谈
  • 新闻与公示
    • 新闻
    • 公示
  • 学院活动
    • 学术会议
      • DDTOR 2025
      • CSAMSE 2023
      • RMTA 2023
      • ICASSP 2022
      • Mostly OM 2019
    • 学术活动
    • 数据科学名家讲坛
    • 其他活动
  • 学术科研
  • 人才招聘
    • 教职人员
    • 博士后
  • 职业发展
    • 升学就业
    • 国际交流
  • 学院概况
    • 概览
    • 学科方向
    • 院长致辞
    • 学院刊物
      • 宣传手册
      • 季度简报
      • 年报
    • 常见问题
    • 联系我们
  • 项目设置
    • 简介
    • 本科生
      • 数据科学与大数据技术
      • 统计学
      • 计算机科学与技术
      • 金融工程
      • 2+2双主修
        • 跨学科数据分析 + X 双主修课程
        • 航天科学与地球信息学 + X 双主修课程
      • 哥伦比亚大学工程学院3+2直硕项目(哥大班)
    • 硕士研究生
      • 数据科学理学硕士
      • 金融工程理学硕士(全日/兼读制)
      • 人工智能与机器人理学硕士
      • 计算机科学理学硕士
      • 统计学理学硕士
      • 生物信息学理学硕士
    • 博士研究生(哲学硕士)
      • 数据科学哲学硕士-博士
      • 计算机科学哲学硕士-博士
  • 师资力量
    • 教职人员
    • 荣休教授
    • 兼职人员
    • 科研/访问人员
    • “数说名师”教授访谈
  • SDS学生
    • 本科生学业咨询系统
    • 博士生
    • 学生访谈
  • 新闻与公示
    • 新闻
    • 公示
  • 学院活动
    • 学术会议
      • DDTOR 2025
      • CSAMSE 2023
      • RMTA 2023
      • ICASSP 2022
      • Mostly OM 2019
    • 学术活动
    • 数据科学名家讲坛
    • 其他活动
  • 学术科研
  • 人才招聘
    • 教职人员
    • 博士后
  • 职业发展
    • 升学就业
    • 国际交流
  • 学院内网
返回主站
English

面包屑

  • 首页
  • 学院活动
  • 学术活动
  • 【学术讲座】 Topics in Random Matrix Theory

【学术讲座】 Topics in Random Matrix Theory

2022-10-26 学术活动

演讲嘉宾与议题详细介绍

Zhenyu LIAO
Huazhong University of Science and Technology (HUST)

主题:Random Matrix Methods for Machine Learning: An Application to “Lossless” Compression of Large and Deep Neural Networks

摘要:

The advent of the Big Data era has triggered a renewed interest in large-dimensional machine learning (ML) and deep neural networks (DNNs). These methods, being developed from small-dimensional intuitions, often behave dramatically different from their original designs and tend to be inefficient on large-dimensional datasets. By assuming both dimension and size of the datasets to be large, recent advances in random matrix theory (RMT) provide novel insights, allowing for a renewed understanding and the possibility to design more efficient machine learning approaches, thereby opening the door to completely new paradigms.

In this talk, we will start with the “curse of dimensionality” phenomenon in high dimensions and highlight how these counterintuitive phenomena arise in ML practice when large-dimensional data are considered. By focusing on the concrete problem of compressing both shallow and deep neural networks, we show how the proposed theory can be applied to design efficient DNN compression schemes with strong performance guarantee.

个人简介:

Zhenyu Liao received his M.Sc. in Signal and Image Processing in 2016, and his Ph.D. in Computer Science in 2019, both from University of Paris-Saclay, France. In 2020 he was a postdoctoral researcher with the Department of Statistics, University of California, Berkeley. He is currently an associated professor with Huazhong University of Science and Technology (HUST), China. His research interests are broadly in machine learning, signal processing, random matrix theory, and high-dimensional statistics. He published more than 20 papers on top-tier machine learning conferences such as ICML, NeurIPS, ICLR, COLT, AISTATS, etc., and he co-authored the book “Random Matrix Methods for Machine Learning.” He is the recipient of the 2021 Wuhan Youth Talent Fellowship, and the 2019 ED STIC Ph.D. Student Award of University of Paris-saclay, France.

Tiefeng JIANG
University of Minnesota

主题:Distances between Random Orthogonal Matrices and Independent Normals

摘要:

We study the distance between Haar-orthogonal matrices and independent normal random variables. The distance is measured by the total variation distance, the Kullback-leibler distance, the Hellinger distance and the Euclidean distance. They appear different features. Optimal rates are obtained. This is a joint work with Yutao Ma.

个人简介:

Prof. Jiang obtained his Ph.D. degree from Stanford University in 1999. His research covers Random matrices, Random graphs, Probability, Symmetric polynomials, High-dimensional statistics, Connections to physics and computer science.

Cosme LOUART
GIPSA Lab, Université Joseph Fourier

主题:Rigorous non-Gaussian Approach of the Statistics of Convex Problems in Machine Learning

摘要:

Convex problem formulation is a very convenient setting in machine learning algorithm because the uniqueness of the solution ensure the reproducibility and the efficiency of the computation. However, the implicit definition of the solution and the non-linearity makes the study of the statistics and the performances quite difficult.

This talk takes the M-estimators as a toy model for the study of convex problems. Some heuristics already provided close form formulas for the statistics, and we will further provide in the presentation mathematical tools to prove rigorously those heuristics. The study goes beyond the traditional Gaussian assumption thanks to a concentration of the measure hypothesis on the data that has the two advantages of (i) representing a very wide range of realistic applications (ii) sharing with the Gaussian distribution the concentration properties necessary for the statistical inferences. Concentration of the measure perfectly fits the random matrix theory requirements which allows to tackle smoothly and rigorously the high dimension of the problem.

个人简介:

After a diploma from ENS Paris, Cosme Louart received a M.Sc. on Machine learning (MVA) at ENS Paris Saclay. Then he started a Ph.D. at the GIPSA lab in Grenoble that will be defended on the beginning of next year. He published several articles in mathematics and machine learning journals (ICML, AISTATS, IEEE, Annals of applied probability).

Zeng LI
Southern University of Science and Technology

主题:Asymptotic Normality for Eigenvalue Statistics of a General Sample covariance Matrix When p/n →∞ and Applications

摘要:

The asymptotic normality for a large family of eigenvalue statistics of a general sample covariance matrix is derived under the ultra-high dimensional
setting, that is, when the dimension to sample size ratio p/n→∞. Based on this CLT result, we extend the covariance matrix test problem to the new
ultrahigh dimensional context and apply it to test a matrix-valued white noise. Simulation experiments are conducted for the investigation of finite-sample
properties of the general asymptotic normality of eigenvalue statistics, as well as the two developed tests.

个人简介:

Dr. Li is currently an associate professor in the Department of Statistics and Data Science, Southern University of Science and Technology. Previously she was a postdoctoral fellow in the Department of Statistics at the Pennsylvania State University. Dr. Li obtained her Ph.D. degree from the Department of Statistics and Actuarial Science at the University of Hong Kong. Dr. Li’s research covers random matrix theory and high dimensional statistics.

​

 

地址: 广东省深圳市龙岗区龙翔大道2001号道远楼3-6楼
邮箱: sds@cuhk.edu.cn
微信公众号: cuhksz-sds

sds.cuhk.edu.cn

版权所有 © 香港中文大学(深圳)数据科学学院