陈键飞

2021.10.08 09:05

职称 助理教授 电话
邮箱 jianfeic@mail.tsinghua.edu.cn

姓名:陈键飞

职称:助理教授

邮箱:jianfeic@mail.tsinghua.edu.cn

主页:https://ml.cs.tsinghua.edu.cn/~jianfei/

教育背景:

工学学士(计算机),清华大学,中国,2014

工学博士(计算机),清华大学,中国,2019

社会兼职:

2022,AAAI领域主席

2021-今,中国人工智能学会机器学习专委会通讯委员

研究领域:

机器学习,深度学习,贝叶斯方法

研究概况:

研究工作专注于深度学习的高效算法、基础理论。针对深度学习算法计算量大、收敛速度慢、内存消耗高的问题,针对具体问题提出了若干有理论保证的高效算法:(1)基于随机量化的快速、省内存神经网络训练算法;(2)基于随机采样和方差缩减的高效机器学习算法;(3)深度生成模型的高效算法;(4)主题模型的高效算法和大规模训练系统。研究成果作为ActNN、WarpLDA、ZhuSuan等开源软件发布。上述成果已连续多年在机器学习顶级国际会议NeurIPS、ICML、ICLR等发表论文20余篇。研究工作得到自然科学基金青年基金支持。

奖励与荣誉:

中国计算机学会优秀博士学位论文奖(2019)

清华大学水木学者(2019)

学术成果:

[1] Jianfei Chen*, Lianmin Zheng*, Zhewei Yao, Dequan Wang, Ion Stoica, Michael W. Mahoney, and Joseph E. Gonzalez. “ActNN: Reducing Training Memory Footprint via 2-Bit Activation Compressed Training”. In ICML. 2021.

[2] Jianfei Chen, Gai Yu, Zhewei Yao, Michael W. Mahoney, Joseph E. Gonzalez. “A Statistical Framework for Low-bitwidth Training of Deep Neural Networks”. In NeurIPS. 2020.

[3] Jianfei Chen, Cheng Lu, Biqi Chenli, Jun Zhu, Tian Tian. “VFlow: More Expressive Generative Flows with Variational Data Augmentation”. In ICML. 2020.

[4] Jianfei Chen, Jun Zhu, Yee Whye Teh, Tong Zhang. “Stochastic Expectation Maximization with Variance Reduction.” In NIPS. 2018.

[5] Jianfei Chen, Jun Zhu, Le Song. "Stochastic training of graph convolutional networks with variance reduction." In ICML. 2018.

[6] Jianfei Chen, Jun Zhu, Jie Lu, Shixia Liu. "Scalable Inference for Nested Chinese Restaurant Process Topic Models." In VLDB. 2018.

[7] Kaiwei Li, Jianfei Chen, Wenguang Chen, Jun Zhu. “SaberLDA: Sparsity-Aware Learning of Topic Models on GPUs.” In ASPLOS. 2017.

[8] Jianfei Chen, Chongxuan Li, Jun Zhu, Bo Zhang. "Population matching discrepancy and applications in deep learning." In NeurIPS. 2017.

[9] Jianfei Chen, Kaiwei Li, Jun Zhu, Wenguang Chen. “WarpLDA: a Cache Efficient O (1) Algorithm for Latent Dirichlet Allocation.” In VLDB. 2016.

[10] Jianfei Chen, Xun Zheng, Zi Wang, Jun Zhu, Bo Zhang. “Scalable Inference for Logistic-Normal Topic Models.” In NIPS. 2013.

上一篇:方斌

关闭