Jingfeng Wu

I am a third year master student at Peking University, School of Mathematical Sciences. Previously, I obtained my B.S. from Peking University, School of Mathematics and Sciences. I am supervised by Prof. Jinwen Ma and Prof. Zhanxing Zhu.

Email  /  CV  /  Google Scholar  /  Github

News
  • One paper is accepted to ICML-2019: "The Anisotropic Noise in Stochastic Gradient Descent: Its Behavior of Escaping from Minima and Regularization Effects".
  • I will start my PhD journey this fall at Department of Computer Science, Johns Hopkins University. Looking forward to potential collaborators!
  • Our paper "Tangent-Normal Adversarial Regularization for Semi-supervised Learning" is accepted as an oral presentation to CVPR-2019!
Research

I'm interested in both the theoretical and applied parts of machine learning. Currently, I am following cutting edge researches in: 1) theoretically machine learning, 2) stochastic algorithms, 3) generative models, 4) adversarial learning, 5) semi-supervised learning, 6) applications in computer vision.

b3do

Tangent-Normal Adversarial Regularization for Semi-supervised Learning
Bing Yu*, Jingfeng Wu*, Jinwen Ma, Zhanxing Zhu
Conference on Computer Vision and Pattern Recognition (CVPR), 2019, oral
bibtex / arXiv

We present a novel manifold regularization method for semi-supervised learning, which is realized via adversarial training.

b3do

The Anisotropic Noise in Stochastic Gradient Descent: Its Behavior of Escaping from Minima and Regularization Effects
Zhanxing Zhu*, Jingfeng Wu*, Bing Yu, Lei Wu, Jinwen Ma
International Conference on Machine Learning (ICML), 2019
bibtex / arXiv

We study the noise structure of stochastic gradient descent, and demonstrate its benefits on helping the dynamic escaping from sharp minima.

Experience
Notes

Template: this