Jingfeng Wu

I am a first year Ph.D. student at Johns Hopkins University, Computer Science Department. I am supervised by Prof. Vladimir Braverman.

Previously, I obtained my B.S. and M.S. at Peking University, School of Mathematical Sciences. During my master education, I was fortunate to work with Prof. Jinwen Ma and Prof. Zhanxing Zhu.

Feel free to contact me if you want to discuss or collaborate. Most of the time you can find me in Malone Hall, 2nd floor. I have always enjoyed meeting people.

Email  /  CV  /  Google Scholar  /  Github

News
  • I will serve as a reviewer for ICML 2020.
Research

I am interested in mathematically motivated machine learning. Currently I am following: 1) stochastic algorithms, 2) sketching algorithms, 3) reinforcement learning, 4) explainable computer vision.

b3do

Obtaining Regularization for Free via Iterate Averaging
Jingfeng Wu, Vladimir Braverman, Lin F. Yang
Workshop on Optimization for Machine Learning (OPT), 2019, oral

For general strongly convex and smooth losses, we can obtain regularization for free via properly averaging the optimization paths. The regularization works similar to weight decay.

b3do

On the Noisy Gradient Descent that Generalizes as SGD
Jingfeng Wu, Wenqing Hu, Haoyi Xiong, Jun Huan, Vladimir Braverman, Zhanxing Zhu
submitted
bibtex / arXiv

We study the importance of the distribution class of noise for regularizing gradient methods.

b3do

Tangent-Normal Adversarial Regularization for Semi-supervised Learning
Bing Yu*, Jingfeng Wu*, Jinwen Ma, Zhanxing Zhu
Conference on Computer Vision and Pattern Recognition (CVPR), 2019, oral
bibtex / arXiv / slides / poster / code

We present a novel manifold regularization method for semi-supervised learning, which is realized via adversarial training.

b3do

The Anisotropic Noise in Stochastic Gradient Descent: Its Behavior of Escaping from Minima and Regularization Effects
Zhanxing Zhu*, Jingfeng Wu*, Bing Yu, Lei Wu, Jinwen Ma
International Conference on Machine Learning (ICML), 2019
bibtex / arXiv / slides / poster

We study the noise structure of stochastic gradient descent, and demonstrate its benefits on helping the dynamic escaping from sharp minima.

Services
  • Reviewer for ICML 2020.
Experience
Notes

Template: this