Hi, I am a postdoctoral fellow in the VITA group and the Institute for Foundations of Machine Learning (IFML) at UT Austin, under the supervision of Atlas Wang. I abtained my Ph.D. at the Eindhoven University of Technology (TU/e), the Netherlands, under the supervision of Mykola Pechenizkiy and Decebal Constantin Mocanu.
Machine Learning, Deep Learning, Sparse Neural Network Training, Sparsity, Computer Vision, Efficient Neural Network.
1/2023, 4 papers got accepted in ICLR 2023, Ramanujan Graph Pruning (oral top-5%), Sparsity May Cry Benchmark (spotlight top-25%), MoE as Dropout (spotlight top-25%), SLaK:51x51 Large Conv. Looking forward to meeting and having fun in Rwanda.
12/2022, our Untrained GNNs paper received the Best Paper Award from LoG 2022.
11/2022, our Lottery-Pools paper got accepted in AAAI 2023.
9/2022, our sparse large kernel on time series paper got accepted in NeurIPs 2022.
25/8/2022, I have moved to Austin, Texas, USA as postdoctoral fellow in the VITA group and the Institute for Foundations of Machine Learning (IFML) at UT Austin, under the supervision of Atlas Wang.
7/2022, our paper, Brain-inspired Highly Sparse NN got accepted for publication in Machine Learning Journal.
5/2022, our Sup-Tickets got accepted by UAI 2022.
4/2022, our tutorial Sparse Neural Networks Training has been accepted at ECMLPKDD 2022.
6/4/2022, I got my PhD with cum laude (distinguished thesis).
3/2022, I got my PhD thesis abstract accpeted by IDA 2022, which was also the first conference (symposium) that I have attended in the first year of my PhD. PhD life is a cycle :).
2/2022, I am honored to receive the postdoctoral fellowship at IFML of The University of Texas at Austin.
1/2022, (2/3) two of my first-author papers are accepted by ICLR 2022: Random pruning and FreeTickets.
12/2021, I receive the “outstanding intern” honour in JD Acedemy Explore.
9/2021, (1/1) one of my first-author paper gets accepted by NeurIPs 2021: GraNet.
6/2021, I moved to Beijing, China for my internship at JD Acedemy Explore, under supervision of Li Shen and Dacheng Tao.
5/2021, (2/2) two of my first-author papers are accepted by ICML 2021: In-Time Over-Parameterization and Selfish RNN.
 Tianjin Huang, Tianlong Chen, Meng Fang, Vlado Menkovski, Jiaxu Zhao, Lu Yin, Yulong Pei, Decebal Constantin Mocanu, Zhangyang Wang, Mykola Pechenizkiy, Shiwei Liu. “You Can Have Better Graph Neural Networks by Not Training Weights at All: Finding Untrained Graph Tickets.” Learning on Graphs Conference (LoG), 2022. Best Paper Award. [Paper].
 Duc N.M Hoang, Shiwei Liu, Radu Marculescu, Zhangyang Wang. “Revisiting Pruning at Initialization Through the Lens of Ramanujan Graph.” International Conference on Learning Representations (ICLR), 2023. Notable-Top-5% Oral. [Paper].
 Shiwei Liu*, Tianlong Chen*, Zhenyu Zhang, Xuxi Chen, Tianjin Huang, AJAY KUMAR JAISWAL, and Zhangyang Wang. “Sparsity May Cry: Let Us Fail (Current) Sparse Neural Networks Together!” International Conference on Learning Representations (ICLR), 2023. Notable-Top-25% Spotlight. [Paper].
 Shiwei Liu, Tianlong Chen*, Xiaohan Chen*, Xuxi Chen, Qiao Xiao, Boqian Wu, Mykola Pechenizkiy, Decebal Mocanu, and Zhangyang Wang. “More Convnets in the 2020s: Scaling Up Kernels Beyond 51x51 Using Sparsity.” International Conference on Learning Representations (ICLR), 2023. [Paper].
 Shiwei Liu, Lu Yin, Decebal Constantin Mocanu, and Mykola Pechenizkiy. “Do We Actually Need Dense Over-Parameterization? In-Time Over-Parameterization in Sparse Training”. The Thirty-Eighth International Conference on Machine Learning (ICML), PMLR, 2021. [Paper].
Honors and Awards
🏆 Best Paper Award, Learning on Graphs Conference (LoG 2022)
🏆 Cum Laude (distinguished Ph.D. thesis), Eindhoven University of Technology (TU/e)
🏆 Outstanding Intern, JD Explore Academy
🏆 IFML Postdoctoral Fellowship, UT Austin, US
🏆 Outstanding Graduate, North University of China, China
|Aug. 2022 - Present||Postdoc||The University of Texas at Austin|
|Jun. 2021 - Nov. 2021||Research Intern||JD Acedemy Explore|
|Mar. 2018 - Mar. 2022||Ph.D.||Eindhoven University of Technology|