I am a Ph.D. student in Computer Science at UCLA, supervised by Cho-Jui Hsieh. I obtained my Bachelor degree in Electrical Engineering and History from National Taiwan University. Before starting my Ph.D. journery, I worked with Shen-Shyang Ho as a Research Assistant at Nanyang Technological University
My current research focsed on Efficient Machine Learning (EML). The goal of EML is to achieve a smaller model with fast inference speed on all devices. This includes studies of compressing machine learning models,approximating model inference operations, or acheving both simultaneoursly. In addition to EML, I am also keen on exploring fundamental problems of Machine Learning such as adversarial robustness or catastrophic forgetting.
Selected Publications [Full List]
- DRONE: Data-aware Low-rank Compression for Large NLP ModelsIn Advances in Neural Information Processing Systems (NeurIPS), 2021
- GroupReduce: Block-Wise Low-Rank Approximation for Neural Language Model ShrinkingIn Advances in Neural Information Processing Systems (NeurIPS), 2018
- Overcoming Catastrophic Forgetting by Bayesian Generative RegularizationIn Conference of Machine Learning (ICML), 2021