news

May 20, 2024 Joining the Advanced Technologies group at Dolby Laboratories as a Ph.D. Research Intern! Will be working on novel efficient finetuning methods for both LLMs and multimodal VLMs.
Mar 05, 2023 Preliminary work accepted in ICLR 2023 Sparse Neural Networks workshop on communication efficient federated learning and full work out on arXiv.
Oct 31, 2022 Our work, Explicit Group Sparse Projection with Applications to Deep Learning and NMF has been published in the Transactions on Machine Learning Research (TMLR). Available at: OpenReview
May 09, 2022 Joined FAIR at Meta AI as a Research Scientist Intern to work on efficient ML and model sparsity research. My compression research library was integrated as part of the open source Fairscale library.
Oct 20, 2021 Our paper, “Single-Shot Pruning for Offline Reinforcement Learning” was accepted in Offline Reinforcement Learning Workshop, NeurIPS 2021. - Paper.
Aug 27, 2021 Started Ph.D. at Georgia Tech. I will be working on developing new techniques for sparse deep learning, with potential applications in federated, multi-task and multimodal learning.
Nov 07, 2015 A long announcement with details
Oct 22, 2015 A simple inline announcement.