news

Sep 25, 2024 Our latest work, Efficient Reinforcement Learning by Discovering Neural Pathways was accepted at NeurIPS 2024.
Sep 03, 2024 Excited to join the model efficiency team at Cohere as a Research Intern!
May 20, 2024 Joining the Advanced Technologies group at Dolby Laboratories as a Ph.D. Research Intern! Will be working on novel efficient finetuning methods for both LLMs and multimodal VLMs.
Mar 05, 2023 Preliminary work accepted in ICLR 2023 Sparse Neural Networks workshop on communication efficient federated learning and full work out on arXiv.
Oct 31, 2022 Our work, Explicit Group Sparse Projection with Applications to Deep Learning and NMF has been published in the Transactions on Machine Learning Research (TMLR). Available at: OpenReview
May 09, 2022 Joined FAIR at Meta AI as a Research Scientist Intern to work on efficient ML and model sparsity research. My compression research library was integrated as part of the open source Fairscale library.
Oct 20, 2021 Our paper, “Single-Shot Pruning for Offline Reinforcement Learning” was accepted in Offline Reinforcement Learning Workshop, NeurIPS 2021. - Paper.
Aug 27, 2021 Started Ph.D. at Georgia Tech. I will be working on developing new techniques for sparse deep learning, with potential applications in federated, multi-task and multimodal learning.
Nov 07, 2015 A long announcement with details