Riyasat Ohib

Georgia Institute of Technology. Ph.D. Candidate

prof_pic.jpg

I am a Graduate Student in the Department of Electrical and Computer Engineering at the Georgia Institute of Technology. While I have a broad interest in learning algorithms, my current research primarily centers on the development of sparse and efficient neural networks and understanding the intricacies of their training process. I currently work as a Graduate Research Assistant at the Center for Translational Research in Neuro-imaging and Data Science (TReNDS), a joint research center by Georgia Tech, Georgia State and Emory University under the supervision of Dr. Vince Calhoun and Dr. Sergey Plis.

My interests span most of deep learning, with a current focus on efficient AI, sparse deep learning, and multimodal learning. I also have happened to dabble in research on multi-task reinforcement learning.

Please reach out if you would like to know more about my research, discuss about AI research or would like to collaborate.


Georgia Tech

Georgia Tech

Aug 2019 - Present

FAIR at Meta AI

FAIR at Meta AI

Summer 2022

Dolby Labs

Dolby Labs

Summer 2024

TReNDS Center

TReNDS Center

Aug 2019 - Present


news

May 20, 2024 Joining the Advanced Technologies group at Dolby Laboratories as a Ph.D. Research Intern! Will be working on novel efficient finetuning methods for both LLMs and multimodal VLMs.
Mar 05, 2023 Preliminary work accepted in ICLR 2023 Sparse Neural Networks workshop on communication efficient federated learning and full work out on arXiv.
Oct 31, 2022 Our work, Explicit Group Sparse Projection with Applications to Deep Learning and NMF has been published in the Transactions on Machine Learning Research (TMLR). Available at: OpenReview
May 09, 2022 Joined FAIR at Meta AI as a Research Scientist Intern to work on efficient ML and model sparsity research. My compression research library was integrated as part of the open source Fairscale library.
Oct 20, 2021 Our paper, “Single-Shot Pruning for Offline Reinforcement Learning” was accepted in Offline Reinforcement Learning Workshop, NeurIPS 2021. - Paper.
Aug 27, 2021 Started Ph.D. at Georgia Tech. I will be working on developing new techniques for sparse deep learning, with potential applications in federated, multi-task and multimodal learning.

selected publications

  1. arxiv
    Unmasking Efficiency: Learning Salient Sparse Models in Non-IID Federated Learning
    Riyasat Ohib, Bishal Thapaliya, Gintare Karolina Dziugaite , and 3 more authors
    arxiv, 2024
  2. ICLR SNN
    SalientGrads: Sparse Models for Communication Efficient and data aware Distributed Federated Training
    Riyasat Ohib, Bishal Thapaliya, Pratyush Reddy , and 3 more authors
    ICLR Sparse Neural Networks Workshop, 2023
  3. TMLR
    Explicit Group Sparse Projection with Applications to Deep Learning and NMF
    Riyasat Ohib, Nicolas Gillis, Niccolò Dalmasso , and 3 more authors
    Transactions on Machine Learning Research, 2022
  4. NeurIPS Off-RL
    Single-Shot Pruning for Offline Reinforcement Learning
    Samin Yeasar, Riyasat Ohib, Sergey Plis , and 1 more author
    NeurIPS Offline RL Workshop, 2021
  5. ICLR HAET
    Grouped Sparse Projection for Deep Learning
    Riyasat Ohib, Nicolas Gillis, Sergey Plis , and 1 more author
    ICLR Hardware Aware Efficient Training workshop, 2021