Riyasat Ohib

Georgia Institute of Technology Ph.D. Candidate

prof_pic.jpg

I am a Graduate Student at Georgia Institute of Technology, in the department of Electrical and Computer Engineering. My current research is on sparse representation of neural networks, and machine learning in general. I am currently working as a Graduate Research Assistant at the Center for Translational Research in Neuro-imaging and Data Science (TReNDS), a joint research center by Georgia Tech, Georgia State and Emory University, under the supervision of Dr. Vince Calhoun and Dr. Sergey Plis.

I am primarily interested in sparsity in deep learning, model compression, efficient AI and sparse learning. I am interested not only in the efficiency due to sparsity, but also in its potential to find better solutions. I am also involved in the areas of federated, multi-task and multimodal learning.

fair

Georgia Institute of Technology
Fall 2019 - Present

fair

Trends Center
Fall 2019 - Present


news

Mar 5, 2023 Preliminary work accepted in ICLR 2023 Sparse Neural Networks workshop on communication efficient federated learning. Details coming soon!!
Oct 31, 2022 Our work, Explicit Group Sparse Projection with Applications to Deep Learning and NMF has been published in the Transactions on Machine Learning Research (TMLR). Available at: OpenReview
May 9, 2022 Joined FAIR at Meta AI as a Research Scientist Intern to work on efficient ML and model sparsity research. My compression research library was integrated as part of the open source Fairscale library.
Oct 20, 2021 Our paper, “Single-Shot Pruning for Offline Reinforcement Learning” was accepted in Offline Reinforcement Learning Workshop, NeurIPS 2021. - Paper.
Aug 27, 2021 Started Ph.D. at Georgia Tech. I will be working on developing new techniques for sparse deep learning, with potential applications in federated, multi-task and multimodal learning.
Aug 20, 2019 Started attending Georgia Institute of Technology in the ECE Master’s program.

selected publications

  1. ICLR SNN
    SalientGrads: Sparse Models for Communication Efficient and data aware Distributed Federated Training [coming soon]
    Riyasat Ohib, Bishal Thapaliya, Pratyush Reddy, Jingyu Liu, Vince Calhoun, and Sergey Plis
    ICLR Sparse Neural Networks Workshop 2023
  2. TMLR
    Explicit Group Sparse Projection with Applications to Deep Learning and NMF
    Riyasat Ohib, Nicolas Gillis, Niccolò Dalmasso, Sameena Shah, Vamsi Potluru, and Sergey Plis
    Transactions on Machine Learning Research 2022
  3. NeurIPS Off-RL
    Single-Shot Pruning for Offline Reinforcement Learning
    Samin Yeasar, Riyasat Ohib, Sergey Plis, and Doina Precup
    NeurIPS Offline RL Workshop 2021
  4. ICLR HAET
    Grouped Sparse Projection for Deep Learning
    Riyasat Ohib, Nicolas Gillis, Sergey Plis, and Vamsi Potluru
    ICLR Hardware Aware Efficient Training workshop 2021