Skip to content
View mhmdsabry's full-sized avatar
Block or Report

Block or report mhmdsabry

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse

Pinned Loading

  1. Universal_Decoder Universal_Decoder Public

    This is the Transformers Decoder version, based on the "Universal Transformers (Encoder-Decoder)" paper, and trained on natural data rather than algorithmic tasks.

    Python 2

  2. FP_vs_1Bit_InductionHeadCircuit FP_vs_1Bit_InductionHeadCircuit Public

    Comparing the full-precision Query, Keys, and Values matrices with their 1-bit counterparts in a two-layer, attention-only transformer trained on a synthetic copying task.

    Python 1

  3. baby_mamba baby_mamba Public

    An implementation of Mamba to develop an understanding of its functioning.

    Python 2

  4. BERT_with_Residual_vs_Highway BERT_with_Residual_vs_Highway Public

    Comparing between residual stream and highway stream in transformers(BERT) .

    Python 3

  5. ResLM ResLM Public

    Querying a Language model in a residual path for drug-target interaction task

    Python 2

  6. bonaventuredossou/afrivec bonaventuredossou/afrivec Public

    Implementation of the paper: «AfriVEC: Word Embedding Models for African Languages. Case Study of Fon and Nobiin» - Bonaventure F. P. Dossou and Mohammed Sabry

    Python 4 2