Skip to content
View shizhediao's full-sized avatar
🎯
Focusing
🎯
Focusing

Highlights

  • Pro
Block or Report

Block or report shizhediao

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
shizhediao/README.md

Aloha. I'm Shizhe Diao 👋

Hello, my name is Shizhe Diao and I am now a final-year Ph.D. candidate at the Hong Kong University of Science and Technology (HKUST), majoring in Computer Science and supervised by Prof. Tong Zhang. My research focuses on pre-training, efficient-tuning, and adaptation of large language models.

Besides programming, I have an immense interest in swimming 🏊, kayaking 🚣, windsurfing 🏄, dinghy sailing ⛵, and stand up paddling!

C3n7ral051nt4g3ncy

Pinned

  1. OptimalScale/LMFlow OptimalScale/LMFlow Public

    An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Models for All.

    Python 8k 809

  2. active-prompt active-prompt Public

    Source code for the paper "Active Prompting with Chain-of-Thought for Large Language Models"

    Python 196 23

  3. extreme-bert/extreme-bert extreme-bert/extreme-bert Public

    ExtremeBERT is a toolkit that accelerates the pretraining of customized language models on customized datasets, described in the paper “ExtremeBERT: A Toolkit for Accelerating Pretraining of Custom…

    Python 282 15

  4. ChatGPTPapers ChatGPTPapers Public

    Must-read papers, related blogs and API tools on the pre-training and tuning methods for ChatGPT.

    314 18

  5. ZEN ZEN Public

    Forked from sinovation/ZEN

    A BERT-based Chinese Text Encoder Enhanced by N-gram Representations

    Python 1 1

  6. T-DNA T-DNA Public

    Source code for the ACL-IJCNLP 2021 paper entitled "T-DNA: Taming Pre-trained Language Models with N-gram Representations for Low-Resource Domain Adaptation" by Shizhe Diao et al.

    Python 19 5