- Homepage: link
- 🌱 I'm a PhD student at Tsinghua University, THUNLP lab.
- 🔭 I’m currently working on Efficient LLM, Parameter-Efficient Tuning.
- ⚡ I’m one of the maintainers of the following open-source projects: BMTrain, ModelCenter, OpenPrompt and OpenDelta.
🤔
-
Tsinghua University
Highlights
- Pro
Pinned Loading
-
OpenBMB/BMTrain
OpenBMB/BMTrain PublicEfficient Training (including pre-training and fine-tuning) for Big Models
-
OpenBMB/ModelCenter
OpenBMB/ModelCenter PublicEfficient, Low-Resource, Distributed transformer implementation based on BMTrain
-
thunlp/OpenPrompt
thunlp/OpenPrompt PublicAn Open-Source Framework for Prompt-Learning.
-
thunlp/OpenDelta
thunlp/OpenDelta PublicA plug-and-play library for parameter-efficient-tuning (Delta Tuning)
-
OpenBMB/MiniCPM
OpenBMB/MiniCPM PublicMiniCPM3-4B: An edge-side LLM that surpasses GPT-3.5-Turbo.
-
thunlp/Ouroboros
thunlp/Ouroboros PublicOuroboros: Speculative Decoding with Large Model Enhanced Drafting (EMNLP 2024 main)
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.