[ACL 2024] AbsInstruct: Eliciting Abstraction Ability from LLMs through Explanation Tuning with Plausibility Estimation
This repository is the official implementation of ACL 2024 paper AbsInstruct: Eliciting Abstraction Ability from LLMs through Explanation Tuning with Plausibility Estimation.
Python version is 3.8.5
requirements:
bert_score==0.3.13
datasets==2.13.1
evaluate==0.4.1
nltk==3.8.1
numpy==1.24.4
pandas==2.0.3
peft==0.6.2
rouge_score==0.1.2
scikit_learn==1.3.0
spacy==2.3.2
torch==2.0.1
transformers==4.34.0
You can install all requirements with the command
pip install -r requirements.txt
Our framework collects instructions of abstraction detection and combined them with Alpaca.
The mixed instructions are put in the data
folder.
Usage and License Notices: Since the Alpaca dataset is released under the CC BY NC 4.0 (allowing only non-commercial use), our datasets are also released under the CC BY NC 4.0 License. Models trained using our dataset should not be used outside of research purposes. Meanwhile, the code license is Apache 2.0, which is a more permissive license.
If you use our code to build mixed datasets with other instruction datasets, the licenses are subject to Apache 2.0 of our code and the licenses of the instruction dataset you use.
We provide the command to instruct tune Llama2 (7B) in train_script.sh
. You can replace it with
other models you want. Remember to replace the placeholder [DIR_PATH]
with our own path.
The inference command is in inference_script.sh
Please cite the repo if you use the data or code in this repo.
@inproceedings{wang2024absinstruct,
title={AbsInstruct: Eliciting Abstraction Ability from LLMs through Explanation Tuning with Plausibility Estimation},
author={Wang, Zhaowei and Fan, Wei and Zong, Qing and Zhang, Hongming and Choi, Sehyun and Fang, Tianqing and Liu, Xin and Song, Yangqiu and Wong, Ginny Y and See, Simon},
booktitle={Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics},
year={2024}
}
This repo is maintained by Zhaowei Wang