Skip to content

HCOMP 2023 paper - human collaboration on detecting LLM-generated deepfake texts

License

Notifications You must be signed in to change notification settings

huashen218/llm-deepfake-human-study

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Does Human Collaboration Enhance the Accuracy of Identifying LLM-Generated Deepfake Texts?

Arxiv Articles Repo

This repository aims to investigate if human collaboration can improve the performance of detecting LLM-generated deepfake texts, which includes the data and codes of MTurk human studies.

The work is to support the AAAI HCOMP 2023 paper "Does Human Collaboration Enhance the Accuracy of Identifying LLM-Generated Deepfake Texts?" by Adaku Uchendu*, Jooyoung Lee∗, Hua Shen∗, Thai Le, Ting-Hao ‘Kenneth’ Huang, Dongwon Lee.

Citation

If you find this repo helpful to your research, please cite the paper:

@inproceedings{uchendu2023understanding,
  title={Does Human Collaboration Enhance the Accuracy of Identifying LLM-Generated Deepfake Texts?},
  author={Uchendu, Adaku and Lee, Jooyoung and Shen, Hua and Le, Thai and Huang, Ting-Hao'Kenneth' and Lee, Dongwon},
  booktitle={Proceedings of the AAAI Conference on Human Computation and Crowdsourcing},
  year={2023},
}

Add a link to download only datasets (3 paragraph CSV file)?

Download the dataset

You can check and download the dataset used in this study.

The dataset includes 50 instances (articles), in which each instance (article) cnosists of three paragraphs. Within the three paragraphs, one random-selected paragraph is generated by LLM and the other two are written by humans.

Download all data in repo

You can also download all data in this repo as a zip file, which include the aforementioned dataset and the codes to conduct human study with Amazon MTurk.

Human Study of dinstinguishing Paragraphs Generated by Humans or AI Machines?

Step1: Design the templates of "Group1" and "Group2"

root_dir = 'the project path'
group1_template = os.path.join(root_dir, generate_hit_htmls, "group1_template.html")
group2_template = os.path.join(root_dir, generate_hit_htmls, "group2_template.html")

Step2: Generate All HITs with data & templates;

cd root_dir/generate_hit_htmls
$ python generate_hits.py

All generated HIT htmls are saved in "root_dir/generate_hit_htmls/htmls/group1

Step3: Deploy HITs to MTurk.

cd root_dir/deploy_mturk/product

To create HITs:
$ python create_hit_product.py

To retrieve HITs:
$ python retrieve_hit_product.py

About

HCOMP 2023 paper - human collaboration on detecting LLM-generated deepfake texts

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published