Skip to content

multi gpu training in one machine for BERT

License

Notifications You must be signed in to change notification settings

wipen/BERT-multi-gpu

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

BERT MULTI GPU

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

REQUIREMENT

python 3

tensorflow 1.12.0

TRAINING

0, edit the input and output file name in create_pretraining_data.py and run_pretraining_gpu.py

1, run create_pretraining_data.py

2, run run_pretraining_gpu.py

PARAMETERS

Edit n_gpus in run_pretraining_gpu.py

DATA

In sample_text.txt, sentence is end by \n, paragraph is splitted by empty line.

About

multi gpu training in one machine for BERT

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%