v1.0
- Parallelformers, which is based on Megatron LM, is designed to make model parallelization easier.
- You can parallelize various models in HuggingFace Transformers on multiple GPUs with a single line of code.
- Currently, Parallelformers only supports inference. Training features are NOT included.