From 34d52ab90ef9995ab51298b02827e58f96a4a7b5 Mon Sep 17 00:00:00 2001
From: kshama-msft <66488860+kshama-msft@users.noreply.github.com>
Date: Fri, 27 Oct 2023 20:11:54 +0000
Subject: [PATCH] More content for large model training
---
src/routes/training/+page.svelte | 6 ++++--
1 file changed, 4 insertions(+), 2 deletions(-)
diff --git a/src/routes/training/+page.svelte b/src/routes/training/+page.svelte
index 5babb9b9382c5..2130c9fe8ebb6 100644
--- a/src/routes/training/+page.svelte
+++ b/src/routes/training/+page.svelte
@@ -27,8 +27,10 @@
ORTModule accelerates training of large transformer based PyTorch models. The training time and - cost is reduced with a few lines of code change. It is built on top of highly successful and - proven technologies of ONNX Runtime and ONNX format. + training cost is reduced with a few lines of code change. It is built on top of highly successful and + proven technologies of ONNX Runtime and ONNX format. It is composable with technologies like DeepSpeed and + accelerates pre-training and finetuning for state of the art LLMs. It is integrated in the Hugging Face Optimum + library which provides an ORTTrainer API to use ONNX Runtime as the backend for training acceleration.