diff --git a/docs/source/index.mdx b/docs/source/index.mdx index 52e35d61d7..0c9a9c87f1 100644 --- a/docs/source/index.mdx +++ b/docs/source/index.mdx @@ -33,6 +33,10 @@ As such, Optimum enables developers to efficiently use any of these platforms wi >
AWS Trainium/Inferentia

Accelerate your training and inference workflows with AWS Trainium and AWS Inferentia

+
AMD Instinct GPUs
+

Available soon AMD Instinct GPUs

+
FuriosaAI

Fast and efficient inference on FuriosaAI WARBOY

diff --git a/docs/source/installation.mdx b/docs/source/installation.mdx index 5313839dfd..d8a41d973f 100644 --- a/docs/source/installation.mdx +++ b/docs/source/installation.mdx @@ -25,6 +25,7 @@ If you'd like to use the accelerator-specific features of 🤗 Optimum, you can | [ONNX runtime](https://onnxruntime.ai/docs/) | `pip install --upgrade-strategy eager install optimum[onnxruntime]`| | [Intel Neural Compressor (INC)](https://www.intel.com/content/www/us/en/developer/tools/oneapi/neural-compressor.html) | `pip install --upgrade-strategy eager optimum[neural-compressor]` | | [Intel OpenVINO](https://docs.openvino.ai/latest/index.html) | `pip install --upgrade-strategy eager optimum[openvino,nncf]` | +| [AMD Instinct GPUs](https://www.amd.com/en/graphics/instinct-server-accelerators) | Available soon | | [Habana Gaudi Processor (HPU)](https://habana.ai/training/) | `pip install --upgrade-strategy eager optimum[habana]` | | [FuriosaAI](https://www.furiosa.ai/) | `pip install --upgrade-strategy eager optimum[furiosa]` |