diff --git a/docs/genai/tutorials/finetune.md b/docs/genai/tutorials/finetune.md index 5d0302b896dfc..3dd739d80340b 100644 --- a/docs/genai/tutorials/finetune.md +++ b/docs/genai/tutorials/finetune.md @@ -65,7 +65,7 @@ Olive generates models and adapters in ONNX format. These models and adapters ca Note: this operations requires a system with an NVIDIA GPU, with CUDA installed - Use the `olive fine-tune` command: https://microsoft.github.io/Olive/features/cli.html#finetune + Use the `olive fine-tune` command: https://microsoft.github.io/Olive/how-to/cli/cli-finetune.html Here is an example usage of the command: @@ -75,12 +75,12 @@ Olive generates models and adapters in ONNX format. These models and adapters ca 2. Optionally, quantize your model - Use the `olive quantize` command: https://microsoft.github.io/Olive/features/cli.html#quantize + Use the `olive quantize` command: https://microsoft.github.io/Olive/how-to/cli/cli-quantize.html 3. Generate the ONNX model and adapter using the quantized model - Use the `olive auto-opt` command for this step: https://microsoft.github.io/Olive/features/cli.html#auto-opt + Use the `olive auto-opt` command for this step: https://microsoft.github.io/Olive/how-to/cli/cli-auto-opt.html The `--adapter path` can either be a HuggingFace adapter reference, or a path to the adapter you fine-tuned above. @@ -162,4 +162,4 @@ python app.py -m -a <.onnx_adapter files> -t -s ## References * [Python API docs](../api/python.md#adapter-class) -* [Olive CLI docs](https://microsoft.github.io/Olive/features/cli.html) +* [Olive CLI docs](https://microsoft.github.io/Olive/how-to/index.html#working-with-the-cli) diff --git a/docs/tutorials/OpenVINO_EP_samples/squeezenet_classification_cpp.md b/docs/tutorials/OpenVINO_EP_samples/squeezenet_classification_cpp.md index d0759ac028d55..46d32902393b7 100644 --- a/docs/tutorials/OpenVINO_EP_samples/squeezenet_classification_cpp.md +++ b/docs/tutorials/OpenVINO_EP_samples/squeezenet_classification_cpp.md @@ -14,7 +14,7 @@ The source code for this sample is available [here](https://github.com/microsoft # How to build ## Prerequisites -1. [The Intel® Distribution of OpenVINO toolkit](https://docs.openvinotoolkit.org/latest/index.html) +1. [The Intel® Distribution of OpenVINO toolkit](https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/overview.html) 2. Use opencv (use the same opencv package that comes builtin with Intel® Distribution of OpenVINO toolkit) diff --git a/docs/tutorials/OpenVINO_EP_samples/tiny_yolo_v2_object_detection_python.md b/docs/tutorials/OpenVINO_EP_samples/tiny_yolo_v2_object_detection_python.md index e6c6e756a2087..3ee8c610ef9d6 100644 --- a/docs/tutorials/OpenVINO_EP_samples/tiny_yolo_v2_object_detection_python.md +++ b/docs/tutorials/OpenVINO_EP_samples/tiny_yolo_v2_object_detection_python.md @@ -14,7 +14,7 @@ The source code for this sample is available [here](https://github.com/microsoft # How to build ## Prerequisites -1. [The Intel® Distribution of OpenVINO toolkit](https://docs.openvinotoolkit.org/latest/index.html) +1. [The Intel® Distribution of OpenVINO toolkit](https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/overview.html) 2. Download the latest tinyYOLOv2 model from the ONNX Model Zoo. This model was adapted from [ONNX Model Zoo](https://github.com/onnx/models).Download the latest version of the [tinyYOLOv2](https://github.com/onnx/models/tree/main/validated/vision/object_detection_segmentation/tiny-yolov2) model from here. diff --git a/docs/tutorials/csharp/yolov3_object_detection_csharp.md b/docs/tutorials/csharp/yolov3_object_detection_csharp.md index dce5c44694eea..56f00b2a758eb 100644 --- a/docs/tutorials/csharp/yolov3_object_detection_csharp.md +++ b/docs/tutorials/csharp/yolov3_object_detection_csharp.md @@ -23,7 +23,7 @@ The source code for this sample is available [here](https://github.com/microsoft ## Prerequisites 1. Install [.NET Core 3.1](https://dotnet.microsoft.com/download/dotnet-core/3.1) or higher for you OS (Mac, Windows or Linux). -2. [The Intel® Distribution of OpenVINO toolkit](https://docs.openvinotoolkit.org/latest/index.html) +2. [The Intel® Distribution of OpenVINO toolkit](https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/overview.html) 3. Use any sample Image as input to the sample.