Skip to content

Commit

Permalink
Fixed all links.
Browse files Browse the repository at this point in the history
  • Loading branch information
MaanavD committed Dec 10, 2024
1 parent f9cbd18 commit 824c734
Show file tree
Hide file tree
Showing 4 changed files with 7 additions and 7 deletions.
8 changes: 4 additions & 4 deletions docs/genai/tutorials/finetune.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ Olive generates models and adapters in ONNX format. These models and adapters ca

Note: this operations requires a system with an NVIDIA GPU, with CUDA installed

Use the `olive fine-tune` command: https://microsoft.github.io/Olive/features/cli.html#finetune
Use the `olive fine-tune` command: https://microsoft.github.io/Olive/how-to/cli/cli-finetune.html

Here is an example usage of the command:

Expand All @@ -75,12 +75,12 @@ Olive generates models and adapters in ONNX format. These models and adapters ca

2. Optionally, quantize your model

Use the `olive quantize` command: https://microsoft.github.io/Olive/features/cli.html#quantize
Use the `olive quantize` command: https://microsoft.github.io/Olive/how-to/cli/cli-quantize.html


3. Generate the ONNX model and adapter using the quantized model

Use the `olive auto-opt` command for this step: https://microsoft.github.io/Olive/features/cli.html#auto-opt
Use the `olive auto-opt` command for this step: https://microsoft.github.io/Olive/how-to/cli/cli-auto-opt.html

The `--adapter path` can either be a HuggingFace adapter reference, or a path to the adapter you fine-tuned above.

Expand Down Expand Up @@ -162,4 +162,4 @@ python app.py -m <model folder> -a <.onnx_adapter files> -t <prompt template> -s
## References

* [Python API docs](../api/python.md#adapter-class)
* [Olive CLI docs](https://microsoft.github.io/Olive/features/cli.html)
* [Olive CLI docs](https://microsoft.github.io/Olive/how-to/index.html#working-with-the-cli)
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ The source code for this sample is available [here](https://github.com/microsoft
# How to build

## Prerequisites
1. [The Intel<sup>®</sup> Distribution of OpenVINO toolkit](https://docs.openvinotoolkit.org/latest/index.html)
1. [The Intel<sup>®</sup> Distribution of OpenVINO toolkit](https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/overview.html)

2. Use opencv (use the same opencv package that comes builtin with Intel<sup>®</sup> Distribution of OpenVINO toolkit)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ The source code for this sample is available [here](https://github.com/microsoft
# How to build

## Prerequisites
1. [The Intel<sup>®</sup> Distribution of OpenVINO toolkit](https://docs.openvinotoolkit.org/latest/index.html)
1. [The Intel<sup>®</sup> Distribution of OpenVINO toolkit](https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/overview.html)

2. Download the latest tinyYOLOv2 model from the ONNX Model Zoo.
This model was adapted from [ONNX Model Zoo](https://github.com/onnx/models).Download the latest version of the [tinyYOLOv2](https://github.com/onnx/models/tree/main/validated/vision/object_detection_segmentation/tiny-yolov2) model from here.
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/csharp/yolov3_object_detection_csharp.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ The source code for this sample is available [here](https://github.com/microsoft
## Prerequisites
1. Install [.NET Core 3.1](https://dotnet.microsoft.com/download/dotnet-core/3.1) or higher for you OS (Mac, Windows or Linux).

2. [The Intel<sup>®</sup> Distribution of OpenVINO toolkit](https://docs.openvinotoolkit.org/latest/index.html)
2. [The Intel<sup>®</sup> Distribution of OpenVINO toolkit](https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/overview.html)

3. Use any sample Image as input to the sample.

Expand Down

0 comments on commit 824c734

Please sign in to comment.