Skip to content

Commit

Permalink
Improve generate() API build instructions (#20181)
Browse files Browse the repository at this point in the history
  • Loading branch information
natke authored Apr 5, 2024
1 parent e31928e commit 4a0a4ce
Showing 1 changed file with 85 additions and 70 deletions.
155 changes: 85 additions & 70 deletions docs/genai/howto/build-from-source.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,105 +17,120 @@ nav_order: 2

`cmake`

## Build steps
## Clone the onnxruntime-genai repo

1. Clone this repo
```bash
git clone https://github.com/microsoft/onnxruntime-genai
cd onnxruntime-genai
```

```bash
git clone https://github.com/microsoft/onnxruntime-genai
cd onnxruntime-genai
```
## Install ONNX Runtime

2. Install ONNX Runtime
By default, the onnxruntime-genai build expects to find the ONNX Runtime include and binaries in a folder called `ort` in the root directory of onnxruntime-genai. You can put the ONNX Runtime files in a different location and specify this location to the onnxruntime-genai build. These instructions use `ORT_HOME` as the location.

By default, the onnxruntime-genai build expects to find the ONNX Runtime include and binaries in a folder called `ort` in the root directory of onnxruntime-genai. You can put the ONNX Runtime files in a different location and specify this location to the onnxruntime-genai build. These instructions use ORT_HOME as the location.
### Option 1: Install from release

* Install from release
These instructions are for the Linux GPU build of ONNX Runtime. Replace `linux-gpu` with your target of choice.

These instructions are for the Linux GPU build of ONNX Runtime. Replace the location with the operating system and target of choice.
```bash
cd <ORT_HOME>
curl -L https://github.com/microsoft/onnxruntime/releases/download/v1.17.0/onnxruntime-linux-x64-gpu-1.17.1.tgz
tar xvzf onnxruntime-linux-x64-gpu-1.17.1.tgz
mv onnxruntime-linux-x64-gpu-1.17.1/include .
mv onnxruntime-linux-x64-gpu-1.17.1/lib .
```

```bash
cd $ORT_HOME
wget https://github.com/microsoft/onnxruntime/releases/download/v1.17.0/onnxruntime-linux-x64-gpu-1.17.0.tgz
tar xvzf onnxruntime-linux-x64-gpu-1.17.0.tgz
mv onnxruntime-linux-x64-gpu-1.17.0/include .
mv onnxruntime-linux-x64-gpu-1.17.0/lib .
```
### Option 2: Install from nightly

* Install from nightly

Download the nightly nuget package `Microsoft.ML.OnnxRuntime` from: https://aiinfra.visualstudio.com/PublicPackages/_artifacts/feed/ORT-Nightly.
Download the nightly nuget package `Microsoft.ML.OnnxRuntime` from: https://aiinfra.visualstudio.com/PublicPackages/_artifacts/feed/ORT-Nightly.

Extract the nuget package.
Extract the nuget package.

```bash
tar xvf Microsoft.ML.OnnxRuntime.1.18.0-dev-20240322-0323-ca825cb6e6.nupkg
```
```bash
tar xvf Microsoft.ML.OnnxRuntime.1.18.0-dev-20240322-0323-ca825cb6e6.nupkg
```

Copy the include and lib files into $ORT_HOME.
Copy the include and lib files into `ORT_HOME`.

On Windows
On Windows

Example is given for `win-x64`. Change this to your architecture if different.
Example is given for `win-x64`. Change this to your architecture if different.

```cmd
copy build\native\include\onnxruntime_c_api.h $ORT_HOME\include
copy runtimes\win-x64\native\*.dll $ORT_HOME\lib
```
```cmd
copy build\native\include\onnxruntime_c_api.h <ORT_HOME>\include
copy runtimes\win-x64\native\*.dll <ORT_HOME>\lib
```

On Linux
On Linux

```cmd
cp build/native/include/onnxruntime_c_api.h $ORT_HOME/include
cp build/linux-x64/native/libonnxruntime*.so* $ORT_HOME/lib
```
```cmd
cp build/native/include/onnxruntime_c_api.h <ORT_HOME>/include
cp build/linux-x64/native/libonnxruntime*.so* <ORT_HOME>/lib
```
* Or build from source
### Option 3: Build from source

```
git clone https://github.com/microsoft/onnxruntime.git
cd onnxruntime
```

```
git clone https://github.com/microsoft/onnxruntime.git
cd onnxruntime
```
Create include and lib folders in the `ORT_HOME` directory

Create include and lib folders in the ORT_HOME directory
```bash
mkdir <ORT HOME>/include
mkdir <ORT_HOME>/lib
```

```bash
mkdir $ORT_HOME/include
mkdir $ORT_HOME/lib
```
Build from source and copy the include and libraries into `ORT_HOME`

Build from source and copy the include and libraries into ORT_HOME
On Windows

On Windows
```cmd
build.bat --build_shared_lib --skip_tests --parallel [--use_cuda]
copy include\onnxruntime\core\session\onnxruntime_c_api.h <ORT_HOME>\include
copy build\Windows\Debug\Debug\*.dll <ORT_HOME>\lib
```

```cmd
build.bat --build_shared_lib --skip_tests --parallel [--use_cuda]
copy include\onnxruntime\core\session\onnxruntime_c_api.h $ORT_HOME\include
copy build\Windows\Debug\Debug\*.dll $ORT_HOME\lib
```
On Linux

On Linux
```cmd
./build.sh --build_shared_lib --skip_tests --parallel [--use_cuda]
cp include/onnxruntime/core/session/onnxruntime_c_api.h <ORT_HOME>/include
cp build/Linux/RelWithDebInfo/libonnxruntime*.so* <ORT_HOME>/lib
```

```cmd
./build.sh --build_shared_lib --skip_tests --parallel [--use_cuda]
cp include/onnxruntime/core/session/onnxruntime_c_api.h $ORT_HOME/include
cp build/Linux/RelWithDebInfo/libonnxruntime*.so* $ORT_HOME/lib
```
## Build onnxruntime-genai

3. Build onnxruntime-genai
### Build for CPU

If you are building for CUDA, add the cuda_home argument.
```bash
cd ..
python build.py [--ort_home <ORT_HOME>]
```

```bash
cd ..
python build.py [--cuda_home <path_to_cuda_home>]
```
### Build for CUDA

These instructions assume you already have CUDA installed.

```bash
cd ..
python build.py --cuda_home <path to cuda home> [--ort_home <ORT_HOME>]
```

4. Install Python wheel
## Install the library into your application

### Install Python wheel

```bash
cd build/wheel
pip install *.whl
```

### Install Nuget package

_Coming soon_

### Install C/C++ header file and library

```bash
cd build/wheel
pip install *.whl
```
_Coming soon_

0 comments on commit 4a0a4ce

Please sign in to comment.