This sample uses the Code Llama model to help you write code.
Please check the repo's README for prerequisites for running this example.
$ cd api
$ spin build --up
Note: If you are using the Cloud GPU component, remember to reference the
runtime-config.toml
file, e.g.:spin build --up --runtime-config-file ./runtime-config.toml
.
You will be using a client application written in Rust to send requests to the Spin application. Run the following command from the root directory to send a request to the application.
$ cargo r --bin client -- -l bash 'Find how large each directory named "target" is that is found in any subdirectory of my home directory'
$ cd api
$ spin deploy
Make sure to change the url-reference in the `./client/src/main.rs`` file