Skip to content

Commit

Permalink
Merge branch 'main' into open-webui
Browse files Browse the repository at this point in the history
  • Loading branch information
shivaraj-bh committed Jun 13, 2024
2 parents d3d0ced + 4cb3824 commit bd7d185
Show file tree
Hide file tree
Showing 3 changed files with 116 additions and 2 deletions.
8 changes: 6 additions & 2 deletions doc/ollama.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Ollama

[Ollama](https://github.com/ollama/ollama) enables you to get up and running with Llama 3, Mistral, Gemma, and other large language models.
[Ollama](https://github.com/ollama/ollama) enables you to easily run large language models (LLMs) locally. It supports Llama 3, Mistral, Gemma and [many others](https://ollama.com/library).

## Getting Started

Expand All @@ -15,7 +15,9 @@

By default Ollama uses the CPU for inference. To enable GPU acceleration:

### Cuda
### CUDA

For NVIDIA GPUs.

```nix
# In `perSystem.process-compose.<name>`
Expand All @@ -29,6 +31,8 @@ By default Ollama uses the CPU for inference. To enable GPU acceleration:

### ROCm

For Radeon GPUs.

```nix
# In `perSystem.process-compose.<name>`
{
Expand Down
106 changes: 106 additions & 0 deletions example/llm/flake.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

4 changes: 4 additions & 0 deletions flake.nix
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,10 @@
inherit overrideInputs;
dir = "./example/simple";
};
llm-example = {
inherit overrideInputs;
dir = "./example/llm";
};
share-services-example = {
overrideInputs = {
inherit (overrideInputs) services-flake;
Expand Down

0 comments on commit bd7d185

Please sign in to comment.