llama.cpp server-cuda-b4078 Public Latest
Install from the command line
$ docker pull ghcr.io/agray3/llama.cpp:server-cuda-b4078
linux/amd64
$ docker pull ghcr.io/agray3/llama.cpp:server-cuda-b4078@sha256:07628121816deb2f0a8eacdd7e7fdf1c546c91d3196cfebee15c1781a1cf70b0
unknown/unknown
$ docker pull ghcr.io/agray3/llama.cpp:server-cuda-b4078@sha256:ac50b58e1df9aa648d1237f302f97891d2db31f7f30b1941ee7950a826f96bf4
Recent tagged image versions
- 4 Version downloads
- 4 Version downloads
- 4 Version downloads
- 4 Version downloads
- 4 Version downloads
Loading
Sorry, something went wrong.
Last published
2 months ago
Total downloads