Skip to content

llama.cpp server-cuda-b4078 Public Latest

Install from the command line
$ docker pull ghcr.io/agray3/llama.cpp:server-cuda-b4078

Recent tagged image versions

  • Published about 2 months ago · Digest
    sha256:0fa72138445ddef56fb297737a3234f443c5a6cdc69dcfc3ad33e53dae5f0232
    4 Version downloads
  • Published about 2 months ago · Digest
    sha256:a51f1ac652f754979c849b8fc3e861c4b0608b5f8306ed4408282fd81ea36c75
    4 Version downloads
  • Published about 2 months ago · Digest
    sha256:341776b3ac5688258e507343130c9678b2f471c640d36c99be5d0cae932c6243
    4 Version downloads
  • Published about 2 months ago · Digest
    sha256:34cc20378a718560c327658781e33cd55913173515b9ca3f2e046ac730bde1c0
    4 Version downloads
  • Published about 2 months ago · Digest
    sha256:88f5b5b6c73807abd982853182c79d4e6f2a3676e3808108c7d7fcedde374ef9
    4 Version downloads

Loading

Details


Last published

2 months ago

Total downloads

2.15K