From 3efae70443697a5796f7f4b36230f108538489ce Mon Sep 17 00:00:00 2001 From: "github-actions[bot]" <41898282+github-actions[bot]@users.noreply.github.com> Date: Wed, 24 Apr 2024 11:49:30 +0000 Subject: [PATCH] awesome-stars category by topic update by github actions cron, created by starred --- topics.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/topics.md b/topics.md index 662d9bc..87b57cd 100644 --- a/topics.md +++ b/topics.md @@ -579,7 +579,7 @@ - [microsoft/LLMLingua](https://github.com/microsoft/LLMLingua) - To speed up LLMs' inference and enhance LLM's perceive of key information, compress the prompt and KV-Cache, which achieves up to 20x compression with minimal performance loss. - [thunlp/LLaVA-UHD](https://github.com/thunlp/LLaVA-UHD) - LLaVA-UHD: an LMM Perceiving Any Aspect Ratio and High-Resolution Images -- [Blaizzy/mlx-vlm](https://github.com/Blaizzy/mlx-vlm) - +- [Blaizzy/mlx-vlm](https://github.com/Blaizzy/mlx-vlm) - MLX-VLM is a package for running Vision LLMs locally on your Mac using MLX. - [hrishioa/tough-llm-tests](https://github.com/hrishioa/tough-llm-tests) - Some tough questions to test new models. - [meta-llama/llama3](https://github.com/meta-llama/llama3) - The official Meta Llama 3 GitHub site - [timpaul/form-extractor-prototype](https://github.com/timpaul/form-extractor-prototype) -