This is LLM Compress Prompt Tool for reducing the size of prompts for llm
The following Library is based upon Microsoft's LLMLinqua (https://github.com/microsoft/LLMLingua) prompt compression technique. LLMLingua is a highly performant prompt compression technique; however, it is designed for organizations that host their models. LLM Compress Prompt is a library that provides similar prompt compression; however, it is designed to not run on a GPU and instead uses third party LLMs to support the compression technique.