Prompt Compress

Online LLM prompt compression tool with AI analysis.

What it does

Prompt Compress is a tool designed to optimize LLM prompts for efficiency and cost-effectiveness. Its main function is to reduce the number of tokens in a prompt while preserving its core meaning and intent. This is important because many AI models charge based on token usage. It takes a prompt and lets users specify an optimization pipeline with industry best techniques for reducing tokens while preserving the meaning of the prompt. It then sanity checks the LLM response to the compressed prompt and provides an analysis of it when compared to the original input. Prompt Compress uses Gemini at every stage as the input prompt is transformed to gather responses, token counts, and then to perform analysis.

Built with

  • Web/Chrome
  • Google Translate

Team

By

Glen Baker and Sam Partridge

From

United States