Explore Projects

23 analyzed repos ready for contributors.

og-drizzles/trache

11
Python

Trache is a CLI tool that caches a Trello board locally in SQLite and provides Git-style pull/push semantics for reading and mutating cards without hitting the Trello API repeatedly. It's specifica...

Collab
4
Activity
3
medium barriercli-toolai-agentapi-wrappermemory-layerclaude-code-helperdev-tooling

Great for people interested in building local-first caching layers for external SaaS APIs, or optimizing AI agent workflows to reduce expensive API round-trips — specifically in the Trello/project-management space.

agulaya24/baselayer

13
Python

Base Layer is a CLI tool and MCP server that processes personal text corpora (ChatGPT exports, journals, books) through a 4-step LLM pipeline (extract facts → author identity layers → compose brief...

Collab
3
Activity
3
high barriermcp-servermcp-clientmemory-layercli-toolai-agentapi-wrapper

people interested in personal AI memory systems, behavioral modeling from text corpora, or LLM-as-judge evaluation frameworks — specifically the problem of compressing identity signal from unstructured personal data into dense, injectable context

dandaka/traul

1
TypeScript

Traul is a local-first CLI tool that syncs messages from Slack, Discord, Telegram, Gmail, Linear, WhatsApp, and Claude Code sessions into a SQLite database, then lets you search them with FTS5 keyw...

Collab
5
Activity
3
medium barriercli-toolmemory-layerai-agentautomationapi-wrapper

People building local-first personal data pipelines, or anyone interested in giving AI agents searchable memory over real communication data (Slack, Telegram, Gmail) using SQLite FTS5 + vector embeddings without sending data to a cloud service.

agentailor/slimcontext-mcp-server

14
TypeScript

A thin MCP server wrapper around the SlimContext npm library that exposes two chat history compression tools: token-based trimming (drop oldest messages) and AI-powered summarization via OpenAI. It...

Collab
2
Activity
1
low barriermcp-serverapi-wrappermemory-layerai-agent

people interested in context window management for LLM applications, specifically implementing chat history compression strategies within the MCP ecosystem

chopratejas/headroom

7743
Python

Headroom is a transparent LLM context compression proxy that sits between your application and providers like OpenAI/Anthropic. It intercepts prompt messages and compresses tool outputs, JSON array...

Collab
8
Activity
6
medium barrierlibraryframeworkcli-toolai-agentmemory-layerapi-wrapperdev-tooling

people interested in LLM cost optimization infrastructure, specifically the engineering problem of compressing heterogeneous agent context (JSON tool outputs, logs, code, RAG results) without degrading answer quality — touching NLP compression algorithms, statistical anomaly detection, and LLM proxy architecture

samuelfaj/distill

3299
TypeScript

distill is a CLI pipe tool that compresses verbose command outputs (test logs, git diffs, terraform plans, etc.) before they're consumed by an LLM agent, saving tokens by using a local or remote LL...

Collab
4
Activity
5
low barriercli-tooldev-toolingai-agentclaude-code-helperapi-wrapper

building token-efficient LLM agent workflows, especially people integrating tools like Claude Code, Codex, or OpenCode into CI/heavy-output shell pipelines

Sign in to explore more

Sign In