AI coding-assistant chat shell
Chat-shell TUI for AI coding assistants. Pluggable backends (ship your own Anthropic / OpenAI / Ollama / shell-out adapter), Markdown rendering of replies via CandyShine, scrollback above a fixed input box.
composer require candycore/sugar-crush
composer install
./bin/sugarcrush
# By default it ships with EchoBackend (offline). Wire to a real LLM:
export SUGARCRUSH_BACKEND_CMD=~/bin/anthropic-stream.sh
./bin/sugarcrush
# ~/bin/anthropic-stream.sh
#!/usr/bin/env bash
payload=$(jq -nc --argjson h "$(cat)" \
'{model: "claude-opus-4-7", max_tokens: 4096, messages: $h}')
curl -sN https://api.anthropic.com/v1/messages \
-H "x-api-key: $ANTHROPIC_API_KEY" \
-H "anthropic-version: 2023-06-01" \
-H "content-type: application/json" \
-d "$payload" \
| jq -r '.content[0].text'
Backend::send(History): string. EchoBackend ships for offline runs.$SUGARCRUSH_BACKEND_CMD — hand it JSON history, read its stdout.VHS-recorded GIFs of every example shipped with the app. Regenerated automatically on every push that touches the source.