chore: auto-commit uncommitted changes
This commit is contained in:
parent
0eb25aa2a6
commit
851ed58a93
36
HEARTBEAT.md
36
HEARTBEAT.md
|
|
@ -130,7 +130,7 @@ Check and update Docker containers on 192.168.1.253 and HAOS on 192.168.1.252:
|
||||||
- `docker compose pull` → `docker compose up -d` → `docker image prune -f`
|
- `docker compose pull` → `docker compose up -d` → `docker image prune -f`
|
||||||
3. Report what was updated in the weekly briefing
|
3. Report what was updated in the weekly briefing
|
||||||
|
|
||||||
Services: immich, clickhouse, jellyfin, signal, qbittorrent-vpn
|
Services: immich, clickhouse, jellyfin, qbittorrent-vpn
|
||||||
**qbittorrent-vpn: PULL ONLY, do NOT start.** Johan uses it on-demand.
|
**qbittorrent-vpn: PULL ONLY, do NOT start.** Johan uses it on-demand.
|
||||||
SSH: `ssh johan@192.168.1.253`
|
SSH: `ssh johan@192.168.1.253`
|
||||||
|
|
||||||
|
|
@ -245,13 +245,34 @@ Update `memory/heartbeat-state.json` with `lastTechScan` timestamp after running
|
||||||
**State:** Track `lastIntraDayXScan` in `memory/heartbeat-state.json`. Skip if checked < 2h ago.
|
**State:** Track `lastIntraDayXScan` in `memory/heartbeat-state.json`. Skip if checked < 2h ago.
|
||||||
|
|
||||||
### ⚠️ ALWAYS SPAWN A SUBAGENT — never run inline
|
### ⚠️ ALWAYS SPAWN A SUBAGENT — never run inline
|
||||||
```
|
|
||||||
sessions_spawn(task="Intra-day X scan: ...", label="x-watch")
|
|
||||||
```
|
|
||||||
X scanning = multiple bird calls + web searches = context pollution. Offload it.
|
X scanning = multiple bird calls + web searches = context pollution. Offload it.
|
||||||
|
|
||||||
|
### De-duplication (MANDATORY — do both before posting)
|
||||||
|
|
||||||
|
**1. Last 24h only:** Only surface posts with timestamps within the last 24 hours. Discard anything older regardless of how interesting it is.
|
||||||
|
|
||||||
|
**2. Check what was already posted:** Before posting anything to the dashboard or pinging Johan, fetch recent dashboard news:
|
||||||
|
```bash
|
||||||
|
curl -s http://localhost:9200/api/news | python3 -c "import json,sys; items=json.load(sys.stdin).get('news',[]); [print(i['title']) for i in items[:20]]"
|
||||||
|
```
|
||||||
|
If a story with a similar title or topic was already posted today, **skip it**. Don't post NemoClaw twice. Don't post the same OC release three times.
|
||||||
|
|
||||||
|
**3. Save what you surfaced:** After the scan, write a short summary of what was posted to `memory/x-watch-last.md` (overwrite each time):
|
||||||
|
```
|
||||||
|
# Last X Watch: <ISO timestamp>
|
||||||
|
- NemoClaw announced at GTC (steipete, NVIDIA)
|
||||||
|
- MiniMax M2.7 benchmarks circulating
|
||||||
|
```
|
||||||
|
The next subagent reads this file at the start and skips anything already covered.
|
||||||
|
|
||||||
|
### How to start each scan
|
||||||
|
1. Read `memory/x-watch-last.md` — know what was already covered
|
||||||
|
2. Fetch dashboard news (last 20 items) — know what's already posted
|
||||||
|
3. Run bird scans, filter to last 24h only
|
||||||
|
4. Only post what's genuinely new
|
||||||
|
|
||||||
### Accounts to scan every run
|
### Accounts to scan every run
|
||||||
Check recent posts (last ~4h) from each:
|
Check recent posts (last ~24h) from each:
|
||||||
- **@Cloudflare** — MCP, Workers, AI integrations, platform announcements
|
- **@Cloudflare** — MCP, Workers, AI integrations, platform announcements
|
||||||
- **@openclaw** — releases, features, community highlights
|
- **@openclaw** — releases, features, community highlights
|
||||||
- **@steipete** — Peter Steinberger, OpenClaw creator
|
- **@steipete** — Peter Steinberger, OpenClaw creator
|
||||||
|
|
@ -288,9 +309,10 @@ Use: `bird user-tweets @handle` → filter for posts newer than last scan timest
|
||||||
- Anything from @OpenAI/@MiniMax_AI/@Kimi_Moonshot/@ZhipuAI/@GeminiApp that isn't a model release, pricing change, or major product launch — these accounts post constantly, only hard news counts
|
- Anything from @OpenAI/@MiniMax_AI/@Kimi_Moonshot/@ZhipuAI/@GeminiApp that isn't a model release, pricing change, or major product launch — these accounts post constantly, only hard news counts
|
||||||
|
|
||||||
### Subagent reports back with
|
### Subagent reports back with
|
||||||
- Any items surfaced (title + URL)
|
- Any items surfaced (title + URL) — new ones only, not repeats
|
||||||
- "Nothing significant" if quiet
|
- "Nothing new since last scan" if quiet
|
||||||
- **Do NOT list accounts with no news** — not even in a "dropped" or "nothing from X" section. Only mention accounts that had something worth surfacing.
|
- **Do NOT list accounts with no news** — not even in a "dropped" or "nothing from X" section. Only mention accounts that had something worth surfacing.
|
||||||
|
- Always write `memory/x-watch-last.md` even if nothing new — update the timestamp so the next scan knows when it last ran.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|
|
||||||
Binary file not shown.
|
|
@ -1,9 +1,9 @@
|
||||||
{
|
{
|
||||||
"last_updated": "2026-03-16T16:02:42.128566Z",
|
"last_updated": "2026-03-16T22:00:01.624085Z",
|
||||||
"source": "api",
|
"source": "api",
|
||||||
"session_percent": 6,
|
"session_percent": 6,
|
||||||
"session_resets": "2026-03-16T20:00:00.069453+00:00",
|
"session_resets": "2026-03-17T01:00:00.576842+00:00",
|
||||||
"weekly_percent": 21,
|
"weekly_percent": 23,
|
||||||
"weekly_resets": "2026-03-20T03:00:00.069481+00:00",
|
"weekly_resets": "2026-03-20T03:00:00.576862+00:00",
|
||||||
"sonnet_percent": 26
|
"sonnet_percent": 28
|
||||||
}
|
}
|
||||||
|
|
@ -14,9 +14,9 @@
|
||||||
"lastDocInbox": "2026-02-25T22:01:42.532628Z",
|
"lastDocInbox": "2026-02-25T22:01:42.532628Z",
|
||||||
"lastTechScan": 1773416379.4425044,
|
"lastTechScan": 1773416379.4425044,
|
||||||
"lastMemoryReview": "2026-03-13T16:32:00.000Z",
|
"lastMemoryReview": "2026-03-13T16:32:00.000Z",
|
||||||
"lastIntraDayXScan": 1773623402.5845382,
|
"lastIntraDayXScan": 1773695083.9935617,
|
||||||
"lastInouSuggestion": "2026-03-16T01:10:02.584545+00:00",
|
"lastInouSuggestion": "2026-03-16T21:04:43.993567+00:00",
|
||||||
"lastEmail": 1773623361.0962188,
|
"lastEmail": 1773695083.9935625,
|
||||||
"pendingBriefingItems": [],
|
"pendingBriefingItems": [],
|
||||||
"lastOvernightAgentWork": "2026-02-28T12:20:00Z",
|
"lastOvernightAgentWork": "2026-02-28T12:20:00Z",
|
||||||
"pendingReminders": []
|
"pendingReminders": []
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,13 @@
|
||||||
|
# Last X Watch: 2026-03-16T22:09:00Z
|
||||||
|
|
||||||
|
## Already surfaced today — skip these:
|
||||||
|
- NemoClaw announced at NVIDIA GTC by Jensen Huang (built on OpenClaw, steipete on stage)
|
||||||
|
- OpenClaw 2026.3.11, 2026.3.12, 2026.3.13 releases (Chrome DevTools MCP, Android 7MB, Dashboard v2, /fast mode, Ollama provider, Hunter & Healer Alpha)
|
||||||
|
- Ollama now official OpenClaw provider
|
||||||
|
- steipete teasing Claude Code/Codex plugin bundle support + leaner core
|
||||||
|
- CodexBar 0.18 (new providers: Kilo, Ollama, OpenRouter)
|
||||||
|
- MiniMax M2.7 benchmarks circulating on X — release imminent
|
||||||
|
- Kimi Moonshot: Attention Residuals architecture paper; founder at GTC Tue Mar 17
|
||||||
|
- Susie Wiles (WH Chief of Staff) breast cancer diagnosis — early stage, continuing full-time
|
||||||
|
- Operation Epic Fury US/Iran military strikes ongoing
|
||||||
|
- Trump anti-fraud EO signed (Vance chairs task force)
|
||||||
|
|
@ -7,7 +7,6 @@ set -euo pipefail
|
||||||
SCRIPT_DIR="$(dirname "$0")"
|
SCRIPT_DIR="$(dirname "$0")"
|
||||||
DB="$SCRIPT_DIR/../memory/claude-usage.db"
|
DB="$SCRIPT_DIR/../memory/claude-usage.db"
|
||||||
USAGE_JSON="$SCRIPT_DIR/../memory/claude-usage.json"
|
USAGE_JSON="$SCRIPT_DIR/../memory/claude-usage.json"
|
||||||
SIGNAL_SKILL="$SCRIPT_DIR/../skills/signal-notify"
|
|
||||||
|
|
||||||
# Run the fetch first
|
# Run the fetch first
|
||||||
"$SCRIPT_DIR/claude-usage-check.sh" || true
|
"$SCRIPT_DIR/claude-usage-check.sh" || true
|
||||||
|
|
|
||||||
|
|
@ -27,10 +27,9 @@ if [ -d "$SESSION_DIR" ]; then
|
||||||
rm -f "$SESSION_DIR"/*.jsonl
|
rm -f "$SESSION_DIR"/*.jsonl
|
||||||
systemctl --user restart clawdbot-gateway
|
systemctl --user restart clawdbot-gateway
|
||||||
|
|
||||||
# Alert via Signal (if configured)
|
|
||||||
# curl -X POST http://localhost:8080/v1/send -d '{"number":"+31...", "message":"K2 KILLED: runaway loop detected"}'
|
|
||||||
|
|
||||||
log "Gateway restarted, sessions cleared"
|
log "Gateway restarted, sessions cleared"
|
||||||
|
/home/johan/clawd/scripts/notify.sh -t "K2 killed" -p 4 -T "warning,robot" "Runaway K2 session cleared — $LINES API calls (~${EST_TOKENS} tokens). Gateway restarted."
|
||||||
exit 1
|
exit 1
|
||||||
elif [ "$EST_TOKENS" -gt "$THRESHOLD_TOKENS" ]; then
|
elif [ "$EST_TOKENS" -gt "$THRESHOLD_TOKENS" ]; then
|
||||||
log "WARN: Session $session has $LINES calls (~${EST_TOKENS} tokens). Monitoring."
|
log "WARN: Session $session has $LINES calls (~${EST_TOKENS} tokens). Monitoring."
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,107 @@
|
||||||
|
#!/bin/bash
|
||||||
|
# notify.sh — Centralized notification dispatcher
|
||||||
|
# Usage: notify.sh [OPTIONS] "message"
|
||||||
|
#
|
||||||
|
# Options:
|
||||||
|
# -t TITLE Title (default: "forge alert")
|
||||||
|
# -p PRIORITY 1-5 (default: 3). 1=min, 3=default, 5=urgent
|
||||||
|
# -T TAGS Comma-separated ntfy tag/emoji shortcodes (default: "bell")
|
||||||
|
# -c CHANNEL Target channel: forge (default), inou, dashboard, all
|
||||||
|
# -u Urgent mode: priority 5 + "rotating_light" tag
|
||||||
|
#
|
||||||
|
# Examples:
|
||||||
|
# notify.sh "Disk usage at 87%"
|
||||||
|
# notify.sh -t "K2 Killed" -p 4 -T "warning,robot" "Runaway session cleared"
|
||||||
|
# notify.sh -c inou -t "inou error" "API returned 500"
|
||||||
|
# notify.sh -c all "Critical: forge unreachable"
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
# ── Config ────────────────────────────────────────────────────────────────────
|
||||||
|
NTFY_URL="https://ntfy.inou.com"
|
||||||
|
NTFY_TOKEN="tk_k120jegay3lugeqbr9fmpuxdqmzx5"
|
||||||
|
DASHBOARD_URL="http://localhost:9200/api/news"
|
||||||
|
|
||||||
|
TOPIC_FORGE="forge-alerts"
|
||||||
|
TOPIC_INOU="inou-alerts"
|
||||||
|
|
||||||
|
# ── Defaults ──────────────────────────────────────────────────────────────────
|
||||||
|
TITLE="forge alert"
|
||||||
|
PRIORITY=3
|
||||||
|
TAGS="bell"
|
||||||
|
CHANNEL="forge"
|
||||||
|
URGENT=0
|
||||||
|
|
||||||
|
# ── Parse args ────────────────────────────────────────────────────────────────
|
||||||
|
while getopts ":t:p:T:c:u" opt; do
|
||||||
|
case $opt in
|
||||||
|
t) TITLE="$OPTARG" ;;
|
||||||
|
p) PRIORITY="$OPTARG" ;;
|
||||||
|
T) TAGS="$OPTARG" ;;
|
||||||
|
c) CHANNEL="$OPTARG" ;;
|
||||||
|
u) URGENT=1 ;;
|
||||||
|
*) echo "Unknown option: -$OPTARG" >&2; exit 1 ;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
shift $((OPTIND - 1))
|
||||||
|
|
||||||
|
MESSAGE="${1:-}"
|
||||||
|
if [ -z "$MESSAGE" ]; then
|
||||||
|
echo "Usage: notify.sh [OPTIONS] \"message\"" >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ "$URGENT" -eq 1 ]; then
|
||||||
|
PRIORITY=5
|
||||||
|
TAGS="rotating_light"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ── Send to ntfy ──────────────────────────────────────────────────────────────
|
||||||
|
ntfy_send() {
|
||||||
|
local topic="$1"
|
||||||
|
curl -s "$NTFY_URL/$topic" \
|
||||||
|
-H "Authorization: Bearer $NTFY_TOKEN" \
|
||||||
|
-H "Title: $TITLE" \
|
||||||
|
-H "Priority: $PRIORITY" \
|
||||||
|
-H "Tags: $TAGS" \
|
||||||
|
-H "Markdown: yes" \
|
||||||
|
-d "$MESSAGE" \
|
||||||
|
> /dev/null
|
||||||
|
}
|
||||||
|
|
||||||
|
# ── Send to dashboard news ────────────────────────────────────────────────────
|
||||||
|
dashboard_send() {
|
||||||
|
# Map priority to dashboard type
|
||||||
|
local type="info"
|
||||||
|
[ "$PRIORITY" -ge 4 ] && type="warning"
|
||||||
|
[ "$PRIORITY" -ge 5 ] && type="error"
|
||||||
|
|
||||||
|
curl -s -X POST "$DASHBOARD_URL" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d "{\"title\":\"$TITLE\",\"body\":\"$MESSAGE\",\"type\":\"$type\",\"source\":\"notify\"}" \
|
||||||
|
> /dev/null
|
||||||
|
}
|
||||||
|
|
||||||
|
# ── Dispatch ──────────────────────────────────────────────────────────────────
|
||||||
|
case "$CHANNEL" in
|
||||||
|
forge)
|
||||||
|
ntfy_send "$TOPIC_FORGE"
|
||||||
|
;;
|
||||||
|
inou)
|
||||||
|
ntfy_send "$TOPIC_INOU"
|
||||||
|
;;
|
||||||
|
dashboard)
|
||||||
|
dashboard_send
|
||||||
|
;;
|
||||||
|
all)
|
||||||
|
ntfy_send "$TOPIC_FORGE"
|
||||||
|
ntfy_send "$TOPIC_INOU"
|
||||||
|
dashboard_send
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
echo "Unknown channel: $CHANNEL (forge|inou|dashboard|all)" >&2
|
||||||
|
exit 1
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
exit 0
|
||||||
|
|
@ -16,15 +16,18 @@ for model in "${MODELS[@]}"; do
|
||||||
model_short=$(basename "$model")
|
model_short=$(basename "$model")
|
||||||
echo "FOUND: $model"
|
echo "FOUND: $model"
|
||||||
|
|
||||||
# Post to dashboard
|
# Post to dashboard + notify
|
||||||
curl -s -X POST http://localhost:9200/api/news \
|
/home/johan/clawd/scripts/notify.sh -c dashboard -t "GGUF Available: $model_short" -T "package" "$model is now available for download on HuggingFace."
|
||||||
-H 'Content-Type: application/json' \
|
/home/johan/clawd/scripts/notify.sh -t "GGUF ready: $model_short" -T "package" -p 3 "https://huggingface.co/$model"
|
||||||
-d "{\"title\":\"GGUF Available: $model_short\",\"body\":\"$model is now available for download on HuggingFace.\",\"type\":\"success\",\"source\":\"qwen-gguf-watch\"}" \
|
|
||||||
> /dev/null
|
|
||||||
|
|
||||||
# Signal Johan
|
FOUND=$((FOUND + 1))
|
||||||
curl -s -X POST "http://localhost:8080/api/v1/rpc" \
|
fi
|
||||||
-H "Content-Type: application/json" \
|
done
|
||||||
|
|
||||||
|
if [ $FOUND -eq 0 ]; then
|
||||||
|
echo "No Qwen3.5 GGUFs yet."
|
||||||
|
fi
|
||||||
|
cation/json" \
|
||||||
-d "{\"jsonrpc\":\"2.0\",\"method\":\"send\",\"params\":{\"recipient\":[\"+17272252475\"],\"message\":\"⚡ GGUF ready: $model_short — https://huggingface.co/$model\"},\"id\":1}" \
|
-d "{\"jsonrpc\":\"2.0\",\"method\":\"send\",\"params\":{\"recipient\":[\"+17272252475\"],\"message\":\"⚡ GGUF ready: $model_short — https://huggingface.co/$model\"},\"id\":1}" \
|
||||||
> /dev/null
|
> /dev/null
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -1,67 +0,0 @@
|
||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Simple webhook proxy for Uptime Kuma → Signal
|
|
||||||
* Translates Uptime Kuma webhook payloads to signal-cli JSON-RPC
|
|
||||||
*
|
|
||||||
* Run: node signal-webhook-proxy.js
|
|
||||||
* Listens on port 8085
|
|
||||||
*/
|
|
||||||
|
|
||||||
const http = require('http');
|
|
||||||
|
|
||||||
const SIGNAL_RPC_URL = 'http://localhost:8080/api/v1/rpc';
|
|
||||||
const RECIPIENT = '+31634481877';
|
|
||||||
const PORT = 8085;
|
|
||||||
|
|
||||||
async function sendSignal(message) {
|
|
||||||
const payload = {
|
|
||||||
jsonrpc: '2.0',
|
|
||||||
method: 'send',
|
|
||||||
params: { recipient: RECIPIENT, message },
|
|
||||||
id: Date.now()
|
|
||||||
};
|
|
||||||
|
|
||||||
const res = await fetch(SIGNAL_RPC_URL, {
|
|
||||||
method: 'POST',
|
|
||||||
headers: { 'Content-Type': 'application/json' },
|
|
||||||
body: JSON.stringify(payload)
|
|
||||||
});
|
|
||||||
return res.json();
|
|
||||||
}
|
|
||||||
|
|
||||||
const server = http.createServer(async (req, res) => {
|
|
||||||
if (req.method !== 'POST') {
|
|
||||||
res.writeHead(405);
|
|
||||||
res.end('Method not allowed');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
let body = '';
|
|
||||||
req.on('data', chunk => body += chunk);
|
|
||||||
req.on('end', async () => {
|
|
||||||
try {
|
|
||||||
const data = JSON.parse(body);
|
|
||||||
|
|
||||||
// Extract message from Uptime Kuma payload
|
|
||||||
// Uptime Kuma sends: { msg, monitor: { name }, heartbeat: { status, msg } }
|
|
||||||
const message = data.msg ||
|
|
||||||
`[${data.monitor?.name || 'Unknown'}] ${data.heartbeat?.status === 1 ? '🟢 UP' : '🔴 DOWN'}`;
|
|
||||||
|
|
||||||
console.log(`[${new Date().toISOString()}] Forwarding to Signal: ${message.substring(0, 100)}...`);
|
|
||||||
|
|
||||||
const result = await sendSignal(message);
|
|
||||||
|
|
||||||
res.writeHead(200, { 'Content-Type': 'application/json' });
|
|
||||||
res.end(JSON.stringify({ ok: true, result }));
|
|
||||||
} catch (err) {
|
|
||||||
console.error('Error:', err.message);
|
|
||||||
res.writeHead(500, { 'Content-Type': 'application/json' });
|
|
||||||
res.end(JSON.stringify({ ok: false, error: err.message }));
|
|
||||||
}
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
server.listen(PORT, '0.0.0.0', () => {
|
|
||||||
console.log(`Signal webhook proxy listening on http://0.0.0.0:${PORT}`);
|
|
||||||
console.log(`Configure Uptime Kuma webhook URL: http://james:${PORT}/`);
|
|
||||||
});
|
|
||||||
Loading…
Reference in New Issue