feat: audit hardening, webhook retry, and local Claude session tracking (#68)

Security hardening:
- Fix timing-safe comparison bugs in webhooks.ts and auth.ts (was comparing buffer with itself)
- Harden rate limiter IP extraction — use rightmost untrusted IP from XFF chain with MC_TRUSTED_PROXIES support
- Add 12-char minimum password validation in Zod schema and runtime check
- Add Zod validation on PUT /api/tasks bulk status update

Webhook retry system (completing in-progress feature):
- Exponential backoff with circuit breaker in webhooks.ts
- POST /api/webhooks/retry endpoint for manual retry
- GET /api/webhooks/verify-docs endpoint for signature verification docs
- Scheduler integration for automatic retry processing
- Unit tests for signature verification and backoff logic

Local Claude Code session tracking:
- New claude-sessions.ts scanner parses JSONL transcripts from ~/.claude/projects/
- Extracts model, tokens, messages, cost estimates, active status per session
- Migration 020 adds claude_sessions table
- GET/POST /api/claude/sessions endpoint with filtering and aggregate stats
- Scheduler runs scan every 60s with MC_CLAUDE_HOME config

Quality improvements:
- Replace all console.error/warn with structured logger across 31 API routes
- Add Docker HEALTHCHECK directive
- Add vitest coverage config with v8 provider (60% threshold)
- Update README with new features, API docs, env vars, and roadmap items
- Fix E2E tests for password length and rate limiter IP changes
This commit is contained in:
nyk 2026-03-02 22:17:35 +07:00 committed by GitHub
parent b2703b37d5
commit 96168fe2f4
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
54 changed files with 1090 additions and 188 deletions

View File

@ -23,8 +23,11 @@ COPY --from=build /app/.next/static ./.next/static
COPY --from=build /app/public* ./public/
# Create data directory with correct ownership for SQLite
RUN mkdir -p .data && chown nextjs:nodejs .data
RUN apt-get update && apt-get install -y curl --no-install-recommends && rm -rf /var/lib/apt/lists/*
USER nextjs
EXPOSE 3000
ENV PORT=3000
ENV HOSTNAME=0.0.0.0
HEALTHCHECK --interval=30s --timeout=5s --start-period=10s --retries=3 \
CMD curl -f http://localhost:3000/api/status || exit 1
CMD ["node", "server.js"]

View File

@ -33,6 +33,8 @@ Running AI agents at scale means juggling sessions, tasks, costs, and reliabilit
## Quick Start
> **Requires [pnpm](https://pnpm.io/installation)** — Mission Control uses pnpm for dependency management. Install it with `npm install -g pnpm` or `corepack enable`.
```bash
git clone https://github.com/builderz-labs/mission-control.git
cd mission-control
@ -89,6 +91,9 @@ Scheduled tasks for database backups, stale record cleanup, and agent heartbeat
### Direct CLI Integration
Connect Claude Code, Codex, or any CLI tool directly to Mission Control without requiring a gateway. Register connections, send heartbeats with inline token reporting, and auto-register agents.
### Claude Code Session Tracking
Automatically discovers and tracks local Claude Code sessions by scanning `~/.claude/projects/`. Extracts token usage, model info, message counts, cost estimates, and active status from JSONL transcripts. Scans every 60 seconds via the background scheduler.
### GitHub Issues Sync
Inbound sync from GitHub repositories with label and assignee mapping. Synced issues appear on the task board alongside agent-created tasks.
@ -113,7 +118,8 @@ mission-control/
│ ├── lib/
│ │ ├── auth.ts # Session + API key auth, RBAC
│ │ ├── db.ts # SQLite (better-sqlite3, WAL mode)
│ │ ├── migrations.ts # 18 schema migrations
│ │ ├── claude-sessions.ts # Local Claude Code session scanner
│ │ ├── migrations.ts # 20 schema migrations
│ │ ├── scheduler.ts # Background task scheduler
│ │ ├── webhooks.ts # Outbound webhook delivery
│ │ └── websocket.ts # Gateway WebSocket client
@ -234,6 +240,8 @@ All endpoints require authentication unless noted. Full reference below.
|--------|------|------|-------------|
| `GET/POST/PUT/DELETE` | `/api/webhooks` | admin | Webhook CRUD |
| `POST` | `/api/webhooks/test` | admin | Test delivery |
| `POST` | `/api/webhooks/retry` | admin | Manual retry a failed delivery |
| `GET` | `/api/webhooks/verify-docs` | viewer | Signature verification docs |
| `GET` | `/api/webhooks/deliveries` | admin | Delivery history |
| `GET/POST/PUT/DELETE` | `/api/alerts` | admin | Alert rules |
| `GET/POST/PUT/DELETE` | `/api/gateways` | admin | Gateway connections |
@ -276,6 +284,16 @@ All endpoints require authentication unless noted. Full reference below.
</details>
<details>
<summary><strong>Claude Code Sessions</strong></summary>
| Method | Path | Role | Description |
|--------|------|------|-------------|
| `GET` | `/api/claude/sessions` | viewer | List discovered sessions (filter: `?active=1`, `?project=`) |
| `POST` | `/api/claude/sessions` | operator | Trigger manual session scan |
</details>
<details>
<summary><strong>Pipelines</strong></summary>
@ -300,6 +318,8 @@ See [`.env.example`](.env.example) for the complete list. Key variables:
| `OPENCLAW_GATEWAY_HOST` | No | Gateway host (default: `127.0.0.1`) |
| `OPENCLAW_GATEWAY_PORT` | No | Gateway WebSocket port (default: `18789`) |
| `OPENCLAW_MEMORY_DIR` | No | Memory browser root (see note below) |
| `MC_CLAUDE_HOME` | No | Path to `~/.claude` directory (default: `~/.claude`) |
| `MC_TRUSTED_PROXIES` | No | Comma-separated trusted proxy IPs for XFF parsing |
| `MC_ALLOWED_HOSTS` | No | Host allowlist for production |
*Memory browser, log viewer, and gateway config require `OPENCLAW_HOME`.
@ -360,15 +380,18 @@ See [open issues](https://github.com/builderz-labs/mission-control/issues) for p
- [x] OpenAPI 3.1 documentation with Scalar UI ([#60](https://github.com/builderz-labs/mission-control/pull/60))
- [x] GitHub Issues sync — inbound sync with label/assignee mapping ([#63](https://github.com/builderz-labs/mission-control/pull/63))
- [x] Webhook retry with exponential backoff and circuit breaker
- [x] Webhook signature verification (HMAC-SHA256 with constant-time comparison)
- [x] Local Claude Code session tracking — auto-discover sessions from `~/.claude/projects/`
- [x] Rate limiter IP extraction hardening with trusted proxy support
**Up next:**
- [ ] Agent-agnostic gateway support — connect any orchestration framework (OpenClaw, ZeroClaw, OpenFang, NeoBot, IronClaw, etc.), not just OpenClaw
- [ ] Native macOS app (Electron or Tauri)
- [ ] First-class per-agent cost breakdowns — dedicated panel with per-agent token usage and spend (currently derivable from per-session data)
- [ ] Webhook retry with exponential backoff
- [ ] OAuth approval UI improvements
- [ ] API token rotation UI
- [ ] Webhook signature verification
## Contributing

View File

@ -1,6 +1,7 @@
import { NextRequest, NextResponse } from 'next/server';
import { getDatabase, Activity } from '@/lib/db';
import { requireRole } from '@/lib/auth'
import { requireRole } from '@/lib/auth';
import { logger } from '@/lib/logger';
/**
* GET /api/activities - Get activity stream or stats
@ -21,7 +22,7 @@ export async function GET(request: NextRequest) {
// Default activities endpoint
return handleActivitiesRequest(request);
} catch (error) {
console.error('GET /api/activities error:', error);
logger.error({ err: error }, 'GET /api/activities error');
return NextResponse.json({ error: 'Failed to process request' }, { status: 500 });
}
}
@ -115,7 +116,7 @@ async function handleActivitiesRequest(request: NextRequest) {
}
}
} catch (error) {
console.warn(`Failed to fetch entity details for activity ${activity.id}:`, error);
logger.warn({ err: error, activityId: activity.id }, 'Failed to fetch entity details for activity');
}
return {
@ -157,7 +158,7 @@ async function handleActivitiesRequest(request: NextRequest) {
hasMore: offset + activities.length < countResult.total
});
} catch (error) {
console.error('GET /api/activities (activities) error:', error);
logger.error({ err: error }, 'GET /api/activities (activities) error');
return NextResponse.json({ error: 'Failed to fetch activities' }, { status: 500 });
}
}
@ -219,7 +220,7 @@ async function handleStatsRequest(request: NextRequest) {
}))
});
} catch (error) {
console.error('GET /api/activities (stats) error:', error);
logger.error({ err: error }, 'GET /api/activities (stats) error');
return NextResponse.json({ error: 'Failed to fetch activity stats' }, { status: 500 });
}
}

View File

@ -1,6 +1,7 @@
import { NextRequest, NextResponse } from 'next/server';
import { getDatabase, db_helpers } from '@/lib/db';
import { requireRole } from '@/lib/auth';
import { logger } from '@/lib/logger';
/**
* GET /api/agents/[id]/heartbeat - Agent heartbeat check
@ -161,7 +162,7 @@ export async function GET(
});
} catch (error) {
console.error('GET /api/agents/[id]/heartbeat error:', error);
logger.error({ err: error }, 'GET /api/agents/[id]/heartbeat error');
return NextResponse.json({ error: 'Failed to perform heartbeat check' }, { status: 500 });
}
}

View File

@ -1,6 +1,7 @@
import { NextRequest, NextResponse } from 'next/server';
import { getDatabase, db_helpers } from '@/lib/db';
import { requireRole } from '@/lib/auth';
import { logger } from '@/lib/logger';
/**
* GET /api/agents/[id]/memory - Get agent's working memory
@ -58,7 +59,7 @@ export async function GET(
size: workingMemory.length
});
} catch (error) {
console.error('GET /api/agents/[id]/memory error:', error);
logger.error({ err: error }, 'GET /api/agents/[id]/memory error');
return NextResponse.json({ error: 'Failed to fetch working memory' }, { status: 500 });
}
}
@ -147,7 +148,7 @@ export async function PUT(
size: newContent.length
});
} catch (error) {
console.error('PUT /api/agents/[id]/memory error:', error);
logger.error({ err: error }, 'PUT /api/agents/[id]/memory error');
return NextResponse.json({ error: 'Failed to update working memory' }, { status: 500 });
}
}
@ -207,7 +208,7 @@ export async function DELETE(
updated_at: now
});
} catch (error) {
console.error('DELETE /api/agents/[id]/memory error:', error);
logger.error({ err: error }, 'DELETE /api/agents/[id]/memory error');
return NextResponse.json({ error: 'Failed to clear working memory' }, { status: 500 });
}
}

View File

@ -3,6 +3,7 @@ import { getDatabase, db_helpers, logAuditEvent } from '@/lib/db'
import { getUserFromRequest, requireRole } from '@/lib/auth'
import { writeAgentToConfig } from '@/lib/agent-sync'
import { eventBus } from '@/lib/event-bus'
import { logger } from '@/lib/logger'
/**
* GET /api/agents/[id] - Get a single agent by ID or name
@ -36,7 +37,7 @@ export async function GET(
return NextResponse.json({ agent: parsed })
} catch (error) {
console.error('GET /api/agents/[id] error:', error)
logger.error({ err: error }, 'GET /api/agents/[id] error')
return NextResponse.json({ error: 'Failed to fetch agent' }, { status: 500 })
}
}
@ -158,7 +159,7 @@ export async function PUT(
agent: { ...agent, config: newConfig, role: role || agent.role, updated_at: now },
})
} catch (error: any) {
console.error('PUT /api/agents/[id] error:', error)
logger.error({ err: error }, 'PUT /api/agents/[id] error')
return NextResponse.json({ error: error.message || 'Failed to update agent' }, { status: 500 })
}
}
@ -203,7 +204,7 @@ export async function DELETE(
return NextResponse.json({ success: true, deleted: agent.name })
} catch (error) {
console.error('DELETE /api/agents/[id] error:', error)
logger.error({ err: error }, 'DELETE /api/agents/[id] error')
return NextResponse.json({ error: 'Failed to delete agent' }, { status: 500 })
}
}

View File

@ -5,6 +5,7 @@ import { join } from 'path';
import { config } from '@/lib/config';
import { resolveWithin } from '@/lib/paths';
import { getUserFromRequest, requireRole } from '@/lib/auth';
import { logger } from '@/lib/logger';
/**
* GET /api/agents/[id]/soul - Get agent's SOUL content
@ -44,7 +45,7 @@ export async function GET(
.map(file => file.replace('.md', ''));
}
} catch (error) {
console.warn('Could not read soul templates directory:', error);
logger.warn({ err: error }, 'Could not read soul templates directory');
}
return NextResponse.json({
@ -58,7 +59,7 @@ export async function GET(
updated_at: agent.updated_at
});
} catch (error) {
console.error('GET /api/agents/[id]/soul error:', error);
logger.error({ err: error }, 'GET /api/agents/[id]/soul error');
return NextResponse.json({ error: 'Failed to fetch SOUL content' }, { status: 500 });
}
}
@ -118,7 +119,7 @@ export async function PUT(
return NextResponse.json({ error: 'Template not found' }, { status: 404 });
}
} catch (error) {
console.error('Error loading soul template:', error);
logger.error({ err: error }, 'Error loading soul template');
return NextResponse.json({ error: 'Failed to load template' }, { status: 500 });
}
}
@ -155,7 +156,7 @@ export async function PUT(
updated_at: now
});
} catch (error) {
console.error('PUT /api/agents/[id]/soul error:', error);
logger.error({ err: error }, 'PUT /api/agents/[id]/soul error');
return NextResponse.json({ error: 'Failed to update SOUL content' }, { status: 500 });
}
}
@ -226,7 +227,7 @@ export async function PATCH(
return NextResponse.json({ templates });
} catch (error) {
console.error('PATCH /api/agents/[id]/soul error:', error);
logger.error({ err: error }, 'PATCH /api/agents/[id]/soul error');
return NextResponse.json({ error: 'Failed to fetch templates' }, { status: 500 });
}
}

View File

@ -2,6 +2,7 @@ import { NextRequest, NextResponse } from 'next/server'
import { getDatabase, db_helpers } from '@/lib/db'
import { runOpenClaw } from '@/lib/command'
import { requireRole } from '@/lib/auth'
import { logger } from '@/lib/logger'
export async function POST(
request: NextRequest,
@ -57,7 +58,7 @@ export async function POST(
stdout: stdout.trim()
})
} catch (error) {
console.error('POST /api/agents/[id]/wake error:', error)
logger.error({ err: error }, 'POST /api/agents/[id]/wake error')
return NextResponse.json({ error: 'Failed to wake agent' }, { status: 500 })
}
}

View File

@ -1,6 +1,7 @@
import { NextRequest, NextResponse } from "next/server"
import { getDatabase, Message } from "@/lib/db"
import { requireRole } from '@/lib/auth'
import { logger } from '@/lib/logger'
/**
* GET /api/agents/comms - Inter-agent communication stats and timeline
@ -153,7 +154,7 @@ export async function GET(request: NextRequest) {
source: { mode: source, seededCount, liveCount },
})
} catch (error) {
console.error("GET /api/agents/comms error:", error)
logger.error({ err: error }, "GET /api/agents/comms error")
return NextResponse.json({ error: "Failed to fetch agent communications" }, { status: 500 })
}
}

View File

@ -4,6 +4,7 @@ import { runOpenClaw } from '@/lib/command'
import { requireRole } from '@/lib/auth'
import { validateBody, createMessageSchema } from '@/lib/validation'
import { mutationLimiter } from '@/lib/rate-limit'
import { logger } from '@/lib/logger'
export async function POST(request: NextRequest) {
const auth = requireRole(request, 'operator')
@ -61,7 +62,7 @@ export async function POST(request: NextRequest) {
return NextResponse.json({ success: true })
} catch (error) {
console.error('POST /api/agents/message error:', error)
logger.error({ err: error }, 'POST /api/agents/message error')
return NextResponse.json({ error: 'Failed to send message' }, { status: 500 })
}
}

View File

@ -1,6 +1,7 @@
import { NextRequest, NextResponse } from 'next/server'
import { requireRole } from '@/lib/auth'
import { syncAgentsFromConfig, previewSyncDiff } from '@/lib/agent-sync'
import { logger } from '@/lib/logger'
/**
* POST /api/agents/sync - Trigger agent config sync from openclaw.json
@ -19,7 +20,7 @@ export async function POST(request: NextRequest) {
return NextResponse.json(result)
} catch (error: any) {
console.error('POST /api/agents/sync error:', error)
logger.error({ err: error }, 'POST /api/agents/sync error')
return NextResponse.json({ error: error.message || 'Sync failed' }, { status: 500 })
}
}
@ -36,7 +37,7 @@ export async function GET(request: NextRequest) {
const diff = await previewSyncDiff()
return NextResponse.json(diff)
} catch (error: any) {
console.error('GET /api/agents/sync error:', error)
logger.error({ err: error }, 'GET /api/agents/sync error')
return NextResponse.json({ error: error.message || 'Preview failed' }, { status: 500 })
}
}

View File

@ -2,6 +2,7 @@ import { NextRequest, NextResponse } from 'next/server'
import { getUserFromRequest, updateUser , requireRole } from '@/lib/auth'
import { logAuditEvent } from '@/lib/db'
import { verifyPassword } from '@/lib/password'
import { logger } from '@/lib/logger'
export async function GET(request: Request) {
const auth = requireRole(request, 'viewer')
@ -105,7 +106,7 @@ export async function PATCH(request: NextRequest) {
},
})
} catch (error) {
console.error('PATCH /api/auth/me error:', error)
logger.error({ err: error }, 'PATCH /api/auth/me error')
return NextResponse.json({ error: 'Failed to update profile' }, { status: 500 })
}
}

View File

@ -3,6 +3,7 @@ import { getUserFromRequest, getAllUsers, createUser, updateUser, deleteUser , r
import { logAuditEvent } from '@/lib/db'
import { validateBody, createUserSchema } from '@/lib/validation'
import { mutationLimiter } from '@/lib/rate-limit'
import { logger } from '@/lib/logger'
/**
* GET /api/auth/users - List all users (admin only)
@ -62,7 +63,7 @@ export async function POST(request: NextRequest) {
if (error.message?.includes('UNIQUE constraint failed')) {
return NextResponse.json({ error: 'Username already exists' }, { status: 409 })
}
console.error('POST /api/auth/users error:', error)
logger.error({ err: error }, 'POST /api/auth/users error')
return NextResponse.json({ error: 'Failed to create user' }, { status: 500 })
}
}
@ -117,7 +118,7 @@ export async function PUT(request: NextRequest) {
}
})
} catch (error) {
console.error('PUT /api/auth/users error:', error)
logger.error({ err: error }, 'PUT /api/auth/users error')
return NextResponse.json({ error: 'Failed to update user' }, { status: 500 })
}
}

View File

@ -5,6 +5,7 @@ import { config, ensureDirExists } from '@/lib/config'
import { join, dirname } from 'path'
import { readdirSync, statSync, unlinkSync } from 'fs'
import { heavyLimiter } from '@/lib/rate-limit'
import { logger } from '@/lib/logger'
const BACKUP_DIR = join(dirname(config.dbPath), 'backups')
const MAX_BACKUPS = 10
@ -79,7 +80,7 @@ export async function POST(request: NextRequest) {
},
})
} catch (error: any) {
console.error('Backup failed:', error)
logger.error({ err: error }, 'Backup failed')
return NextResponse.json({ error: `Backup failed: ${error.message}` }, { status: 500 })
}
}

View File

@ -1,6 +1,7 @@
import { NextRequest, NextResponse } from 'next/server'
import { getDatabase } from '@/lib/db'
import { requireRole } from '@/lib/auth'
import { logger } from '@/lib/logger'
/**
* GET /api/chat/conversations - List conversations derived from messages
@ -94,7 +95,7 @@ export async function GET(request: NextRequest) {
return NextResponse.json({ conversations: withLastMessage, total: countRow.total, page: Math.floor(offset / limit) + 1, limit })
} catch (error) {
console.error('GET /api/chat/conversations error:', error)
logger.error({ err: error }, 'GET /api/chat/conversations error')
return NextResponse.json({ error: 'Failed to fetch conversations' }, { status: 500 })
}
}

View File

@ -1,6 +1,7 @@
import { NextRequest, NextResponse } from 'next/server'
import { getDatabase, Message } from '@/lib/db'
import { requireRole } from '@/lib/auth'
import { logger } from '@/lib/logger'
/**
* GET /api/chat/messages/[id] - Get a single message
@ -29,7 +30,7 @@ export async function GET(
}
})
} catch (error) {
console.error('GET /api/chat/messages/[id] error:', error)
logger.error({ err: error }, 'GET /api/chat/messages/[id] error')
return NextResponse.json({ error: 'Failed to fetch message' }, { status: 500 })
}
}
@ -69,7 +70,7 @@ export async function PATCH(
}
})
} catch (error) {
console.error('PATCH /api/chat/messages/[id] error:', error)
logger.error({ err: error }, 'PATCH /api/chat/messages/[id] error')
return NextResponse.json({ error: 'Failed to update message' }, { status: 500 })
}
}

View File

@ -4,6 +4,7 @@ import { runOpenClaw } from '@/lib/command'
import { getAllGatewaySessions } from '@/lib/sessions'
import { eventBus } from '@/lib/event-bus'
import { requireRole } from '@/lib/auth'
import { logger } from '@/lib/logger'
type ForwardInfo = {
attempted: boolean
@ -166,7 +167,7 @@ export async function GET(request: NextRequest) {
return NextResponse.json({ messages: parsed, total: countRow.total, page: Math.floor(offset / limit) + 1, limit })
} catch (error) {
console.error('GET /api/chat/messages error:', error)
logger.error({ err: error }, 'GET /api/chat/messages error')
return NextResponse.json({ error: 'Failed to fetch messages' }, { status: 500 })
}
}
@ -287,7 +288,7 @@ export async function POST(request: NextRequest) {
{ status: 'offline', reason: 'no_active_session' }
)
} catch (e) {
console.error('Failed to create offline status reply:', e)
logger.error({ err: e }, 'Failed to create offline status reply')
}
}
} else {
@ -332,7 +333,7 @@ export async function POST(request: NextRequest) {
}
} else {
forwardInfo.reason = 'gateway_send_failed'
console.error('Failed to forward message via gateway:', err)
logger.error({ err }, 'Failed to forward message via gateway')
// For coordinator messages, emit visible status when send fails
if (typeof conversation_id === 'string' && conversation_id.startsWith('coord:')) {
@ -347,7 +348,7 @@ export async function POST(request: NextRequest) {
{ status: 'delivery_failed', reason: 'gateway_send_failed' }
)
} catch (e) {
console.error('Failed to create gateway failure status reply:', e)
logger.error({ err: e }, 'Failed to create gateway failure status reply')
}
}
}
@ -370,7 +371,7 @@ export async function POST(request: NextRequest) {
{ status: 'accepted', runId: forwardInfo.runId || null }
)
} catch (e) {
console.error('Failed to create accepted status reply:', e)
logger.error({ err: e }, 'Failed to create accepted status reply')
}
// Best effort: wait briefly and surface completion/error feedback.
@ -477,7 +478,7 @@ export async function POST(request: NextRequest) {
return NextResponse.json({ message: parsedMessage, forward: forwardInfo }, { status: 201 })
} catch (error) {
console.error('POST /api/chat/messages error:', error)
logger.error({ err: error }, 'POST /api/chat/messages error')
return NextResponse.json({ error: 'Failed to send message' }, { status: 500 })
}
}

View File

@ -0,0 +1,102 @@
import { NextRequest, NextResponse } from 'next/server'
import { getDatabase } from '@/lib/db'
import { requireRole } from '@/lib/auth'
import { syncClaudeSessions } from '@/lib/claude-sessions'
import { logger } from '@/lib/logger'
/**
* GET /api/claude/sessions List discovered local Claude Code sessions
*
* Query params:
* active=1 only active sessions
* project=slug filter by project slug
* limit=50 max results (default 50, max 200)
* offset=0 pagination offset
*/
export async function GET(request: NextRequest) {
const auth = requireRole(request, 'viewer')
if ('error' in auth) return NextResponse.json({ error: auth.error }, { status: auth.status })
try {
const db = getDatabase()
const { searchParams } = new URL(request.url)
const active = searchParams.get('active')
const project = searchParams.get('project')
const limit = Math.min(parseInt(searchParams.get('limit') || '50'), 200)
const offset = parseInt(searchParams.get('offset') || '0')
let query = 'SELECT * FROM claude_sessions WHERE 1=1'
const params: any[] = []
if (active === '1') {
query += ' AND is_active = 1'
}
if (project) {
query += ' AND project_slug = ?'
params.push(project)
}
query += ' ORDER BY last_message_at DESC LIMIT ? OFFSET ?'
params.push(limit, offset)
const sessions = db.prepare(query).all(...params)
// Get total count
let countQuery = 'SELECT COUNT(*) as total FROM claude_sessions WHERE 1=1'
const countParams: any[] = []
if (active === '1') {
countQuery += ' AND is_active = 1'
}
if (project) {
countQuery += ' AND project_slug = ?'
countParams.push(project)
}
const { total } = db.prepare(countQuery).get(...countParams) as { total: number }
// Aggregate stats
const stats = db.prepare(`
SELECT
COUNT(*) as total_sessions,
SUM(CASE WHEN is_active = 1 THEN 1 ELSE 0 END) as active_sessions,
SUM(input_tokens) as total_input_tokens,
SUM(output_tokens) as total_output_tokens,
SUM(estimated_cost) as total_estimated_cost,
COUNT(DISTINCT project_slug) as unique_projects
FROM claude_sessions
`).get() as any
return NextResponse.json({
sessions,
total,
stats: {
total_sessions: stats.total_sessions || 0,
active_sessions: stats.active_sessions || 0,
total_input_tokens: stats.total_input_tokens || 0,
total_output_tokens: stats.total_output_tokens || 0,
total_estimated_cost: Math.round((stats.total_estimated_cost || 0) * 100) / 100,
unique_projects: stats.unique_projects || 0,
},
})
} catch (error) {
logger.error({ err: error }, 'GET /api/claude/sessions error')
return NextResponse.json({ error: 'Failed to fetch Claude sessions' }, { status: 500 })
}
}
/**
* POST /api/claude/sessions Trigger a manual scan of local Claude sessions
*/
export async function POST(request: NextRequest) {
const auth = requireRole(request, 'operator')
if ('error' in auth) return NextResponse.json({ error: auth.error }, { status: auth.status })
try {
const result = await syncClaudeSessions()
return NextResponse.json(result)
} catch (error) {
logger.error({ err: error }, 'POST /api/claude/sessions error')
return NextResponse.json({ error: 'Failed to scan Claude sessions' }, { status: 500 })
}
}

View File

@ -1,6 +1,7 @@
import { NextRequest, NextResponse } from 'next/server'
import { requireRole } from '@/lib/auth'
import { config } from '@/lib/config'
import { logger } from '@/lib/logger'
import fs from 'node:fs'
import path from 'node:path'
@ -89,7 +90,7 @@ function saveCronFile(data: OpenClawCronFile): boolean {
fs.writeFileSync(filePath, JSON.stringify(data, null, 2))
return true
} catch (err) {
console.error('Failed to write cron file:', err)
logger.error({ err }, 'Failed to write cron file')
return false
}
}
@ -189,7 +190,7 @@ export async function GET(request: NextRequest) {
return NextResponse.json({ error: 'Invalid action' }, { status: 400 })
} catch (error) {
console.error('Cron API error:', error)
logger.error({ err: error }, 'Cron API error')
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
}
}
@ -338,7 +339,7 @@ export async function POST(request: NextRequest) {
return NextResponse.json({ error: 'Invalid action' }, { status: 400 })
} catch (error) {
console.error('Cron management error:', error)
logger.error({ err: error }, 'Cron management error')
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
}
}

View File

@ -4,6 +4,7 @@ import { join } from 'path'
import { config } from '@/lib/config'
import { requireRole } from '@/lib/auth'
import { readLimiter, mutationLimiter } from '@/lib/rate-limit'
import { logger } from '@/lib/logger'
const LOGS_PATH = config.logsDir
@ -248,7 +249,7 @@ export async function GET(request: NextRequest) {
return NextResponse.json({ error: 'Invalid action' }, { status: 400 })
} catch (error) {
console.error('Logs API error:', error)
logger.error({ err: error }, 'Logs API error')
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
}
}
@ -283,7 +284,7 @@ export async function POST(request: NextRequest) {
return NextResponse.json({ error: 'Invalid action' }, { status: 400 })
} catch (error) {
console.error('Logs API error:', error)
logger.error({ err: error }, 'Logs API error')
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
}
}

View File

@ -5,6 +5,7 @@ import { config } from '@/lib/config'
import { resolveWithin } from '@/lib/paths'
import { requireRole } from '@/lib/auth'
import { readLimiter, mutationLimiter } from '@/lib/rate-limit'
import { logger } from '@/lib/logger'
const MEMORY_PATH = config.memoryDir
@ -96,7 +97,7 @@ async function buildFileTree(dirPath: string, relativePath: string = ''): Promis
})
}
} catch (error) {
console.error(`Error reading ${itemPath}:`, error)
logger.error({ err: error, path: itemPath }, 'Error reading file')
}
}
@ -108,7 +109,7 @@ async function buildFileTree(dirPath: string, relativePath: string = ''): Promis
return a.name.localeCompare(b.name)
})
} catch (error) {
console.error(`Error reading directory ${dirPath}:`, error)
logger.error({ err: error, path: dirPath }, 'Error reading directory')
return []
}
}
@ -216,7 +217,7 @@ export async function GET(request: NextRequest) {
}
}
} catch (error) {
console.error(`Error searching directory ${dirPath}:`, error)
logger.error({ err: error, path: dirPath }, 'Error searching directory')
}
}
@ -230,7 +231,7 @@ export async function GET(request: NextRequest) {
return NextResponse.json({ error: 'Invalid action' }, { status: 400 })
} catch (error) {
console.error('Memory API error:', error)
logger.error({ err: error }, 'Memory API error')
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
}
}
@ -290,7 +291,7 @@ export async function POST(request: NextRequest) {
return NextResponse.json({ error: 'Invalid action' }, { status: 400 })
} catch (error) {
console.error('Memory POST API error:', error)
logger.error({ err: error }, 'Memory POST API error')
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
}
}
@ -329,7 +330,7 @@ export async function DELETE(request: NextRequest) {
return NextResponse.json({ error: 'Invalid action' }, { status: 400 })
} catch (error) {
console.error('Memory DELETE API error:', error)
logger.error({ err: error }, 'Memory DELETE API error')
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
}
}

View File

@ -2,6 +2,7 @@ import { NextRequest, NextResponse } from 'next/server';
import { getDatabase, Notification, db_helpers } from '@/lib/db';
import { runOpenClaw } from '@/lib/command';
import { requireRole } from '@/lib/auth';
import { logger } from '@/lib/logger';
/**
* POST /api/notifications/deliver - Notification delivery daemon endpoint
@ -144,7 +145,7 @@ export async function POST(request: NextRequest) {
error: error.message
});
console.error(`Failed to deliver notification ${notification.id} to ${notification.recipient}:`, error);
logger.error({ err: error, notificationId: notification.id, recipient: notification.recipient }, 'Failed to deliver notification');
}
}
@ -175,7 +176,7 @@ export async function POST(request: NextRequest) {
error_details: errors
});
} catch (error) {
console.error('POST /api/notifications/deliver error:', error);
logger.error({ err: error }, 'POST /api/notifications/deliver error');
return NextResponse.json({ error: 'Failed to deliver notifications' }, { status: 500 });
}
}
@ -252,7 +253,7 @@ export async function GET(request: NextRequest) {
agent_filter: agent
});
} catch (error) {
console.error('GET /api/notifications/deliver error:', error);
logger.error({ err: error }, 'GET /api/notifications/deliver error');
return NextResponse.json({ error: 'Failed to get delivery status' }, { status: 500 });
}
}

View File

@ -3,6 +3,7 @@ import { getDatabase, Notification } from '@/lib/db';
import { requireRole } from '@/lib/auth';
import { mutationLimiter } from '@/lib/rate-limit';
import { validateBody, notificationActionSchema } from '@/lib/validation';
import { logger } from '@/lib/logger';
/**
* GET /api/notifications - Get notifications for a specific recipient
@ -91,7 +92,7 @@ export async function GET(request: NextRequest) {
}
}
} catch (error) {
console.warn(`Failed to fetch source details for notification ${notification.id}:`, error);
logger.warn({ err: error, notificationId: notification.id }, 'Failed to fetch source details for notification');
}
return {
@ -127,7 +128,7 @@ export async function GET(request: NextRequest) {
unreadCount: unreadCount.count
});
} catch (error) {
console.error('GET /api/notifications error:', error);
logger.error({ err: error }, 'GET /api/notifications error');
return NextResponse.json({ error: 'Failed to fetch notifications' }, { status: 500 });
}
}
@ -185,7 +186,7 @@ export async function PUT(request: NextRequest) {
}, { status: 400 });
}
} catch (error) {
console.error('PUT /api/notifications error:', error);
logger.error({ err: error }, 'PUT /api/notifications error');
return NextResponse.json({ error: 'Failed to update notifications' }, { status: 500 });
}
}
@ -239,7 +240,7 @@ export async function DELETE(request: NextRequest) {
}, { status: 400 });
}
} catch (error) {
console.error('DELETE /api/notifications error:', error);
logger.error({ err: error }, 'DELETE /api/notifications error');
return NextResponse.json({ error: 'Failed to delete notifications' }, { status: 500 });
}
}
@ -291,7 +292,7 @@ export async function POST(request: NextRequest) {
return NextResponse.json({ error: 'Invalid action' }, { status: 400 });
}
} catch (error) {
console.error('POST /api/notifications error:', error);
logger.error({ err: error }, 'POST /api/notifications error');
return NextResponse.json({ error: 'Failed to process notification action' }, { status: 500 });
}
}

View File

@ -3,6 +3,7 @@ import { getDatabase, db_helpers } from '@/lib/db'
import { requireRole } from '@/lib/auth'
import { validateBody, createPipelineSchema } from '@/lib/validation'
import { mutationLimiter } from '@/lib/rate-limit'
import { logger } from '@/lib/logger'
export interface PipelineStep {
template_id: number
@ -60,7 +61,7 @@ export async function GET(request: NextRequest) {
return NextResponse.json({ pipelines: parsed })
} catch (error) {
console.error('GET /api/pipelines error:', error)
logger.error({ err: error }, 'GET /api/pipelines error')
return NextResponse.json({ error: 'Failed to fetch pipelines' }, { status: 500 })
}
}
@ -106,7 +107,7 @@ export async function POST(request: NextRequest) {
const pipeline = db.prepare('SELECT * FROM workflow_pipelines WHERE id = ?').get(insertResult.lastInsertRowid) as Pipeline
return NextResponse.json({ pipeline: { ...pipeline, steps: JSON.parse(pipeline.steps) } }, { status: 201 })
} catch (error) {
console.error('POST /api/pipelines error:', error)
logger.error({ err: error }, 'POST /api/pipelines error')
return NextResponse.json({ error: 'Failed to create pipeline' }, { status: 500 })
}
}
@ -153,7 +154,7 @@ export async function PUT(request: NextRequest) {
const updated = db.prepare('SELECT * FROM workflow_pipelines WHERE id = ?').get(id) as Pipeline
return NextResponse.json({ pipeline: { ...updated, steps: JSON.parse(updated.steps) } })
} catch (error) {
console.error('PUT /api/pipelines error:', error)
logger.error({ err: error }, 'PUT /api/pipelines error')
return NextResponse.json({ error: 'Failed to update pipeline' }, { status: 500 })
}
}
@ -175,7 +176,7 @@ export async function DELETE(request: NextRequest) {
db.prepare('DELETE FROM workflow_pipelines WHERE id = ?').run(parseInt(id))
return NextResponse.json({ success: true })
} catch (error) {
console.error('DELETE /api/pipelines error:', error)
logger.error({ err: error }, 'DELETE /api/pipelines error')
return NextResponse.json({ error: 'Failed to delete pipeline' }, { status: 500 })
}
}

View File

@ -2,6 +2,7 @@ import { NextRequest, NextResponse } from 'next/server'
import { getDatabase, db_helpers } from '@/lib/db'
import { requireRole } from '@/lib/auth'
import { eventBus } from '@/lib/event-bus'
import { logger } from '@/lib/logger'
interface PipelineStep {
template_id: number
@ -79,7 +80,7 @@ export async function GET(request: NextRequest) {
return NextResponse.json({ runs: parsed })
} catch (error) {
console.error('GET /api/pipelines/run error:', error)
logger.error({ err: error }, 'GET /api/pipelines/run error')
return NextResponse.json({ error: 'Failed to fetch runs' }, { status: 500 })
}
}
@ -106,7 +107,7 @@ export async function POST(request: NextRequest) {
return NextResponse.json({ error: 'Invalid action. Use: start, advance, cancel' }, { status: 400 })
} catch (error) {
console.error('POST /api/pipelines/run error:', error)
logger.error({ err: error }, 'POST /api/pipelines/run error')
return NextResponse.json({ error: 'Failed to process pipeline run' }, { status: 500 })
}
}

View File

@ -3,6 +3,7 @@ import { getDatabase, db_helpers } from '@/lib/db'
import { requireRole } from '@/lib/auth'
import { validateBody, qualityReviewSchema } from '@/lib/validation'
import { mutationLimiter } from '@/lib/rate-limit'
import { logger } from '@/lib/logger'
export async function GET(request: NextRequest) {
const auth = requireRole(request, 'viewer')
@ -59,7 +60,7 @@ export async function GET(request: NextRequest) {
return NextResponse.json({ reviews })
} catch (error) {
console.error('GET /api/quality-review error:', error)
logger.error({ err: error }, 'GET /api/quality-review error')
return NextResponse.json({ error: 'Failed to fetch quality reviews' }, { status: 500 })
}
}
@ -99,7 +100,7 @@ export async function POST(request: NextRequest) {
return NextResponse.json({ success: true, id: result.lastInsertRowid })
} catch (error) {
console.error('POST /api/quality-review error:', error)
logger.error({ err: error }, 'POST /api/quality-review error')
return NextResponse.json({ error: 'Failed to create quality review' }, { status: 500 })
}
}

View File

@ -1,6 +1,7 @@
import { NextRequest, NextResponse } from 'next/server'
import { getAllGatewaySessions } from '@/lib/sessions'
import { requireRole } from '@/lib/auth'
import { logger } from '@/lib/logger'
export async function GET(request: NextRequest) {
const auth = requireRole(request, 'viewer')
@ -31,7 +32,7 @@ export async function GET(request: NextRequest) {
return NextResponse.json({ sessions })
} catch (error) {
console.error('Sessions API error:', error)
logger.error({ err: error }, 'Sessions API error')
return NextResponse.json({ sessions: [] })
}
}

View File

@ -1,6 +1,7 @@
import { NextRequest, NextResponse } from 'next/server';
import { getDatabase, db_helpers } from '@/lib/db';
import { requireRole } from '@/lib/auth';
import { logger } from '@/lib/logger';
/**
* POST /api/standup/generate - Generate daily standup report
@ -197,7 +198,7 @@ export async function POST(request: NextRequest) {
return NextResponse.json({ standup: standupReport });
} catch (error) {
console.error('POST /api/standup/generate error:', error);
logger.error({ err: error }, 'POST /api/standup/generate error');
return NextResponse.json({ error: 'Failed to generate standup' }, { status: 500 });
}
}
@ -244,7 +245,7 @@ export async function GET(request: NextRequest) {
limit
});
} catch (error) {
console.error('GET /api/standup/history error:', error);
logger.error({ err: error }, 'GET /api/standup/history error');
return NextResponse.json({ error: 'Failed to fetch standup history' }, { status: 500 });
}
}

View File

@ -7,6 +7,7 @@ import { getDatabase } from '@/lib/db'
import { getAllGatewaySessions, getAgentLiveStatuses } from '@/lib/sessions'
import { requireRole } from '@/lib/auth'
import { MODEL_CATALOG } from '@/lib/models'
import { logger } from '@/lib/logger'
export async function GET(request: NextRequest) {
const auth = requireRole(request, 'viewer')
@ -43,7 +44,7 @@ export async function GET(request: NextRequest) {
return NextResponse.json({ error: 'Invalid action' }, { status: 400 })
} catch (error) {
console.error('Status API error:', error)
logger.error({ err: error }, 'Status API error')
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
}
}
@ -167,7 +168,7 @@ function getDbStats() {
webhookCount,
}
} catch (err) {
console.error('getDbStats error:', err)
logger.error({ err }, 'getDbStats error')
return null
}
}
@ -190,7 +191,7 @@ async function getSystemStatus() {
const bootTime = new Date(uptimeOutput.trim())
status.uptime = Date.now() - bootTime.getTime()
} catch (error) {
console.error('Error getting uptime:', error)
logger.error({ err: error }, 'Error getting uptime')
}
try {
@ -209,7 +210,7 @@ async function getSystemStatus() {
}
}
} catch (error) {
console.error('Error getting memory info:', error)
logger.error({ err: error }, 'Error getting memory info')
}
try {
@ -228,7 +229,7 @@ async function getSystemStatus() {
}
}
} catch (error) {
console.error('Error getting disk info:', error)
logger.error({ err: error }, 'Error getting disk info')
}
try {
@ -251,7 +252,7 @@ async function getSystemStatus() {
.filter((proc) => /clawdbot|openclaw/i.test(proc.command))
status.processes = processes
} catch (error) {
console.error('Error getting process info:', error)
logger.error({ err: error }, 'Error getting process info')
}
try {
@ -283,10 +284,10 @@ async function getSystemStatus() {
)
}
} catch (dbErr) {
console.error('Error syncing agent statuses:', dbErr)
logger.error({ err: dbErr }, 'Error syncing agent statuses')
}
} catch (error) {
console.error('Error reading session stores:', error)
logger.error({ err: error }, 'Error reading session stores')
}
return status
@ -321,7 +322,7 @@ async function getGatewayStatus() {
try {
gatewayStatus.port_listening = await isPortOpen(config.gatewayHost, config.gatewayPort)
} catch (error) {
console.error('Error checking port:', error)
logger.error({ err: error }, 'Error checking port')
}
try {
@ -371,7 +372,7 @@ async function getAvailableModels() {
}
})
} catch (error) {
console.error('Error checking Ollama models:', error)
logger.error({ err: error }, 'Error checking Ollama models')
}
return models

View File

@ -2,6 +2,7 @@ import { NextRequest, NextResponse } from 'next/server'
import { getDatabase, db_helpers } from '@/lib/db'
import { runOpenClaw } from '@/lib/command'
import { requireRole } from '@/lib/auth'
import { logger } from '@/lib/logger'
export async function POST(
request: NextRequest,
@ -86,7 +87,7 @@ export async function POST(
return NextResponse.json({ sent, skipped })
} catch (error) {
console.error('POST /api/tasks/[id]/broadcast error:', error)
logger.error({ err: error }, 'POST /api/tasks/[id]/broadcast error')
return NextResponse.json({ error: 'Failed to broadcast message' }, { status: 500 })
}
}

View File

@ -3,6 +3,7 @@ import { getDatabase, Comment, db_helpers } from '@/lib/db';
import { requireRole } from '@/lib/auth';
import { validateBody, createCommentSchema } from '@/lib/validation';
import { mutationLimiter } from '@/lib/rate-limit';
import { logger } from '@/lib/logger';
/**
* GET /api/tasks/[id]/comments - Get all comments for a task
@ -74,7 +75,7 @@ export async function GET(
total: comments.length
});
} catch (error) {
console.error(`GET /api/tasks/[id]/comments error:`, error);
logger.error({ err: error }, 'GET /api/tasks/[id]/comments error');
return NextResponse.json({ error: 'Failed to fetch comments' }, { status: 500 });
}
}
@ -201,7 +202,7 @@ export async function POST(
}
}, { status: 201 });
} catch (error) {
console.error(`POST /api/tasks/[id]/comments error:`, error);
logger.error({ err: error }, 'POST /api/tasks/[id]/comments error');
return NextResponse.json({ error: 'Failed to add comment' }, { status: 500 });
}
}

View File

@ -4,7 +4,7 @@ import { eventBus } from '@/lib/event-bus';
import { requireRole } from '@/lib/auth';
import { mutationLimiter } from '@/lib/rate-limit';
import { logger } from '@/lib/logger';
import { validateBody, createTaskSchema } from '@/lib/validation';
import { validateBody, createTaskSchema, bulkUpdateTaskStatusSchema } from '@/lib/validation';
function hasAegisApproval(db: ReturnType<typeof getDatabase>, taskId: number): boolean {
const review = db.prepare(`
@ -208,11 +208,9 @@ export async function PUT(request: NextRequest) {
try {
const db = getDatabase();
const { tasks } = await request.json();
if (!Array.isArray(tasks)) {
return NextResponse.json({ error: 'Tasks must be an array' }, { status: 400 });
}
const validated = await validateBody(request, bulkUpdateTaskStatusSchema);
if ('error' in validated) return validated.error;
const { tasks } = validated.data;
const now = Math.floor(Date.now() / 1000);

View File

@ -4,6 +4,7 @@ import { dirname } from 'path'
import { config, ensureDirExists } from '@/lib/config'
import { requireRole } from '@/lib/auth'
import { getAllGatewaySessions } from '@/lib/sessions'
import { logger } from '@/lib/logger'
const DATA_PATH = config.tokensPath
@ -385,7 +386,7 @@ export async function GET(request: NextRequest) {
return NextResponse.json({ error: 'Invalid action' }, { status: 400 })
} catch (error) {
console.error('Tokens API error:', error)
logger.error({ err: error }, 'Tokens API error')
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
}
}
@ -430,7 +431,7 @@ export async function POST(request: NextRequest) {
return NextResponse.json({ success: true, record })
} catch (error) {
console.error('Error saving token usage:', error)
logger.error({ err: error }, 'Error saving token usage')
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
}
}

View File

@ -1,6 +1,7 @@
import { NextRequest, NextResponse } from 'next/server'
import { getDatabase } from '@/lib/db'
import { requireRole } from '@/lib/auth'
import { logger } from '@/lib/logger'
/**
* GET /api/webhooks/deliveries - Get delivery history for a webhook
@ -44,7 +45,7 @@ export async function GET(request: NextRequest) {
return NextResponse.json({ deliveries, total })
} catch (error) {
console.error('GET /api/webhooks/deliveries error:', error)
logger.error({ err: error }, 'GET /api/webhooks/deliveries error')
return NextResponse.json({ error: 'Failed to fetch deliveries' }, { status: 500 })
}
}

View File

@ -0,0 +1,63 @@
import { NextRequest, NextResponse } from 'next/server'
import { getDatabase } from '@/lib/db'
import { requireRole } from '@/lib/auth'
import { deliverWebhookPublic } from '@/lib/webhooks'
import { logger } from '@/lib/logger'
/**
* POST /api/webhooks/retry - Manually retry a failed delivery
*/
export async function POST(request: NextRequest) {
const auth = requireRole(request, 'admin')
if ('error' in auth) return NextResponse.json({ error: auth.error }, { status: auth.status })
try {
const db = getDatabase()
const { delivery_id } = await request.json()
if (!delivery_id) {
return NextResponse.json({ error: 'delivery_id is required' }, { status: 400 })
}
const delivery = db.prepare(`
SELECT wd.*, w.id as w_id, w.name as w_name, w.url as w_url, w.secret as w_secret,
w.events as w_events, w.enabled as w_enabled
FROM webhook_deliveries wd
JOIN webhooks w ON w.id = wd.webhook_id
WHERE wd.id = ?
`).get(delivery_id) as any
if (!delivery) {
return NextResponse.json({ error: 'Delivery not found' }, { status: 404 })
}
const webhook = {
id: delivery.w_id,
name: delivery.w_name,
url: delivery.w_url,
secret: delivery.w_secret,
events: delivery.w_events,
enabled: delivery.w_enabled,
}
// Parse the original payload
let parsedPayload: Record<string, any>
try {
const parsed = JSON.parse(delivery.payload)
parsedPayload = parsed.data ?? parsed
} catch {
parsedPayload = {}
}
const result = await deliverWebhookPublic(webhook, delivery.event_type, parsedPayload, {
attempt: (delivery.attempt ?? 0) + 1,
parentDeliveryId: delivery.id,
allowRetry: false, // Manual retries don't auto-schedule further retries
})
return NextResponse.json(result)
} catch (error) {
logger.error({ err: error }, 'POST /api/webhooks/retry error')
return NextResponse.json({ error: 'Failed to retry delivery' }, { status: 500 })
}
}

View File

@ -24,12 +24,15 @@ export async function GET(request: NextRequest) {
ORDER BY w.created_at DESC
`).all() as any[]
// Parse events JSON, mask secret
// Parse events JSON, mask secret, add circuit breaker status
const maxRetries = parseInt(process.env.MC_WEBHOOK_MAX_RETRIES || '5', 10) || 5
const result = webhooks.map((wh) => ({
...wh,
events: JSON.parse(wh.events || '["*"]'),
secret: wh.secret ? '••••••' + wh.secret.slice(-4) : null,
enabled: !!wh.enabled,
consecutive_failures: wh.consecutive_failures ?? 0,
circuit_open: (wh.consecutive_failures ?? 0) >= maxRetries,
}))
return NextResponse.json({ webhooks: result })
@ -92,7 +95,7 @@ export async function PUT(request: NextRequest) {
try {
const db = getDatabase()
const body = await request.json()
const { id, name, url, events, enabled, regenerate_secret } = body
const { id, name, url, events, enabled, regenerate_secret, reset_circuit } = body
if (!id) {
return NextResponse.json({ error: 'Webhook ID is required' }, { status: 400 })
@ -117,6 +120,12 @@ export async function PUT(request: NextRequest) {
if (events !== undefined) { updates.push('events = ?'); params.push(JSON.stringify(events)) }
if (enabled !== undefined) { updates.push('enabled = ?'); params.push(enabled ? 1 : 0) }
// Reset circuit breaker: clear failure count and re-enable
if (reset_circuit) {
updates.push('consecutive_failures = 0')
updates.push('enabled = 1')
}
let newSecret: string | null = null
if (regenerate_secret) {
newSecret = randomBytes(32).toString('hex')

View File

@ -1,7 +1,8 @@
import { NextRequest, NextResponse } from 'next/server'
import { getDatabase } from '@/lib/db'
import { requireRole } from '@/lib/auth'
import { createHmac } from 'crypto'
import { deliverWebhookPublic } from '@/lib/webhooks'
import { logger } from '@/lib/logger'
/**
* POST /api/webhooks/test - Send a test event to a webhook
@ -23,78 +24,18 @@ export async function POST(request: NextRequest) {
return NextResponse.json({ error: 'Webhook not found' }, { status: 404 })
}
const body = JSON.stringify({
event: 'test.ping',
timestamp: Math.floor(Date.now() / 1000),
data: {
message: 'This is a test webhook from Mission Control',
webhook_id: webhook.id,
webhook_name: webhook.name,
triggered_by: auth.user.username,
},
})
const headers: Record<string, string> = {
'Content-Type': 'application/json',
'User-Agent': 'MissionControl-Webhook/1.0',
'X-MC-Event': 'test.ping',
const payload = {
message: 'This is a test webhook from Mission Control',
webhook_id: webhook.id,
webhook_name: webhook.name,
triggered_by: auth.user.username,
}
if (webhook.secret) {
const sig = createHmac('sha256', webhook.secret).update(body).digest('hex')
headers['X-MC-Signature'] = `sha256=${sig}`
}
const result = await deliverWebhookPublic(webhook, 'test.ping', payload, { allowRetry: false })
const start = Date.now()
let statusCode: number | null = null
let responseBody: string | null = null
let error: string | null = null
try {
const controller = new AbortController()
const timeout = setTimeout(() => controller.abort(), 10000)
const res = await fetch(webhook.url, {
method: 'POST',
headers,
body,
signal: controller.signal,
})
clearTimeout(timeout)
statusCode = res.status
responseBody = await res.text().catch(() => null)
if (responseBody && responseBody.length > 1000) {
responseBody = responseBody.slice(0, 1000) + '...'
}
} catch (err: any) {
error = err.name === 'AbortError' ? 'Timeout (10s)' : err.message
}
const durationMs = Date.now() - start
// Log the test delivery
db.prepare(`
INSERT INTO webhook_deliveries (webhook_id, event_type, payload, status_code, response_body, error, duration_ms)
VALUES (?, ?, ?, ?, ?, ?, ?)
`).run(webhook.id, 'test.ping', body, statusCode, responseBody, error, durationMs)
db.prepare(`
UPDATE webhooks SET last_fired_at = unixepoch(), last_status = ?, updated_at = unixepoch()
WHERE id = ?
`).run(statusCode ?? -1, webhook.id)
const success = statusCode !== null && statusCode >= 200 && statusCode < 300
return NextResponse.json({
success,
status_code: statusCode,
response_body: responseBody,
error,
duration_ms: durationMs,
})
return NextResponse.json(result)
} catch (error) {
console.error('POST /api/webhooks/test error:', error)
logger.error({ err: error }, 'POST /api/webhooks/test error')
return NextResponse.json({ error: 'Failed to test webhook' }, { status: 500 })
}
}

View File

@ -0,0 +1,40 @@
import { NextRequest, NextResponse } from 'next/server'
import { requireRole } from '@/lib/auth'
/**
* GET /api/webhooks/verify-docs - Returns webhook signature verification documentation
* No secrets exposed. Accessible to any authenticated user (viewer+).
*/
export async function GET(request: NextRequest) {
const auth = requireRole(request, 'viewer')
if ('error' in auth) return NextResponse.json({ error: auth.error }, { status: auth.status })
return NextResponse.json({
algorithm: 'HMAC-SHA256',
header: 'X-MC-Signature',
format: 'sha256=<hex-digest>',
description: 'Mission Control signs webhook payloads using HMAC-SHA256. The signature is sent in the X-MC-Signature header.',
verification_steps: [
'1. Extract the raw request body as a UTF-8 string (do NOT parse JSON first).',
'2. Read the X-MC-Signature header value.',
'3. Compute HMAC-SHA256 of the raw body using your webhook secret.',
'4. Format the expected value as: sha256=<hex-digest>',
'5. Compare the computed value with the header using a constant-time comparison.',
'6. Reject the request if they do not match.',
],
example_nodejs: `
const crypto = require('crypto');
function verifySignature(secret, rawBody, signatureHeader) {
const expected = 'sha256=' + crypto.createHmac('sha256', secret).update(rawBody).digest('hex');
const sigBuf = Buffer.from(signatureHeader);
const expBuf = Buffer.from(expected);
if (sigBuf.length !== expBuf.length) return false;
return crypto.timingSafeEqual(sigBuf, expBuf);
}
// In your Express/Fastify handler:
// const isValid = verifySignature(MY_SECRET, req.rawBody, req.headers['x-mc-signature']);
`.trim(),
})
}

View File

@ -3,6 +3,7 @@ import { getDatabase, db_helpers } from '@/lib/db'
import { requireRole } from '@/lib/auth'
import { validateBody, createWorkflowSchema } from '@/lib/validation'
import { mutationLimiter } from '@/lib/rate-limit'
import { logger } from '@/lib/logger'
export interface WorkflowTemplate {
id: number
@ -38,7 +39,7 @@ export async function GET(request: NextRequest) {
return NextResponse.json({ templates: parsed })
} catch (error) {
console.error('GET /api/workflows error:', error)
logger.error({ err: error }, 'GET /api/workflows error')
return NextResponse.json({ error: 'Failed to fetch templates' }, { status: 500 })
}
}
@ -74,7 +75,7 @@ export async function POST(request: NextRequest) {
template: { ...template, tags: template.tags ? JSON.parse(template.tags) : [] }
}, { status: 201 })
} catch (error) {
console.error('POST /api/workflows error:', error)
logger.error({ err: error }, 'POST /api/workflows error')
return NextResponse.json({ error: 'Failed to create template' }, { status: 500 })
}
}
@ -127,7 +128,7 @@ export async function PUT(request: NextRequest) {
const updated = db.prepare('SELECT * FROM workflow_templates WHERE id = ?').get(id) as WorkflowTemplate
return NextResponse.json({ template: { ...updated, tags: updated.tags ? JSON.parse(updated.tags) : [] } })
} catch (error) {
console.error('PUT /api/workflows error:', error)
logger.error({ err: error }, 'PUT /api/workflows error')
return NextResponse.json({ error: 'Failed to update template' }, { status: 500 })
}
}
@ -152,7 +153,7 @@ export async function DELETE(request: NextRequest) {
db.prepare('DELETE FROM workflow_templates WHERE id = ?').run(parseInt(id))
return NextResponse.json({ success: true })
} catch (error) {
console.error('DELETE /api/workflows error:', error)
logger.error({ err: error }, 'DELETE /api/workflows error')
return NextResponse.json({ error: 'Failed to delete template' }, { status: 500 })
}
}

View File

@ -12,7 +12,7 @@ describe('createRateLimiter', () => {
function makeRequest(ip: string = '127.0.0.1'): Request {
return new Request('http://localhost/api/test', {
headers: new Headers({ 'x-forwarded-for': ip }),
headers: new Headers({ 'x-real-ip': ip }),
})
}

View File

@ -130,7 +130,7 @@ describe('createUserSchema', () => {
it('accepts valid input', () => {
const result = createUserSchema.safeParse({
username: 'alice',
password: 'secret123',
password: 'secure-pass-12chars',
})
expect(result.success).toBe(true)
if (result.success) {

View File

@ -0,0 +1,82 @@
import { describe, expect, it } from 'vitest'
import { createHmac } from 'crypto'
import { verifyWebhookSignature, nextRetryDelay } from '../webhooks'
describe('verifyWebhookSignature', () => {
const secret = 'test-secret-key-1234'
const body = '{"event":"test.ping","timestamp":1700000000,"data":{"message":"hello"}}'
it('returns true for a correct signature', () => {
const sig = `sha256=${createHmac('sha256', secret).update(body).digest('hex')}`
expect(verifyWebhookSignature(secret, body, sig)).toBe(true)
})
it('returns false for a wrong signature', () => {
const wrongSig = `sha256=${createHmac('sha256', 'wrong-secret').update(body).digest('hex')}`
expect(verifyWebhookSignature(secret, body, wrongSig)).toBe(false)
})
it('returns false for a tampered body', () => {
const sig = `sha256=${createHmac('sha256', secret).update(body).digest('hex')}`
expect(verifyWebhookSignature(secret, body + 'tampered', sig)).toBe(false)
})
it('returns false for missing signature header', () => {
expect(verifyWebhookSignature(secret, body, null)).toBe(false)
expect(verifyWebhookSignature(secret, body, undefined)).toBe(false)
expect(verifyWebhookSignature(secret, body, '')).toBe(false)
})
it('returns false for empty secret', () => {
const sig = `sha256=${createHmac('sha256', secret).update(body).digest('hex')}`
expect(verifyWebhookSignature('', body, sig)).toBe(false)
})
})
describe('nextRetryDelay', () => {
// Expected base delays: 30s, 300s, 1800s, 7200s, 28800s
const expectedBases = [30, 300, 1800, 7200, 28800]
it('returns delays within ±20% jitter range for each attempt', () => {
for (let attempt = 0; attempt < expectedBases.length; attempt++) {
const base = expectedBases[attempt]
const minExpected = base * 0.8
const maxExpected = base * 1.2
// Run multiple times to test jitter randomness
for (let i = 0; i < 20; i++) {
const delay = nextRetryDelay(attempt)
expect(delay).toBeGreaterThanOrEqual(Math.floor(minExpected))
expect(delay).toBeLessThanOrEqual(Math.ceil(maxExpected))
}
}
})
it('clamps attempts beyond the backoff array length', () => {
const lastBase = expectedBases[expectedBases.length - 1]
const delay = nextRetryDelay(100)
expect(delay).toBeGreaterThanOrEqual(Math.floor(lastBase * 0.8))
expect(delay).toBeLessThanOrEqual(Math.ceil(lastBase * 1.2))
})
it('returns a rounded integer', () => {
for (let i = 0; i < 50; i++) {
const delay = nextRetryDelay(0)
expect(Number.isInteger(delay)).toBe(true)
}
})
})
describe('circuit breaker logic', () => {
it('consecutive_failures >= maxRetries means circuit is open', () => {
const maxRetries = 5
// Simulate the circuit_open derivation used in the API
const isCircuitOpen = (failures: number) => failures >= maxRetries
expect(isCircuitOpen(0)).toBe(false)
expect(isCircuitOpen(3)).toBe(false)
expect(isCircuitOpen(4)).toBe(false)
expect(isCircuitOpen(5)).toBe(true)
expect(isCircuitOpen(10)).toBe(true)
})
})

View File

@ -10,7 +10,9 @@ export function safeCompare(a: string, b: string): boolean {
const bufA = Buffer.from(a)
const bufB = Buffer.from(b)
if (bufA.length !== bufB.length) {
timingSafeEqual(bufA, bufA)
// Compare against dummy buffer to avoid timing leak on length mismatch
const dummy = Buffer.alloc(bufA.length)
timingSafeEqual(bufA, dummy)
return false
}
return timingSafeEqual(bufA, bufB)
@ -176,6 +178,7 @@ export function createUser(
options?: { provider?: 'local' | 'google'; provider_user_id?: string | null; email?: string | null; avatar_url?: string | null; is_approved?: 0 | 1; approved_by?: string | null; approved_at?: number | null }
): User {
const db = getDatabase()
if (password.length < 12) throw new Error('Password must be at least 12 characters')
const passwordHash = hashPassword(password)
const provider = options?.provider || 'local'
const result = db.prepare(`

298
src/lib/claude-sessions.ts Normal file
View File

@ -0,0 +1,298 @@
/**
* Claude Code Local Session Scanner
*
* Discovers and tracks local Claude Code sessions by scanning ~/.claude/projects/.
* Each project directory contains JSONL session transcripts that record every
* user message, assistant response, and tool call with timestamps and token usage.
*
* This module parses those JSONL files to extract:
* - Session metadata (model, project, git branch, timestamps)
* - Message counts (user, assistant, tool uses)
* - Token usage (input, output, estimated cost)
* - Activity status (active if last message < 5 minutes ago)
*/
import { readdirSync, readFileSync, statSync } from 'fs'
import { join } from 'path'
import { config } from './config'
import { getDatabase } from './db'
import { logger } from './logger'
// Rough per-token pricing (USD) for cost estimation
const MODEL_PRICING: Record<string, { input: number; output: number }> = {
'claude-opus-4-6': { input: 15 / 1_000_000, output: 75 / 1_000_000 },
'claude-sonnet-4-6': { input: 3 / 1_000_000, output: 15 / 1_000_000 },
'claude-haiku-4-5': { input: 0.8 / 1_000_000, output: 4 / 1_000_000 },
}
const DEFAULT_PRICING = { input: 3 / 1_000_000, output: 15 / 1_000_000 }
// Session is "active" if last message was within this window
const ACTIVE_THRESHOLD_MS = 5 * 60 * 1000
interface SessionStats {
sessionId: string
projectSlug: string
projectPath: string | null
model: string | null
gitBranch: string | null
userMessages: number
assistantMessages: number
toolUses: number
inputTokens: number
outputTokens: number
estimatedCost: number
firstMessageAt: string | null
lastMessageAt: string | null
lastUserPrompt: string | null
isActive: boolean
}
interface JSONLEntry {
type?: string
sessionId?: string
timestamp?: string
isSidechain?: boolean
gitBranch?: string
cwd?: string
message?: {
role?: string
content?: string | Array<{ type: string; text?: string; id?: string }>
model?: string
usage?: {
input_tokens?: number
output_tokens?: number
cache_read_input_tokens?: number
cache_creation_input_tokens?: number
}
}
}
/** Parse a single JSONL file and extract session stats */
function parseSessionFile(filePath: string, projectSlug: string): SessionStats | null {
try {
const content = readFileSync(filePath, 'utf-8')
const lines = content.split('\n').filter(Boolean)
if (lines.length === 0) return null
let sessionId: string | null = null
let model: string | null = null
let gitBranch: string | null = null
let projectPath: string | null = null
let userMessages = 0
let assistantMessages = 0
let toolUses = 0
let inputTokens = 0
let outputTokens = 0
let firstMessageAt: string | null = null
let lastMessageAt: string | null = null
let lastUserPrompt: string | null = null
for (const line of lines) {
let entry: JSONLEntry
try {
entry = JSON.parse(line)
} catch {
continue
}
// Extract session ID from first entry that has one
if (!sessionId && entry.sessionId) {
sessionId = entry.sessionId
}
// Extract git branch
if (!gitBranch && entry.gitBranch) {
gitBranch = entry.gitBranch
}
// Extract project working directory
if (!projectPath && entry.cwd) {
projectPath = entry.cwd
}
// Track timestamps
if (entry.timestamp) {
if (!firstMessageAt) firstMessageAt = entry.timestamp
lastMessageAt = entry.timestamp
}
// Skip sidechain messages (subagent work) for counts
if (entry.isSidechain) continue
if (entry.type === 'user' && entry.message) {
userMessages++
// Extract last user prompt text
const msg = entry.message
if (typeof msg.content === 'string' && msg.content.length > 0) {
lastUserPrompt = msg.content.slice(0, 500)
}
}
if (entry.type === 'assistant' && entry.message) {
assistantMessages++
// Extract model
if (entry.message.model) {
model = entry.message.model
}
// Extract token usage
const usage = entry.message.usage
if (usage) {
inputTokens += (usage.input_tokens || 0)
+ (usage.cache_read_input_tokens || 0)
+ (usage.cache_creation_input_tokens || 0)
outputTokens += (usage.output_tokens || 0)
}
// Count tool uses in assistant content
if (Array.isArray(entry.message.content)) {
for (const block of entry.message.content) {
if (block.type === 'tool_use') toolUses++
}
}
}
}
if (!sessionId) return null
// Estimate cost
const pricing = (model && MODEL_PRICING[model]) || DEFAULT_PRICING
const estimatedCost = inputTokens * pricing.input + outputTokens * pricing.output
// Determine if active
const isActive = lastMessageAt
? (Date.now() - new Date(lastMessageAt).getTime()) < ACTIVE_THRESHOLD_MS
: false
return {
sessionId,
projectSlug,
projectPath,
model,
gitBranch,
userMessages,
assistantMessages,
toolUses,
inputTokens,
outputTokens,
estimatedCost: Math.round(estimatedCost * 10000) / 10000,
firstMessageAt,
lastMessageAt,
lastUserPrompt,
isActive,
}
} catch (err) {
logger.warn({ err, filePath }, 'Failed to parse Claude session file')
return null
}
}
/** Scan all Claude Code projects and discover sessions */
export function scanClaudeSessions(): SessionStats[] {
const claudeHome = config.claudeHome
if (!claudeHome) return []
const projectsDir = join(claudeHome, 'projects')
let projectDirs: string[]
try {
projectDirs = readdirSync(projectsDir)
} catch {
return [] // No projects directory — Claude Code not installed or never used
}
const sessions: SessionStats[] = []
for (const projectSlug of projectDirs) {
const projectDir = join(projectsDir, projectSlug)
let stat
try {
stat = statSync(projectDir)
} catch {
continue
}
if (!stat.isDirectory()) continue
// Find JSONL files in this project
let files: string[]
try {
files = readdirSync(projectDir).filter(f => f.endsWith('.jsonl'))
} catch {
continue
}
for (const file of files) {
const filePath = join(projectDir, file)
const parsed = parseSessionFile(filePath, projectSlug)
if (parsed) sessions.push(parsed)
}
}
return sessions
}
/** Scan and upsert sessions into the database */
export async function syncClaudeSessions(): Promise<{ ok: boolean; message: string }> {
try {
const sessions = scanClaudeSessions()
if (sessions.length === 0) {
return { ok: true, message: 'No Claude sessions found' }
}
const db = getDatabase()
const now = Math.floor(Date.now() / 1000)
const upsert = db.prepare(`
INSERT INTO claude_sessions (
session_id, project_slug, project_path, model, git_branch,
user_messages, assistant_messages, tool_uses,
input_tokens, output_tokens, estimated_cost,
first_message_at, last_message_at, last_user_prompt,
is_active, scanned_at, updated_at
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
ON CONFLICT(session_id) DO UPDATE SET
model = excluded.model,
git_branch = excluded.git_branch,
user_messages = excluded.user_messages,
assistant_messages = excluded.assistant_messages,
tool_uses = excluded.tool_uses,
input_tokens = excluded.input_tokens,
output_tokens = excluded.output_tokens,
estimated_cost = excluded.estimated_cost,
last_message_at = excluded.last_message_at,
last_user_prompt = excluded.last_user_prompt,
is_active = excluded.is_active,
scanned_at = excluded.scanned_at,
updated_at = excluded.updated_at
`)
let upserted = 0
db.transaction(() => {
// Mark all sessions inactive before scanning
db.prepare('UPDATE claude_sessions SET is_active = 0').run()
for (const s of sessions) {
upsert.run(
s.sessionId, s.projectSlug, s.projectPath, s.model, s.gitBranch,
s.userMessages, s.assistantMessages, s.toolUses,
s.inputTokens, s.outputTokens, s.estimatedCost,
s.firstMessageAt, s.lastMessageAt, s.lastUserPrompt,
s.isActive ? 1 : 0, now, now,
)
upserted++
}
})()
const active = sessions.filter(s => s.isActive).length
return {
ok: true,
message: `Scanned ${upserted} session(s), ${active} active`,
}
} catch (err: any) {
logger.error({ err }, 'Claude session sync failed')
return { ok: false, message: `Scan failed: ${err.message}` }
}
}

View File

@ -10,6 +10,9 @@ const openclawHome =
''
export const config = {
claudeHome:
process.env.MC_CLAUDE_HOME ||
path.join(os.homedir(), '.claude'),
dataDir: process.env.MISSION_CONTROL_DATA_DIR || defaultDataDir,
dbPath:
process.env.MISSION_CONTROL_DB_PATH ||

View File

@ -495,6 +495,58 @@ const migrations: Migration[] = [
CREATE INDEX IF NOT EXISTS idx_token_usage_model ON token_usage(model);
`)
}
},
{
id: '019_webhook_retry',
up: (db) => {
// Add retry columns to webhook_deliveries
const deliveryCols = db.prepare(`PRAGMA table_info(webhook_deliveries)`).all() as Array<{ name: string }>
const hasCol = (name: string) => deliveryCols.some((c) => c.name === name)
if (!hasCol('attempt')) db.exec(`ALTER TABLE webhook_deliveries ADD COLUMN attempt INTEGER NOT NULL DEFAULT 0`)
if (!hasCol('next_retry_at')) db.exec(`ALTER TABLE webhook_deliveries ADD COLUMN next_retry_at INTEGER`)
if (!hasCol('is_retry')) db.exec(`ALTER TABLE webhook_deliveries ADD COLUMN is_retry INTEGER NOT NULL DEFAULT 0`)
if (!hasCol('parent_delivery_id')) db.exec(`ALTER TABLE webhook_deliveries ADD COLUMN parent_delivery_id INTEGER`)
// Add circuit breaker column to webhooks
const webhookCols = db.prepare(`PRAGMA table_info(webhooks)`).all() as Array<{ name: string }>
if (!webhookCols.some((c) => c.name === 'consecutive_failures')) {
db.exec(`ALTER TABLE webhooks ADD COLUMN consecutive_failures INTEGER NOT NULL DEFAULT 0`)
}
// Partial index for retry queue processing
db.exec(`CREATE INDEX IF NOT EXISTS idx_webhook_deliveries_retry ON webhook_deliveries(next_retry_at) WHERE next_retry_at IS NOT NULL`)
}
},
{
id: '020_claude_sessions',
up: (db) => {
db.exec(`
CREATE TABLE IF NOT EXISTS claude_sessions (
id INTEGER PRIMARY KEY AUTOINCREMENT,
session_id TEXT NOT NULL UNIQUE,
project_slug TEXT NOT NULL,
project_path TEXT,
model TEXT,
git_branch TEXT,
user_messages INTEGER NOT NULL DEFAULT 0,
assistant_messages INTEGER NOT NULL DEFAULT 0,
tool_uses INTEGER NOT NULL DEFAULT 0,
input_tokens INTEGER NOT NULL DEFAULT 0,
output_tokens INTEGER NOT NULL DEFAULT 0,
estimated_cost REAL NOT NULL DEFAULT 0,
first_message_at TEXT,
last_message_at TEXT,
last_user_prompt TEXT,
is_active INTEGER NOT NULL DEFAULT 0,
scanned_at INTEGER NOT NULL,
created_at INTEGER NOT NULL DEFAULT (unixepoch()),
updated_at INTEGER NOT NULL DEFAULT (unixepoch())
)
`)
db.exec(`CREATE INDEX IF NOT EXISTS idx_claude_sessions_active ON claude_sessions(is_active) WHERE is_active = 1`)
db.exec(`CREATE INDEX IF NOT EXISTS idx_claude_sessions_project ON claude_sessions(project_slug)`)
}
}
]

View File

@ -13,6 +13,31 @@ interface RateLimiterOptions {
critical?: boolean
}
// Trusted proxy IPs (comma-separated). Only parse XFF when behind known proxies.
const TRUSTED_PROXIES = new Set(
(process.env.MC_TRUSTED_PROXIES || '').split(',').map(s => s.trim()).filter(Boolean)
)
/**
* Extract client IP from request headers.
* When MC_TRUSTED_PROXIES is set, takes the rightmost untrusted IP from x-forwarded-for.
* Without trusted proxies, falls back to x-real-ip or 'unknown'.
*/
export function extractClientIp(request: Request): string {
const xff = request.headers.get('x-forwarded-for')
if (xff && TRUSTED_PROXIES.size > 0) {
// Walk the chain from right to left, skip trusted proxies, return first untrusted
const ips = xff.split(',').map(s => s.trim())
for (let i = ips.length - 1; i >= 0; i--) {
if (!TRUSTED_PROXIES.has(ips[i])) return ips[i]
}
}
// Fallback: x-real-ip (set by nginx/caddy) or 'unknown'
return request.headers.get('x-real-ip')?.trim() || 'unknown'
}
export function createRateLimiter(options: RateLimiterOptions) {
const store = new Map<string, RateLimitEntry>()
@ -29,7 +54,7 @@ export function createRateLimiter(options: RateLimiterOptions) {
return function checkRateLimit(request: Request): NextResponse | null {
// Allow disabling non-critical rate limiting for E2E tests
if (process.env.MC_DISABLE_RATE_LIMIT === '1' && !options.critical) return null
const ip = request.headers.get('x-forwarded-for')?.split(',')[0]?.trim() || 'unknown'
const ip = extractClientIp(request)
const now = Date.now()
const entry = store.get(ip)

View File

@ -4,6 +4,8 @@ import { config, ensureDirExists } from './config'
import { join, dirname } from 'path'
import { readdirSync, statSync, unlinkSync } from 'fs'
import { logger } from './logger'
import { processWebhookRetries } from './webhooks'
import { syncClaudeSessions } from './claude-sessions'
const BACKUP_DIR = join(dirname(config.dbPath), 'backups')
@ -246,9 +248,27 @@ export function initScheduler() {
running: false,
})
tasks.set('webhook_retry', {
name: 'Webhook Retry',
intervalMs: TICK_MS, // Every 60s, matching scheduler tick resolution
lastRun: null,
nextRun: now + TICK_MS,
enabled: true,
running: false,
})
tasks.set('claude_session_scan', {
name: 'Claude Session Scan',
intervalMs: TICK_MS, // Every 60s — lightweight file stat checks
lastRun: null,
nextRun: now + 5_000, // First scan 5s after startup
enabled: true,
running: false,
})
// Start the tick loop
tickInterval = setInterval(tick, TICK_MS)
logger.info('Scheduler initialized - backup at ~3AM, cleanup at ~4AM, heartbeat every 5m')
logger.info('Scheduler initialized - backup at ~3AM, cleanup at ~4AM, heartbeat every 5m, webhook retry every 60s, claude scan every 60s')
}
/** Calculate ms until next occurrence of a given hour (UTC) */
@ -272,13 +292,18 @@ async function tick() {
// Check if this task is enabled in settings (heartbeat is always enabled)
const settingKey = id === 'auto_backup' ? 'general.auto_backup'
: id === 'auto_cleanup' ? 'general.auto_cleanup'
: id === 'webhook_retry' ? 'webhooks.retry_enabled'
: id === 'claude_session_scan' ? 'general.claude_session_scan'
: 'general.agent_heartbeat'
if (!isSettingEnabled(settingKey, id === 'agent_heartbeat')) continue
const defaultEnabled = id === 'agent_heartbeat' || id === 'webhook_retry' || id === 'claude_session_scan'
if (!isSettingEnabled(settingKey, defaultEnabled)) continue
task.running = true
try {
const result = id === 'auto_backup' ? await runBackup()
: id === 'agent_heartbeat' ? await runHeartbeatCheck()
: id === 'webhook_retry' ? await processWebhookRetries()
: id === 'claude_session_scan' ? await syncClaudeSessions()
: await runCleanup()
task.lastResult = { ...result, timestamp: now }
} catch (err: any) {
@ -306,11 +331,14 @@ export function getSchedulerStatus() {
for (const [id, task] of tasks) {
const settingKey = id === 'auto_backup' ? 'general.auto_backup'
: id === 'auto_cleanup' ? 'general.auto_cleanup'
: id === 'webhook_retry' ? 'webhooks.retry_enabled'
: id === 'claude_session_scan' ? 'general.claude_session_scan'
: 'general.agent_heartbeat'
const defaultEnabled = id === 'agent_heartbeat' || id === 'webhook_retry' || id === 'claude_session_scan'
result.push({
id,
name: task.name,
enabled: isSettingEnabled(settingKey, id === 'agent_heartbeat'),
enabled: isSettingEnabled(settingKey, defaultEnabled),
lastRun: task.lastRun,
nextRun: task.nextRun,
running: task.running,
@ -326,6 +354,8 @@ export async function triggerTask(taskId: string): Promise<{ ok: boolean; messag
if (taskId === 'auto_backup') return runBackup()
if (taskId === 'auto_cleanup') return runCleanup()
if (taskId === 'agent_heartbeat') return runHeartbeatCheck()
if (taskId === 'webhook_retry') return processWebhookRetries()
if (taskId === 'claude_session_scan') return syncClaudeSessions()
return { ok: false, message: `Unknown task: ${taskId}` }
}

View File

@ -54,6 +54,13 @@ export const createAgentSchema = z.object({
write_to_gateway: z.boolean().optional(),
})
export const bulkUpdateTaskStatusSchema = z.object({
tasks: z.array(z.object({
id: z.number().int().positive(),
status: z.enum(['inbox', 'assigned', 'in_progress', 'review', 'quality_review', 'done']),
})).min(1, 'At least one task is required').max(100),
})
export const createWebhookSchema = z.object({
name: z.string().min(1, 'Name is required').max(200),
url: z.string().url('Invalid URL'),
@ -140,7 +147,7 @@ export const spawnAgentSchema = z.object({
export const createUserSchema = z.object({
username: z.string().min(1, 'Username is required'),
password: z.string().min(1, 'Password is required'),
password: z.string().min(12, 'Password must be at least 12 characters'),
display_name: z.string().optional(),
role: z.enum(['admin', 'operator', 'viewer']).default('operator'),
provider: z.enum(['local', 'google']).default('local'),

View File

@ -1,4 +1,4 @@
import { createHmac } from 'crypto'
import { createHmac, timingSafeEqual } from 'crypto'
import { eventBus, type ServerEvent } from './event-bus'
import { logger } from './logger'
@ -9,8 +9,29 @@ interface Webhook {
secret: string | null
events: string // JSON array
enabled: number
consecutive_failures?: number
}
interface DeliverOpts {
attempt?: number
parentDeliveryId?: number | null
allowRetry?: boolean
}
interface DeliveryResult {
success: boolean
status_code: number | null
response_body: string | null
error: string | null
duration_ms: number
delivery_id?: number
}
// Backoff schedule in seconds: 30s, 5m, 30m, 2h, 8h
const BACKOFF_SECONDS = [30, 300, 1800, 7200, 28800]
const MAX_RETRIES = parseInt(process.env.MC_WEBHOOK_MAX_RETRIES || '5', 10) || 5
// Map event bus events to webhook event types
const EVENT_MAP: Record<string, string> = {
'activity.created': 'activity', // Dynamically becomes activity.<type>
@ -22,6 +43,42 @@ const EVENT_MAP: Record<string, string> = {
'task.deleted': 'activity.task_deleted',
}
/**
* Compute the next retry delay in seconds, with ±20% jitter.
*/
export function nextRetryDelay(attempt: number): number {
const base = BACKOFF_SECONDS[Math.min(attempt, BACKOFF_SECONDS.length - 1)]
const jitter = base * 0.2 * (2 * Math.random() - 1) // ±20%
return Math.round(base + jitter)
}
/**
* Verify a webhook signature using constant-time comparison.
* Consumers can use this to validate incoming webhook deliveries.
*/
export function verifyWebhookSignature(
secret: string,
rawBody: string,
signatureHeader: string | null | undefined
): boolean {
if (!signatureHeader || !secret) return false
const expected = `sha256=${createHmac('sha256', secret).update(rawBody).digest('hex')}`
// Constant-time comparison
const sigBuf = Buffer.from(signatureHeader)
const expectedBuf = Buffer.from(expected)
if (sigBuf.length !== expectedBuf.length) {
// Compare expected against a dummy buffer of matching length to avoid timing leak
const dummy = Buffer.alloc(expectedBuf.length)
timingSafeEqual(expectedBuf, dummy)
return false
}
return timingSafeEqual(sigBuf, expectedBuf)
}
/**
* Subscribe to the event bus and fire webhooks for matching events.
* Called once during server initialization.
@ -92,15 +149,31 @@ async function fireWebhooksAsync(eventType: string, payload: Record<string, any>
})
await Promise.allSettled(
matchingWebhooks.map((wh) => deliverWebhook(wh, eventType, payload))
matchingWebhooks.map((wh) => deliverWebhook(wh, eventType, payload, { allowRetry: true }))
)
}
/**
* Public wrapper for API routes (test endpoint, manual retry).
* Returns delivery result fields for the response.
*/
export async function deliverWebhookPublic(
webhook: Webhook,
eventType: string,
payload: Record<string, any>,
opts?: DeliverOpts
): Promise<DeliveryResult> {
return deliverWebhook(webhook, eventType, payload, opts ?? { allowRetry: false })
}
async function deliverWebhook(
webhook: Webhook,
eventType: string,
payload: Record<string, any>
) {
payload: Record<string, any>,
opts: DeliverOpts = {}
): Promise<DeliveryResult> {
const { attempt = 0, parentDeliveryId = null, allowRetry = true } = opts
const body = JSON.stringify({
event: eventType,
timestamp: Math.floor(Date.now() / 1000),
@ -146,14 +219,17 @@ async function deliverWebhook(
}
const durationMs = Date.now() - start
const success = statusCode !== null && statusCode >= 200 && statusCode < 300
let deliveryId: number | undefined
// Log delivery attempt
// Log delivery attempt and handle retry/circuit-breaker logic
try {
const { getDatabase } = await import('./db')
const db = getDatabase()
db.prepare(`
INSERT INTO webhook_deliveries (webhook_id, event_type, payload, status_code, response_body, error, duration_ms)
VALUES (?, ?, ?, ?, ?, ?, ?)
const insertResult = db.prepare(`
INSERT INTO webhook_deliveries (webhook_id, event_type, payload, status_code, response_body, error, duration_ms, attempt, is_retry, parent_delivery_id)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`).run(
webhook.id,
eventType,
@ -161,8 +237,12 @@ async function deliverWebhook(
statusCode,
responseBody,
error,
durationMs
durationMs,
attempt,
attempt > 0 ? 1 : 0,
parentDeliveryId
)
deliveryId = Number(insertResult.lastInsertRowid)
// Update webhook last_fired
db.prepare(`
@ -170,6 +250,31 @@ async function deliverWebhook(
WHERE id = ?
`).run(statusCode ?? -1, webhook.id)
// Circuit breaker + retry scheduling (skip for test deliveries)
if (allowRetry) {
if (success) {
// Reset consecutive failures on success
db.prepare(`UPDATE webhooks SET consecutive_failures = 0 WHERE id = ?`).run(webhook.id)
} else {
// Increment consecutive failures
db.prepare(`UPDATE webhooks SET consecutive_failures = consecutive_failures + 1 WHERE id = ?`).run(webhook.id)
if (attempt < MAX_RETRIES - 1) {
// Schedule retry
const delaySec = nextRetryDelay(attempt)
const nextRetryAt = Math.floor(Date.now() / 1000) + delaySec
db.prepare(`UPDATE webhook_deliveries SET next_retry_at = ? WHERE id = ?`).run(nextRetryAt, deliveryId)
} else {
// Exhausted retries — trip circuit breaker
const wh = db.prepare(`SELECT consecutive_failures FROM webhooks WHERE id = ?`).get(webhook.id) as { consecutive_failures: number } | undefined
if (wh && wh.consecutive_failures >= MAX_RETRIES) {
db.prepare(`UPDATE webhooks SET enabled = 0, updated_at = unixepoch() WHERE id = ?`).run(webhook.id)
logger.warn({ webhookId: webhook.id, name: webhook.name }, 'Webhook circuit breaker tripped — disabled after exhausting retries')
}
}
}
}
// Prune old deliveries (keep last 200 per webhook)
db.prepare(`
DELETE FROM webhook_deliveries
@ -177,7 +282,83 @@ async function deliverWebhook(
SELECT id FROM webhook_deliveries WHERE webhook_id = ? ORDER BY created_at DESC LIMIT 200
)
`).run(webhook.id, webhook.id)
} catch {
// Silent - delivery logging is best-effort
} catch (logErr) {
logger.error({ err: logErr, webhookId: webhook.id }, 'Webhook delivery logging/pruning failed')
}
return { success, status_code: statusCode, response_body: responseBody, error, duration_ms: durationMs, delivery_id: deliveryId }
}
/**
* Process pending webhook retries. Called by the scheduler.
* Picks up deliveries where next_retry_at has passed and re-delivers them.
*/
export async function processWebhookRetries(): Promise<{ ok: boolean; message: string }> {
try {
const { getDatabase } = await import('./db')
const db = getDatabase()
const now = Math.floor(Date.now() / 1000)
// Find deliveries ready for retry (limit batch to 50)
const pendingRetries = db.prepare(`
SELECT wd.id, wd.webhook_id, wd.event_type, wd.payload, wd.attempt,
w.id as w_id, w.name as w_name, w.url as w_url, w.secret as w_secret,
w.events as w_events, w.enabled as w_enabled, w.consecutive_failures as w_consecutive_failures
FROM webhook_deliveries wd
JOIN webhooks w ON w.id = wd.webhook_id AND w.enabled = 1
WHERE wd.next_retry_at IS NOT NULL AND wd.next_retry_at <= ?
LIMIT 50
`).all(now) as Array<{
id: number; webhook_id: number; event_type: string; payload: string; attempt: number
w_id: number; w_name: string; w_url: string; w_secret: string | null
w_events: string; w_enabled: number; w_consecutive_failures: number
}>
if (pendingRetries.length === 0) {
return { ok: true, message: 'No pending retries' }
}
// Clear next_retry_at immediately to prevent double-processing
const clearStmt = db.prepare(`UPDATE webhook_deliveries SET next_retry_at = NULL WHERE id = ?`)
for (const row of pendingRetries) {
clearStmt.run(row.id)
}
// Re-deliver each
let succeeded = 0
let failed = 0
for (const row of pendingRetries) {
const webhook: Webhook = {
id: row.w_id,
name: row.w_name,
url: row.w_url,
secret: row.w_secret,
events: row.w_events,
enabled: row.w_enabled,
consecutive_failures: row.w_consecutive_failures,
}
// Parse the original payload from the stored JSON body
let parsedPayload: Record<string, any>
try {
const parsed = JSON.parse(row.payload)
parsedPayload = parsed.data ?? parsed
} catch {
parsedPayload = {}
}
const result = await deliverWebhook(webhook, row.event_type, parsedPayload, {
attempt: row.attempt + 1,
parentDeliveryId: row.id,
allowRetry: true,
})
if (result.success) succeeded++
else failed++
}
return { ok: true, message: `Processed ${pendingRetries.length} retries (${succeeded} ok, ${failed} failed)` }
} catch (err: any) {
return { ok: false, message: `Webhook retry failed: ${err.message}` }
}
}

View File

@ -126,7 +126,7 @@ export async function createTestUser(
const username = `e2e-user-${uid()}`
const res = await request.post('/api/auth/users', {
headers: API_KEY_HEADER,
data: { username, password: 'testpass123', display_name: username, ...overrides },
data: { username, password: 'e2e-testpass-123', display_name: username, ...overrides },
})
const body = await res.json()
return { id: body.user?.id as number, username, res, body }

View File

@ -13,7 +13,7 @@ test.describe('Login Rate Limiting (Issue #8)', () => {
for (let i = 0; i < 7; i++) {
const res = await request.post('/api/auth/login', {
data: { username: 'testadmin', password: 'wrongpassword' },
headers: { 'x-forwarded-for': '10.99.99.99' }
headers: { 'x-real-ip': '10.99.99.99' }
})
results.push(res.status())
}
@ -26,7 +26,7 @@ test.describe('Login Rate Limiting (Issue #8)', () => {
test('successful login is not blocked for fresh IP', async ({ request }) => {
const res = await request.post('/api/auth/login', {
data: { username: 'testadmin', password: 'testpass123' },
headers: { 'x-forwarded-for': '10.88.88.88' }
headers: { 'x-real-ip': '10.88.88.88' }
})
// Should succeed (200) or at least not be rate limited
expect(res.status()).not.toBe(429)

View File

@ -31,7 +31,7 @@ test.describe('User Management', () => {
headers: API_KEY_HEADER,
data: {
username: first.user.username,
password: 'testpass123',
password: 'e2e-testpass-123',
},
})
expect(res.status()).toBe(409)

View File

@ -13,6 +13,17 @@ export default defineConfig(async () => {
globals: true,
setupFiles: ['src/test/setup.ts'],
include: ['src/**/*.test.ts', 'src/**/*.test.tsx'],
coverage: {
provider: 'v8' as const,
include: ['src/lib/**/*.ts'],
exclude: ['src/lib/__tests__/**', 'src/**/*.test.ts'],
thresholds: {
lines: 60,
functions: 60,
branches: 60,
statements: 60,
},
},
},
}
})