Compare commits
3 commits
d8c09eeefa
...
d952b973a8
| Author | SHA1 | Date | |
|---|---|---|---|
|
d952b973a8 |
|||
|
2facb2a1e2 |
|||
|
374408834b |
11 changed files with 636 additions and 127 deletions
231
README.md
231
README.md
|
|
@ -3,10 +3,34 @@
|
||||||
Troutbot is the final solution to protecting the trout population. It's
|
Troutbot is the final solution to protecting the trout population. It's
|
||||||
environmental protection incarnate!
|
environmental protection incarnate!
|
||||||
|
|
||||||
Well in reality, it's a GitHub webhook bot that analyzes issues and pull
|
Well, in reality, it is a GitHub bot that analyzes issues and pull requests
|
||||||
requests using real signals such as CI check results, diff quality, and body
|
using real signals such as CI check results, diff quality, and body structure
|
||||||
structure and then posts trout-themed comments about the findings. Now you know
|
and then posts trout-themed comments about the findings. Now you know whether
|
||||||
whether your changes hurt or help the trout population.
|
your changes hurt or help the trout population.
|
||||||
|
|
||||||
|
## Operation Modes
|
||||||
|
|
||||||
|
Troutbot supports two operation modes:
|
||||||
|
|
||||||
|
### Webhook Mode (Real-time)
|
||||||
|
|
||||||
|
GitHub sends webhook events to troutbot when issues/PRs are opened or updated.
|
||||||
|
Troutbot responds immediately. Best for:
|
||||||
|
|
||||||
|
- Single or few repositories
|
||||||
|
- You have admin access to configure webhooks
|
||||||
|
- You can expose a public endpoint
|
||||||
|
|
||||||
|
### Polling Mode (Periodic)
|
||||||
|
|
||||||
|
Troutbot periodically polls configured repositories for `@troutbot` mentions in
|
||||||
|
comments. Best for:
|
||||||
|
|
||||||
|
- Monitoring dozens of repositories without webhook setup
|
||||||
|
- Running behind a firewall or on dynamic IPs
|
||||||
|
- Simplified deployment without webhook secrets
|
||||||
|
|
||||||
|
Both modes use the same analysis engine and produce the same results.
|
||||||
|
|
||||||
## Quick Start
|
## Quick Start
|
||||||
|
|
||||||
|
|
@ -17,18 +41,17 @@ $ npm install
|
||||||
# Populate the environment config
|
# Populate the environment config
|
||||||
$ cp .env.example .env
|
$ cp .env.example .env
|
||||||
|
|
||||||
# Set up application confg
|
# Set up application config
|
||||||
cp config.example.ts config.ts
|
cp config.example.ts config.ts
|
||||||
|
|
||||||
# Edit .env and config.ts, then to start:
|
# Edit .env and config.ts, then build and start.
|
||||||
npm run build && npm start
|
# If `.env` is not populated, Troutbot will start in dry-run mode.
|
||||||
|
pnpm run build && pnpm start
|
||||||
```
|
```
|
||||||
|
|
||||||
## How It Works
|
## How It Works
|
||||||
|
|
||||||
Troutbot has three analysis backends ran against each incoming webhook event.
|
Troutbot has three analysis backends that analyze issues and PRs:
|
||||||
They are the primary decisionmaking logic behind whether your changes affect the
|
|
||||||
trout population negatively, or positively.
|
|
||||||
|
|
||||||
### `checks`
|
### `checks`
|
||||||
|
|
||||||
|
|
@ -71,6 +94,75 @@ checks 0.4, diff 0.3, quality 0.3). Backends that return zero confidence (e.g.,
|
||||||
no CI checks found yet) are excluded from the average. If combined confidence
|
no CI checks found yet) are excluded from the average. If combined confidence
|
||||||
falls below `confidenceThreshold`, the result is forced to neutral.
|
falls below `confidenceThreshold`, the result is forced to neutral.
|
||||||
|
|
||||||
|
## Webhook Mode
|
||||||
|
|
||||||
|
In webhook mode, troutbot receives real-time events from GitHub.
|
||||||
|
|
||||||
|
### GitHub Webhook Setup
|
||||||
|
|
||||||
|
1. Go to your repository's **Settings > Webhooks > Add webhook**
|
||||||
|
2. **Payload URL**: `https://your-host/webhook`
|
||||||
|
3. **Content type**: `application/json`
|
||||||
|
4. **Secret**: Generate with `openssl rand -hex 32` and set as `WEBHOOK_SECRET`
|
||||||
|
5. **Events**: Select **Issues**, **Pull requests**, and optionally **Check
|
||||||
|
suites** (for re-analysis when CI finishes)
|
||||||
|
|
||||||
|
If you enable **Check suites** and set `response.allowUpdates: true` in your
|
||||||
|
config, troutbot will update its comment on a PR once CI results are available.
|
||||||
|
|
||||||
|
### Webhook Security
|
||||||
|
|
||||||
|
- **`WEBHOOK_SECRET` is strongly recommended.** Without it, anyone who can reach
|
||||||
|
the `/webhook` endpoint can trigger analysis and post comments. Always set a
|
||||||
|
secret and configure the same value in your GitHub webhook settings.
|
||||||
|
|
||||||
|
## Polling Mode
|
||||||
|
|
||||||
|
In polling mode, troutbot periodically checks configured repositories for
|
||||||
|
`@troutbot` mentions in comments.
|
||||||
|
|
||||||
|
### Configuration
|
||||||
|
|
||||||
|
Enable polling in your `config.ts`:
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
polling: {
|
||||||
|
enabled: true,
|
||||||
|
intervalMinutes: 5, // Check every 5 minutes
|
||||||
|
lookbackMinutes: 10, // Look back 10 minutes for new comments
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### How It Works
|
||||||
|
|
||||||
|
1. On startup, troutbot fetches recent comments from all configured repositories
|
||||||
|
2. It scans each comment for `@troutbot` mentions
|
||||||
|
3. When found, it analyzes the associated issue/PR and posts a response
|
||||||
|
4. Processed comments are tracked to avoid duplicate responses
|
||||||
|
5. The cycle repeats every `intervalMinutes`
|
||||||
|
|
||||||
|
### On-Demand Analysis
|
||||||
|
|
||||||
|
Users can trigger analysis by mentioning `@troutbot` in any comment:
|
||||||
|
|
||||||
|
```plaintext
|
||||||
|
Hey @troutbot, can you take a look at this?
|
||||||
|
```
|
||||||
|
|
||||||
|
The bot will analyze the issue/PR and respond with a trout-themed assessment.
|
||||||
|
|
||||||
|
### Rate Limiting
|
||||||
|
|
||||||
|
Polling uses the GitHub REST API and respects rate limits. The default settings
|
||||||
|
(5 min interval, 10 min lookback) are conservative and work well within GitHub's
|
||||||
|
5000 requests/hour limit for personal access tokens.
|
||||||
|
|
||||||
|
### Requirements
|
||||||
|
|
||||||
|
- `GITHUB_TOKEN` with read access to all watched repositories
|
||||||
|
- Repositories configured in `config.repositories`
|
||||||
|
- Write access to post comments
|
||||||
|
|
||||||
## GitHub Account & Token Setup
|
## GitHub Account & Token Setup
|
||||||
|
|
||||||
Troutbot is designed to run as a dedicated bot account on GitHub. Create a
|
Troutbot is designed to run as a dedicated bot account on GitHub. Create a
|
||||||
|
|
@ -89,8 +181,7 @@ The bot account needs access to every repository it will comment on:
|
||||||
- **For organization repos**: Invite the bot account as a collaborator with
|
- **For organization repos**: Invite the bot account as a collaborator with
|
||||||
**Write** access, or add it to a team with write permissions.
|
**Write** access, or add it to a team with write permissions.
|
||||||
- **For personal repos**: Add the bot account as a collaborator under
|
- **For personal repos**: Add the bot account as a collaborator under
|
||||||
\*\*Settings
|
`Settings > Collaborators`.
|
||||||
> Collaborators\*\*.
|
|
||||||
|
|
||||||
The bot needs write access to post comments. Read access alone is not enough.
|
The bot needs write access to post comments. Read access alone is not enough.
|
||||||
|
|
||||||
|
|
@ -98,10 +189,10 @@ The bot needs write access to post comments. Read access alone is not enough.
|
||||||
|
|
||||||
Log in as the bot account and create a fine-grained PAT:
|
Log in as the bot account and create a fine-grained PAT:
|
||||||
|
|
||||||
1. Go to **Settings > Developer settings > Personal access tokens > Fine-grained
|
1. Go to
|
||||||
tokens**
|
`Settings > Developer settings > Personal access tokens > Fine-grained tokens`
|
||||||
2. Click **Generate new token**
|
2. Click **Generate new token**
|
||||||
3. Set a descriptive name (e.g., `troutbot-webhook`)
|
3. Set a descriptive name (e.g., `troutbot-production`)
|
||||||
4. Set **Expiration** - pick a long-lived duration or no expiration, since this
|
4. Set **Expiration** - pick a long-lived duration or no expiration, since this
|
||||||
runs unattended
|
runs unattended
|
||||||
5. Under **Repository access**, select the specific repositories the bot will
|
5. Under **Repository access**, select the specific repositories the bot will
|
||||||
|
|
@ -121,19 +212,7 @@ Set this as the `GITHUB_TOKEN` environment variable.
|
||||||
> `repo` scope. Fine-grained tokens are recommended because they follow the
|
> `repo` scope. Fine-grained tokens are recommended because they follow the
|
||||||
> principle of least privilege.
|
> principle of least privilege.
|
||||||
|
|
||||||
### 4. Generate a webhook secret
|
## Configuring Troutbot
|
||||||
|
|
||||||
Generate a random secret to verify webhook payloads:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
openssl rand -hex 32
|
|
||||||
```
|
|
||||||
|
|
||||||
Set this as the `WEBHOOK_SECRET` environment variable, and use the same value
|
|
||||||
when configuring the webhook in GitHub (see
|
|
||||||
[GitHub Webhook Setup](#github-webhook-setup)).
|
|
||||||
|
|
||||||
## Configuration
|
|
||||||
|
|
||||||
### Environment Variables
|
### Environment Variables
|
||||||
|
|
||||||
|
|
@ -142,7 +221,7 @@ when configuring the webhook in GitHub (see
|
||||||
| Variable | Description | Required |
|
| Variable | Description | Required |
|
||||||
| ---------------- | ----------------------------------------------------- | ---------------------------- |
|
| ---------------- | ----------------------------------------------------- | ---------------------------- |
|
||||||
| `GITHUB_TOKEN` | Fine-grained PAT from the bot account (see above) | No (dry-run without it) |
|
| `GITHUB_TOKEN` | Fine-grained PAT from the bot account (see above) | No (dry-run without it) |
|
||||||
| `WEBHOOK_SECRET` | Secret for verifying webhook signatures | No (skips verification) |
|
| `WEBHOOK_SECRET` | Secret for verifying webhook signatures | No (only for webhook mode) |
|
||||||
| `PORT` | Server port (overrides `server.port` in config) | No |
|
| `PORT` | Server port (overrides `server.port` in config) | No |
|
||||||
| `CONFIG_PATH` | Path to config file | No (defaults to `config.ts`) |
|
| `CONFIG_PATH` | Path to config file | No (defaults to `config.ts`) |
|
||||||
| `LOG_LEVEL` | Log level override (`debug`, `info`, `warn`, `error`) | No |
|
| `LOG_LEVEL` | Log level override (`debug`, `info`, `warn`, `error`) | No |
|
||||||
|
|
@ -156,10 +235,11 @@ default-exports a `Config` object - full type checking and autocompletion in
|
||||||
your editor.
|
your editor.
|
||||||
|
|
||||||
```typescript
|
```typescript
|
||||||
import type { Config } from "./src/types";
|
import type { Config } from './src/types';
|
||||||
|
|
||||||
const config: Config = {
|
const config: Config = {
|
||||||
server: { port: 3000 },
|
server: { port: 3000 },
|
||||||
|
repositories: [{ owner: 'myorg', repo: 'myrepo' }],
|
||||||
engine: {
|
engine: {
|
||||||
backends: {
|
backends: {
|
||||||
checks: { enabled: true },
|
checks: { enabled: true },
|
||||||
|
|
@ -169,6 +249,11 @@ const config: Config = {
|
||||||
weights: { checks: 0.4, diff: 0.3, quality: 0.3 },
|
weights: { checks: 0.4, diff: 0.3, quality: 0.3 },
|
||||||
confidenceThreshold: 0.1,
|
confidenceThreshold: 0.1,
|
||||||
},
|
},
|
||||||
|
polling: {
|
||||||
|
enabled: true,
|
||||||
|
intervalMinutes: 5,
|
||||||
|
lookbackMinutes: 10,
|
||||||
|
},
|
||||||
// ...
|
// ...
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|
@ -180,28 +265,13 @@ pre-compilation needed.
|
||||||
|
|
||||||
See `config.example.ts` for the full annotated reference.
|
See `config.example.ts` for the full annotated reference.
|
||||||
|
|
||||||
## GitHub Webhook Setup
|
|
||||||
|
|
||||||
1. Go to your repository's **Settings > Webhooks > Add webhook**
|
|
||||||
2. **Payload URL**: `https://your-host/webhook`
|
|
||||||
3. **Content type**: `application/json`
|
|
||||||
4. **Secret**: Must match your `WEBHOOK_SECRET` env var
|
|
||||||
5. **Events**: Select **Issues**, **Pull requests**, and optionally **Check
|
|
||||||
suites** (for re-analysis when CI finishes)
|
|
||||||
|
|
||||||
If you enable **Check suites** and set `response.allowUpdates: true` in your
|
|
||||||
config, troutbot will update its comment on a PR once CI results are available.
|
|
||||||
|
|
||||||
## Production Configuration
|
## Production Configuration
|
||||||
|
|
||||||
When deploying troutbot to production, keep the following in mind:
|
When deploying troutbot to production, keep the following in mind:
|
||||||
|
|
||||||
- **`WEBHOOK_SECRET` is strongly recommended.** Without it, anyone who can reach
|
- **Use a reverse proxy with TLS.** If using webhook mode, GitHub sends payloads
|
||||||
the `/webhook` endpoint can trigger analysis and post comments. Always set a
|
over HTTPS. Put nginx, Caddy, or a cloud load balancer in front of troutbot
|
||||||
secret and configure the same value in your GitHub webhook settings.
|
and terminate TLS there. Polling mode doesn't require a public endpoint.
|
||||||
- **Use a reverse proxy with TLS.** GitHub sends webhook payloads over HTTPS.
|
|
||||||
Put nginx, Caddy, or a cloud load balancer in front of troutbot and terminate
|
|
||||||
TLS there.
|
|
||||||
- **Set `NODE_ENV=production`.** This is set automatically in the Docker image.
|
- **Set `NODE_ENV=production`.** This is set automatically in the Docker image.
|
||||||
For standalone deployments, export it in your environment. Express uses this
|
For standalone deployments, export it in your environment. Express uses this
|
||||||
to enable performance optimizations.
|
to enable performance optimizations.
|
||||||
|
|
@ -218,22 +288,19 @@ When deploying troutbot to production, keep the following in mind:
|
||||||
|
|
||||||
## Deployment
|
## Deployment
|
||||||
|
|
||||||
<details>
|
### Standalone (Node.js)
|
||||||
<summary>Standalone (Node.js)</summary>
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
npm ci
|
npm ci
|
||||||
npm run build
|
npm run build
|
||||||
export NODE_ENV=production
|
export NODE_ENV=production
|
||||||
export GITHUB_TOKEN="ghp_..."
|
export GITHUB_TOKEN="ghp_..."
|
||||||
export WEBHOOK_SECRET="your-secret"
|
# Only needed for webhook mode:
|
||||||
|
# export WEBHOOK_SECRET="your-secret"
|
||||||
npm start
|
npm start
|
||||||
```
|
```
|
||||||
|
|
||||||
</details>
|
### Nix
|
||||||
|
|
||||||
<details>
|
|
||||||
<summary>Nix</summary>
|
|
||||||
|
|
||||||
**Flake** (NixOS or flake-enabled systems):
|
**Flake** (NixOS or flake-enabled systems):
|
||||||
|
|
||||||
|
|
@ -248,8 +315,8 @@ npm start
|
||||||
{
|
{
|
||||||
services.troutbot = {
|
services.troutbot = {
|
||||||
enable = true;
|
enable = true;
|
||||||
environmentFile = "/path/to/.env"; # use Agenix if possible
|
environmentFile = "/path/to/.env";
|
||||||
configPath = "/path/to/config.ts" # use Agenix if possible
|
configPath = "/path/to/config.ts";
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
];
|
];
|
||||||
|
|
@ -264,10 +331,7 @@ npm start
|
||||||
nix run github:notashelf/troutbot
|
nix run github:notashelf/troutbot
|
||||||
```
|
```
|
||||||
|
|
||||||
</details>
|
### Docker
|
||||||
|
|
||||||
<details>
|
|
||||||
<summary>Docker</summary>
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
docker build -t troutbot .
|
docker build -t troutbot .
|
||||||
|
|
@ -275,7 +339,6 @@ docker run -d \
|
||||||
--name troutbot \
|
--name troutbot \
|
||||||
-p 127.0.0.1:3000:3000 \
|
-p 127.0.0.1:3000:3000 \
|
||||||
-e GITHUB_TOKEN="ghp_..." \
|
-e GITHUB_TOKEN="ghp_..." \
|
||||||
-e WEBHOOK_SECRET="your-secret" \
|
|
||||||
-v $(pwd)/config.ts:/app/config.ts:ro \
|
-v $(pwd)/config.ts:/app/config.ts:ro \
|
||||||
--restart unless-stopped \
|
--restart unless-stopped \
|
||||||
troutbot
|
troutbot
|
||||||
|
|
@ -283,17 +346,14 @@ docker run -d \
|
||||||
|
|
||||||
Multi-stage build, non-root user, built-in health check, `STOPSIGNAL SIGTERM`.
|
Multi-stage build, non-root user, built-in health check, `STOPSIGNAL SIGTERM`.
|
||||||
|
|
||||||
</details>
|
### Docker Compose
|
||||||
|
|
||||||
<details>
|
|
||||||
<summary>Docker Compose</summary>
|
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
services:
|
services:
|
||||||
troutbot:
|
troutbot:
|
||||||
build: .
|
build: .
|
||||||
ports:
|
ports:
|
||||||
- "127.0.0.1:3000:3000"
|
- '127.0.0.1:3000:3000'
|
||||||
env_file: .env
|
env_file: .env
|
||||||
volumes:
|
volumes:
|
||||||
- ./config.ts:/app/config.ts:ro
|
- ./config.ts:/app/config.ts:ro
|
||||||
|
|
@ -305,20 +365,17 @@ services:
|
||||||
logging:
|
logging:
|
||||||
driver: json-file
|
driver: json-file
|
||||||
options:
|
options:
|
||||||
max-size: "10m"
|
max-size: '10m'
|
||||||
max-file: "3"
|
max-file: '3'
|
||||||
```
|
```
|
||||||
|
|
||||||
</details>
|
### Systemd
|
||||||
|
|
||||||
<details>
|
|
||||||
<summary>systemd</summary>
|
|
||||||
|
|
||||||
Create `/etc/systemd/system/troutbot.service`:
|
Create `/etc/systemd/system/troutbot.service`:
|
||||||
|
|
||||||
```ini
|
```ini
|
||||||
[Unit]
|
[Unit]
|
||||||
Description=Troutbot GitHub Webhook Bot
|
Description=Troutbot GitHub Bot
|
||||||
After=network.target
|
After=network.target
|
||||||
|
|
||||||
[Service]
|
[Service]
|
||||||
|
|
@ -345,10 +402,9 @@ sudo systemctl daemon-reload
|
||||||
sudo systemctl enable --now troutbot
|
sudo systemctl enable --now troutbot
|
||||||
```
|
```
|
||||||
|
|
||||||
</details>
|
### Reverse Proxy (nginx)
|
||||||
|
|
||||||
<details>
|
Only needed for webhook mode:
|
||||||
<summary>Reverse Proxy (nginx)</summary>
|
|
||||||
|
|
||||||
```nginx
|
```nginx
|
||||||
server {
|
server {
|
||||||
|
|
@ -369,7 +425,7 @@ server {
|
||||||
proxy_set_header X-Forwarded-Proto $scheme;
|
proxy_set_header X-Forwarded-Proto $scheme;
|
||||||
}
|
}
|
||||||
|
|
||||||
# Optional: nginx-level rate limiting
|
# Optional: nginx-level rate limiting for webhooks
|
||||||
# limit_req_zone $binary_remote_addr zone=webhook:10m rate=10r/s;
|
# limit_req_zone $binary_remote_addr zone=webhook:10m rate=10r/s;
|
||||||
# location /webhook {
|
# location /webhook {
|
||||||
# limit_req zone=webhook burst=20 nodelay;
|
# limit_req zone=webhook burst=20 nodelay;
|
||||||
|
|
@ -378,21 +434,23 @@ server {
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
</details>
|
|
||||||
|
|
||||||
## API Endpoints
|
## API Endpoints
|
||||||
|
|
||||||
|
<!--markdownlint-disable MD013-->
|
||||||
|
|
||||||
| Method | Path | Description |
|
| Method | Path | Description |
|
||||||
| -------- | ------------- | ---------------------------------------------------------------------------------------- |
|
| -------- | ------------- | ---------------------------------------------------------------------------------------- |
|
||||||
| `GET` | `/health` | Health check - returns `status`, `uptime` (seconds), `version`, `dryRun`, and `backends` |
|
| `GET` | `/health` | Health check - returns `status`, `uptime` (seconds), `version`, `dryRun`, and `backends` |
|
||||||
| `POST` | `/webhook` | GitHub webhook receiver (rate limited) |
|
| `POST` | `/webhook` | GitHub webhook receiver (rate limited, webhook mode only) |
|
||||||
| `GET` | `/dashboard` | Web UI dashboard with status, events, and config editor |
|
| `GET` | `/dashboard` | Web UI dashboard with status, events, and config editor |
|
||||||
| `GET` | `/api/status` | JSON status: uptime, version, dry-run, backends, repo count |
|
| `GET` | `/api/status` | JSON status: uptime, version, dry-run, backends, repo count |
|
||||||
| `GET` | `/api/events` | Recent webhook events from the in-memory ring buffer |
|
| `GET` | `/api/events` | Recent events from the in-memory ring buffer |
|
||||||
| `DELETE` | `/api/events` | Clear the event ring buffer |
|
| `DELETE` | `/api/events` | Clear the event ring buffer |
|
||||||
| `GET` | `/api/config` | Current runtime configuration as JSON |
|
| `GET` | `/api/config` | Current runtime configuration as JSON |
|
||||||
| `PUT` | `/api/config` | Partial config update: deep-merges, validates, and applies in-place |
|
| `PUT` | `/api/config` | Partial config update: deep-merges, validates, and applies in-place |
|
||||||
|
|
||||||
|
<!--markdownlint-enable MD013-->
|
||||||
|
|
||||||
## Dashboard & Runtime API
|
## Dashboard & Runtime API
|
||||||
|
|
||||||
Troutbot ships with a built-in web dashboard and JSON API for monitoring and
|
Troutbot ships with a built-in web dashboard and JSON API for monitoring and
|
||||||
|
|
@ -405,9 +463,8 @@ running). The dashboard provides:
|
||||||
|
|
||||||
- **Status card** - uptime, version, dry-run state, active backends, and repo
|
- **Status card** - uptime, version, dry-run state, active backends, and repo
|
||||||
count. Auto-refreshes every 30 seconds.
|
count. Auto-refreshes every 30 seconds.
|
||||||
- **Event log** - table of recent webhook events showing repo, PR/issue number,
|
- **Event log** - table of recent events showing repo, PR/issue number, action,
|
||||||
action, impact rating, and confidence score. Keeps the last 100 events in
|
impact rating, and confidence score. Keeps the last 100 events in memory.
|
||||||
memory.
|
|
||||||
- **Config editor** - read-only JSON view of the current runtime config with an
|
- **Config editor** - read-only JSON view of the current runtime config with an
|
||||||
"Edit" toggle that lets you modify and save changes without restarting.
|
"Edit" toggle that lets you modify and save changes without restarting.
|
||||||
|
|
||||||
|
|
@ -441,8 +498,8 @@ original config remains unchanged if validation fails.
|
||||||
|
|
||||||
### Event Buffer API
|
### Event Buffer API
|
||||||
|
|
||||||
The event buffer stores the last 100 processed webhook events in memory. Events
|
The event buffer stores the last 100 processed events in memory (from both
|
||||||
are lost on restart.
|
webhooks and polling). Events are lost on restart.
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# List recent events
|
# List recent events
|
||||||
|
|
|
||||||
|
|
@ -15,9 +15,11 @@ const config: Config = {
|
||||||
include: [],
|
include: [],
|
||||||
exclude: ['bot-ignore'],
|
exclude: ['bot-ignore'],
|
||||||
},
|
},
|
||||||
|
|
||||||
authors: {
|
authors: {
|
||||||
exclude: ['dependabot', 'renovate[bot]'],
|
exclude: ['dependabot', 'renovate[bot]'],
|
||||||
},
|
},
|
||||||
|
|
||||||
branches: {
|
branches: {
|
||||||
include: [], // empty = all branches
|
include: [], // empty = all branches
|
||||||
},
|
},
|
||||||
|
|
@ -62,7 +64,9 @@ const config: Config = {
|
||||||
includeReasoning: true,
|
includeReasoning: true,
|
||||||
|
|
||||||
// One message is picked at random from the list matching the impact.
|
// One message is picked at random from the list matching the impact.
|
||||||
// Placeholders: {type} (issue/pull request), {impact} (positive/negative/neutral)
|
// Placeholders:
|
||||||
|
// - {type} (issue/pull request),
|
||||||
|
// - {impact} (positive/negative/neutral)
|
||||||
messages: {
|
messages: {
|
||||||
positive: [
|
positive: [
|
||||||
'This {type} looks great for the trout! All signals point upstream.',
|
'This {type} looks great for the trout! All signals point upstream.',
|
||||||
|
|
@ -89,6 +93,14 @@ const config: Config = {
|
||||||
level: 'info',
|
level: 'info',
|
||||||
file: 'troutbot.log',
|
file: 'troutbot.log',
|
||||||
},
|
},
|
||||||
|
|
||||||
|
// Polling mode: Watch for @troutbot mentions without webhooks.
|
||||||
|
// Useful for monitoring multiple repos without needing webhook configuration.
|
||||||
|
polling: {
|
||||||
|
enabled: false,
|
||||||
|
intervalMinutes: 5, // how often to check for new comments
|
||||||
|
lookbackMinutes: 10, // how far back to look for comments on each poll
|
||||||
|
},
|
||||||
};
|
};
|
||||||
|
|
||||||
export default config;
|
export default config;
|
||||||
|
|
|
||||||
|
|
@ -9,8 +9,7 @@ export function createDashboardRouter(config: Config): express.Router {
|
||||||
|
|
||||||
router.use(express.json());
|
router.use(express.json());
|
||||||
|
|
||||||
// --- API routes ---
|
// API routes
|
||||||
|
|
||||||
router.get('/api/status', (_req, res) => {
|
router.get('/api/status', (_req, res) => {
|
||||||
const enabledBackends = Object.entries(config.engine.backends)
|
const enabledBackends = Object.entries(config.engine.backends)
|
||||||
.filter(([, v]) => v.enabled)
|
.filter(([, v]) => v.enabled)
|
||||||
|
|
@ -41,7 +40,10 @@ export function createDashboardRouter(config: Config): express.Router {
|
||||||
router.put('/api/config', (req, res) => {
|
router.put('/api/config', (req, res) => {
|
||||||
try {
|
try {
|
||||||
const partial = req.body as Partial<Config>;
|
const partial = req.body as Partial<Config>;
|
||||||
const merged = deepMerge(config as Record<string, unknown>, partial as Record<string, unknown>) as Config;
|
const merged = deepMerge(
|
||||||
|
config as Record<string, unknown>,
|
||||||
|
partial as Record<string, unknown>
|
||||||
|
) as Config;
|
||||||
validate(merged);
|
validate(merged);
|
||||||
|
|
||||||
// Apply in-place
|
// Apply in-place
|
||||||
|
|
|
||||||
|
|
@ -81,15 +81,12 @@ export class ChecksBackend implements EngineBackend {
|
||||||
// Classify failures by severity
|
// Classify failures by severity
|
||||||
const criticalFailures = failed.filter((r) => classifyCheck(r.name) === 'critical');
|
const criticalFailures = failed.filter((r) => classifyCheck(r.name) === 'critical');
|
||||||
const advisoryFailures = failed.filter((r) => classifyCheck(r.name) === 'advisory');
|
const advisoryFailures = failed.filter((r) => classifyCheck(r.name) === 'advisory');
|
||||||
const standardFailures = failed.filter(
|
const standardFailures = failed.filter((r) => classifyCheck(r.name) === 'standard');
|
||||||
(r) => classifyCheck(r.name) === 'standard'
|
|
||||||
);
|
|
||||||
|
|
||||||
// Weighted scoring: critical failures count 3x, advisory 0.5x
|
// Weighted scoring: critical failures count 3x, advisory 0.5x
|
||||||
const failureScore =
|
const failureScore =
|
||||||
criticalFailures.length * 3 + standardFailures.length * 1 + advisoryFailures.length * 0.5;
|
criticalFailures.length * 3 + standardFailures.length * 1 + advisoryFailures.length * 0.5;
|
||||||
const totalWeight =
|
const totalWeight = completed
|
||||||
completed
|
|
||||||
.filter((r) => !skipped.includes(r))
|
.filter((r) => !skipped.includes(r))
|
||||||
.reduce((s, r) => {
|
.reduce((s, r) => {
|
||||||
const cls = classifyCheck(r.name);
|
const cls = classifyCheck(r.name);
|
||||||
|
|
@ -117,13 +114,20 @@ export class ChecksBackend implements EngineBackend {
|
||||||
|
|
||||||
// Build detailed reasoning
|
// Build detailed reasoning
|
||||||
const parts: string[] = [];
|
const parts: string[] = [];
|
||||||
if (passed.length > 0) parts.push(`${passed.length} passed (${passed.map((r) => r.name).join(', ')})`);
|
if (passed.length > 0)
|
||||||
|
parts.push(`${passed.length} passed (${passed.map((r) => r.name).join(', ')})`);
|
||||||
if (criticalFailures.length > 0)
|
if (criticalFailures.length > 0)
|
||||||
parts.push(`${criticalFailures.length} critical failure(s) (${criticalFailures.map((r) => r.name).join(', ')})`);
|
parts.push(
|
||||||
|
`${criticalFailures.length} critical failure(s) (${criticalFailures.map((r) => r.name).join(', ')})`
|
||||||
|
);
|
||||||
if (advisoryFailures.length > 0)
|
if (advisoryFailures.length > 0)
|
||||||
parts.push(`${advisoryFailures.length} advisory failure(s) (${advisoryFailures.map((r) => r.name).join(', ')})`);
|
parts.push(
|
||||||
|
`${advisoryFailures.length} advisory failure(s) (${advisoryFailures.map((r) => r.name).join(', ')})`
|
||||||
|
);
|
||||||
if (standardFailures.length > 0)
|
if (standardFailures.length > 0)
|
||||||
parts.push(`${standardFailures.length} other failure(s) (${standardFailures.map((r) => r.name).join(', ')})`);
|
parts.push(
|
||||||
|
`${standardFailures.length} other failure(s) (${standardFailures.map((r) => r.name).join(', ')})`
|
||||||
|
);
|
||||||
if (skipped.length > 0) parts.push(`${skipped.length} skipped`);
|
if (skipped.length > 0) parts.push(`${skipped.length} skipped`);
|
||||||
if (pending.length > 0) parts.push(`${pending.length} still running`);
|
if (pending.length > 0) parts.push(`${pending.length} still running`);
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -15,7 +15,9 @@ const RISKY_FILE_PATTERN =
|
||||||
|
|
||||||
const DOC_FILE_PATTERN = /\.(md|mdx|txt|rst|adoc)$|^(README|CHANGELOG|LICENSE|CONTRIBUTING)/i;
|
const DOC_FILE_PATTERN = /\.(md|mdx|txt|rst|adoc)$|^(README|CHANGELOG|LICENSE|CONTRIBUTING)/i;
|
||||||
|
|
||||||
function categorizeFiles(files: { filename: string; additions: number; deletions: number; changes: number }[]) {
|
function categorizeFiles(
|
||||||
|
files: { filename: string; additions: number; deletions: number; changes: number }[]
|
||||||
|
) {
|
||||||
const src: typeof files = [];
|
const src: typeof files = [];
|
||||||
const tests: typeof files = [];
|
const tests: typeof files = [];
|
||||||
const generated: typeof files = [];
|
const generated: typeof files = [];
|
||||||
|
|
@ -89,7 +91,11 @@ export class DiffBackend implements EngineBackend {
|
||||||
} else if (totalChanges <= this.config.maxChanges) {
|
} else if (totalChanges <= this.config.maxChanges) {
|
||||||
signals.push({ name: `large PR (${totalChanges} lines)`, positive: false, weight: 0.8 });
|
signals.push({ name: `large PR (${totalChanges} lines)`, positive: false, weight: 0.8 });
|
||||||
} else {
|
} else {
|
||||||
signals.push({ name: `very large PR (${totalChanges} lines, exceeds limit)`, positive: false, weight: 1.5 });
|
signals.push({
|
||||||
|
name: `very large PR (${totalChanges} lines, exceeds limit)`,
|
||||||
|
positive: false,
|
||||||
|
weight: 1.5,
|
||||||
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
// --- Focus signals ---
|
// --- Focus signals ---
|
||||||
|
|
@ -98,9 +104,17 @@ export class DiffBackend implements EngineBackend {
|
||||||
} else if (meaningful.length <= 10) {
|
} else if (meaningful.length <= 10) {
|
||||||
signals.push({ name: 'focused changeset', positive: true, weight: 0.8 });
|
signals.push({ name: 'focused changeset', positive: true, weight: 0.8 });
|
||||||
} else if (meaningful.length > 30) {
|
} else if (meaningful.length > 30) {
|
||||||
signals.push({ name: `sprawling changeset (${meaningful.length} files)`, positive: false, weight: 1.2 });
|
signals.push({
|
||||||
|
name: `sprawling changeset (${meaningful.length} files)`,
|
||||||
|
positive: false,
|
||||||
|
weight: 1.2,
|
||||||
|
});
|
||||||
} else if (meaningful.length > 20) {
|
} else if (meaningful.length > 20) {
|
||||||
signals.push({ name: `broad changeset (${meaningful.length} files)`, positive: false, weight: 0.6 });
|
signals.push({
|
||||||
|
name: `broad changeset (${meaningful.length} files)`,
|
||||||
|
positive: false,
|
||||||
|
weight: 0.6,
|
||||||
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
// --- Test coverage ---
|
// --- Test coverage ---
|
||||||
|
|
@ -129,10 +143,17 @@ export class DiffBackend implements EngineBackend {
|
||||||
|
|
||||||
// --- Churn detection (files with high add+delete suggesting rewrites) ---
|
// --- Churn detection (files with high add+delete suggesting rewrites) ---
|
||||||
const highChurnFiles = src.filter(
|
const highChurnFiles = src.filter(
|
||||||
(f) => f.additions > 50 && f.deletions > 50 && Math.min(f.additions, f.deletions) / Math.max(f.additions, f.deletions) > 0.6
|
(f) =>
|
||||||
|
f.additions > 50 &&
|
||||||
|
f.deletions > 50 &&
|
||||||
|
Math.min(f.additions, f.deletions) / Math.max(f.additions, f.deletions) > 0.6
|
||||||
);
|
);
|
||||||
if (highChurnFiles.length >= 3) {
|
if (highChurnFiles.length >= 3) {
|
||||||
signals.push({ name: `high churn in ${highChurnFiles.length} files (possible refactor)`, positive: false, weight: 0.5 });
|
signals.push({
|
||||||
|
name: `high churn in ${highChurnFiles.length} files (possible refactor)`,
|
||||||
|
positive: false,
|
||||||
|
weight: 0.5,
|
||||||
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
// --- Risky files ---
|
// --- Risky files ---
|
||||||
|
|
@ -180,7 +201,11 @@ export class DiffBackend implements EngineBackend {
|
||||||
const totalSignalWeight = positiveWeight + negativeWeight;
|
const totalSignalWeight = positiveWeight + negativeWeight;
|
||||||
const confidence =
|
const confidence =
|
||||||
signals.length > 0
|
signals.length > 0
|
||||||
? Math.min(1, Math.abs(positiveWeight - negativeWeight) / Math.max(totalSignalWeight, 1) * 0.6 + 0.25)
|
? Math.min(
|
||||||
|
1,
|
||||||
|
(Math.abs(positiveWeight - negativeWeight) / Math.max(totalSignalWeight, 1)) * 0.6 +
|
||||||
|
0.25
|
||||||
|
)
|
||||||
: 0;
|
: 0;
|
||||||
|
|
||||||
// Build reasoning
|
// Build reasoning
|
||||||
|
|
|
||||||
|
|
@ -44,7 +44,11 @@ export class QualityBackend implements EngineBackend {
|
||||||
if (body.length === 0) {
|
if (body.length === 0) {
|
||||||
signals.push({ name: 'empty description', positive: false, weight: 2 });
|
signals.push({ name: 'empty description', positive: false, weight: 2 });
|
||||||
} else if (body.length < this.config.minBodyLength) {
|
} else if (body.length < this.config.minBodyLength) {
|
||||||
signals.push({ name: `short description (${body.length} chars)`, positive: false, weight: 1.2 });
|
signals.push({
|
||||||
|
name: `short description (${body.length} chars)`,
|
||||||
|
positive: false,
|
||||||
|
weight: 1.2,
|
||||||
|
});
|
||||||
} else if (body.length >= this.config.minBodyLength) {
|
} else if (body.length >= this.config.minBodyLength) {
|
||||||
signals.push({ name: 'adequate description', positive: true, weight: 1 });
|
signals.push({ name: 'adequate description', positive: true, weight: 1 });
|
||||||
if (body.length > 300) {
|
if (body.length > 300) {
|
||||||
|
|
@ -68,7 +72,11 @@ export class QualityBackend implements EngineBackend {
|
||||||
if (total > 0 && checked === total) {
|
if (total > 0 && checked === total) {
|
||||||
signals.push({ name: `checklist complete (${total}/${total})`, positive: true, weight: 1 });
|
signals.push({ name: `checklist complete (${total}/${total})`, positive: true, weight: 1 });
|
||||||
} else if (total > 0) {
|
} else if (total > 0) {
|
||||||
signals.push({ name: `checklist incomplete (${checked}/${total})`, positive: false, weight: 0.8 });
|
signals.push({
|
||||||
|
name: `checklist incomplete (${checked}/${total})`,
|
||||||
|
positive: false,
|
||||||
|
weight: 0.8,
|
||||||
|
});
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
@ -79,14 +87,22 @@ export class QualityBackend implements EngineBackend {
|
||||||
if (body.length > 100 && BREAKING_PATTERN.test(body)) {
|
if (body.length > 100 && BREAKING_PATTERN.test(body)) {
|
||||||
signals.push({ name: 'breaking change documented', positive: true, weight: 0.8 });
|
signals.push({ name: 'breaking change documented', positive: true, weight: 0.8 });
|
||||||
} else {
|
} else {
|
||||||
signals.push({ name: 'breaking change mentioned but not detailed', positive: false, weight: 0.8 });
|
signals.push({
|
||||||
|
name: 'breaking change mentioned but not detailed',
|
||||||
|
positive: false,
|
||||||
|
weight: 0.8,
|
||||||
|
});
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// TODOs/FIXMEs in description suggest unfinished work
|
// TODOs/FIXMEs in description suggest unfinished work
|
||||||
const todoMatches = body.match(TODO_PATTERN);
|
const todoMatches = body.match(TODO_PATTERN);
|
||||||
if (todoMatches) {
|
if (todoMatches) {
|
||||||
signals.push({ name: `unfinished markers in description (${todoMatches.length})`, positive: false, weight: 0.6 });
|
signals.push({
|
||||||
|
name: `unfinished markers in description (${todoMatches.length})`,
|
||||||
|
positive: false,
|
||||||
|
weight: 0.6,
|
||||||
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
// --- Type-specific signals ---
|
// --- Type-specific signals ---
|
||||||
|
|
@ -100,7 +116,9 @@ export class QualityBackend implements EngineBackend {
|
||||||
signals.push({ name: 'has expected/actual behavior', positive: true, weight: 1.2 });
|
signals.push({ name: 'has expected/actual behavior', positive: true, weight: 1.2 });
|
||||||
}
|
}
|
||||||
|
|
||||||
if (/\b(version|environment|os|platform|browser|node|python|java|rust|go)\s*[:\d]/i.test(body)) {
|
if (
|
||||||
|
/\b(version|environment|os|platform|browser|node|python|java|rust|go)\s*[:\d]/i.test(body)
|
||||||
|
) {
|
||||||
signals.push({ name: 'has environment details', positive: true, weight: 1 });
|
signals.push({ name: 'has environment details', positive: true, weight: 1 });
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
@ -140,7 +158,11 @@ export class QualityBackend implements EngineBackend {
|
||||||
// Shared: references to other issues/PRs
|
// Shared: references to other issues/PRs
|
||||||
const refs = body.match(/#\d+/g);
|
const refs = body.match(/#\d+/g);
|
||||||
if (refs && refs.length > 0) {
|
if (refs && refs.length > 0) {
|
||||||
signals.push({ name: `references ${refs.length} issue(s)/PR(s)`, positive: true, weight: 0.6 });
|
signals.push({
|
||||||
|
name: `references ${refs.length} issue(s)/PR(s)`,
|
||||||
|
positive: true,
|
||||||
|
weight: 0.6,
|
||||||
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
// Screenshots or images
|
// Screenshots or images
|
||||||
|
|
@ -169,7 +191,7 @@ export class QualityBackend implements EngineBackend {
|
||||||
const totalWeight = positiveWeight + negativeWeight;
|
const totalWeight = positiveWeight + negativeWeight;
|
||||||
const confidence = Math.min(
|
const confidence = Math.min(
|
||||||
1,
|
1,
|
||||||
Math.abs(positiveWeight - negativeWeight) / Math.max(totalWeight, 1) * 0.5 + 0.2
|
(Math.abs(positiveWeight - negativeWeight) / Math.max(totalWeight, 1)) * 0.5 + 0.2
|
||||||
);
|
);
|
||||||
|
|
||||||
const reasoning = `Quality: ${signals.map((s) => `${s.positive ? '+' : '-'} ${s.name}`).join(', ')}.`;
|
const reasoning = `Quality: ${signals.map((s) => `${s.positive ? '+' : '-'} ${s.name}`).join(', ')}.`;
|
||||||
|
|
|
||||||
|
|
@ -16,8 +16,7 @@ export function isDryRun(): boolean {
|
||||||
return octokit === null;
|
return octokit === null;
|
||||||
}
|
}
|
||||||
|
|
||||||
// --- Comment operations ---
|
// Comment operations
|
||||||
|
|
||||||
export async function postComment(
|
export async function postComment(
|
||||||
owner: string,
|
owner: string,
|
||||||
repo: string,
|
repo: string,
|
||||||
|
|
@ -70,8 +69,7 @@ export async function updateComment(
|
||||||
getLogger().info(`Updated comment ${commentId} on ${owner}/${repo}`);
|
getLogger().info(`Updated comment ${commentId} on ${owner}/${repo}`);
|
||||||
}
|
}
|
||||||
|
|
||||||
// --- Data fetching for engine backends ---
|
// Data fetching for engine backends
|
||||||
|
|
||||||
export async function fetchCheckRuns(
|
export async function fetchCheckRuns(
|
||||||
owner: string,
|
owner: string,
|
||||||
repo: string,
|
repo: string,
|
||||||
|
|
@ -146,8 +144,74 @@ export async function fetchPR(
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
// --- Comment formatting ---
|
export async function fetchIssue(
|
||||||
|
owner: string,
|
||||||
|
repo: string,
|
||||||
|
issueNumber: number
|
||||||
|
): Promise<{
|
||||||
|
title: string;
|
||||||
|
body: string;
|
||||||
|
author: string;
|
||||||
|
labels: string[];
|
||||||
|
} | null> {
|
||||||
|
if (!octokit) return null;
|
||||||
|
|
||||||
|
const { data } = await octokit.issues.get({ owner, repo, issue_number: issueNumber });
|
||||||
|
return {
|
||||||
|
title: data.title,
|
||||||
|
body: data.body || '',
|
||||||
|
author: data.user?.login || '',
|
||||||
|
labels: (data.labels || []).map((l) => (typeof l === 'string' ? l : l.name || '')),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface RecentComment {
|
||||||
|
id: number;
|
||||||
|
body: string;
|
||||||
|
author: string;
|
||||||
|
createdAt: string;
|
||||||
|
issueNumber: number;
|
||||||
|
isPullRequest: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function listRecentComments(
|
||||||
|
owner: string,
|
||||||
|
repo: string,
|
||||||
|
since: Date
|
||||||
|
): Promise<RecentComment[]> {
|
||||||
|
if (!octokit) {
|
||||||
|
getLogger().debug('[dry-run] Cannot fetch comments without a token');
|
||||||
|
return [];
|
||||||
|
}
|
||||||
|
|
||||||
|
const sinceIso = since.toISOString();
|
||||||
|
const comments: RecentComment[] = [];
|
||||||
|
|
||||||
|
// Fetch recent issue comments
|
||||||
|
const issueComments = await octokit.paginate(octokit.issues.listCommentsForRepo, {
|
||||||
|
owner,
|
||||||
|
repo,
|
||||||
|
since: sinceIso,
|
||||||
|
per_page: 100,
|
||||||
|
});
|
||||||
|
|
||||||
|
for (const comment of issueComments) {
|
||||||
|
if (!comment.body) continue;
|
||||||
|
|
||||||
|
comments.push({
|
||||||
|
id: comment.id,
|
||||||
|
body: comment.body,
|
||||||
|
author: comment.user?.login || '',
|
||||||
|
createdAt: comment.created_at,
|
||||||
|
issueNumber: comment.issue_url ? parseInt(comment.issue_url.split('/').pop() || '0', 10) : 0,
|
||||||
|
isPullRequest: false, // we'll determine this by fetching the issue
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
return comments;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Comment formatting
|
||||||
function pickRandom(list: string[]): string {
|
function pickRandom(list: string[]): string {
|
||||||
return list[Math.floor(Math.random() * list.length)];
|
return list[Math.floor(Math.random() * list.length)];
|
||||||
}
|
}
|
||||||
|
|
|
||||||
10
src/index.ts
10
src/index.ts
|
|
@ -10,6 +10,7 @@ import {
|
||||||
} from './github.js';
|
} from './github.js';
|
||||||
import { createApp } from './server.js';
|
import { createApp } from './server.js';
|
||||||
import { createEngine } from './engine/index.js';
|
import { createEngine } from './engine/index.js';
|
||||||
|
import { startPolling } from './polling.js';
|
||||||
import type { WebhookEvent } from './types.js';
|
import type { WebhookEvent } from './types.js';
|
||||||
|
|
||||||
async function analyzeOne(target: string) {
|
async function analyzeOne(target: string) {
|
||||||
|
|
@ -95,7 +96,9 @@ function serve() {
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
if (!process.env.WEBHOOK_SECRET) {
|
if (!process.env.WEBHOOK_SECRET) {
|
||||||
logger.warn('No WEBHOOK_SECRET - webhook signature verification is disabled');
|
logger.warn(
|
||||||
|
'No WEBHOOK_SECRET - webhook signature verification is disabled (not needed for polling-only mode)'
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
const app = createApp(config);
|
const app = createApp(config);
|
||||||
|
|
@ -105,7 +108,7 @@ function serve() {
|
||||||
.filter(([, v]) => v.enabled)
|
.filter(([, v]) => v.enabled)
|
||||||
.map(([k]) => k);
|
.map(([k]) => k);
|
||||||
|
|
||||||
const server = app.listen(port, () => {
|
const server = app.listen(port, async () => {
|
||||||
logger.info(`Troutbot listening on port ${port}`);
|
logger.info(`Troutbot listening on port ${port}`);
|
||||||
logger.info(`Enabled backends: ${enabledBackends.join(', ')}`);
|
logger.info(`Enabled backends: ${enabledBackends.join(', ')}`);
|
||||||
|
|
||||||
|
|
@ -138,6 +141,9 @@ function serve() {
|
||||||
logger.info(`Comment updates: ${config.response.allowUpdates ? 'enabled' : 'disabled'}`);
|
logger.info(`Comment updates: ${config.response.allowUpdates ? 'enabled' : 'disabled'}`);
|
||||||
|
|
||||||
logger.info(`Dashboard available at http://localhost:${port}/dashboard`);
|
logger.info(`Dashboard available at http://localhost:${port}/dashboard`);
|
||||||
|
|
||||||
|
// Start polling if enabled
|
||||||
|
await startPolling(config);
|
||||||
});
|
});
|
||||||
|
|
||||||
function shutdown(signal: string) {
|
function shutdown(signal: string) {
|
||||||
|
|
|
||||||
223
src/polling.ts
Normal file
223
src/polling.ts
Normal file
|
|
@ -0,0 +1,223 @@
|
||||||
|
import type { Config, WebhookEvent } from './types.js';
|
||||||
|
import {
|
||||||
|
listRecentComments,
|
||||||
|
fetchPR,
|
||||||
|
fetchIssue,
|
||||||
|
hasExistingComment,
|
||||||
|
postComment,
|
||||||
|
updateComment,
|
||||||
|
formatComment,
|
||||||
|
type RecentComment,
|
||||||
|
} from './github.js';
|
||||||
|
import { createEngine } from './engine/index.js';
|
||||||
|
import { getLogger } from './logger.js';
|
||||||
|
import { recordEvent } from './events.js';
|
||||||
|
|
||||||
|
interface ProcessedComment {
|
||||||
|
id: number;
|
||||||
|
timestamp: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
const processedComments: Map<string, ProcessedComment> = new Map();
|
||||||
|
const MAX_PROCESSED_CACHE = 1000;
|
||||||
|
|
||||||
|
function getCacheKey(owner: string, repo: string, commentId: number): string {
|
||||||
|
return `${owner}/${repo}#${commentId}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
function isProcessed(owner: string, repo: string, commentId: number): boolean {
|
||||||
|
return processedComments.has(getCacheKey(owner, repo, commentId));
|
||||||
|
}
|
||||||
|
|
||||||
|
function markProcessed(owner: string, repo: string, commentId: number): void {
|
||||||
|
const key = getCacheKey(owner, repo, commentId);
|
||||||
|
processedComments.set(key, { id: commentId, timestamp: Date.now() });
|
||||||
|
|
||||||
|
// Clean up old entries if cache is too large
|
||||||
|
if (processedComments.size > MAX_PROCESSED_CACHE) {
|
||||||
|
const entries = Array.from(processedComments.entries());
|
||||||
|
entries.sort((a, b) => a[1].timestamp - b[1].timestamp);
|
||||||
|
const toRemove = entries.slice(0, entries.length - MAX_PROCESSED_CACHE);
|
||||||
|
for (const [k] of toRemove) {
|
||||||
|
processedComments.delete(k);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function containsMention(body: string): boolean {
|
||||||
|
return body.includes('@troutbot');
|
||||||
|
}
|
||||||
|
|
||||||
|
async function analyzeAndComment(
|
||||||
|
event: WebhookEvent,
|
||||||
|
config: Config
|
||||||
|
): Promise<Record<string, unknown>> {
|
||||||
|
const logger = getLogger();
|
||||||
|
const engine = createEngine(config.engine);
|
||||||
|
|
||||||
|
// Run analysis
|
||||||
|
const analysis = await engine.analyze(event);
|
||||||
|
logger.info(
|
||||||
|
`Analyzed ${event.owner}/${event.repo}#${event.number}: impact=${analysis.impact}, confidence=${analysis.confidence.toFixed(2)}`
|
||||||
|
);
|
||||||
|
|
||||||
|
// Check for existing comment
|
||||||
|
const { commentMarker, allowUpdates } = config.response;
|
||||||
|
const existing = await hasExistingComment(event.owner, event.repo, event.number, commentMarker);
|
||||||
|
|
||||||
|
if (existing.exists && !allowUpdates) {
|
||||||
|
logger.info(`Already commented on ${event.owner}/${event.repo}#${event.number}, skipping`);
|
||||||
|
const result = { skipped: true, reason: 'Already commented' };
|
||||||
|
recordEvent(event, result, analysis);
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
const body = formatComment(
|
||||||
|
config.response,
|
||||||
|
event.type,
|
||||||
|
analysis.impact,
|
||||||
|
analysis.confidence,
|
||||||
|
analysis.reasoning
|
||||||
|
);
|
||||||
|
|
||||||
|
if (existing.exists && allowUpdates && existing.commentId) {
|
||||||
|
logger.info(`Updating existing comment on ${event.owner}/${event.repo}#${event.number}`);
|
||||||
|
await updateComment(event.owner, event.repo, existing.commentId, body);
|
||||||
|
} else {
|
||||||
|
await postComment(event.owner, event.repo, event.number, body);
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = { processed: true, impact: analysis.impact, confidence: analysis.confidence };
|
||||||
|
recordEvent(event, result, analysis);
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function processComment(
|
||||||
|
comment: RecentComment,
|
||||||
|
owner: string,
|
||||||
|
repo: string,
|
||||||
|
config: Config
|
||||||
|
): Promise<void> {
|
||||||
|
const logger = getLogger();
|
||||||
|
|
||||||
|
if (!containsMention(comment.body)) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (isProcessed(owner, repo, comment.id)) {
|
||||||
|
logger.debug(`Comment ${owner}/${repo}#${comment.id} already processed, skipping`);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.info(`Found @troutbot mention in ${owner}/${repo}#${comment.issueNumber}`);
|
||||||
|
|
||||||
|
try {
|
||||||
|
// First, try to fetch as a PR to check if it's a pull request
|
||||||
|
const prData = await fetchPR(owner, repo, comment.issueNumber);
|
||||||
|
|
||||||
|
let event: WebhookEvent;
|
||||||
|
|
||||||
|
if (prData) {
|
||||||
|
// It's a pull request
|
||||||
|
event = {
|
||||||
|
action: 'on_demand',
|
||||||
|
type: 'pull_request',
|
||||||
|
number: comment.issueNumber,
|
||||||
|
title: prData.title,
|
||||||
|
body: prData.body,
|
||||||
|
owner,
|
||||||
|
repo,
|
||||||
|
author: prData.author,
|
||||||
|
labels: prData.labels,
|
||||||
|
branch: prData.branch,
|
||||||
|
sha: prData.sha,
|
||||||
|
};
|
||||||
|
} else {
|
||||||
|
// It's an issue
|
||||||
|
const issueData = await fetchIssue(owner, repo, comment.issueNumber);
|
||||||
|
if (!issueData) {
|
||||||
|
logger.warn(`Could not fetch issue ${owner}/${repo}#${comment.issueNumber}`);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
event = {
|
||||||
|
action: 'on_demand',
|
||||||
|
type: 'issue',
|
||||||
|
number: comment.issueNumber,
|
||||||
|
title: issueData.title,
|
||||||
|
body: issueData.body,
|
||||||
|
owner,
|
||||||
|
repo,
|
||||||
|
author: issueData.author,
|
||||||
|
labels: issueData.labels,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
await analyzeAndComment(event, config);
|
||||||
|
markProcessed(owner, repo, comment.id);
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
`Successfully processed on-demand analysis for ${owner}/${repo}#${comment.issueNumber}`
|
||||||
|
);
|
||||||
|
} catch (err) {
|
||||||
|
logger.error(`Failed to process mention in ${owner}/${repo}#${comment.issueNumber}`, err);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function pollRepository(
|
||||||
|
owner: string,
|
||||||
|
repo: string,
|
||||||
|
config: Config,
|
||||||
|
since: Date
|
||||||
|
): Promise<void> {
|
||||||
|
const logger = getLogger();
|
||||||
|
|
||||||
|
try {
|
||||||
|
const comments = await listRecentComments(owner, repo, since);
|
||||||
|
logger.debug(`Fetched ${comments.length} recent comments from ${owner}/${repo}`);
|
||||||
|
|
||||||
|
for (const comment of comments) {
|
||||||
|
await processComment(comment, owner, repo, config);
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
logger.error(`Failed to poll ${owner}/${repo}`, err);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function startPolling(config: Config): Promise<void> {
|
||||||
|
const logger = getLogger();
|
||||||
|
const pollingConfig = config.polling;
|
||||||
|
|
||||||
|
if (!pollingConfig || !pollingConfig.enabled) {
|
||||||
|
logger.info('Polling is disabled');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (config.repositories.length === 0) {
|
||||||
|
logger.warn('Polling enabled but no repositories configured');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const intervalMs = pollingConfig.intervalMinutes * 60 * 1000;
|
||||||
|
const lookbackMs = pollingConfig.lookbackMinutes * 60 * 1000;
|
||||||
|
|
||||||
|
logger.info(`Starting polling for ${config.repositories.length} repositories`);
|
||||||
|
logger.info(
|
||||||
|
`Poll interval: ${pollingConfig.intervalMinutes} minutes, lookback: ${pollingConfig.lookbackMinutes} minutes`
|
||||||
|
);
|
||||||
|
|
||||||
|
// Do an initial poll
|
||||||
|
const initialSince = new Date(Date.now() - lookbackMs);
|
||||||
|
for (const repo of config.repositories) {
|
||||||
|
await pollRepository(repo.owner, repo.repo, config, initialSince);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Set up recurring polling
|
||||||
|
setInterval(async () => {
|
||||||
|
const since = new Date(Date.now() - lookbackMs);
|
||||||
|
|
||||||
|
for (const repo of config.repositories) {
|
||||||
|
await pollRepository(repo.owner, repo.repo, config, since);
|
||||||
|
}
|
||||||
|
}, intervalMs);
|
||||||
|
}
|
||||||
|
|
@ -1,11 +1,12 @@
|
||||||
import crypto from 'node:crypto';
|
import crypto from 'node:crypto';
|
||||||
import express from 'express';
|
import express from 'express';
|
||||||
import rateLimit from 'express-rate-limit';
|
import rateLimit from 'express-rate-limit';
|
||||||
import type { Config, WebhookEvent, AnalysisResult } from './types.js';
|
import type { Config, WebhookEvent } from './types.js';
|
||||||
import { shouldProcess } from './filters.js';
|
import { shouldProcess } from './filters.js';
|
||||||
import { createEngine } from './engine/index.js';
|
import { createEngine } from './engine/index.js';
|
||||||
import {
|
import {
|
||||||
fetchPR,
|
fetchPR,
|
||||||
|
fetchIssue,
|
||||||
formatComment,
|
formatComment,
|
||||||
hasExistingComment,
|
hasExistingComment,
|
||||||
postComment,
|
postComment,
|
||||||
|
|
@ -96,6 +97,21 @@ export function createApp(config: Config): express.Express {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Handle issue_comment with @troutbot mention - on-demand analysis
|
||||||
|
if (
|
||||||
|
eventType === 'issue_comment' &&
|
||||||
|
['created', 'edited'].includes(payload.action as string)
|
||||||
|
) {
|
||||||
|
const commentBody = (payload.comment as Record<string, unknown>).body as string;
|
||||||
|
if (commentBody && commentBody.includes('@troutbot')) {
|
||||||
|
const result = await handleOnDemandAnalysis(payload, config, engine);
|
||||||
|
res.json(result);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
res.json({ skipped: true, reason: 'Comment does not mention @troutbot' });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
if (eventType !== 'issues' && eventType !== 'pull_request') {
|
if (eventType !== 'issues' && eventType !== 'pull_request') {
|
||||||
res.json({ skipped: true, reason: `Unhandled event: ${eventType}` });
|
res.json({ skipped: true, reason: `Unhandled event: ${eventType}` });
|
||||||
return;
|
return;
|
||||||
|
|
@ -241,6 +257,77 @@ async function handleCheckSuiteCompleted(
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
async function handleOnDemandAnalysis(
|
||||||
|
payload: Record<string, unknown>,
|
||||||
|
config: Config,
|
||||||
|
engine: ReturnType<typeof createEngine>
|
||||||
|
): Promise<Record<string, unknown>> {
|
||||||
|
const logger = getLogger();
|
||||||
|
const repo = payload.repository as Record<string, unknown>;
|
||||||
|
const owner = (repo.owner as Record<string, unknown>).login as string;
|
||||||
|
const repoName = repo.name as string;
|
||||||
|
|
||||||
|
const issue = payload.issue as Record<string, unknown>;
|
||||||
|
const issueNumber = issue.number as number;
|
||||||
|
const isPullRequest = issue.pull_request !== undefined;
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
`On-demand analysis triggered for ${owner}/${repoName}#${issueNumber} (${isPullRequest ? 'PR' : 'issue'})`
|
||||||
|
);
|
||||||
|
|
||||||
|
try {
|
||||||
|
let event: WebhookEvent;
|
||||||
|
|
||||||
|
if (isPullRequest) {
|
||||||
|
const prData = await fetchPR(owner, repoName, issueNumber);
|
||||||
|
if (!prData) {
|
||||||
|
logger.warn(`Could not fetch PR ${owner}/${repoName}#${issueNumber}`);
|
||||||
|
return { skipped: true, reason: 'Could not fetch PR data' };
|
||||||
|
}
|
||||||
|
|
||||||
|
event = {
|
||||||
|
action: 'on_demand',
|
||||||
|
type: 'pull_request',
|
||||||
|
number: issueNumber,
|
||||||
|
title: prData.title,
|
||||||
|
body: prData.body,
|
||||||
|
owner,
|
||||||
|
repo: repoName,
|
||||||
|
author: prData.author,
|
||||||
|
labels: prData.labels,
|
||||||
|
branch: prData.branch,
|
||||||
|
sha: prData.sha,
|
||||||
|
};
|
||||||
|
} else {
|
||||||
|
const issueData = await fetchIssue(owner, repoName, issueNumber);
|
||||||
|
if (!issueData) {
|
||||||
|
logger.warn(`Could not fetch issue ${owner}/${repoName}#${issueNumber}`);
|
||||||
|
return { skipped: true, reason: 'Could not fetch issue data' };
|
||||||
|
}
|
||||||
|
|
||||||
|
event = {
|
||||||
|
action: 'on_demand',
|
||||||
|
type: 'issue',
|
||||||
|
number: issueNumber,
|
||||||
|
title: issueData.title,
|
||||||
|
body: issueData.body,
|
||||||
|
owner,
|
||||||
|
repo: repoName,
|
||||||
|
author: issueData.author,
|
||||||
|
labels: issueData.labels,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
return await analyzeAndComment(event, config, engine);
|
||||||
|
} catch (err) {
|
||||||
|
logger.error(
|
||||||
|
`Failed to process on-demand analysis for ${owner}/${repoName}#${issueNumber}`,
|
||||||
|
err
|
||||||
|
);
|
||||||
|
return { error: 'Internal server error' };
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
function parseEvent(eventType: string, payload: Record<string, unknown>): WebhookEvent | null {
|
function parseEvent(eventType: string, payload: Record<string, unknown>): WebhookEvent | null {
|
||||||
try {
|
try {
|
||||||
if (eventType === 'issues') {
|
if (eventType === 'issues') {
|
||||||
|
|
|
||||||
|
|
@ -5,6 +5,13 @@ export interface Config {
|
||||||
engine: EngineConfig;
|
engine: EngineConfig;
|
||||||
response: ResponseConfig;
|
response: ResponseConfig;
|
||||||
logging: LoggingConfig;
|
logging: LoggingConfig;
|
||||||
|
polling?: PollingConfig;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface PollingConfig {
|
||||||
|
enabled: boolean;
|
||||||
|
intervalMinutes: number;
|
||||||
|
lookbackMinutes: number;
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface ServerConfig {
|
export interface ServerConfig {
|
||||||
|
|
|
||||||
Loading…
Add table
Add a link
Reference in a new issue