Skip to content

perf-results-db

import { Aside, Steps } from ‘@astrojs/starlight/components’;

perf-results-db is a self-hosted store for performance test results from JMeter, k6, and Gatling. It provides trend analysis, baseline comparison, SLA tracking, and a web dashboard.

TierProjectsAPI KeysRetentionPrice
Community3590 daysFree
ProfessionalUnlimitedUnlimited365 days£149/year
EnterpriseUnlimitedUnlimitedCustom + SSO/RBAC£499/year
  • Docker and Docker Compose
  • 1 GB RAM minimum (2 GB recommended for production)
  • PostgreSQL (provided via Compose, or bring your own)
  1. Download

    martkos-it.co.uk/store/perf-results-db-download — requires Community license key (free registration).

  2. Extract and configure

    Terminal window
    unzip perf-results-db.zip && cd perf-results-db
    cp .env.example .env

    Edit .env:

    Terminal window
    ENVIRONMENT=production
    DB_USER=perf
    DB_PASSWORD=change-me-strong-password
    DB_NAME=perfresults
    DATABASE_URL=postgres://perf:change-me-strong-password@postgres:5432/perfresults
    # Authentication (recommended)
    AUTH_ENABLED=true
    # Data retention in days (Community max: 90)
    RETENTION_DAYS=90
  3. Start

    Terminal window
    docker compose up -d
    docker compose ps # wait for all services to be healthy
  4. Verify

    Terminal window
    curl http://localhost:4000/api/health
    # → {"status":"ok","version":"1.0.0"}

    Dashboard: http://localhost:4000

Create a project via the dashboard (Projects → New) or API:

Terminal window
curl -X POST http://localhost:4000/api/projects \
-H "X-API-Key: prdb_your_key" \
-H "Content-Type: application/json" \
-d '{"name": "my-app", "description": "API load tests"}'

Returns {"id": "uuid-here", ...}. Store this UUID — it’s needed for all uploads.

Create an API key: Settings → API Keys → New Key. Keys are prefixed prdb_. Store immediately; the full key is shown only once.

Terminal window
# k6 results
k6 run --out json=results.json my-script.js
npx perf-results-db-cli upload \
--url http://localhost:4000 \
--api-key prdb_your_key \
--project-id your-project-uuid \
--file results.json \
--tool k6
# JMeter results
jmeter -n -t my-test.jmx -l results.jtl
npx perf-results-db-cli upload \
--url http://localhost:4000 \
--api-key prdb_your_key \
--project-id your-project-uuid \
--file results.jtl \
--tool jmeter
# Gatling results
npx perf-results-db-cli upload \
--url http://localhost:4000 \
--api-key prdb_your_key \
--project-id your-project-uuid \
--file target/gatling/MySimulation-*/simulation.log \
--tool gatling
- uses: markslilley/perf-results-db-action@v1
with:
url: ${{ secrets.PERF_RESULTS_DB_URL }}
api-key: ${{ secrets.PERF_RESULTS_DB_API_KEY }}
project-id: ${{ vars.PERF_RESULTS_DB_PROJECT_ID }}
file: results.json
tool: k6
upload-results:
script:
- npx perf-results-db-cli upload
--url $PERF_RESULTS_DB_URL
--api-key $PERF_RESULTS_DB_API_KEY
--project-id $PERF_RESULTS_DB_PROJECT_ID
--file results.json --tool k6
Terminal window
curl -X POST http://localhost:4000/api/test-runs \
-H "X-API-Key: prdb_your_key" \
-H "Content-Type: application/json" \
-d '{
"projectId": "your-uuid",
"tool": "k6",
"metrics": {
"p50": 142, "p95": 387, "p99": 612,
"avg": 158, "min": 12, "max": 2341,
"errorRate": 0.2,
"throughput": 312.4
},
"metadata": {"branch": "main", "commit": "abc123"}
}'
Terminal window
# Check for regression vs last 10 runs (>10% degradation = regression)
GET /api/trends/project/{projectId}/regression?threshold=10
# Compare two date ranges
GET /api/trends/project/{projectId}/compare?currentStart=2026-03-01&currentEnd=2026-03-15&baselineStart=2026-02-01&baselineEnd=2026-02-15

For statistical regression detection, use perf-compare.

Tag a run as a baseline:

Terminal window
curl -X POST http://localhost:4000/api/test-runs/{runId}/baseline \
-H "X-API-Key: prdb_your_key" \
-d '{"name": "v2.1.0 release"}'

Compare a run against the baseline:

Terminal window
curl -X POST http://localhost:4000/api/baselines/{baselineId}/compare \
-H "X-API-Key: prdb_your_key" \
-d '{"runId": "run-uuid"}'

Returns delta and percentage difference for each metric.

Define SLA thresholds per project:

Terminal window
curl -X POST http://localhost:4000/api/projects/{projectId}/sla \
-H "X-API-Key: prdb_your_key" \
-H "Content-Type: application/json" \
-d '{
"p95MaxMs": 1500,
"errorRateMaxPercent": 1.0,
"apdexMinScore": 0.85
}'

Check SLA compliance:

Terminal window
GET /api/sla/summary?projectId={uuid}&days=30

Trigger external systems when a test run completes:

Terminal window
curl -X POST http://localhost:4000/api/webhooks \
-H "X-API-Key: prdb_your_key" \
-d '{
"url": "https://hooks.slack.com/...",
"events": ["test_run_completed", "sla_breached"]
}'

Override port or use an external database:

docker-compose.override.yml
services:
api:
ports:
- "8080:4000" # expose on a different host port
environment:
DATABASE_URL: postgres://user:pass@external-host:5432/perfresults
Terminal window
# Backup
docker compose exec postgres pg_dump -U $DB_USER $DB_NAME > backup-$(date +%Y%m%d).sql
# Restore
docker compose exec -T postgres psql -U $DB_USER $DB_NAME < backup-20260301.sql
Terminal window
docker compose pull
docker compose up -d
# Migrations run automatically on startup
docker compose logs api | grep -i migrat
services:
perf_results_db:
url: "http://localhost:4000"
api_key: "${PERF_RESULTS_DB_API_KEY}"
project_id: "${PERF_RESULTS_DB_PROJECT_ID}"
integrations:
push_to_results_db: true
compare_on_upload: true # auto-run perf-compare after each upload

Full REST API: perf-results-db API reference OpenAPI/Swagger: http://localhost:4000/api/docs