Skip to content

Quick Start

import { Steps, Aside } from ‘@astrojs/starlight/components’;

This guide takes you from zero to a linted script with results stored in perf-results-db — the two most common entry points into the ecosystem.

  • Docker and Docker Compose installed
  • A JMeter (.jmx), k6 (.js), or Gatling (.scala) script to work with
  • Node.js 18+ (for the perf-results-db CLI uploader)

  1. Install perf-lint

    Terminal window
    pip install perf-lint-tool
  2. Run the linter against your script

    Terminal window
    # JMeter
    perf-lint check my-test.jmx
    # k6
    perf-lint check my-script.js
    # Gatling
    perf-lint check MySimulation.scala
  3. Review the output

    my-test.jmx
    line 12 WARN [JMX001] Thread group uses "Forever" loop — set a finite loop count
    line 34 ERROR [JMX010] HTTP sampler missing explicit timeout — default is infinite
    line 67 INFO [JMX042] Consider using CSV Data Set Config for parameterisation
    3 problems (1 error, 1 warning, 1 info)

    Errors block CI pipelines. Warnings and info are advisory.

  4. Fix and re-run

    Terminal window
    perf-lint check my-test.jmx --fix

    The --fix flag auto-corrects rules that support it.


Section titled “Option B — Store results and track trends”
  1. Start perf-results-db

    Download the Community release and start it:

    Terminal window
    unzip perf-results-db.zip && cd perf-results-db
    cp .env.example .env
    docker compose up -d

    The dashboard is available at http://localhost:4000.

  2. Create a project

    Open the dashboard → Projects → New Project. Copy the project UUID.

  3. Create an API key

    Settings → API Keys → New Key. Copy the key — you won’t see it again.

  4. Run your test and export results

    Terminal window
    # k6 — export JSON results
    k6 run --out json=results.json my-script.js
    # JMeter — export CSV (default listener output)
    jmeter -n -t my-test.jmx -l results.jtl
    # Gatling — results are in simulation.log by default
  5. Upload results

    Terminal window
    npx perf-results-db-cli upload \
    --url http://localhost:4000 \
    --api-key prdb_your_key_here \
    --project-id your-project-uuid \
    --file results.json
  6. View trends

    Open the dashboard. Run your test a few more times — the trend graph will show response time and error rate over time.


See the CI/CD Integration guide for a complete GitHub Actions pipeline that:

  1. Generates test data with dummydatagenpro
  2. Lints the script with perf-lint
  3. Runs the test with the perf-containers Docker image
  4. Uploads results to perf-results-db
  5. Fails the build on statistical regression (perf-compare)