Quick Start
import { Steps, Aside } from ‘@astrojs/starlight/components’;
This guide takes you from zero to a linted script with results stored in perf-results-db — the two most common entry points into the ecosystem.
Prerequisites
Section titled “Prerequisites”- Docker and Docker Compose installed
- A JMeter (
.jmx), k6 (.js), or Gatling (.scala) script to work with - Node.js 18+ (for the perf-results-db CLI uploader)
Option A — Lint a script
Section titled “Option A — Lint a script”-
Install perf-lint
Terminal window pip install perf-lint-tool -
Run the linter against your script
Terminal window # JMeterperf-lint check my-test.jmx# k6perf-lint check my-script.js# Gatlingperf-lint check MySimulation.scala -
Review the output
my-test.jmxline 12 WARN [JMX001] Thread group uses "Forever" loop — set a finite loop countline 34 ERROR [JMX010] HTTP sampler missing explicit timeout — default is infiniteline 67 INFO [JMX042] Consider using CSV Data Set Config for parameterisation3 problems (1 error, 1 warning, 1 info)Errors block CI pipelines. Warnings and info are advisory.
-
Fix and re-run
Terminal window perf-lint check my-test.jmx --fixThe
--fixflag auto-corrects rules that support it.
Option B — Store results and track trends
Section titled “Option B — Store results and track trends”-
Start perf-results-db
Download the Community release and start it:
Terminal window unzip perf-results-db.zip && cd perf-results-dbcp .env.example .envdocker compose up -dThe dashboard is available at
http://localhost:4000. -
Create a project
Open the dashboard → Projects → New Project. Copy the project UUID.
-
Create an API key
Settings → API Keys → New Key. Copy the key — you won’t see it again.
-
Run your test and export results
Terminal window # k6 — export JSON resultsk6 run --out json=results.json my-script.js# JMeter — export CSV (default listener output)jmeter -n -t my-test.jmx -l results.jtl# Gatling — results are in simulation.log by default -
Upload results
Terminal window npx perf-results-db-cli upload \--url http://localhost:4000 \--api-key prdb_your_key_here \--project-id your-project-uuid \--file results.json -
View trends
Open the dashboard. Run your test a few more times — the trend graph will show response time and error rate over time.
Option C — Full pipeline in CI
Section titled “Option C — Full pipeline in CI”See the CI/CD Integration guide for a complete GitHub Actions pipeline that:
- Generates test data with dummydatagenpro
- Lints the script with perf-lint
- Runs the test with the perf-containers Docker image
- Uploads results to perf-results-db
- Fails the build on statistical regression (perf-compare)
Next Steps
Section titled “Next Steps”- perf-ecosystem.yml — set up shared config once, all tools pick it up
- perf-lint — full rule reference and configuration
- perf-results-db — baselines, SLA tracking, and webhooks