Skip to content

perf-ci-pipelines

perf-ci-pipelines provides ready-made CI/CD pipeline templates for running performance tests in GitHub Actions, GitLab CI, Jenkins, and Azure DevOps. Templates cover single-tool runs through to multi-tool orchestration with quality gates.

TierTemplatesPrice
CommunityBasic single-tool templates (1 tool)Free
PaidMulti-tool + enterprise templates + priority support£500/year
.github/workflows/perf.yml
name: Performance Test
on:
push:
branches: [main]
schedule:
- cron: '0 2 * * *' # nightly
jobs:
perf-test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Lint scripts
uses: markslilley/perf-lint-action@v1
with:
path: tests/performance/
tool: k6
- name: Run k6
uses: grafana/k6-action@v0.3
with:
filename: tests/performance/load-test.js
env:
TARGET_HOST: ${{ vars.TARGET_HOST }}
K6_OUT: json=results.json
- name: Upload results
uses: markslilley/perf-results-db-action@v1
with:
url: ${{ vars.PERF_RESULTS_DB_URL }}
api-key: ${{ secrets.PERF_RESULTS_DB_API_KEY }}
project-id: ${{ vars.PERF_RESULTS_DB_PROJECT_ID }}
file: results.json
tool: k6
- name: Check for regressions
run: npx @martkos-it/perf-compare
--url ${{ vars.PERF_RESULTS_DB_URL }}
--project ${{ vars.PERF_RESULTS_DB_PROJECT_ID }}
--method statistical
--baseline 10 --current 3
env:
PERF_RESULTS_DB_API_KEY: ${{ secrets.PERF_RESULTS_DB_API_KEY }}
PERF_COMPARE_LICENSE_KEY: ${{ secrets.PERF_COMPARE_LICENSE_KEY }}

Multi-stage pipeline (smoke → load → stress)

Section titled “Multi-stage pipeline (smoke → load → stress)”
jobs:
smoke:
runs-on: ubuntu-latest
outputs:
passed: ${{ steps.smoke.outcome == 'success' }}
steps:
- uses: actions/checkout@v4
- name: Smoke test
id: smoke
run: |
docker run --rm \
-v ${{ github.workspace }}/tests:/tests \
ghcr.io/markslilley/perf-k6:latest \
k6 run /tests/smoke.js -e VUS=5 -e DURATION=30s
env:
TARGET_HOST: ${{ vars.TARGET_HOST }}
load:
needs: smoke
if: needs.smoke.outputs.passed == 'true'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Load test
run: |
docker run --rm \
-v ${{ github.workspace }}/tests:/tests \
ghcr.io/markslilley/perf-k6:latest \
k6 run /tests/load.js \
-e VUS=${{ vars.LOAD_VUS }} \
-e DURATION=${{ vars.LOAD_DURATION }}
.gitlab-ci.yml
stages:
- lint
- test
- analyse
perf-lint:
stage: lint
image: python:3.12-alpine
script:
- pip install perf-lint-tool
- perf-lint check tests/performance/*.jmx
jmeter-load:
stage: test
image: ghcr.io/markslilley/perf-jmeter:latest
script:
- jmeter -n
-t tests/performance/load-test.jmx
-l results.jtl
-JTARGET_HOST=${TARGET_HOST}
-JVUS=${VUS:-100}
artifacts:
paths:
- results.jtl
expire_in: 7 days
upload-results:
stage: analyse
image: node:20-alpine
script:
- npx perf-results-db-cli upload
--url ${PERF_RESULTS_DB_URL}
--api-key ${PERF_RESULTS_DB_API_KEY}
--project-id ${PERF_RESULTS_DB_PROJECT_ID}
--file results.jtl --tool jmeter
dependencies:
- jmeter-load
pipeline {
agent any
environment {
TARGET_HOST = credentials('target-host')
PERF_RESULTS_DB_API_KEY = credentials('perf-results-db-api-key')
}
stages {
stage('Lint') {
steps {
sh 'pip install perf-lint-tool && perf-lint check tests/performance/load-test.jmx'
}
}
stage('Load Test') {
agent {
docker {
image 'ghcr.io/markslilley/perf-jmeter:latest'
args '-v ${WORKSPACE}/tests:/tests'
}
}
steps {
sh '''
jmeter -n \
-t /tests/load-test.jmx \
-l /tests/results.jtl \
-JTARGET_HOST=${TARGET_HOST}
'''
}
}
stage('Upload Results') {
steps {
sh '''
npx perf-results-db-cli upload \
--url ${PERF_RESULTS_DB_URL} \
--api-key ${PERF_RESULTS_DB_API_KEY} \
--project-id ${PERF_RESULTS_DB_PROJECT_ID} \
--file tests/results.jtl --tool jmeter
'''
}
}
}
}
azure-pipelines.yml
trigger:
- main
pool:
vmImage: ubuntu-latest
steps:
- task: NodeTool@0
inputs:
versionSpec: '20.x'
- script: pip install perf-lint-tool && perf-lint check tests/performance/
displayName: 'Lint performance scripts'
- script: |
docker run --rm \
-v $(Build.SourcesDirectory)/tests:/tests \
ghcr.io/markslilley/perf-k6:latest \
k6 run /tests/load-test.js \
-e TARGET_HOST=$(TARGET_HOST) \
--out json=results.json
displayName: 'Run k6 load test'
- script: |
npx perf-results-db-cli upload \
--url $(PERF_RESULTS_DB_URL) \
--api-key $(PERF_RESULTS_DB_API_KEY) \
--project-id $(PERF_RESULTS_DB_PROJECT_ID) \
--file results.json --tool k6
displayName: 'Upload results'

Set these in your CI/CD secrets/variables store:

VariableUsed By
TARGET_HOSTTest scripts (target URL)
PERF_RESULTS_DB_URLperf-results-db-cli, perf-compare
PERF_RESULTS_DB_API_KEYperf-results-db-cli, perf-compare
PERF_RESULTS_DB_PROJECT_IDperf-results-db-cli, perf-compare
PERF_COMPARE_LICENSE_KEYperf-compare (Pro)
PERF_LINT_API_KEYperf-lint (Pro/Team API sync)