This content originally appeared on DEV Community and was authored by Spencer Radcliff
Introduction
API performance testing is critical for ensuring reliability under load. Traditionally, engineers run JMeter locally, interpret results manually, and only test periodically. But in a DevOps world, performance testing should be continuous, automated, and part of your CI/CD pipeline.
In this post, I’ll share how I built a framework that:
- Runs JMeter tests inside Azure DevOps pipelines
- Supports progressive load testing (incrementally increasing users)
- Performs automatic SLA validation on latency, response time, and throughput
- Publishes JUnit XML & HTML reports directly into the pipeline
  
  
   Architecture
 Architecture
PerformanceTestFramework/
βββ JMeter/
β   βββ {Module}/
β   β   βββ testplan/    # JMeter test plans (.jmx)
β   β   βββ SLA/         # SLA configs (.json)
βββ Pipelines/
β   βββ loadtest.yaml    # Azure DevOps pipeline config
βββ scripts/             # PowerShell automation scripts
Tech Stack
- Apache JMeter (load testing engine)
- Azure DevOps Pipelines (orchestration & reporting)
- PowerShell (setup & execution)
- Python (JTL β JUnit conversion)
 Pipeline Configuration
 Pipeline Configuration
The pipeline is parameterized, making it easy to select test plans, SLA files, and environments at runtime.
parameters:
  - name: MAX_THREADS
    type: number
    default: 10
  - name: THREAD_START
    type: number
    default: 5
  - name: THREAD_STEP
    type: number
    default: 5
  - name: RAMPUP
    type: number
    default: 1
  - name: TEST_PLAN
    type: string
    values:
      - 'JMeter/HomePage/testplan/HomePageFeatures.jmx'
      - 'JMeter/DataExploration/testplan/DataExplorationAssetsMe.jmx'
  - name: SLA_FILE
    type: string
    values:
      - 'JMeter/HomePage/SLA/sla_HomePage.json'
      - 'JMeter/DataExploration/SLA/sla_DataExploration.json'
This way, testers can run different APIs under different loads without editing code.
 Progressive LoadΒ Testing
 Progressive LoadΒ Testing
The test scales load gradually:
Start with THREAD_START users
Increase by THREAD_STEP until MAX_THREADS
Use RAMPUP for smooth scaling
Example:
THREAD_START = 5
THREAD_STEP  = 5
MAX_THREADS  = 20
 Runs 4 iterations: 5 β 10 β 15 β 20 users
 Runs 4 iterations: 5 β 10 β 15 β 20 users
 SLA Validation
 SLA Validation
Each test has an SLA JSON file, e.g.:
{
  "response_time_ms": 2000,
  "latency_ms": 1500,
  "throughput_rps": 50,
  "violation_threshold_pct": 30
}
The pipeline validates:
Response Time β€ response_time_ms
Latency β€ latency_ms
Throughput β₯ throughput_rps
SLA health classification β  Excellent /
 Excellent /  Moderate /
 Moderate /  Poor
 Poor
 JTL β JUnit Conversion
 JTL β JUnit Conversion
JMeter producesΒ .jtl results, which aren’t CI/CD friendly.
Β We use a Python script to convert JTL into JUnit XML, so Azure DevOps can show pass/fail status in the Test tab.
Key snippet from jtl_to_junit.py:
if elapsed > sla_response_time:
    message += f"Response time {elapsed}ms exceeded SLA."
if latency > sla_latency:
    message += f"Latency {latency}ms exceeded SLA."
 Generates JUnit results per request + SLA health checks
 Generates JUnit results per request + SLA health checks
Β  Failures appear just like unit test failures
 Failures appear just like unit test failures
 Automation with PowerShell
 Automation with PowerShell
PowerShell scripts handle setup & execution:
1_install_jdk.ps1 β Install OpenJDK
2_install_jmeter.ps1 β Install Apache JMeter
3_clean_results.ps1 β Clean artifacts
4_install_python.ps1 β Ensure Python is available
5_run_jmeter_tests.ps1 β Run JMeter, collect results, call Python converter
This keeps the pipeline clean and modular.
  
  
   Reporting
 Reporting
JUnit Results β Published to pipeline test tab
HTML Reports β JMeter’s native HTML report uploaded as artifacts
Raw JTL Files β Saved for debugging
Example inline HTML report step:
- task: PublishPipelineArtifact@1
  inputs:
    targetPath: '$(RESULTS_DIR)\html_reports'
    artifact: jmeter-html-reports
  
  
   LessonsΒ Learned
 LessonsΒ Learned
 Make SLA validation automatic β no more manual log parsing
 Make SLA validation automatic β no more manual log parsing
 Tokens & correlation IDs must be refreshed before runs
 Tokens & correlation IDs must be refreshed before runs
 Always store artifacts (JTL + JUnit + HTML) for traceability
 Always store artifacts (JTL + JUnit + HTML) for traceability
 Progressive load testing exposes degradation early
 Progressive load testing exposes degradation early
  
  
   Conclusion
 Conclusion
With this setup, API performance testing became:
- 1. Repeatable β Any tester can trigger tests with a few clicks
- 2. Automated β Runs in CI/CD, no manual effort
- 3. Actionable β Failures appear directly in pipeline results
- 4. Scalable β Easy to add new APIs & SLAs
This framework turns performance testing from a one-time activity into a continuous quality gate for APIs.
 Have you tried integrating performance testing into CI/CD pipelines?
 Have you tried integrating performance testing into CI/CD pipelines?
Β I’d love to hear how you approached SLA validation and reporting!
This content originally appeared on DEV Community and was authored by Spencer Radcliff
