Introduction
In this article, let me share the template for performance testing with k6.
When we first introduce and implement a script for performance testing, you may be struggling with how we implement performance testing scenarios with the tool and what is the best practice of implementation.
- How can we manage and structure each scenario set in the repository
- Ideally, an environment that will be executed load should be switched easily
- Each scenarios configuration should be passed easily
- The code format for the repository should be aligned
I’ve experienced performance testing and know of it. In this article, I’d like to share a practice of how to implement each script and template of it.
Intended reader
- People who plan to conduct performance testing or load testing
- People who want to know the template to implement scripts with k6 easily
Steps for quick start
In this template, the following pre-requirements
- This script work on local env or EC2 instance(not cloud k6)
- Environment of go and brew are already prepared
The template of k6 can be downloaded from the following GitHub repository.
https://github.com/gonkunkun/k6-template
Now the first thing is to download the template and execute k6 with a scenario for smoke testing.
# clone repository git clone https://github.com/gonkunkun/k6-template cd ./k6-template # Instaration npm install # Install xk6 go install go.k6.io/xk6/cmd/xk6@latest xk6 build --with github.com/LeonAdato/xk6-output-statsd@latest --with github.com/grafana/xk6-dashboard@latest --with github.com/szkiba/xk6-enhanced@latest --with github.com/szkiba/xk6-dotenv@latest # Set environment variables cp .env.sample .env
# Install Redis brew install redis
## Start redis (if the installation destination of the homebrew app is /usr/local) redis-server /usr/local/etc/redis.conf
## Start redis (if the installation destination of the homebrew app is /opt/homebrew) redis-server /opt/homebrew/etc/redis.conf
If you finished the above steps, the preparing environment has been completed.
Let’s execute the smoke testing scenario.
# bundle
npm run bundle
# Execute smoke test scenario set.
./k6 run ./dist/loadTest.js --config ./src/sample-product/configs/smoke.json -e ENV=local
The result of the run command will be outputted like the following.
~/D/g/template-of-k6 ❯❯❯ ./k6 run ./dist/loadTest.js --config ./src/sample-product/configs/smoke.json -e ENV=local main ✚ ✱ ◼
/ |‾‾| /‾‾/ /‾‾/
/ / | |/ / / /
/ / | ( / ‾‾
/ | | | (‾) |
/ __________ |__| __ _____/ .io
execution: local
script: ./dist/loadTest.js
output: -
scenarios: (100.00%) 2 scenarios, 2 max VUs, 1m30s max duration (incl. graceful stop):
* sampleScenario1: 1 iterations for each of 1 VUs (maxDuration: 1m0s, exec: sampleScenario1, gracefulStop: 30s)
* sampleScenario2: 1 iterations for each of 1 VUs (maxDuration: 1m0s, exec: sampleScenario2, gracefulStop: 30s)
INFO[0001] 0se: == setup() BEGIN =========================================================== source=console
INFO[0001] 0se: Start of test: 2024-03-20 17:54:57 source=console
INFO[0001] 0se: Test environment: local source=console
INFO[0001] 0se: == Check scenario configurations ====================================================== source=console
INFO[0001] 0se: Scenario: sampleScenario1() source=console
INFO[0001] 0se: Scenario: sampleScenario2() source=console
INFO[0001] 0se: == Check scenario configurations FINISHED =============================================== source=console
INFO[0001] 0se: == Initialize Redis ====================================================== source=console
INFO[0001] 0se: == setup() END =========================================================== source=console
INFO[0003] 2se: Scenario sampleScenario2 is initialized. Lens is 10000 source=console
INFO[0003] 2se: Scenario sampleScenario1 is initialized. Lens is 10000 source=console
INFO[0003] 2se: == Initialize Redis FNISHED =============================================== source=console
INFO[0003] 2se: sampleScenario1() start ID: 2, vu iterations: 1, total iterations: 0 source=console
INFO[0003] 2se: sampleScenario2() start ID: 2, vu iterations: 1, total iterations: 0 source=console
INFO[0004] 3se: sampleScenario1() end ID: 2, vu iterations: 1, total iterations: 0 source=console
INFO[0004] 3se: sampleScenario2() end ID: 2, vu iterations: 1, total iterations: 0 source=console
INFO[0004] 0se: == All scenarios FINISHED =========================================================== source=console
INFO[0004] 0se: == Teardown() STARTED =========================================================== source=console
INFO[0004] 0se: == Initialize Redis ====================================================== source=console
INFO[0004] 0se: == Teardown() FINISHED =========================================================== source=console
INFO[0004] 0se: == Initialize Redis FINISHED =============================================== source=console
█ setup
█ sampleScenario1
✓ Status is 200
█ sampleScenario2
✓ Status is 200
█ teardown
checks.........................: 100.00% ✓ 2 ✗ 0
data_received..................: 152 kB 43 kB/s
data_sent......................: 939 kB 268 kB/s
group_duration.................: avg=1.26s min=1.23s med=1.26s max=1.28s p(90)=1.28s p(95)=1.28s
http_req_blocked...............: avg=507.05ms min=505.29ms med=507.05ms max=508.82ms p(90)=508.46ms p(95)=508.64ms
http_req_connecting............: avg=233.66ms min=233.63ms med=233.66ms max=233.69ms p(90)=233.69ms p(95)=233.69ms
http_req_duration..............: avg=754.63ms min=733.82ms med=754.63ms max=775.44ms p(90)=771.28ms p(95)=773.36ms
{ expected_response:true }...: avg=754.63ms min=733.82ms med=754.63ms max=775.44ms p(90)=771.28ms p(95)=773.36ms
http_req_failed................: 0.00% ✓ 0 ✗ 2
http_req_receiving.............: avg=259µs min=216µs med=259µs max=302µs p(90)=293.39µs p(95)=297.7µs
http_req_sending...............: avg=186.5µs min=117µs med=186.5µs max=256µs p(90)=242.1µs p(95)=249.05µs
http_req_tls_handshaking.......: avg=272.21ms min=270.45ms med=272.21ms max=273.96ms p(90)=273.61ms p(95)=273.79ms
http_req_waiting...............: avg=754.18ms min=733.48ms med=754.18ms max=774.88ms p(90)=770.74ms p(95)=772.81ms
http_reqs......................: 2 0.570468/s
iteration_duration.............: avg=1.18s min=901.98µs med=1.26s max=2.21s p(90)=1.93s p(95)=2.07s
iterations.....................: 2 0.570468/s
vus............................: 2 min=0 max=2
vus_max........................: 2 min=2 max=2
running (0m03.5s), 0/2 VUs, 2 complete and 0 interrupted iterations
sampleScenario1 ✓ [======================================] 1 VUs 0m01.2s/1m0s 1/1 iters, 1 per VU
sampleScenario2 ✓ [======================================] 1 VUs 0m01.3s/1m0s 1/1 iters, 1 per VU
Introduce the repository structure and settings
The above steps were just command manuals, thus you maybe couldn’t understand the logic and intention for implementation.
Don’t worry, I’ll tell you the above things to this chapter.
What I wanna do in this repository are here.
- To be able to smartly manage each scenario and scenario set in the repository
- To switch environment variable group easily like local, stg, prod
- Read user data that will be used in scenario from CSV file
- Output convenience debug log for development
- Check code and format it automatically
Probably, there are enough features when you start implementation of testing.
Just caution, this repository doesn’t contain the following contents.
- Test code
- Settings for Redis like user/password configuration
The directory structure in this repository is here.
~/D/g/template-of-k6 ❯❯❯ tree -a main
.
├── .env
├── .github
│ └── workflows
│ └── lint.yml
├── README.md
├── assets
│ └── datas
│ └── sample-product
│ └── local
│ ├── sampleScenario1.csv
│ └── sampleScenario2.csv
├── dist
├── k6
├── src
│ └── sample-product
│ ├── common
│ │ ├── common.ts
│ │ ├── env.ts
│ │ └── redis.ts
│ ├── configs
│ │ └── smoke.json
│ ├── loadTest.ts
│ └── scenarios
│ ├── sampleScenario1.ts
│ └── sampleScenario2.ts
Now let’s break down the settings.
Smartly manage each scenario and scenario set
“Scenario set” means a group of scenarios that will be used for the smoke test, load test, spike test, etc…
Assuming the daily development, we usually use the scenario set for the smoke test.
But sometimes, we use the scenarios for the load test.
So, it’s convenient for us that the scenario set be able to switch easily.
Thus, in this repository will be the following directory structure.
├── src
│ └── sample-product
│ ├── common
│ │ ├── common.ts
│ │ ├── env.ts
│ │ └── redis.ts
│ ├── configs
│ │ └── smoke.json
│ ├── loadTest.ts
│ └── scenarios
│ ├── sampleScenario1.ts
│ └── sampleScenario2.ts
Each scenarios are set under the “scenarios” directory and “scenario set” is in the parent directory(loadTest.ts).
If you wanna call scenarios, you execute via loadTest.ts.
Configuration of each scenario set that will be executed is managed under the /configs directory.
If you use some pattern of scenario set, you’ll prepare a configuration file by JSON per scenario set.
Configuration details of a scenario set is here.
In this file, the kind of scenarios and the amount of loads (VU, Iteration, etc…) will be defined.
{
"scenarios": {
"sampleScenario1": {
"exec": "sampleScenario1",
"executor": "per-vu-iterations",
"startTime": "0s",
"vus": 1,
"iterations": 1,
"maxDuration": "60s"
},
"sampleScenario2": {
"exec": "sampleScenario2",
"executor": "per-vu-iterations",
"startTime": "0s",
"vus": 1,
"iterations": 1,
"maxDuration": "60s"
}
}
}
At last, you only just pass the path of the JSON file to the command line for execution of k6.
./k6 run ./dist/loadTest.js --config ./src/sample-product/configs/smoke.json -e ENV=local
In this way, you can manage a lot of scenarios structurally, and we can easily switch scenarios set(smoke, load, spike, etc…) that are executed.
Switch environment variables group easily
When we continue to implement scripts for performance testing, we need to switch environments like local, stg, prod that receive requests many times.
It’s guilty to execute loads to the production environment even if just a mistake.
To avoid the above situation, the mechanism of switching environment must become simple.
Thus, in this repository, environment variables are imported from the “.env” file.
And, exported values from “.env” will be passed to “env.ts”.
~/D/g/template-of-k6 ❯❯❯ tree -a main
.
├── .env
├── assets
│ └── datas
│ └── sample-product
│ └── local
│ ├── sampleScenario1.csv
│ └── sampleScenario2.csv
├── src
│ └── sample-product
│ ├── common
│ │ ├── common.ts
│ │ ├── env.ts
│ │ └── redis.ts
After that, “env.ts” will set default environment variables if there is no one passed and export those values to be able to use other scripts.
// @ts-ignore
import { parse } from "k6/x/dotenv"
import { toBoolean } from './common'
// eslint-disable-next-line @typescript-eslint/naming-convention
const _ENV = parse(open("../.env"))
export const DEBUG =toBoolean(_ENV.DEBUG) || false
export const ENV = _ENV.ENV || 'local'
export const REDIS_ENDPOINT = _ENV.REDIS_ENDPOINT || 'redis://localhost:6379'
export const AMOUNT_OF_INDEX_SIZE_FOR_TEST_DATA = _ENV.AMOUNT_OF_INDEX_SIZE_FOR_TEST_DATA || 10000
export const SAMPLE_PRODUCT_ENDPOINT = _ENV.SAMPLE_PRODUCT_ENDPOINT || 'localhost'
Also, test data that will be used by each scenario are stored per environment.
├── assets
│ └── datas
│ └── sample-product
│ └── local
│ ├── sampleScenario1.csv
│ └── sampleScenario2.csv
And you need to pass environment flg to the execution command for k6.
./k6 run ./dist/loadTest.js --config ./src/sample-product/configs/smoke.json -e ENV=local
Read user data from CSV file
During creating scenarios, you would come up with preparing user data in advance and use it for loads.
For example, for the application that has an authentication feature, you’ll execute it.
In this repository, each scenario can read CSV files in /assets directory.
├── assets
│ └── datas
│ └── sample-product
│ └── local
│ ├── sampleScenario1.csv
│ └── sampleScenario2.csv
Also, we need to prepare Redis in local env to store and manage that which user data have been already used by VU(Virtual User).
The following code is a part of “samleScenario1.ts”.
This script reads user data from CSV file, but the index number that is set to specify the array of CSV is collected from Redis using pop() method.
const users = new SharedArray('sampleScenario1', function () {
return papaparse.parse(open(`${SCENARIO_FILES_DIR()}/sampleScenario1.csv`), { header: true }).data
})
export default async function sampleScenario1(): Promise<void> {
if (loopCounterPerVU === 0) {
const index = Number(await client.lpop('sampleScenario1'))
user = users[index]
}
loopCounterPerVU++
Maybe you’ll come up with some questions that why you set this complicated implementation.
That reason is described the following article.
If you are interested more, please check it.
https://gonkunblog.com/en/k6-share-variable-across-multipul-vus/1826/
Set logging for debug during development
you want to run debug logs only during development.
I think that this is a common story.
In this repository, the following function is prepared and used by other scripts.
export function debugOrLog(textToLog: string): void {
if (env.DEBUG) {
const millis = Date.now() - start
const time = Math.floor(millis / 1000)
console.log(`${time}se: ${textToLog}`)
}
}
Caller just execute the following code.
debugOrLog(
`sampleScenario1() start ID: ${user.ID}, vu iterations: ${loopCounterPerVU}, total iterations: ${exec.scenario.iterationInTest}`,
)
In this way, the presence or absence of debug logging can be controlled on the environment variable side.
If meaningless logs continue to flow during a production load test, it can become a bottleneck in the load test.
Align code formats and check automatically
I don’t have much to say about this one, as it doesn’t do much…
We have a minimum of CI on the github actions side, and we have included lint and prettier settings.
Please see the repository directly for more details.
Conclution
In this article, we have introduced an example of a base project configuration for developing load scripts in k6.
There are advantages and disadvantages to this configuration as well, so I would be happy if you could customize it accordingly to suit your own purposes.
I hope this article will be of some help to you.