Using Docker containers as a source of load for load testing
In May 2024 I needed to test a Node.js REST API that accepted, processed and stored (in MariaDB) data from many remote sensors. Each load was small, but there were many of them.
The remote sensor was itself a Node.js implementation, half of which aggregated and POSTed every 10 minutes called by a systemd cron. I considered that simulating a sensor load could be done by "Dockerising" that component of the sensor with sample data.
My Dockerfile
FROM node
COPY live.1714561273.csv /run/user/1000/
WORKDIR /usr/src/XXXX
COPY package.json ./
COPY post.js ./
RUN npm install
ENTRYPOINT node /usr/src/XXXX/post.js ; sleep 10 ; node /usr/src/XXXX/post.js
If you're curious about the node, sleep, node - see my previous "having been burned by DDOSing a C&C server". If the first attempt is unsuccessful, the file are left as they were for the next one to try.
# docker build -t loadtest .
[+] Building 1.6s (11/11) FINISHED docker:default
=> [internal] load build definition from Dockerfile 0.2s
=> => transferring dockerfile: 285B 0.0s
=> [internal] load metadata for docker.io/library/node:latest 0.0s
=> [internal] load .dockerignore 0.2s
=> => transferring context: 2B 0.0s
=> [1/6] FROM docker.io/library/node:latest 0.0s
=> [internal] load build context 0.2s
=> => transferring context: 100B 0.0s
=> CACHED [2/6] COPY live.1714561273.csv /run/user/1000/ 0.0s
=> CACHED [3/6] WORKDIR /usr/src/LOL 0.0s
=> CACHED [4/6] COPY package.json ./ 0.0s
=> CACHED [5/6] COPY post.js ./ 0.0s
=> CACHED [6/6] RUN npm install 0.0s
=> exporting to image 0.1s
=> => exporting layers 0.0s
=> => writing image sha256:a43a89e5863d7f48bcc6b245cd36a42ab5212fbd003fe6659f5e9c27fb651f63 0.0s
=> => naming to docker.io/library/loadtest
My image list
# docker image list -a
REPOSITORY TAG IMAGE ID CREATED SIZE
loadtest latest a43a89e5863d 18 hours ago 1.16GB
node latest a1c1026b1a58 5 days ago 1.11GB
hello-world latest d2c94e258dcb 12 months ago 13.3kB
Running the load test
runloadtest.sh
#!/bin/bash
docker container prune -f
date
for i in {1..100}; do
docker container run loadtest &
done
Outcomes
Well... for one the MariaDB Node.js npm package is faster than MySQL or MySQL2. BUT I was using transactions to ensure I either rejected or completly accepted sensor data. I found out that MariaDB with connection pools AND transactions resulted in partial data INSERTion under load. I switched to opening a connection per request the problem went away.