Part 1 - Getting Gatling to run in Azure
In the last post, I went over a way to run Gatling in Azure Container Instances by mounting certain Azure Storage and editing configuration files. It was a somewhat hacky way to get it running and could certainly improved.
It this article I will go into one way of automating it - adding an ASP.NET API on top.
Step one - Creating a custom Docker runtime container
The first step was to take the original Gatling docker container (denvazh/gatling) and use it as a base to create an image that has the dotnet dependencies baked in.
This was a fairly simple process of adding the parts from the runtime-deps image and then adding on the parts needed for ASP.NET to run.
You end up with this dockerfile.
FROM denvazh/gatling:3.0.3
RUN apk add --no-cache \
ca-certificates \
\
# .NET Core dependencies
krb5-libs \
libgcc \
libintl \
libssl1.1 \
libstdc++ \
lttng-ust \
tzdata \
userspace-rcu \
zlib
# Configure web servers to bind to port 80 when present
ENV ASPNETCORE_URLS=http://+:80 \
# Enable detection of running in a container
DOTNET_RUNNING_IN_CONTAINER=true \
# Set the invariant mode since icu_libs isn't included (see https://github.com/dotnet/announcements/issues/20)
DOTNET_SYSTEM_GLOBALIZATION_INVARIANT=true
# Install ASP.NET Core
ENV ASPNETCORE_VERSION 2.2.7
RUN wget -O aspnetcore.tar.gz https://dotnetcli.blob.core.windows.net/dotnet/aspnetcore/Runtime/$ASPNETCORE_VERSION/aspnetcore-runtime-$ASPNETCORE_VERSION-linux-musl-x64.tar.gz \
&& aspnetcore_sha512='d3c1cc27998fc8e45fbf0c652a8d8694e999a3cd5909f83fb11b1e5cf713b93f4e7614c4b74c92d6c04f0b0759373b6b6ff7218d9d143d36bb9b261ef8161574' \
&& echo "$aspnetcore_sha512 aspnetcore.tar.gz" | sha512sum -c - \
&& mkdir -p /usr/share/dotnet \
&& tar -zxf aspnetcore.tar.gz -C /usr/share/dotnet \
&& rm aspnetcore.tar.gz \
&& ln -s /usr/share/dotnet/dotnet /usr/bin/dotnet
I currently have this published on dockerhub here.
With that made we have our runtime image.
Step two - Creating the API
Visual Studio comes with some decent Docker templates already, so I started with a blank API with docker support. After that's spun up, the only change I needed to make in the dockerfile it created was changing the base to my own.
FROM dantheman999/gatling-aspnet AS base # Change the base image
WORKDIR /app
EXPOSE 80
EXPOSE 443
FROM mcr.microsoft.com/dotnet/core/sdk:2.2 AS build
WORKDIR /src
COPY ["Gatling.Runner.csproj", ""]
RUN dotnet restore "./Gatling.Runner.csproj"
COPY . .
WORKDIR "/src/."
RUN dotnet build "Gatling.Runner.csproj" -c Release -o /app
FROM build AS publish
RUN dotnet publish "Gatling.Runner.csproj" -c Release -o /app
FROM base AS final
WORKDIR /app
COPY --from=publish /app .
ENTRYPOINT ["dotnet", "Gatling.Runner.dll"]
Then it was a case of creating the API itself. There were two approaches I was considering for this:
- Continuing on with my original plan, mounting Azure Storage when the container is created, leaving the possibility of it being shared across multiple containers.
- Adding the simulation into the container directly when starting a new test.
I ended up opting for option 2. It gives much more flexibility, for example you could have the container running whilst you run multiple different tests before tearing it down, or having tests running concurrently if you so desired.
Below I'm going to walk through some of the code used to get this to work. You can either follow it along or have a little look around the repo.
Azure + Gatling - Running Gatling in the Cloud
The projects in the repo contain applications that help to run Gatling in an automated fashion in Azure.
Currently there is:
-
base-image
: The folder containing a Gatling Docker container that can also run .NET Core applications. -
src/Gatling.Runner
: An API for running Gatling tests on a machine. Combined with the Gatling container above this allows remote running of Gatling tests in the Cloud. -
src/Gatling.Orchestrator
: A Durable Function for orchestrating running Gatling tests across multiple Azure Container Instances with returns a single combined result.
The Controller (GatlingController.cs)
The controller has 3 endpoints:
Start
StartAsync
GetResults
Start
will take a test upload, store the files, run the test and return the results as one operation. StartAsync
and GetResults
do the same thing except StartAsync
will just return the runId
which can be used to query the results whilst the test is running in the background.
Starting with the simplest method, Start
.
It is basically as explained above. We take the runId and create the folders necessary for Gatling to run. If the full report is required, then it will run the simulation and return a zip of the result, otherwise it will return the results of the run which is just the console output created by Gatling when it runs.
Then we have the async version of this which is a little more complicated.
It takes the same format of request as Start
, but all it does is creates the folders and then queues up a job to be run asynchronously in the background, before returning the runId location.
You could then poll the GetResults
endpoint for the run, which will return the zipped up Gatling results when completed (or return that there was a failure).
The Services
File Service (FileService.cs)
Two very simple methods here, one for creating the necessary folder structure for Gatling to run and the other that creates a FileStream
to send back the zipped results folder.
One thing to notice here is that it reads a file out of the zip provided through the controller called run.json
. All this has in it at the moment is the name of the simulation to run.
{
"SimulationClassName": "apimsimulations.ApiMSimulation"
}
Gatling Service (GatlingService.cs)
The only job this has so far is to run the a test given the RunSettings
from the FileService
is to run that against Gatling, and that is done by just running the process and waiting for it to finish. You can see what each of the command line arguments for Gatling does here, but for the most part we are just telling it where things are and what simulation we wish to run.
The Background Services
For the asynchronous methods, I took the skeleton of the code from the ASP.NET Core documentation and changed a few pieces of it. The main change was to add a jobId
to an item when queued which had an accompanying state. When the job was finished, its state was updated for the GetResults
endpoint.
Job Status Service (JobStatusService.cs)
This service keeps track of the state for a run. Obviously not ideal using a ConcurrentDictionary
as a data store but given the container should be spun up for one or two jobs before being destroyed, it works fine.
Background Task Queue (BackgroundTaskQueue.cs)
This is pretty much a copy of the documentation example, except we now have it taking a Tuple with the job id and job to run. When the job gets queued, it's State is set to started which is a little wrong in all honesty as it's only been queued up at this point.
Queue Hosted Service (QueueHostedService.cs)
Finally the task is dequeued here, and run. Once finished its State
is updated so that the controller knows it can pick up the results.
Step 3 - Run a request
With all of this set up, we can now run a test!
Create a zip with your simulation (and resources if you have them) in the same format as it would be in the Gatling folders, and also your run.json
that was explained earlier.
Send a request to the API on api/gatling/start/{uuid}?returnReport=true
(easiest just to start with the synchronous method) with your zip file attached. The request will take as long as your simulation is, plus around 10 seconds for the initial start up time.
And that's it 🎉 You now have a remotely controlled Gatling that can be hosted in Azure once it is built.
If you just want to directly play around with container without compiling it yourself, you can grab it here.
Next steps
In part three, I will be creating a wrapper around this in Azure that will
- Create a set number of Azure Container Instances of the container, possibly in mulitple regions
- Set them all to run a test using a similar API as the one in the container (with more checking that the zip looks "correct").
- Wait for them all to finish and get all the
simulation.log
. - Merge the log files and create a master report.
Top comments (1)
Have you made part 3 of this article ?