Let me set up a scenario for you...
You've deployed your new Kentico 12 MVC application out to production and everything's going great. Your boss is happy and you're happy too. ๐
A new feature request comes to you and to add this functionality you have to make some changes to existing parts of the site in addition to the new code you write for the feature.
After testing locally and then deploying the new feature to production some people notice a couple of problems ๐ฎ - not with the new feature you built but with unrelated parts of the site.
You don't have any automated tests, so there's no easy way to test for regressions in site functionality when a new feature is added. ๐ง
Instead you (and a couple other kind souls, if you are lucky) go through the site, clicking around hoping to figure out what is causing these problems.
After some searching, the cause of the problems is found! It was a change you made, in already existing code, to get the new feature working. ๐ข
The Case for Automated Tests
Instead of just wiping the sweat off your brow and shipping the fix straight out to production, you decide to try writing an automated unit test, which will exercise the previously broken code, to ensure it behaves how you expect. ๐
If you are new to automated testing in Kentico, check out my post on Writing Unit Tests
But there's a problem. One of the issues was related to data in the CMS, not code. That data had to be changed for the new feature while an existing one depended on the previous state of that data.
Unit tests need to execute quickly and not depend on functionality in other parts of the code base or the state of data in a database, so unit tests can't help us here.
You decide to go read Kentico's documentation and you find the section on creating integration tests.
Wahoo! ๐
Now you can write automated tests that interact with the data in the CMS and ensure it is configured how your application expects.
But what's the best way to write integration tests? How do we write tests that can target different environments or consistently test the correct data? ๐ค
Let's take a look at how we might create an integration test project. It will test our email templates, which are stored in the database, making sure they are written correctly and match what our code expects.
Check out the blog post Sending Emails With the Kentico API by Dustin Christians over at BizStream to learn how to programmatically send emails from your code.
Creating an Integration Test Project
In a previous post I wrote about how I'm a fan of using newer .NET Core tools and features in our classic .NET Framework projects.
Check out that post here
Let's leverage some of that here.
First, we add a new test project, in Visual Studio, to the solution that has our Kentico projects.
We will want to select "NUnit Test Project (.NET Core)"
To make sure our test project works correctly with Kentico's testing tools, we'll want to name it with a .Tests
suffix. My project is named Sandbox
, so I'll name my test project Sandbox.Tests
.
If we double click on the project node in the Solution Explorer, the .csproj
file will open up for editing.
One of the benefits of using the new .NET Core Common Project System is live editing of the project without needing to unload it first. ๐
We're going to make a couple of changes to this file to help with .NET Framework compatibility.
- Change the
<TargetFramework>
element value tonet472
(or whatever version of the .NET Framework you are working with). -
Find the
<ItemGroup>
that contains<PackageReference>
elements and add the Kentico Testing package (specify the version that matches the hotfix version of your project):<PackageReference Include="Kentico.Libraries.Tests" Version="12.0.30" />
. -
Add my favorite assertion library, FluentAssertions:
<PackageReference Include="FluentAssertions" Version="5.7.0" />
Either change the
nunit
package version to3.8.1
(this is the version supported by Kentico) or remove this package asKentico.Libraries.Tests
includes it already.Add the
SlowCheetah
package which will help us run our tests against different environments:
<PackageReference Include="Microsoft.VisualStudio.SlowCheetah" Version="3.2.20">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers</IncludeAssets>
</PackageReference>
Now, we can save and close the project file.
Writing Integration Tests
Let's begin writing our tests.
First, we rename UnitTest1.cs
to EmailTemplateTests.cs
and replace its contents with the following:
using CMS.EmailEngine;
using CMS.MacroEngine;
using CMS.Tests;
using FluentAssertions;
using NUnit.Framework;
using System.Collections.Generic;
namespace Sandbox.Tests
{
[TestFixture]
public class EmailTemplateTests : IntegrationTests
{
[Test]
public void Welcome_EmailTemplate_Will_Exist_And_Have_The_Correct_Macro_Expressions()
{
}
}
}
The
IntegrationTests
base class let's us connect to a real database during test execution and any of Kentico's data access classes will use that connection to retrieve data.
Let's create a class to represent all the values we want to populate our email template with using Kentico's macro expressions.
public class WelcomeUserEmailModel
{
public string FirstName { get; set; }
public string LastName { get; set; }
public string UserName { get; set; }
public string LoginUrl { get; set; }
}
Now let's setup the context for our test, specifying the template, site, and model data we want to test against.
[Test]
public void Welcome_EmailTemplate_Will_Exist_And_Have_The_Correct_Macro_Expressions()
{
// You can fill in these values to fit your needs.
var templateName = "WELCOME_NEW_USER";
var siteId = 1;
var model = new WelcomeUserEmailModel
{
FirstName = "Test",
LastName = "User",
UserName = "test-user@some-site.com",
LoginUrl = "https://test.dev.site.com"
};
// ... start testing here
}
We can now write the test code that makes sure our email template is in the database, and its macro expressions match up with our email template model class, WelcomeUserEmailModel
.
var template = EmailTemplateProvider.GetEmailTemplate(templateName, siteId);
template.Should().NotBeNull();
These two lines will query the template from the database and assert that it's not null
(EmailTemplateProvider.GetEmailTemplate
returns null
if the template can't be found). ๐ช
What about the macro expressions? Let's test those next!
var resolver = MacroResolver.GetInstance();
var values = new List<string>
{
model.FirstName,
model.LastName,
model.UserName,
model.LoginUrl
};
resolver.SetNamedSourceData(nameof(WelcomeUserEmailModel.FirstName), model.FirstName);
resolver.SetNamedSourceData(nameof(WelcomeUserEmailModel.LastName), model.LastName);
resolver.SetNamedSourceData(nameof(WelcomeUserEmailModel.UserName), model.UserName);
resolver.SetNamedSourceData(nameof(WelcomeUserEmailModel.LoginUrl), model.LoginUrl);
var hydratedTemplateText = resolver.ResolveMacros(template.TemplateText);
hydratedTemplateText.Should().ContainAll(values);
This bit of code creates a new macro resolver, adds the keys and values to it from our model class, and resolves the macros in our email template using those keys and values.
We then assert that all of the values in our model should show up in the "hydrated" email template string that the macro resolver generates.
Hot sauce! ๐
Would you believe me if I told you there was a more general purpose way to do this?
Let's take a look! ๐ง
var resolver = MacroResolver.GetInstance();
var properties = model.GetType().GetProperties();
var values = new List<string>();
foreach (var property in properties)
{
var value = property.GetValue(model, null);
values.Add(value?.ToString());
resolver.SetNamedSourceData(property.Name, value);
}
var hydratedTemplateText = resolver.ResolveMacros(template.TemplateText);
hydratedTemplateText.Should().ContainAll(values);
Here, instead of using the model class, we use reflection.
The code gathers up all the properties and values of the model, adds them to the macro resolver, and uses the collection of values to assert what the contents of the hydratedTemplateText
should contain.
If our email template in the database is missing one of the macro expressions that would insert the model's value, or if there is a type in an macro expression, our test will fail, letting us know our mistake.
Pretty cool! ๐
Using SlowCheetah for Configuration
Before we can run our tests, we need to configure the project to store our configuration for different environments.
This configuration includes different connection strings, app settings, or special files. If you use a Continuous Integration / Deployment (CI/CD) setup for building and deploying your application, the steps below can be especially helpful. โก
If we right click on our test project node in the Solution Explorer and select "Add New Item" we will see a dialog appear with many file choices.
The old project system (.NET Framework) used XML .config
files to store configuration for projects, whereas the new project system (.NET Core) uses .json
files.
We are still working with .NET Framework and not .NET Core, even though we are using the new project system, so we want to use the XML based configuration files.
The closest thing to these files in dialog before us is the XML File
, so we select that and name it app.config
.
If we open the new app.config
file it's going to be empty except for the root XML node.
Replace its contents with the following
<?xml version="1.0" encoding="utf-8" ?>
<configuration>
<appSettings></appSettings>
<connectionStrings configSource="App_Config\ConnectionStrings.config" />
</configuration>
Now we need to add the App_Config
folder to our project and add two files to this folder - ConnectionStrings.config
and ConnectionStrings.Local.config
.
We can add these the same way we added the app.config
file above.
The contents of each of these "ConnectionStrings" files should be identical.
<connectionStrings>
<add name="CMSTestConnectionString" connectionString="" />
</connectionStrings>
Notice we use the name
CMSTestConnectionString
because Kentico's Integration Test infrastructure will look for this specific connection string when we run our tests. โ
We add ConnectionStrings.Local.config
to our .gitignore
file because it will contain a real connection string, with credentials for local testing, and we don't want that in source control. ๐ค
We will now use SlowCheetah
to help us add "transforms" to our app.config
by right clicking on the app.config
file and selecting "Add Transform".
What we should end up with is an app.Debug.config
and app.Release.config
(one for each Configuration in our project).
The following XML should have been added to our .csproj
file.
<ItemGroup>
<None Update="app.config">
<TransformOnBuild>true</TransformOnBuild>
</None>
<None Update="app.Debug.config">
<IsTransformFile>true</IsTransformFile>
<DependentUpon>app.config</DependentUpon>
</None>
<None Update="app.Release.config">
<IsTransformFile>true</IsTransformFile>
<DependentUpon>app.config</DependentUpon>
</None>
</ItemGroup>
We now update our app.Debug.config
so that when we build our .Tests
project, using the "Debug" configuration, the app.config
is transformed to use our ConnectionStrings.Local.config
instead of the ConnectionStrings.config
.
<?xml version="1.0"?>
<configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform">
<connectionStrings configSource="App_Config\ConnectionStrings.Local.config"
xdt:Transform="SetAttributes(configSource)" />
</configuration>
Finally, we add our real database connection string (local or dev, not prod!) to ConnectionStrings.Local.config
.
When you want to test "Production", put a connection string in the ConnectionStrings.config
file, preferably using some CI/CD process, or in the file on your machine. Then build your test project in "Release" mode and run your tests.
You can create more Configurations for your solution (Development, QA, Staging, ect...) and create an app.config
transform for each of them.
If you have different settings for each environment that you want to use for testing you can take the above approach we used for ConnectionStrings
and use it for AppSettings
.
If some of these settings are not security sensitive then you can commit them to source control and the ones that are more sensitive in nature can be added by a CI/CD process. ๐
Running The Integration Tests!
We should now be able to run our tests and see the results.
For the test code I wrote above, I would need to have an email template with a name WELCOME_NEW_USER
defined in the CMS that looked roughly like this:
<p>Hello {% FirstName %} {% LastName %},</p>
<p>
Welcome to our site! Your account has been created successfully
and you can now login by <a href="{% LoginUrl %}" target="_blank">
clicking this link</a> and entering the username {% UserName %}
along with the password you already entered.
</p>
You might notice that running the integration test takes a lot longer than running unit tests (1-2 seconds for integration vs 10-20 milliseconds for unit). ๐ค
This is part of the reason why we need to be judicious in where we focus our efforts when writing automated tests.
Integration tests aren't better or worse than unit tests - they provide more confidence in our code compared to unit tests, but they are also more brittle (easier to produce false failures) and take longer to run.
We need them to help ensure that our code matches the data in the CMS but we shouldn't use them to test everything.
A common approach is to have the number of different kinds of tests in our application match the Automated Test Pyramid. โ
It's also common to run unit tests with every code change, but only run integration tests before deploying changes to a new environment. โ
As developers, these choices are ours to make - let's use our new powers for good - helping to reduce regressions and improve the stability of the applications we work on. ๐
Summary
My goal here was to set up a scenario that would show where automated tests could provide a real benefit and specifically integration tests, by placing the cause of the issue in the database.
We walked through:
- How to create a test project for testing a Kentico 12 application.
- What writing an integration test would look like when testing email templates.
- How to use test project configuration transforms to set configuration values per-environment and keep sensitive values out of source control.
- The pros and cons of unit tests compared to integration tests, along with the purpose and position integrations should have in your overall testing strategy.
If you have an thoughts on integration testing or cool ways you've used integration tests to help ensure confidence in your applications, let me know in the comments. ๐
Thanks for reading! ๐
If you are looking for additional Kentico content, checkout the Kentico tag here on DEV:
Or my Kentico blog series:
Top comments (0)