End-to-End Testing Using xUnit and Docker on Azure Pipelines

The comprehensive guide for a custom integration testing solution using xUnit and Docker in Azure Pipelines

Agustinus Theodorus
Better Programming

--

Illustration by author, Background by Annie Spratt on Unsplash

Functioning end-to-end tests are one of the most consequential things you can have for your application. But making end-to-end tests are not as simple as making unit tests because it can be an amalgamation of unit tests and integration tests. You would need to spin up your testing environment, and more importantly, you need to do this all in the CI pipeline.

Having end-to-end tests ready to run in your CI pipeline can help you shave off massive amounts of time needing to do the tests, and most of all, because it is automated you can take advantage of this every time you want to check a build.

What Is End-to-End Testing?

End-to-end tests are meant to simulate the entire application flow, making the entire app testable. Using end-to-end tests in your daily workflow creates a more robust app experience, guaranteeing quality on each release build.

But in this article, I am going to show you how to end-to-end test your APIs using xUnit and Azure pipelines. You will learn how to create your mock database using Docker, and how to mock your interfaces using NSubstitute, and AutoFixture.

Writing Unit Tests Using xUnit

Before starting, I need to inform you that in this section I will be utilizing four main packages for unit testing:

  • xUnit
  • NSubstitute
  • AutoFixture
  • FluentAssertions

Part of the end-to-end testing system would be the unit tests. Unit tests are small bits of code used to test tiny parts of your application components. To be able to guarantee the business logic output, you can create a unit test to mock inputs and check the outputs accordingly. Here’s code to assist you:

Unit test template

For example, you want to create a testing suite that checks if a method returns a list of user roles when supplied with empty parameters. Before you initiate the test, you need to create the subject under test (SUT) with the corresponding dependency.

The handler will be the SUT and the repository will be mocked using NSubstitute, lastly, we need to initiate an IFixture instance to help generate fake data.

Arrange unit test

Next, you need to arrange the test by creating mock objects that would be used via dependency injection. In this case, you would want to create fake results and attach them to the mocked dependency.

Execute unit test

After you have created fake results you need to execute the method.

Assert unit test

To validate the results, you may use fluent assertion methods. The fluent assertion is not a default function provided by Microsoft’s default testing suites, it is a custom package that you can install. If you are having a method that doesn’t exist errors, try installing the FluentAssertions package first.

Spin-Up a Standalone Database Using Docker

Before you start writing end-to-end tests using xUnit, you would need to connect to a testing database. The easiest way to create a testing database is to use Docker. By using Docker you can create your database image populated with testing data. Best of all, you can integrate it with your CI pipeline to create functional tests.

First, you would need to create a script that will populate testing data (in this I would be naming the script init.sql). Here’s the code:

Query to inject new AccessRoles data for integration test

Next, create a Dockerfile that will build your custom image, assuming you will be using an SQL Server, the following Dockerfile will use an SQL Server, as shown below:

Dockerfile to create a custom SQL Server image

Writing Integration Tests Using xUnit

Writing integration tests is quite easy, but before starting, I need to inform you that in this section I will be utilizing two main packages for integration testing:

  • xUnit
  • Microsoft.AspNetCore.Mvc.Testing

The Microsoft.AspNetCore.Mvc.Testing the package supplied by Microsoft is required for this to work. This package supplies everything you would need for in-memory API testing. Assuming that you already have your database running in the background, all you would need is to create the tests.

For example, let’s make a test that will check if the /users endpoint will return a success status code.

Integration tests template

Each test class would extend the IClassFixture interface. The IClassFixture is supplied using the WebApplicationFactory class. The WebApplicationFactory will be used to create an in-memory client that will be very similar to production. You would only need to execute the GET methods on the /users endpoint then you can retrieve and assert the status code.

Because the response will be a regular HTTP Client response in .NET you can assert the status code by using:

await client.GetAsync(url);

Alternatively, you can access the response content using:

await response.Content.ReadAsStringAsync();

Creating integration tests isn’t so hard now ain’t it? The example above might look simple but it can be expanded to an enormous degree if you are willing to.

Automating the Tests Inside the CI Build Using Azure Pipeline

For security purposes, Azure Pipelines requires you to host a private build agent to run a Linux Docker image inside your CI pipeline. And to run XUnit tests you need to have to run them inside a machine that has Visual Studio installed.

To install a Visual Studio for your Build Agent you need to first install a build agent on a Windows machine as a baseline OS.

Installing a visual studio build agent

To do this you need to download the Windows-specific build agent, but to get the most recent agent version you need to get it from your Agent pool page.

First, go to your Project Settings:

Image by Author

Then go to the Agent pools section, and click on your Default agent pool.

Image by Author

Click on New agent.

Image by Author

Then a pop up appears and you will have to:

  1. Click on the Windows tab.
  2. Copy the agent download URL.
Image by Author

TL;DR To help you get through all this faster, you can download the agent here (the link points to the agent version 2.193.1) and update it later.

Then, go to your C: directory and type create a new directory agent:

mkdir agent
cd agent

Then, create the agent by unpacking the downloaded package:

Add-Type -AssemblyName System.IO.Compression.FileSystem ; [System.IO.Compression.ZipFile]::ExtractToDirectory(“$HOME\Downloads\vsts-agent-win-x64–2.193.1.zip”, “$PWD”)

Next, configure the agent by running the configuration script:

.\config.cmd

Finally, run the agent using the run.cmd script:

.\run.cmd

To enable Visual Studio on your Windows Agent you only need to download and install Visual Studio on your machine. Because my example will use VS2017 you can retrieve the download URL for VS2017 here.

Creating the CI Pipeline

First, create a new CI pipeline:

  1. Go into your pipeline directory.
  2. Click Create Pipelines.
Image by Author

When trying to create the CI template, opt-in to use the classic editor.

Image by Author

Next, determine which project will you be working on:

  1. Select the Project.
  2. Pick your designated repository.
  3. Choose the branch that will trigger the CI build.
  4. When all is done, double-check and click Continue.
Image by Author

To streamline the creation of your new template:

  1. Search for the predefined ASP.NET Core template.
  2. Apply the template.
Image by Author

Next, use the Visual Studio Agent that you installed:

  1. Click on the Pipeline tab.
  2. Click on the Agent pool dropdown.
  3. Select your self-hosted agent pool.
Image by Author

In this specific ASP.NET Core solution, I separated it into three projects, mainly the Application, UnitTest, and IntegrationTest projects. The restoration and build stages will also be separated for the three projects.

Currently, you are going to set up a restore for the Application project :

  1. Click on the Restore task.
  2. Unlink the project by clicking the chain symbol.
  3. Then click Unlink.
Image by Author

Then change the Path to project(s) to your Application project file.

Image by Author

Next, do the same thing to the Build task:

  1. Click on the Build task.
  2. Change the project path to the Application project file.
  3. Add a configuration to prevent the build from restoring the dependencies (because it has been done through the previous task).
Image by Author

Clear up the unused tasks by:

  1. Clicking on the task you want to remove.
  2. Clicking the Remove button.
Image by Author

Because my example will require an appsettings.json:

  1. Click on the + button.
  2. Because you’re on the Windows OS choose to search for Powershell.
  3. Click Add.
Image by Author

The Powershell task would be used to insert the appsettings.json inside the Application project folder using a Powershell script.

Powershell script to copy an appsettings.json

To insert the Powershell script:

  1. Click on the Powershell task.
  2. Change the task display title to Copy appsettings.json.
  3. Change the script type to inline.
  4. Add the script.
Image by Author

Next, you would need to restore and build the unit test projects, you can easily do this by cloning the two previous tasks for the Application project:

  1. Left-click on the restore project task.
  2. Click on the Clone task option.
Image by Author

After cloning the restore task:

  1. Click on cloned task.
  2. Change the display name.
  3. Change the project file location.
Image by Author

Then repeat the step for the build task, and afterward:

  1. Click on cloned task.
  2. Change the display name.
  3. Change the project file location.
  4. Check for the configuration to prevent a restore during the build, add it in if it doesn’t exist.
  5. Do this again for the integration test project.
Image by Author

To run a Docker database, you need to add a Docker task:

  1. Click on the + button.
  2. Search for the Docker task.
  3. Click Add.
Image by Author

Before you can run your Docker container, you need to build it first:

  1. Click on the new Docker task.
  2. Change the task version to 0.*.
  3. Change the Action to Build an Image.
  4. Update the container registry type to use the regular Container Registry.
  5. Insert the location of your Dockerfile.
Image by Author

Add another Docker task to run the image:

  1. Click on the new Docker task.
  2. Change the task version to 0.*.
  3. Change the Action to Run an Image.
  4. Uncheck the Qualify image name option.
Image by Author

Add a Visual Studio Test task:

  1. Click on the + button.
  2. Search for the Visual Studio Test task.
  3. Click Add.
Image by Author

Set up the Visual Studio Test task:

  1. Click on the task.
  2. Change the display name.
  3. Define the test files.

The test file script is as follows:

**\bin\$(BuildConfiguration)\**\*test*.dll
!**\obj\**
!**\xunit.runner.visualstudio.testadapter.dll
!**\xunit.runner.visualstudio.dotnetcore.testadapter.dll
Image by Author

Add another task to publish a test:

  1. Click on the + button.
  2. Search for the Publish Test Results task.
  3. Click Add.
Image by Author

Finally, set up the publish test results task:

  1. Click on the task.
  2. Change the display name.
  3. Change the expected test format to XUnit.
Image by Author

TL;DR If by any chance this the classic editor goes out of date, you can copy the YAML version of the pipeline here:

Pipeline for end-to-end tests using Azure Pipelines

Conclusion

In this tutorial, you have learned how to create a basic end-to-end test using XUnit and how to run them on your CI pipeline using Azure Pipelines. The purpose of having end-to-end tests is to make sure your application is running perfectly, in this case, to make sure your APIs are flawless.

With this tutorial, I hope you can seamlessly integrate end-to-end tests into your CI pipeline, and now you will have an open cookbook for you to test around.

--

--