ColaFlow Testing Guide
This document explains the testing strategy, setup, and best practices for ColaFlow project.
Table of Contents
- Testing Philosophy
- Test Structure
- Getting Started
- Running Tests
- Writing Tests
- Test Coverage
- CI/CD Integration
- Best Practices
Testing Philosophy
ColaFlow follows the Test Pyramid approach:
/\
/ \ E2E Tests (5%)
/ \ - Critical user flows
/------\
/ \ Integration Tests (15%)
/ \ - API endpoints
/ \ - Database operations
/--------------\
/ \ Unit Tests (80%)
------------------ - Domain logic
- Application services
- Business rules
Quality Standards
- Minimum Code Coverage: 80%
- Target Code Coverage: 90%+
- Critical Path Coverage: 100%
- All tests must:
- Be repeatable and deterministic
- Run independently (no order dependency)
- Clean up after themselves
- Have clear assertions and error messages
Test Structure
tests/
├── ColaFlow.Domain.Tests/ # Domain unit tests
│ ├── Aggregates/
│ │ ├── ProjectTests.cs
│ │ ├── EpicTests.cs
│ │ └── TaskTests.cs
│ ├── ValueObjects/
│ │ ├── ProjectIdTests.cs
│ │ └── TaskPriorityTests.cs
│ └── DomainEvents/
│ └── EventHandlerTests.cs
│
├── ColaFlow.Application.Tests/ # Application layer tests
│ ├── Commands/
│ │ ├── CreateProjectCommandTests.cs
│ │ └── UpdateProjectCommandTests.cs
│ ├── Queries/
│ │ ├── GetProjectByIdQueryTests.cs
│ │ └── GetKanbanBoardQueryTests.cs
│ └── Behaviors/
│ ├── ValidationBehaviorTests.cs
│ └── TransactionBehaviorTests.cs
│
├── ColaFlow.IntegrationTests/ # Integration tests
│ ├── API/
│ │ ├── ProjectsApiTests.cs
│ │ ├── TasksApiTests.cs
│ │ └── WorkflowsApiTests.cs
│ ├── Infrastructure/
│ │ ├── IntegrationTestBase.cs
│ │ └── WebApplicationFactoryBase.cs
│ └── Database/
│ ├── RepositoryTests.cs
│ └── MigrationTests.cs
│
├── ExampleDomainTest.cs # Template domain test
├── ExampleIntegrationTest.cs # Template integration test
├── IntegrationTestBase.cs # Base class for integration tests
├── WebApplicationFactoryBase.cs # WebApplicationFactory setup
└── README.md # This file
Getting Started
Prerequisites
- .NET 9 SDK (includes testing tools)
- Docker Desktop (for Testcontainers)
- IDE: Visual Studio 2022, JetBrains Rider, or VS Code
Initial Setup
-
Ensure Docker Desktop is running:
docker --version docker ps -
Restore NuGet packages (if not already done):
cd tests dotnet restore -
Verify test projects build:
dotnet build
Creating Test Projects
If test projects don't exist yet, use the provided templates:
# Domain Tests
cd tests
dotnet new xunit -n ColaFlow.Domain.Tests
cp ColaFlow.Domain.Tests.csproj.template ColaFlow.Domain.Tests/ColaFlow.Domain.Tests.csproj
# Application Tests
dotnet new xunit -n ColaFlow.Application.Tests
cp ColaFlow.Application.Tests.csproj.template ColaFlow.Application.Tests/ColaFlow.Application.Tests.csproj
# Integration Tests
dotnet new xunit -n ColaFlow.IntegrationTests
cp ColaFlow.IntegrationTests.csproj.template ColaFlow.IntegrationTests/ColaFlow.IntegrationTests.csproj
# Restore packages
dotnet restore
Running Tests
Run All Tests
# From repository root
dotnet test
# From tests directory
cd tests
dotnet test
Run Specific Test Project
# Domain tests only
dotnet test ColaFlow.Domain.Tests/ColaFlow.Domain.Tests.csproj
# Integration tests only
dotnet test ColaFlow.IntegrationTests/ColaFlow.IntegrationTests.csproj
Run Specific Test Class
dotnet test --filter FullyQualifiedName~ProjectTests
Run Specific Test Method
dotnet test --filter FullyQualifiedName~ProjectTests.Create_ValidData_ShouldCreateProject
Run Tests by Category
# Run only unit tests
dotnet test --filter Category=Unit
# Run only integration tests
dotnet test --filter Category=Integration
# Exclude slow tests
dotnet test --filter Category!=Slow
Parallel Execution
# Run tests in parallel (default)
dotnet test --parallel
# Run tests sequentially (for debugging)
dotnet test --parallel none
Verbose Output
# Detailed output
dotnet test --logger "console;verbosity=detailed"
# Minimal output
dotnet test --logger "console;verbosity=minimal"
Writing Tests
Unit Tests (Domain Layer)
Example: Testing Project Aggregate
using FluentAssertions;
using Xunit;
namespace ColaFlow.Domain.Tests.Aggregates;
public class ProjectTests
{
[Fact]
public void Create_ValidData_ShouldCreateProject()
{
// Arrange
var name = "Test Project";
var description = "Test Description";
var key = "TEST";
var ownerId = UserId.Create(Guid.NewGuid());
// Act
var project = Project.Create(name, description, key, ownerId);
// Assert
project.Should().NotBeNull();
project.Name.Should().Be(name);
project.Key.Value.Should().Be(key);
project.Status.Should().Be(ProjectStatus.Active);
project.DomainEvents.Should().ContainSingle(e => e is ProjectCreatedEvent);
}
[Theory]
[InlineData("")]
[InlineData(null)]
[InlineData(" ")]
public void Create_InvalidName_ShouldThrowException(string invalidName)
{
// Arrange
var key = "TEST";
var ownerId = UserId.Create(Guid.NewGuid());
// Act
Action act = () => Project.Create(invalidName, "", key, ownerId);
// Assert
act.Should().Throw<DomainException>()
.WithMessage("Project name cannot be empty");
}
}
Application Layer Tests (CQRS)
Example: Testing Command Handler
using FluentAssertions;
using Moq;
using Xunit;
namespace ColaFlow.Application.Tests.Commands;
public class CreateProjectCommandHandlerTests
{
private readonly Mock<IProjectRepository> _projectRepositoryMock;
private readonly Mock<IUnitOfWork> _unitOfWorkMock;
private readonly Mock<ICurrentUserService> _currentUserServiceMock;
private readonly CreateProjectCommandHandler _handler;
public CreateProjectCommandHandlerTests()
{
_projectRepositoryMock = new Mock<IProjectRepository>();
_unitOfWorkMock = new Mock<IUnitOfWork>();
_currentUserServiceMock = new Mock<ICurrentUserService>();
_handler = new CreateProjectCommandHandler(
_projectRepositoryMock.Object,
_unitOfWorkMock.Object,
_currentUserServiceMock.Object,
Mock.Of<IMapper>(),
Mock.Of<ILogger<CreateProjectCommandHandler>>()
);
}
[Fact]
public async Task Handle_ValidCommand_CreatesProject()
{
// Arrange
var command = new CreateProjectCommand
{
Name = "Test Project",
Description = "Description",
Key = "TEST"
};
_currentUserServiceMock.Setup(x => x.UserId).Returns(Guid.NewGuid());
_projectRepositoryMock.Setup(x => x.GetByKeyAsync(It.IsAny<string>(), default))
.ReturnsAsync((Project?)null);
// Act
var result = await _handler.Handle(command, default);
// Assert
result.Should().NotBeNull();
_projectRepositoryMock.Verify(x => x.AddAsync(It.IsAny<Project>(), default), Times.Once);
_unitOfWorkMock.Verify(x => x.CommitAsync(default), Times.Once);
}
[Fact]
public async Task Handle_DuplicateKey_ThrowsException()
{
// Arrange
var command = new CreateProjectCommand { Name = "Test", Key = "TEST" };
var existingProject = Project.Create("Existing", "", "TEST", UserId.Create(Guid.NewGuid()));
_projectRepositoryMock.Setup(x => x.GetByKeyAsync("TEST", default))
.ReturnsAsync(existingProject);
// Act
Func<Task> act = async () => await _handler.Handle(command, default);
// Assert
await act.Should().ThrowAsync<DomainException>()
.WithMessage("*already exists*");
}
}
Integration Tests (API)
Example: Testing API Endpoint
using System.Net;
using System.Net.Http.Json;
using FluentAssertions;
using Xunit;
namespace ColaFlow.IntegrationTests.API;
[Collection("IntegrationTests")]
public class ProjectsApiTests : IClassFixture<ColaFlowWebApplicationFactory<Program, ColaFlowDbContext>>
{
private readonly HttpClient _client;
private readonly ColaFlowWebApplicationFactory<Program, ColaFlowDbContext> _factory;
public ProjectsApiTests(ColaFlowWebApplicationFactory<Program, ColaFlowDbContext> factory)
{
_factory = factory;
_client = factory.CreateClient();
}
[Fact]
public async Task CreateProject_ValidData_ReturnsCreated()
{
// Arrange
var createRequest = new CreateProjectDto
{
Name = "Test Project",
Description = "Test Description",
Key = "TEST"
};
// Act
var response = await _client.PostAsJsonAsync("/api/v1/projects", createRequest);
// Assert
response.StatusCode.Should().Be(HttpStatusCode.Created);
var project = await response.Content.ReadFromJsonAsync<ProjectDto>();
project.Should().NotBeNull();
project!.Name.Should().Be("Test Project");
}
[Fact]
public async Task GetProject_ExistingId_ReturnsProject()
{
// Arrange - Seed data
using var scope = _factory.CreateScope();
var dbContext = _factory.GetDbContext(scope);
var project = Project.Create("Test", "Description", "TEST", UserId.Create(Guid.NewGuid()));
await dbContext.Projects.AddAsync(project);
await dbContext.SaveChangesAsync();
// Act
var response = await _client.GetAsync($"/api/v1/projects/{project.Id.Value}");
// Assert
response.StatusCode.Should().Be(HttpStatusCode.OK);
var returnedProject = await response.Content.ReadFromJsonAsync<ProjectDto>();
returnedProject.Should().NotBeNull();
returnedProject!.Name.Should().Be("Test");
}
}
Test Coverage
Generate Coverage Report
# Run tests with coverage
dotnet test /p:CollectCoverage=true /p:CoverletOutputFormat=opencover
# Generate HTML report (requires ReportGenerator)
dotnet tool install -g dotnet-reportgenerator-globaltool
reportgenerator -reports:coverage.opencover.xml -targetdir:coveragereport -reporttypes:Html
# Open report
start coveragereport/index.html # Windows
open coveragereport/index.html # Mac
Coverage Thresholds
Configure in test project .csproj:
<PropertyGroup>
<CoverletOutputFormat>opencover</CoverletOutputFormat>
<Threshold>80</Threshold>
<ThresholdType>line,branch,method</ThresholdType>
<ThresholdStat>total</ThresholdStat>
</PropertyGroup>
Exclude from Coverage
[ExcludeFromCodeCoverage]
public class Startup { }
CI/CD Integration
GitHub Actions
Tests run automatically on every push and pull request. See .github/workflows/test.yml.
Local CI Simulation
# Simulate CI environment
dotnet clean
dotnet restore
dotnet build --no-restore
dotnet test --no-build --verbosity normal
Best Practices
General Principles
-
Arrange-Act-Assert (AAA) Pattern
[Fact] public void TestMethod() { // Arrange - Setup test data and dependencies var input = "test"; // Act - Execute the method under test var result = MethodUnderTest(input); // Assert - Verify the result result.Should().Be("expected"); } -
One Assertion Per Test (when practical)
- Makes failures easier to diagnose
- Exception: Related assertions (e.g., checking object properties)
-
Test Naming Convention
MethodName_StateUnderTest_ExpectedBehaviorExamples:
Create_ValidData_ShouldCreateProjectCreate_EmptyName_ShouldThrowExceptionGetById_NonExistentId_ReturnsNotFound
-
Test Independence
- Tests should not depend on execution order
- Each test should clean up after itself
- Use test fixtures for shared setup
-
Avoid Test Logic
- No loops, conditionals, or complex logic in tests
- Tests should be simple and readable
Domain Tests
- Test business rules and invariants
- Test domain events are raised
- Test value object equality
- No mocking (pure unit tests)
Application Tests
- Mock infrastructure dependencies (repositories, services)
- Test command/query handlers
- Test validation logic
- Test MediatR pipeline behaviors
Integration Tests
- Use Testcontainers for real databases
- Test complete request/response flows
- Test database operations
- Test authentication/authorization
- Clean database between tests
Performance Considerations
- Keep unit tests fast (< 100ms each)
- Integration tests can be slower (< 5s each)
- Use
[Fact(Skip = "Reason")]for slow tests during development - Run slow tests in CI only
Data Builders
Use builder pattern for complex test data:
public class ProjectBuilder
{
private string _name = "Test Project";
private string _key = "TEST";
public ProjectBuilder WithName(string name)
{
_name = name;
return this;
}
public ProjectBuilder WithKey(string key)
{
_key = key;
return this;
}
public Project Build()
{
return Project.Create(_name, "Description", _key, UserId.Create(Guid.NewGuid()));
}
}
// Usage
var project = new ProjectBuilder()
.WithName("Custom Project")
.WithKey("CUSTOM")
.Build();
Troubleshooting
Docker Not Running
Error: Unable to connect to Docker daemon
Solution: Start Docker Desktop and ensure it's fully initialized.
Port Conflicts
Error: Address already in use
Solution: Stop conflicting services or use different ports in docker-compose.yml.
Test Database Not Clean
Issue: Tests fail due to leftover data
Solution: Use CleanDatabaseAsync() in test setup or use Testcontainers (auto-cleanup).
Slow Tests
Issue: Integration tests taking too long
Solutions:
- Use Testcontainers' shared fixture (reuse containers)
- Optimize database queries
- Use in-memory database for simple tests
- Run integration tests selectively
Flaky Tests
Issue: Tests pass/fail intermittently
Common causes:
- Race conditions (async/await issues)
- Time-dependent assertions
- External service dependencies
- Database transaction issues
Solutions:
- Use proper async/await
- Mock time-dependent code
- Use Testcontainers for isolation
- Ensure proper transaction handling
Resources
- xUnit Documentation
- FluentAssertions Documentation
- Testcontainers Documentation
- Architecture Design
- Docker Setup
Support
For testing issues:
- Check this README
- Review test examples in this directory
- Consult architecture documentation
- Ask team for help
Last Updated: 2025-11-02 Maintained By: QA Team Quality Standard: 80%+ Coverage, All Tests Green