Bloggo, Buildo's blog
Coding

From “It Works on My Machine” to Real Testing Confidence with Testcontainers

We've happily used Testcontainers on several projects, making our integration testing easier with real Docker-based services. We share our experience and show why it has become our go-to solution and why it should be yours too.

Alessio Pazzani
Full Stack Software Engineer
June 6, 2025
18
minutes read

As developers, project managers, CTOs, or IT professionals, we’ve all experienced the frustration of testing. We write perfect code, create comprehensive test suites, and everything works flawlessly in our local environment. Then, we deploy to a different environment, and suddenly:

“But it works on my machine!”

And yet, like magic, the code that ran beautifully on your laptop implodes the moment it hits a different environment. Database connections mysteriously fail, storage services randomly time out, message brokers refuse to cooperate…and your once “perfect” test suite morphs into a debugging nightmare.

Through long, painful experience, we can confirm that inconsistency issues between environments are often the cause of unexpected errors during testing.

The Root of the Problem: Testing Reality vs. Test Environments

While Docker provided a revolutionary step forward by enabling containerization, many teams still struggle with testing challenges because:

  1. Mocks and stubs create false realities - They simply don’t behave like real systems
  2. In-memory databases lack many real-world constraints and behaviors
  3. Manually configured test environments drift over time, creating inconsistency
  4. Environment-specific configurations are hard to track and reproduce

Docker already solves some of these problems by containerizing dependencies, creating consistent environments that can run anywhere. But Docker alone introduces a new set of challenges:

  • Developers need to become Dockerfile experts
  • Container lifecycle management becomes complex
  • Test data persists between test runs, leading to potential state pollution
  • Setting up specialized test cases requires custom container configurations

This is where Testcontainers comes into play, not to replace Docker, but to leverage its power while addressing these additional challenges.

Testcontainers: Docker’s Perfect Testing Companion

According to the official website, "Testcontainers is an open source library for providing throwaway, lightweight instances of databases, message brokers, web browsers, or just about anything that can run in a Docker container."

But that definition understates what makes Testcontainers truly special. While Docker provides the containerization foundation, Testcontainers adds the test-specific orchestration layer that transforms how we approach integration testing.

How Testcontainers Elevates Docker for Testing

Testcontainers doesn’t just use Docker containers: it creates a testing-specific abstraction around them that solves key challenges:

  1. Code-Driven Configuration: Instead of managing Dockerfiles and command-line parameters, everything is configured directly in your test code
  2. Ephemeral Test Environments: Containers are created and destroyed for each test or test suite, ensuring clean state
  3. Test-Specific Data Initialization: Load exactly the data needed for your test case via scripts, migrations, or seed data
  4. Automatic Lifecycle Management: No need to manually start/stop containers or handle cleanup
  5. Test-Focused Interface: Designed specifically for testing scenarios, not general containerization

Let's look at how this works in practice with a PostgreSQL example:

import { PostgreSQLContainer } from "testcontainers";

const postgresContainer = new PostgreSQLContainer()
  .withUsername("user")
  .withPassword("password")
  // This is where Testcontainers shines - loading test-specific schema
  .withInitScript("src/test/resources/init-test-db.sql");

beforeAll(async () => {
  await postgresContainer.start();
  // Container is ready with exact test data loaded
});

afterAll(async () => {
  await postgresContainer.stop();
  // All test data is automatically cleaned up
});

See the difference? With just a few lines of code, we've created a fully functioning PostgreSQL instance, pre-loaded with exactly the schema and data needed for our tests, all without needing to manage Docker directly.

The Testcontainers Lifecycle: Perfect Testing Environments on Demand

To understand why Testcontainers is such a powerful testing tool, let’s walk through its lifecycle:

1. Test Initialization

When your test begins, Testcontainers:

  • Identifies required dependencies
  • Pulls appropriate Docker images if not available locally
  • Creates a unique container just for your test

2. Container Configuration

Unlike manually managing Docker containers, Testcontainers allow you to:

  • Defined environment variables, ports, volumes, and networks programmatically
  • Load test-specific initialization data (SQL scripts, seed data, etc.)
  • Set container parameters through a clean, fluent API

3. Container Startup and Health Checks

Before your test runs, Testcontainers:

  • Starts the container with your configuration
  • Runs health checks to ensure services are ready
  • Provides connection details to your code

4. Test Execution Against Real Services

During test execution:

  • Your code interacts with actual implementations, not mocks
  • Test assertions run against real service behaviors
  • Dependencies behave exactly as they would in production

5. Thorough Cleanup

After the tests are complete:

  • Containers are automatically stopped and removed
  • All test data is purged
  • Resources are released, leaving no trace or state pollution

Our journey with Testcontainers revealed nuanced strategies for more intelligent testing. We didn't just adopt a new tool; we discovered a more precise way of thinking about integration testing infrastructures.

The real breakthrough came when we started treating our test environments not as simulacra of production, but as near-perfect replicas. Testcontainers enabled us to embed our testing infrastructure directly in our test code, resulting in a radical shift in how we approach test reliability.

The effective use of Testcontainers isn't about the tool itself, but about embracing a set of architectural principles that go beyond traditional containerization:

  • Dynamic Service Initialization: The ability to load specific schemas, seed data, or configuration scripts directly during container startup
  • Granular Test Environment Configuration: Programmatically defining exact runtime conditions for each specific test scenario
  • Reduced Infrastructure Boilerplate: Transforming complex setup logic into declarative, readable test code
  • Ephemeral, Context-Aware Test Dependencies: Creating services that are not just isolated, but intelligently configured for each test's unique requirements

These principles transformed our testing from a mechanical process into a flexible, contextual approach that provides genuine confidence in our software's behavior.

To implement these architectural principles in practice, it's crucial to understand the different levels of abstraction offered by Testcontainers. This enables us to select the most suitable approach for each testing scenario, striking a balance between flexibility and ease of use.

Understanding Generic vs. Technology-Specific Modules

Testcontainers offers two main ways to work with Docker containers:

  1. GenericContainer: The foundational class that can run any Docker image
  2. Technology-specific modules: Pre-configured implementations for common services (PostgreSQL, MySQL, Redis, etc.)

Let's see how this works in practice with PostgreSQL as an example:

// Basic approach with GenericContainer
const postgresContainer = new GenericContainer("postgres:14")
  .withEnvironment({
    "POSTGRES_USER": "user",
    "POSTGRES_PASSWORD": "password",
    "POSTGRES_DB": "testdb"
  })
  .withExposedPorts(5432)
// Initialization script to run on first startup
  .withCopyFileToContainer(
    { hostPath: "src/test/resources/init-test-db.sql", containerPath: "/docker-entrypoint-initdb.d/init.sql" }
  );

This generic approach provides complete control, but it requires an understanding of Docker-specific details, such as environment variables and initialization paths.

To simplify this process, Testcontainers provides technology-specific modules like PostgreSQLContainer that handle these details for you:

// Simplified approach with PostgreSQLContainer
const postgresContainer = new PostgreSQLContainer("postgres:14")
  .withUsername("user")
  .withPassword("password")
  .withDatabaseName("testdb")
  .withInitScript("src/test/resources/init-test-db.sql");

Both approaches produce equivalent results, but the specialized container dramatically reduces boilerplate code while automatically handling best practices.

Practical Testcontainers Strategies Worth Adopting

Our experience at Buildo with Testcontainers has led us to develop and refine several effective strategies for integration testing. Through continuous iteration and real-world application, we've identified approaches that consistently deliver robust and maintainable tests. Below, we'll share these battle-tested practices that have proven particularly valuable in our development workflow.

Test-Specific Initialization

One of Testcontainers' most powerful features is the ability to initialize services with test-specific data:

// Database with specific test schema and data
const dbContainer = new PostgreSQLContainer()
  .withInitScript("src/test/resources/init-test-db.sql");

// S3-compatible storage with predefined test files
const minioContainer = new GenericContainer("minio/minio:latest")
  .withExposedPorts(9000)
  .withCopyToContainer([
    { source: "src/test/resources/test-files", target: "/data" }
  ]);

This best practice ensures your tests interact with services precisely configured for your specific test scenario – something that's much harder to achieve with Docker alone.

Reusable Container Configurations

Instead of repeating container setups across test files, create reusable configurations:

export function createTestDatabase(schema = "default-schema.sql") {
  return new PostgreSQLContainer()
    .withDatabase("testdb")
    .withUsername("testuser")
    .withPassword("testpass")
    .withInitScript(`src/test/resources/schemas/${schema}`);
}

// Later in tests
const dbContainer = createTestDatabase("user-service-schema.sql");

This approach ensures consistent configuration while allowing test-specific customization.

Container Networks for Testing Microservices

Testing microservices often requires multiple interdependent containers. Testcontainers makes this manageable:

// Create a network for the test environment
const network = await new Network().start();

// Create services on the same network
const dbContainer = await new PostgreSQLContainer()
  .withNetwork(network)
  .withNetworkAliases("database")
  .start();

const redisContainer = await new GenericContainer("redis:6")
  .withNetwork(network)
  .withNetworkAliases("cache")
  .start();

const appContainer = await new GenericContainer("my-service:latest")
  .withNetwork(network)
  .withEnvironment({
    "DB_HOST": "database",
    "REDIS_HOST": "cache"
  })
  .withExposedPorts(8080)
  .start();

// Now test the service via appContainer.getMappedPort(8080)

This common practice creates a realistic multi-service environment that mimics production topology – all managed programmatically through your test code.

Real Examples, Real Benefits

To understand how Testcontainers transforms integration testing in practice, let's walk through a complete example. We'll test a user service that handles database operations – exactly the kind of integration scenario where environment inconsistencies typically cause problems.

Step 1: Setting Up Test Dependencies

First, we establish the test dependencies and declare our container:

import { Client } from "pg";
import { PostgreSQLContainer } from "testcontainers";
import { UserService } from "../src/services/user-service";

let postgresContainer: PostgreSQLContainer;
let pgDbClient: Client;
let userService: UserService;

This setup is straightforward: we import our database client, the Testcontainers PostgreSQL container, and the service we want to test.

Step 2: Creating and Configuring the Test Environment

The magic happens in the test setup, where we create a real PostgreSQL instance:

beforeAll(async () => {
  // Create and start container with test-specific schema
  postgresContainer = new PostgreSQLContainer()
    .withInitScript("src/test/resources/user-service-schema.sql");

  await postgresContainer.start();

Here's what makes this powerful: the .withInitScript() method automatically loads our test schema when the container starts. No manual database setup, no shared test databases that might have stale data from previous runs.

Step 3: Connecting Your Service to Real Infrastructure

Once the container is running, we connect our service to the actual PostgreSQL instance:

  // Create a PostgreSQL client connected to our Testcontainer
  pgDbClient = new Client({
    host: postgresContainer.getHost(),
    port: postgresContainer.getMappedPort(5432),
    user: postgresContainer.getUsername(),
    password: postgresContainer.getPassword(),
    database: postgresContainer.getDatabase(),
  });
  await pgDbClient.connect();

  // Initialize our UserService with a real database connection
  userService = new UserService(pgDbClient);
});

Notice how Testcontainers provides all the connection details dynamically. The container might be running on port 32768 or 45231 – you don't need to know or care about this. The service connects to the port assigned by Docker, just like it would in production.

Step 4: Testing Against Real Behavior

Now comes the actual test, which runs against genuine PostgreSQL behavior:

test("should create and retrieve a user", async () => {
  // Create a user
  const userId = await userService.createUser({
    name: "Nicolas Flamel",
    email: "nicolas@flamel.com"
  });

  // Retrieve the user
  const user = await userService.getUserById(userId);

  // Verify correct data
  expect(user.name).toBe("Nicolas Flamel");
  expect(user.email).toBe("nicolas@flamel.com");
});

This test validates actual database transactions, constraint enforcement, data type handling, and all the nuances that an in-memory database or mock might miss.

Step 5: Automatic Cleanup

Finally, everything is cleaned up automatically:

afterAll(async () => {
  await pgDbClient.end(); // Close database connection
  await postgresContainer.stop(); // Stop and remove the container
});

The container and all its data disappear completely, leaving no trace for subsequent test runs.

Beyond Databases: Testing Complete Systems

Testcontainers isn't limited to databases. You can test against any containerized service, enabling comprehensive integration testing across your entire technology stack.

Message Brokers: Testing Event-Driven Architecture

Testing against real message brokers like Kafka helps validate event serialization/deserialization, message ordering, and error handling scenarios that mocks simply can't reproduce:

const kafkaContainer = new KafkaContainer("confluentinc/cp-kafka:latest")
  .withNetwork({/* shared network */ })
  .withExposedPorts(9092);

Search Engines: Testing Query Performance and Indexing

Real search engines like Elasticsearch allow you to test complex query behaviors, indexing strategies, and performance characteristics under various data loads:

const elasticsearchContainer = new ElasticsearchContainer("elasticsearch:7.14.0")
  .withStartupTimeout(/* startup timeout in milliseconds */)
  .withExposedPorts(9200);

Caching Systems: Testing Rate Limits and Edge Cases

Testing against real caching systems like Redis helps validate cache invalidation strategies, expiration policies, rate limiting behaviors, and performance characteristics that are difficult to simulate:

const redisContainer = new RedisContainer("redis:6")
	.withEnvironment({ /* redis configuration */ })
  .withExposedPorts(6379);

Custom Containers: Creating Your Own Container

When standard images aren't enough, you can create custom containers by extending GenericContainer. This lets you test against your own proprietary services or specific configurations while maintaining the same authentic testing approach:

class MyCustomServiceContainer extends GenericContainer {
  constructor() {
    super(IMAGE);
  }
  
  public withCustomMethod(): this {
	return this;
	}
  
  public override async start(): Promise<StartedCustomContainer> {
    return new StartedCustomContainer(await super.start());
  }
}

Combining Services for System-Level Testing

The real power emerges when combining these containers to test complete system behaviors. Instead of mocking the interactions between your application, database, cache, and message broker, you can test the entire data flow:

// Start all services
await Promise.all([
  kafkaContainer.start(),
  elasticsearchContainer.start(),
  redisContainer.start(),
  postgresContainer.start(),
  myCustomServiceContainer.start()
]);

// Your application connects to all real services
// Test complex workflows like:
// 1. User registration triggers Kafka event
// 2. Event processor updates Elasticsearch index
// 3. Cache layer stores frequently accessed data
// 4. Database persists transactional data

This approach eliminates the gap between testing and reality, ensuring your integration tests accurately reflect how your system behaves when all components work together. You're no longer testing assumptions about how external services behave – you're testing against their actual implementations.

From "It Works On My Machine" to "It Works Everywhere"

Moving from “It works on my machine” to “It works everywhere” is not just a dream for software developers — it’s an essential step toward building reliable and scalable products.

Testcontainers is a paradigm shift: It encourages us to treat our tests as real, dynamic, and isolated environments, tailored to the specific needs of each scenario.

By embracing this philosophy, we transform infrastructure complexity into readable and maintainable code, eliminating hidden errors and environment-related instability.

In an increasingly distributed and complex software world, Testcontainers is the key to restoring confidence and speed to the development process, turning our tests into a proper foundation for product success.

Alessio Pazzani
Full Stack Software Engineer

Alessio is a full-stack Software Engineer at Buildo. He's interested in efficient development practices, continuously exploring new tools and methodologies to enhance software development. His primary focus lies in frontend development, crafting visually refined user interfaces.

Still curious? Dive deeper

Coding
You Need This Step in Your A11y Testing Process

November 13, 2024

9

minutes read

Artificial Intelligence
Prefect for Generative AI Pipelines

February 18, 2025

12

minutes read

Let's get down to business

Are you searching for a reliable partner to develop your tailor-made software solution? We'd love to chat with you and learn more about your project.