Asaduzzaman Pavel

Notes on Testing: Why I Prefer Testcontainers Over Mocks

I’ve wasted entire Fridays writing "perfect" mocks for my database layer. I’d spend hours defining exactly what GetByID should return, only to have the app crash in production because of a missing comma in my SQL or a misunderstood Postgres constraint. That’s the problem: mocks don't test your code, they test your assumptions about your code. And usually, your assumptions are wrong.

...And that's why I've mostly moved to Testcontainers. I’d rather wait ten seconds for a real Docker container to spin up than spend ten minutes faking a database behavior that I’m only 80% sure about anyway.

Mocks are just lies we tell ourselves

When you mock a database repository, you're essentially saying: "When I call GetByID, return this static struct." It’s fast, sure, but it completely ignores reality. It doesn't catch syntax errors, it doesn't catch unique constraint violations, and it definitely doesn't catch the weird way your specific version of Postgres handles JSONB columns.

I’ve seen too many projects where the tests were 100% green but the application was broken in its core because the mock didn't account for a simple foreign key constraint. With a real container, the database does the work for you.

The CI Headache (The Real Gripe)

Look, I love Testcontainers, but setting them up in a CI environment like GitHub Actions is an absolute pain in the neck. You end up down a rabbit hole of Docker-in-Docker (DinD) configurations, permission issues, and mounting /var/run/docker.sock into a runner that really doesn't want you to have that much power.

There’s always that one morning where the CI pipeline just hangs indefinitely because the runner ran out of disk space while trying to pull the postgres:16-alpine image for the hundredth time. It’s a trade-off. I’m trading "clean" CI for "reliable" code, but don't let anyone tell you it's a "seamless" setup. It’s a battle.

How I actually use it in Go

I don't restart the container for every single test. That would be insane. I spin up one instance of Postgres at the start of the test suite, and then I use a fresh database or a schema migration for each test.

func TestRepository_CreateUser(t *testing.T) {
    ctx := context.Background()

    // Real Postgres 16 container
    pgContainer, err := postgres.RunContainer(ctx,
        testcontainers.WithImage("postgres:16-alpine"),
        postgres.WithWaitStrategy(
            wait.ForLog("database system is ready to accept connections").
                WithOccurrence(2).
                WithStartupTimeout(5*time.Second),
        ),
    )
    if err != nil {
        t.Fatal(err)
    }
    defer pgContainer.Terminate(ctx)

    // Now we test against a REAL database
    connStr, _ := pgContainer.ConnectionString(ctx, "sslmode=disable")
    db, _ := sql.Open("postgres", connStr)
    repo := repository.NewUserRepository(db)
    
    // This will ACTUALLY fail if my SQL is broken
    err = repo.Create(ctx, &domain.User{Email: "test@example.com"})
    assert.NoError(t, err)
}

Confidence Over Speed

Yes, it’s slower. But I’d rather have a test suite that takes two minutes and actually tells me the truth than a suite that takes two seconds and lies to my face. When I’m using my hand-crafted SQL approach, I need to know that my queries are valid. Testcontainers is the only way I've found to get that confidence without actually deploying to a staging environment.

It’s not perfect. The local Docker-on-Mac/Windows slowness is real, and the CI setup is a constant fight, but I’m never going back to heavy mocking for my data layer. It’s just not worth the risk of a 3 AM production page.

Asaduzzaman Pavel

About the Author

Asaduzzaman Pavel is a Software Engineer who actually enjoys the friction of a well-architected system. He has over 15 years of experience building high-performance backends and infrastructure that can actually handle the real-world chaos of scale.

Currently looking for new opportunities to build something amazing.