Running Database Integration Tests in Go

I’m working through writing a Go app backed by a PostgreSQL database (using sqlx). Whenever I’m writing code that interacts with a database, I want to make sure that I’m testing against the real thing — especially when using plain old SQL statements where dialects can differ between databases.

One thing I do in PHP apps is set up schemas out of band (eg before starting tests) then wrap each test in a transaction that is then rolled back at the end of each test. This makes each individual test pretty quick and gives us a database that’s always in a clean (or at least known) state. This post talks about how to do that in Go.

Setting up the Schema Before Tests Start

The idea here is to run postgres (or whatever) in docker compose. Before tests start, drop the test database then re-create it. Then create the schemas on that database (via migrations or other tools).

Docker Compose File

services:
  postgres:
    image: postgres:14.4
    environment:
      - POSTGRES_USER=app
      - POSTGRES_PASSWORD=app
      - PGDATA=/var/lib/postgresql/data/mnt/pg
    volumes:
      - ./var/postgres:/var/lib/postgresql/data/mnt
    ports:
      - "5432:5432"
    healthcheck:
      test: ["CMD-SHELL", "pg_isready", "-U", "app"]
      retries: 1

Setting Up the Database

See also: how to use health checks to wait for dev environment services. That’s what wait_for_services is doing below.

./wait_for_services

docker compose exec postgres psql app app -c  'DROP DATABASE IF EXISTS app_test'
docker compose exec postgres psql app app -c 'CREATE DATABASE app_test'

# run migrations against the test database
migrate -source file://migrations -database postgres://app:app@localhost/app_test?sslmode=disable

The Database Interface

In order to make this work, our code needs to not care if it’s talking to a database connection pool, a transaction, or something else. To accomplish that, an interface:

type Database interface {
	GetContext(ctx context.Context, dest interface{}, query string, args ...interface{}) error

	SelectContext(ctx context.Context, dest interface{}, query string, args ...interface{}) error

	QueryContext(ctx context.Context, query string, args ...interface{}) (*sql.Rows, error)

	QueryRowContext(ctx context.Context, query string, args ...interface{}) *sql.Row

	ExecContext(ctx context.Context, query string, args ...interface{}) (sql.Result, error)
}

This is a collection of method we want to use from sql.DB and sqlx.DB and they all also happen to exist on sqlx.TX.

All code that interacts with the database should use this interface rather than depend on the concrete types from sqlx or sql.

The Database Test Case

Go doesn’t have the XUnit style setUp and tearDown methods, but one can mimic those with helper functions and t.Cleanup(...).

I like to use a struct that named {Thing}TestCase for this. so DBTestCase in this case:

import (
	"context"
	"testing"

	"github.com/jmoiron/sqlx"
	_ "github.com/lib/pq"
)
type DBTestCase struct {
	Database Database
	TX       *sqlx.Tx
	Context  context.Context
}

func StartDBTestCase(t *testing.T) *DBTestCase {
	t.Helper()

    // may want to pull this DB URL from the environment or something
	db, err := sqlx.Connect("postgres", "postgres://app:app@localhost:5432/app_test?sslmode=disable")
	if err != nil {
		t.Fatalf("failed to connect to database: %v", err)
	}

    // only using this connection for a single test case so only need a max of one connection
	db.SetMaxIdleConns(1)
	db.SetMaxOpenConns(1)

	tx, err := db.Beginx()
	if err != nil {
		t.Fatalf("failed to start transaction: %v", err)
	}

    // when the test is done, rollback then close
	t.Cleanup(func() {
		tx.Rollback()
		db.Close()
	})

	return &DBTestCase{
		Database: tx,
		TX:       tx,
		Context:  context.TODO(),
	}
}

Nothing to crazy here, we’re connecting the the database, using t.Fatal(..) to bail if the connection fails. Then starging the transaction and using the transaction itself as the Database implementation. I’m including a contxt.Context here just so test cases don’t have do context.TODO() everywhere.

The only other thing to note here is t.Cleanup(...) which rolls the transaction back then closes the connection after the test is complete.

One could probably improve this by sharing a connection across all tests and starting a transaction for each, but the overhead of managing that across tests is probably not worth it. This is easy enough to understand and quick enough (until it’s not, then optimize).

Using the Test Case

Here’s a struct that uses the database interface.

type PostgresThingStorage struct {
	db   Database
}

// etc

To test it:

func TestPostgresThingStorage_Example(t *testing.T) {
  tc := StartDBTestCase(t)
  thingStorage := &PostgresThignStorage{db: tc.Database}
  
   // test things, etc
}

Everything in the test will be rolled back when the test is done, leaving the database in a known state for other tests.

Splitting Out Database Tests with Build Tags

In order to make sure that the DB is set up, it might be relevant to hide the tests unless a build tag is specified (in the thing_db_test.go file):

//go:build db

package thing_test

// ...

func TestPostgresThingStorage_Example(t *testing.T) {
  tc := StartDBTestCase(t)
  thingStorage := &PostgresThignStorage{db: tc.Database}
  
   // ...
}

Then run those tests after the DB setup:

#!/usr/bin/env bash

./wait_for_services

docker compose exec postgres psql app app -c  'DROP DATABASE IF EXISTS app_test'
docker compose exec postgres psql app app -c 'CREATE DATABASE app_test'

# run migrations against the test database
migrate -source file://migrations -database postgres://app:app@localhost/app_test?sslmode=disable

exec go test ./... -tags=db

A Warning for MySQL Users

MySQL has some statement that cause an implicit commit. Namely DDL statements (create/drop/alter table, etc). Beware of trying to do something like the above if tests include DDL on MySQL.