Using Hasura GraphQL Engine with a CI/CD system

Applications using Postgres and Hasura GraphQL Engine (HGE) might need to run them in a CI/CD environment to conduct tests. Let’s look at a general workflow in getting Postgres and HGE running with some test data already in place.

Using Hasura with GitLab, Jenkins, Circle CI, Travis CI, Drone CI

Postgres

First step is to run Postgres (≥9.5) as HGE depends on it. Easiest way is to spin up the Docker container if your CI/CD platform supports it.

docker run -d -p 5432:5432 postgres

This will start Postgres with a default database called postgres. The database url will be postgres://postgres:@localhost:5432/postgres.

To wait until Postgres starts running, we can use a simple bash function using netcat:

So, after running Postgres, we wait for it to start running.

wait_for_port 5432

If you have Postgres running elsewhere, you can use that database url directly.

GraphQL Engine

HGE is released as a Docker container. Set the database url as an environment variable and run the container:

docker run -d -p 8080:8080 -e HASURA_GRAPHQL_DATABASE_URL=postgres://postgres:@localhost:5432/postgres hasura/graphql-engine

Wait for port 8080 to be ready:

wait_for_port 8080

GraphQL Engine endpoint will be http://localhost:8080/v1alpha1/graphql.

Migrations

By default, when you run HGE, you can use the console served at /console endpoint to make changes to your schema. However, if you have different environments or you are adding GraphQL to an existing database, and want to replicate the database, you can make use of the Migrations feature.

Hasura GraphQL engine comes with powerful Rails-inspired migration tooling to help you keep track of the changes you make to your schema. As you use the Hasura console from CLI, the CLI will spit out migration files for the changes you make that you can put in version control and even edit manually.

Schema

You can follow the guides on the Migrations docs page to setup migrations for your existing or new project.

Fixtures

If you need to add sample test data, you can add them as SQL files along with migrations.

While preparing your migration files (not on the CI/CD system), create a new migration:

hasura migrate create sample_data

This will create some files of the form <version>_sample_data.{up|down}.{sql|yaml} . You can edit <version>_sample_data.up.sql and add SQL statements to insert required data. Remove all other files and commit this.

Applying migrations during the CI step

Once you have your migrations, you can get them to your CI/CD, say using git clone, and apply these migrations on the HGE we’re running here.

First, install the Hasura CLI:chmod +x hasura

curl -L https://cli.hasura.io/install/linux-amd64 -o hasura

Let’s say the migrations and config.yaml is in a directory called hge, we can apply the migrations by executing the following command:

cd hge
HASURA_GRAPHQL_ENDPOINT=http://localhost:8080 hasura migrate apply

This will restore the schema on the database.

Metadata

If you’re using another migration system like knex, you might be using hasura with migration mode turned off [docs here]. In that case, you’ll only need to apply the HGE metadata:

cd hge
HASURA_GRAPHQL_ENDPOINT=http://localhost:8080 hasura metadata apply

Metadata is already included in the Hasura migrations if you’re using them.

Your application

Once the migration/metadata is applied, you can connect your application to connect to HGE and run your tests. For example, to run npm tests:

HASURA_GRAPHQL_ENGINE_ENDPOINT=http://localhost:8080/v1alpha1/graphql npm run test

Assuming, your application is configured to take the endpoint from the environment variable HASURA_GRAPHQL_ENGINE_ENDPOINT.

Example applications

Here are some projects that run HGE in their tests:


Hasura is an open-source engine that gives you realtime GraphQL APIs on new or existing Postgres databases, with built-in support for stitching custom GraphQL APIs and triggering web hooks on database changes.


Shahidh K Muhammed

Shahidh K Muhammed

Design Engineer by training, Polyglot (machine & human) by day, Cook by night, #GraphQL #Kubernetes #Biriyani

Read More