A basic reference for APIs in Spark on Hasura

This post is intended to serve as a basic guide to deploying Spark based APIs on Hasura.

By the end of this post, our codebase will have the following features: 
1. API endpoint that handles a JSON request and responds in JSON 
2. API endpoint that makes an API request (useful for say push notifications) 
3. API endpoint that makes a HasuraDB request using the Hasura SDK 
4. Dockerise the entire API server

Setup Spark

Instructions on setting up Spark with IntelliJ

Add the following dependency to the pom.xml file of your project.    


Then add to your java file.

import static spark.Spark.*;

Making a JSON API endpoint

For JSON handling we use GSON. 
Instructions on using GSON

Add the following dependency to the pom.xml file of your project.    


Then add to your java file

import com.google.gson.Gson;

Object to JSON :

Gson gson = new Gson();  
String countryInJSON = gson.toJson(country)    //country is an object.

JSON to Object :

Suppose we have a Json string which needs to be converted to the country object.

Gson gson = new Gson();  
country countryObj = gson.fromJson(countryInJSON,country.class);

Find the sample code for the json api end-point.

Making an API endpoint that queries an external API

Instructions on using okhttp

Add the following dependency to your pom.xml file.    


Then add the following to the respective java file.

import java.io.IOException  
import okhttp3.OkHttpClient  
import okhttp3.Request
import okhttp3.Response

Refer this code.

In the code:

(GetExample object).run(url) will return the body of the response obtained, and if there is an error in the url, throws an Exception.

Making an API endpoint that uses HasuraDB

Instructions on using HasuraDB SDK

Include the dependencies of bass-sdk-java in the build.gradle (or) any other build tool. Also add a hasura.java file to your project. The instructions for the same can be found here

Customize the hasura.properties with the respective values.

  • url : https://data.your-project-name.hasura-app.io
  • adminAPIKey : You can find it after logging at https://console.your-project-name.hasura-app.io
  • package : Your ‘db’ package where you want all the table classes.
  • dbprefix : /v1/query
  • dir : Path of your ‘db’ directory.

Run the task ‘generate’. This will create all the table classes according to the schema.

  • To run the task ‘generate’. Go to the build.gradle file.
  • Select the code block of the task.
  • Right click and then build it.

Refer this sample code making insert and select queries.

Dockerizing your Spark server

Instructions on using Docker build

Create a jar file of your java-project. In the IntelliJ IDE framework we can create a jar file by.

  • Select File > Project Structure > Articrafts > (add ‘+’) > JAR > From Modules with dependencies.
  • Set the main class with the file where your main class lies and click ‘Ok’.
  • Now select Build > Build Artificats… > Click on build.
  • This will create a directory with the jar file in the src/out/artifacts directory.

Write the dockerfile, you can find a sample one here.

Go to the directory containing the dockerfile and build your docker image.

docker build -t <image-name> .

Run your docker image.

Hasura is an open-source engine that gives you realtime GraphQL APIs on new or existing Postgres databases, with built-in support for stitching custom GraphQL APIs and triggering webhooks on database changes.



The Hasura GraphQL Engine gives you realtime, high performance GraphQL on any Postgres app. Now supports event triggers for use with serverless.

Read More