Creating Protobuf/JSON Services

In a previous article we showed how to create a service with a simple GET and PUT interface for easily importing and exporting large amounts of data. These are useful for data integration purposes, but not for building web front-ends. What if we want to build a web application that talks to a LogicBlox workspace using a JSON protocol over HTTP?

That’s where LogicBlox protocol buffer/JSON services come in.

We’ll start with the end result of this tutorial: a service callable via a simple curl call or web page to add a new “store” to our workspace.

Here’s how to call the service via curl:

$ curl -X POST -H "Content-Type: application/json" \
               -H "Accept: application/json" \
               -d '{"name": "My Store", "city": "My City"}' \
{"message": "Added"}

We’ll build our service in 4 steps:

  1. Define the service request and response formats
  2. Define the service logic
  3. Map the service to a URL
  4. Test the service

As usual, the full source code to this article is available from Bitbucket.

Define service request and response formats

LogicBlox uses Google’s Protocol Buffers format to define the request and response formats for web services. Protocol buffers (or simply protobufs for short) consists of a simple data description language and a compact binary encoding of these messages. The latter isn’t very useful for our use case (although protobufs over HTTP are supported) — in practice most LogicBlox services are called with JSON encodings of protobufs (at least when building web apps). To write logic against these messages there’s also a mapping from protobuf to LogiQL predicates, which we’ll be using later on.

Let’s start with defining a protobuf protocol for our add store service. For this we’ll create a proto/stores.proto file:

package stores;

message AddStoreRequest {
    required string name = 1;
    required string city = 2;

message AddStoreResponse {
    required string message = 1;

The first line declares these messages to be in the stores package, which is used for namespacing purposes. For each service we have to define (at least) two “messages” (think of a message as a class/struct-like data structure): one describing the format of the request and one for the response. More complicated services usually require additional auxiliary messages.

A message has a name and a set of attributes. Each attribute specifies if it’s either optional, required or repeated. We’ll get to repeated in a future article. The meaning of optional and required should be self explanatory. In addition a type, name and index is specified. The index defines the order in which these attributes appear in the messages and are primarily useful to have the ability to add attributes later without breaking backwards compatibility — all concerns that are only useful when actually using the protobuf encoding, which we won’t. So, all you have to remember is: add an index to your attributes — for sanity sake, in ascending order.

For these protobuf definitions to be used inside of LogicBlox we have to add them to a project, so we add the following entry in our .project file:

proto/stores.proto, proto, descName=stores lifetime=transaction

The first part is the relative path to the .proto file, the second signifies that this is a protocol buffer, the last sets some options for the protobuf to LogiQL compiler, telling it put the resulting message predicates in the stores namespace and give them a transaction lifetime (that is: make them pulse predicates).

To compile the protocol buffers run:

$ lb config
$ make

We can now find an interesting file buried deeply in our build directory: build/sepcom/project/proto-gen/stores.module/stores.logic. This file contains the result of compiling our .proto file to LogiQL. The interesting part of the file is the following:

AddStoreRequest(_) -> .AddStoreRequestConstructor[i]=x ->int(i), AddStoreRequest(x).
AddStoreRequest_name[x] = y -> AddStoreRequest(x), string(y).
AddStoreRequest_city[x] = y -> AddStoreRequest(x), string(y).

AddStoreResponse(_) -> .AddStoreResponseConstructor[i]=x ->int(i), AddStoreResponse(x).
AddStoreResponse_message[x] = y -> AddStoreResponse(x), string(y).

These predicates define a predicate-version of the protocol buffer messages we defined:

  • An AddStoreRequest entity and constructor for the AddStoreRequest message
  • An AddStoreRequest_name functional predicate for the name attribute
  • An AddStoreRequest_city functional predicate for the city attribute
  • An AddStoreResponse entity and constructor for the AddStoreResponse message
  • An AddStoreResponse_message functional predicate for the message attribute of our response message

Pretty obvious, right?

So, let’s see what we can do with these predicates.

Service logic

Here’s what happens when a HTTP request for a protobuf/json web service comes in:

  1. A new transaction is started.
  2. The JSON (or protocol buffer encoded) message in the body of the request is translated into delta updates that pulse the predicates generated from the protobuf specification. Note that since we specified these predicates have transaction lifetime they are specific to the transaction and are not persisted after the transaction ends.
  3. The LogicBlox workspace (presumably) has rules that do something with the pulsed request predicate data and eventually pulses predicates generated for the protobuf messages that represent the response message.
  4. The pulsed response predicate data is translated into a protobuf message (or JSON object) and sent back to the client.
  5. The transaction ends/commits

Naturally, if any constraints are violated during the handling of the request the transaction is rolled back to keep the workspace consistent.

So let’s how we can implement our add store service using this model. Like any type of logic, web service logic is defined in a .logic file, so we create a file named services/add_store.logic for this purpose.

A common pattern that is often used to build services using LogiQL is to define a functional predicate that maps a request to a response:

answer[req] = resp ->

This declares a predicate answer. You can think of it as a way to look up what request messages result in responses. This is particularly useful to send error messages. Many services will contain logic where the pulsing of a response message depends on various conditions. If those conditions are not met, no response message will be pulsed. As a result the user will get an empty response. To avoid this, you can write a rule that produces an error response if there is not already a response in the answer predicate.

The broad stroke behavior of our web service (or any web service for that matter) can be expressed as follows using LogiQL:

+answer[req] = resp <-

That is: if a AddStoreRequest is pulsed (that is: a HTTP request comes in), we will pulse an AddStoreResponse (that is: we send a HTTP response) and track this request/response combo in the answer predicate.

That’s great, but this doesn’t do anything useful yet.

The relevant part of the data model for our service looks as follows (located in the hierarchy:location module):

city(c) -> .
city_by_name[name] = city -> city(city),  string(name).

store(s) -> .
store_by_name[name] = store -> store(store), string(name).
store_in_city[s] = c -> store(s), city(c).

i.e.: we have two entities: city which has a city_by_name constructor that maps names to cities, and store which maps store names to stores and an extra attribute predicate that is used to store what city a store is located in.

What we want to do is create a new store entity with the given name, and ensure there’s also a city entity and map our new store to that city in the store_in_city predicate.

When do we want to do this? When a request comes in that we need to create a response for, i.e. when a new fact was pulsed into the answer predicate:

+location:city_by_name[city_name] = city,
+location:store_by_name[name] = store,
+location:store_in_city[store] = city,
+stores:AddStoreResponse_message[resp] = "Added" <-
  +answer[req] = resp,
  +stores:AddStoreRequest_name[req] = name,
  +stores:AddStoreRequest_city[req] = city_name.

Read this rule backwards: when a request req comes in that contains a name and city (should be the case for every AddStoreRequest), then create a store store and a city city with the respective store name and city name from the request. In addition pulse the message attribute of the response message saying we successfully added the store.

This works, but is not perfect yet. When you call the service with an already existing store, it will effectively do nothing (because the store was already there) and pretend it added the store to the by returning the “Added” message. If, instead, we would like to receive an error message saying the store already exists, we have to add an additional condition to our rule:

+location:city_by_name[city_name] = city,
+location:store_by_name[name] = store,
+location:store_in_city[store] = city,
+stores:AddStoreResponse_message[resp] = "Added" <-
  +answer[req] = resp,
  +stores:AddStoreRequest_name[req] = name,
  +stores:AddStoreRequest_city[req] = city_name,
  // New condition:
  !location:store_by_name@prev[name] = _.

This ensures that the rule only executes when there is not already a store with this name at the beginning of the transaction. So what happens if there is? Without doing anything more, you’d now get an empty response. To send back a nice message instead, we create an extra rule:

+stores:AddStoreResponse_message[resp] = "Store already exists" <-
  +answer[req] = resp,
  +stores:AddStoreRequest_name[req] = name,
  location:store_by_name@prev[name] = _.

So: when there exists a store with the name listed in the request, set the response message to “Store already exists”.

Phew. Alright. Here’s the complete version. Let’s finish this up.

Map the service to a URI

Thus far:

  1. We created a definition of the request and response messages for our service
  2. We defined the logic to generate a response for an add store request message.

What’s left to do is define at what URI this service should live at and what protobuf messages should be used to decode the incoming JSON requests. For this we create the /services/service_config.logic file:

block(`service_config) {

    service_by_prefix["/stores/add"] = x,
    default_protobuf_service(x) {
      protobuf_protocol[] = "stores",
      protobuf_request_message[] = "AddStoreRequest",
      protobuf_response_message[] = "AddStoreResponse"
} <-- .

This maps our service to /stores/add (usually http://localhost:8080/stores/add) and defines that the message to be used for the request is AddStoreRequest in the stores protobuf definition, and AddStoreResponse for the response.

Similar to the for our TDX example we have a start-services target in our

rule('start-services', 'check-lb-workspaces', [
  '$(logicblox)/bin/lb web-server load-services -w lb-example'
], True)

To compile and run:

$ lb config
$ make start-services

To test:

$ curl -X POST -H "Content-Type: application/json" \
       -H "Accept: application/json" \
       -d '{"name": "My Store", "city": "My City"}' \

Note that we need to set the Accept and Content-Type headers to ensure LogicBlox interprets our request as JSON (instead of the binary protobuf format, which is the default) and returns us a response in JSON as well.

The first time we run this curl command we’ll get back:

{"message": "Added"}

When we run it again:

{"message": "Store already exists"}


Clearly, this a simple example of a web service. In future articles we’ll build somewhat more sophisticated services and look at how to call these services from a web page. To get more in-depth information on creating protobuf services, read the chapter on our reference manual.


Leave a reply

© Copyright 2023. Infor. All rights reserved.

Log in with your credentials

Forgot your details?