Custom functions for your Serverless Workflow service

The Cloud Native Computing Foundation (CNCF) specification supports the custom function type, which enables the implementations to extend the function definition capability.

Kogito supports the java and sysout custom types.

The CNCF specification does not support java and sysout functions. Therefore, these functions might not be portable across other implementations.

sysout custom function

You can use the sysout function for debugging or for quick demonstrations as shown in the following example:

Example of sysout function definition
{
  "functions": [
    {
      "name": "printMessage",
      "type": "custom",
      "operation": "sysout"
    }
  ]
}

In the state definition, you can call the same sysout function as shown in the following example:

Example of a sysout function reference within a state
{
  "states": [
    {
      "name": "myState",
      "type": "operation",
      "actions": [
        {
          "name": "printAction",
          "functionRef": {
            "refName": "printMessage",
            "arguments": {
              "message": "."
            }
          }
        }
      ]
    }
  ]
}

You must avoid using the sysout function in a production environment since it can spam the unnecessary data in the application log.

java custom function

Kogito supports the java functions within an Apache Maven project, in which you define your workflow service.

Function Definition

The following example shows the declaration of a java function:

Example of a java function declaration
{
  "functions": [
    {
      "name": "myFunction", (1)
      "type": "custom", (2)
      "operation": "service:java:com.acme.MyInterfaceOrClass::myMethod" (3)
    }
  ]
}
1 myFunction is the function name
2 custom is the function type
3 service:java:com.acme.MyInterfaceOrClass::myMethod is the custom operation definition. In the custom operation definition, service is the reserved operation keyword followed by the java keyword. com.acme.MyInterfaceOrClass is the FQCN (Fully Qualified Class Name) of the interface or implementation class followed by the method name (myMethod).

Function Arguments

Your method interface signature must copy the arguments passed by the workflow.

For example, if you invoke a function using one argument as follows, then your method signature assumes that the number model variable is an integer:

Example of a java function reference with one argument
{
 "functionRef": {
   "refName": "myFunction",
   "arguments": {
     "number": "${.number}"
   }
 }
Example of a java function implementation
public class MyInterfaceOrClass {

  public void myMethod(int number) {
        if (number % 2 != 0) {
            throw new IllegalArgumentException("Odd situation");
        }
    }
}

As a particular case, if you provide no argument in the workflow definition, the signature of the Java method might include a Jackson’s JsonNode parameter. This means that the Java method expects the entire workflow model as input.

When using the following example function reference with no arguments, and if the method signature contains a JsonNode parameter, the entire workflow model is passed when the method call is performed.

Example of a java function reference with no arguments
{
  "functionRef": {
    "refName": "myFunction"
   }
}
Example of a java function implementation
public class MyInterfaceOrClass {


    public JsonNode myMethod(JsonNode workflowData) {
        // do whatever I want with the Workflow model
        ......
        // return the modified content:
        return workflowData;
    }
}

Function return values

If your method returns a JsonNode, the content of that node is merged into the workflow model (you can use an action data filter to control what is merged).

The same occurs if your method returns any Java Object descendant that is not a primitive wrapper, the Java object is recursively converted to a JSON object and the result is merged into the workflow model (you can use an action data filter to control what is merged).

If your method returns a primitive type or their corresponding wrapper object (int, boolean, long, and so on), then the primitive value is added to the workflow model with the name response (you can change that name using an action data filter).

If your method returns Java collections, it is converted to a JSON array and added to the workflow model with the name response (you can change that name using an action data filter).

Function accessing Kogito context

If you need access to process contextual information (for example, Kogito process instance ID) inside your Java service, you can add a KogitoProcessContext parameter as the last one in the method signature.

Therefore, if you need to do so, you can update the signature of methods from previous sections.

Example of a function accessing Kogito context
public class MyInterfaceOrClass {
public JsonNode myMethod(JsonNode workflowData, KogitoProcessContext context ) {
        // do whatever I want with the JsonNode and the Kogito process context
        ......
        // return the modified content:
        return workflowData;
    }
}
Example of a function accessing Kogito context
public class MyInterfaceOrClass {


    public void myMethod(int number, KogitoProcessContext context) {
        if (number % 2 != 0) {
            throw new IllegalArgumentException("Odd situation");
        }
    }
}

Avoid using java functions to call the external services, instead, you can use the services orchestration features.

Custom function types

You can add your custom types by using the Kogito add-on mechanism. As predefined custom types like sysout or java, the custom type identifier is the prefix of the operation field of the function definition.

Kogito add-ons relies on the Quarkus extensions mechanism. And the add-on consists of at least two Maven projects:

  • The deployment module, which is responsible for generating the code required for the extension to work.

  • The runtime module, which includes the non-generated classes that are required for the extension to work.

In the case of a Serverless Workflow custom type, following are the roles of the modules:

  • The deployment project

    The deployment project is expected to configure the work item handler used during runtime to perform the logic associated with the custom type. It must contain a Java class that inherits from WorkItemTypeHandler. Its responsibilities are to indicate the custom type identifier (the operation prefix, as indicated earlier) and to set up the WorkItemNodeFactory instance passed as a parameter of the fillWorkItemHandler method. That instance is included in the Kogito process definition for that Workflow. As a part of this setup, you must indicate the name of the WorkItemNodeFactory. You might also provide any relevant metadata for that handler if needed.

  • The runtime project

    The runtime project consists of a WorkflowWorkItemHandler implementation, which name must match with the one provided to WorkItemNodeFactory during the deployment phase, and a WorkItemHandlerConfig bean that registers that handler with that name.

    When a Serverless Workflow function is called, Kogito identifies the proper WorkflowWorkItemHandler instance to be used for that function type (using the handler name associated with that type by the deployment project) and then invokes the internalExecute method. The Map parameter contains the function arguments defined in the workflow, and the WorkItem parameter contains the metadata information added to the handler by the deployment project. Hence, the executeWorkItem implementation has an access to all the information needed to perform the computational logic intended for that custom type.

Custom function type example

Assuming you want to interact, from a workflow file, with a legacy RPC server as the one defined in this project. This legacy server supports four simple arithmetic operations: add, minus, multiply and divide, which can be invoked using a custom RPC protocol.

Since this is an uncommon protocol, the workflow cannot handle them by using any of the predefined Serverless Workflow function types. The available options are to use a Java service, which invokes a Java class that knows how to interact with the server, or define a custom type that knows how to interact with the service.

Using the recent approach, you can write a workflow file defining this function.

RPC Custom function definition example
 "functions": [
    {
      "name": "division",
      "type": "custom",
      "operation": "rpc:division"
    }
  ],

The operation starts with rpc, which is the custom type identifier, and continues with division, which denotes the operation that will be executed in the legacy server.

A Kogito addon that defines the rpc custom type must be developed for this function definition to be identified. It is consist of a deployment project and a runtime project.

The deployment project is responsible for extending the WorkItemTypeHandler and setup the WorkItemNodeFactory as follows:

Example of the RPC function Java implementation
import static org.kie.kogito.examples.sw.custom.RPCCustomWorkItemHandler.NAME;
import static org.kie.kogito.examples.sw.custom.RPCCustomWorkItemHandler.OPERATION;

public class RPCCustomTypeHandler extends WorkItemTypeHandler{


    @Override
    public String type() {
        return "rpc";
    }

    @Override
    protected <T extends RuleFlowNodeContainerFactory<T, ?>> WorkItemNodeFactory<T> fillWorkItemHandler(Workflow workflow,
                                                                                                        ParserContext context,
                                                                                                        WorkItemNodeFactory<T> node,
                                                                                                        FunctionDefinition functionDef) {
        return node.workName(NAME).metaData(OPERATION, trimCustomOperation(functionDef));
    }
}

This example setups the name of the KogitoWorkItemHandler, adds a metadata key with the name of the remote operation (extracted from the Serverless Workflow function definition operation property), and declares that the custom type is named as rpc.

The Runtime project contains the KogitoWorkItemHandler and the WorkItemHandlerConfig implementations.

As expected, RPCCustomWorkItemHandler implements the internalExecute method as follows:

Example of implementation of the internalExecute method
 @Override
protected Object internalExecute(KogitoWorkItem workItem, Map<String, Object> parameters)  {
    try {
        Iterator<?> iter = parameters.values().iterator();
        Map<String, Object> metadata = workItem.getNodeInstance().getNode().getMetaData();
        String operationId = (String) metadata.get(OPERATION);
        if (operationId == null) {
            throw new IllegalArgumentException ("Operation is a mandatory parameter");
        }
        return CalculatorClient.invokeOperation((String)metadata.getOrDefault(HOST,"localhost"), (int) metadata.getOrDefault(PORT, 8082),
                OperationId.valueOf(operationId.toUpperCase()), (Integer)iter.next(), (Integer)iter.next());
    } catch (IOException io ) {
        throw new UncheckedIOException(io);
    }
}

The implementation invokes the CalculatorClient.invokeOperation, a java static method that knows how to interact with the legacy service. You can obtain the operation parameter from the WorkItem metadata. The dividend and the divisor parameters are obtained from the Map parameter, which contains the function arguments defined in the workflow file.

Example of the custom function call from the workflow definition
"actions": [
       {
          "functionRef": {
             "refName": "division",
             "arguments": {
                 "dividend": ".dividend",
                 "divisor" : ".divisor"
             }
           }

       }

The RPCCustomWorkItemHandlerConfig is a bean that registers the handler name.

Example of injecting the custom`WorkItemHandler`
@Inject
RPCCustomWorkItemHandler handler;

@PostConstruct
void init () {
    register(handler.getName(),handler);
}

Found an issue?

If you find an issue or any misleading information, please feel free to report it here. We really appreciate it!