Red Hat OpenShift Application and Data Services integration

Some Serverless Logic Web Tools features require integration with Red Hat OpenShift Application and Data Services. Uploading OpenAPI specifications to a Service Registry and deploying Serverless Workflow that required an Apache Kafka stream are some of the examples.

On this page, you’ll configure all the settings needed for complete integration.

Setting up a Service Account

Create or use a Service Account from your Red Hat OpenShift Application and Data Services console and add it to the Serverless Logic Web Tools settings tab.

Prerequisites
  • Access to Red Hat OpenShift Application and Data Services console.

Procedure
  1. Create a Service Account on Red Hat Openshift Application and Data Services console (if you already have one, skip this step):

    • Go to Service Accounts | Red Hat OpenShift Application and Data Services;

    • Click on Create service account;

    • In the window that opens up type a Service account name;

    • Click on Create.

    • A new modal will show up with your Client ID and Client Secret, copy those to somewhere safe and save it.

    • Check the I have copied the client ID and secret checkbox and click on Close.

  2. If you skipped the previous step, find your saved Client ID and Client Secret as they will be necessary for the next steps;

  3. In the Serverless Logic Web Tools, click on the Cog wheel (⚙️) on the top right corner and go to the Service Account tab;

  4. Paste your Client ID and Client Secret in the respective fields;

  5. Click on Apply.

  6. The tab contents should be updated, showing Your Service Account information is set.

Setting up a Service Registry

Create or use a Service Registry instance from your Red Hat OpenShift Application and Data Services console and add it to the Serverless Logic Web Tools settings tab.

Prerequisites
  • Access to Red Hat OpenShift Application and Data Services console;

  • A Service Account.

Procedure
  1. Create a Service Registry instance on Red Hat Openshift Application and Data Services console (if you already have one, skip this step):

    • Go to Service Registry | Red Hat OpenShift Application and Data Services;

    • Click on Create Service Registry instance;

    • In the window that opens up type a Service Registry instance name;

    • Click on Create;

    • The list of instances will be updated with your new instance;

    • Find it in the list and click on its name;

    • Go to the Settings tab and click on Grant access;

    • From the dropdown, select your Service Account desired (the same you configured on the Serverless Logic Web Tools);

    • Select a role for that Service Account (has to be Manager or Administrator, to have read and write access);

    • Click on Save;

    • On the top right-hand corner there should be a triple dotted menu, click on it and then on Connection;

    • A drawer should open with all the connection and authentication information you’ll need;

    • Copy the Core Registry API value.

  2. If you skipped the previous step, find your Service Registry instance Core Registry API as it will be necessary for the next steps;

  3. In the Serverless Logic Web Tools, click on the Cog wheel (⚙️) on the top right corner and go to the Service Registry tab;

  4. Input a name for your registry, preferably the same one you used when creating the Service Registry instance;

  5. Paste your Core Registry API in the respective field;

  6. Click on Apply.

  7. The tab contents should be updated, showing Your Service Registry information is set.

Setting up Streams for Apache Kafka

Create or use a Kafka instance from your Red Hat OpenShift Application and Data Services console and add it to the Serverless Logic Web Tools settings tab.

Prerequisites
  • Access to Red Hat OpenShift Application and Data Services console;

Procedure
  1. Create a Kafka instance on Red Hat Openshift Application and Data Services console (if you already have one, skip this step):

    • Go to Streams for Apache Kafka | Red Hat OpenShift Application and Data Services;

    • Click on Create Kafka instance;

    • In the window that opens up type a Kafka instance name;

    • Fill the other fields to your liking, or leave them with the default values;

    • Click on Create instance;

    • Reload the page for the list of instances to be updated with your new instance;

    • Wait for the status to be updated to Ready;

    • Find it in the list and click on its name;

    • Go to the Topics tab and create a new topic, you’ll need its name later;

    • Go to the Access tab;

    • Click on Manage Access and select All Accounts or your Service Account;

    • Add the following permissions:

      • Consumer group is " * " | Allow All | All Accounts;

      • Topic is " * " | Allow All | All Accounts;

    • On the top right-hand corner there should be a triple dotted menu, click on it and then on Connection;

    • Copy the Bootstrap server value.

  2. If you skipped the previous step, find your Kafka instance Bootstrap server value as it will be necessary for the next steps;

  3. In the Serverless Logic Web Tools, click on the Cog wheel (⚙️) on the top right corner and go to the Streams for Apache Kafka tab;

  4. Paste the Bootstrap server value you copied before to the Bootstrap Server field;

  5. Type the name of the topic you created on the Topic field;

  6. Click on Apply.

  7. The tab contents should be updated, showing Your Streams for Apache Kafka information is set.

Note: using these broad settings is meant to make it easy to configure but you can set up specific Service Accounts, Topics, and Consumer Groups as well.

Found an issue?

If you find an issue or any misleading information, please feel free to report it here. We really appreciate it!