Cloud Sink with Azure Event Hub

In this Jumpstart we will create a new Microsoft Azure Event Hub with Kafka support and configure it. We will then use the Cloud Sink object as a producer to send a message to Azure Event Hub using the Kafka protocol.

Prerequisites

To complete the examples in this Jumpstart the following things are required:

  • Azure account

Azure account which is going to be used should be initially configured, meaning that there already should be a valid Azure subscription and all the necessary permissions to add and manage resources. There also should be at least one Resource Group. More information about Resource Groups is available on the the official Microsoft Documentation webpage
  • system:inmation installation (v1.54 and above) including DataStudio

Preparing Azure Infrastructure

  1. Login to Azure Portal. Click on   Create a resource   button

  2. Find and add a new Event Hubs

    Event Hubs namespace
    Figure 1. Adding a new Event Hubs namespace
  3. Make sure you have the following parameters set using the suggested values or use the below screenshot as a guide:

    Kafka protocol is supported only in Standard pricing tier
    • Enable Kafka - Checked

    Create Namespace
    Figure 2. Event Hubs Namespace configuration
  4. Click   Create   and wait until the resource is deployed

  5. When Event Hub is up and running, select it under   All Resources  .

  6. Create a new Event Hub ( Event Hub Namespace  Event Hubs  + Event Hub ). Give the object a name and make note of it (this will also be used in the configuration of the Cloud Sink object and as a Kafka Topic). In this example Event Hub will be named inhub.

    A new Event Hub
    Figure 3. Adding a new Event Hub
    A new Event Hub Configuration
    Figure 4. Configuring a new Event Hub
  7. Select the Event Hub Namespace object you just created and find the connection string to configure under the RootManageSharedAccessKey Policy ( Event Hub Namespace  Shared access policies  RootManageSharedAccessKey  Connection string-primary key ). It should be similar to this:

    Endpoint=sb://inmationKafkaTest.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=cHJpbWFyeSBvciBzZWNvbmRhcnkga2V5Ow==

    Make a note of it for later when configuring the the Cloud Sink object.

    Connection String
    Figure 5. Obtaining Connection String
  8. Download the cacert.pem certificate file available here

    Simple copy-pasting may not work as some extra line-breaker characters might appear and corrupt the certificate. It is recommended to use browser Save as…​ command or use any download tool which can accept URL as a file source.
In case of encountering any troubles, this official Microsoft Quickstart can be used as an alternative example of how to create a new Event Hub

Cloud Sink configuration

  1. In DataStudio create a new Cloud Sink object by selecting a Connector and right-clicking. Select Admin  New  History  Cloud Sink from the context menu. Give the object a name a click Create to create it in the I/O Model.

  2. Select the Cloud Sink object and in the Object Properties panel, open the Configuration property section.

  3. Under Kafka Producer Parameters enter for the Topic property, the Event Hub name from Preparing Azure Infrastructure: Step 6.

    Table 1. Kafka Producer Parameters
    Parameter Value Reference Example

    Topic

    [ Event Hub name ]

    S2:S6

    inhub

    Cloud Sink Topic configuration
    Figure 6. Cloud Sink Kafka configuration
  4. Under Global Configuration Properties, enter the following values for the parameters:

    Table 2. Global Configuration Properties
    Parameter Value Reference Example

    Bootstrap Servers

    [ Event Hubs Namespace name ] + .servicebus.windows.net:9093

    S2:S3

    inmationKafkaTest.servicebus.windows.net:9093

    Security Protocol

    SASL-SSL

    SASL-SSL

    SASL Mechanism

    PLAIN

    PLAIN

    SASL Username

    $ConnectionString

    $ConnectionString

    SASL Password

    [ Connection string ]

    S2:S7

    Endpoint=sb://inmationKafkaTest.servicebus.windows.net/; SharedAccessKeyName=RootManageSharedAccessKey; SharedAccessKey=cHJpbWFyeSBvciBzZWNvbmRhcnkga2V5Ow==

    Azure Kafka service is listening to port 9093
    Cloud Sink Global configuration
    Figure 7. Cloud Sink Global configuration
  5. Import Certificate Authority file

    Click Import File on property SSL Certificate Authority and select cacert.pem file downloaded in Preparing Azure Infrastructure: Step 8

    Cloud Sink Certificate Authority property
    Figure 8. Certificate Authority property
    Selecting Certificate Authority file
    Figure 9. Selecting Certificate Authority file

    If everything is done correctly, it should be possible to view certificate details by clicking View

    Certificate details
    Figure 10. Certificate details
  6. Click Apply in the Object Properties to save configuration.

Transferring data with Cloud Sink

  1. To check that the interface is working, double-click on Cloud Sink object’s Faceplate and enter a string in the Write Value dialog.

    Writing to Cloud Sink
    Figure 11. Writing a value to Cloud Sink Item Value
  2. Now check the Azure Event Hub metric to see that the message has been captured.

    Metric Result
    Figure 12. Azure Event Hub metric with the number of incoming messages
  3. Alternatively you can install the Azure Event Hub Explorer extension for Visual Studio Code and check the result there

    Explorer Result
    Figure 13. An incoming message reported by Azure Event Hub Explorer extension

    After deleting a Cloud Sink object, some certificates might be left in the corresponding certificate stores. In order to remove old certificates run the following command in a console:

    return require("esi-cloud-interface-initializer"):CLEAN()

    Executing this command will not affect any certificates, which are currently being used by other Cloud Sink objects