Skip to content

KAFKA - DATA SINK

Description

The KAFKA data sink is designed to distribute data by publishing it to a Kafka topic. This method is particularly useful for integrating with Kafka-based systems, allowing for real-time data processing and analysis. The KAFKA data sink supports various configuration options to customize the data distribution process, including specifying the Kafka broker addresses, topic name, and additional properties for the Kafka producer.


Config

REQUIRED


Config Parameters

Name Description
clientId A unique identifier for the Kafka producer. This is a required parameter.
brokers A comma-separated list of Kafka broker addresses. This is a required parameter.
topic The name of the Kafka topic where the data will be published. This is a required parameter.
properties A set of additional Kafka producer properties. This is an optional parameter.

Config Example

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
<?xml version="1.0" encoding="UTF-8"?>
<apiroConf version="1" xmlns="http://apiro.com/apiro/v1/root">
    <loadOrder>10</loadOrder>
    <dataSinks>
        <dataSink definition="KAFKA" execPriority="10" name="ASX20_PRICES_KAFKA">
            <config>
                <![CDATA[
                    {
                        "clientId" : "dip.com.abcitconsulting",
                        "brokers" : "192.168.11.10, 192.168.12.11",
                        "topic" : "ASX20_PRICES",
                        "properties" : {}   
                    }
                ]]>
            </config>
        </dataSink>
    </dataSinks>
</apiroConf>

Common Mistakes

  • Incorrect Broker Addresses: Ensure that the brokers parameter contains the correct addresses of your Kafka brokers.
  • Invalid Topic Name: Verify that the topic parameter matches the name of an existing Kafka topic.
  • Producer Configuration: If you encounter issues with data distribution, review the properties parameter for any Kafka producer configurations that may be causing problems.
  • Network Issues: If you encounter connectivity issues, check the network configuration and ensure that the Kafka brokers are accessible from the machine running the data sink.