Skip to content

Extension that can be used to receive events from a Kafka cluster and to publish events to a Kafka cluster

License

Notifications You must be signed in to change notification settings

siddhi-io/siddhi-io-kafka

Repository files navigation

Siddhi IO Kafka

Jenkins Build Status GitHub Release GitHub Release Date GitHub Open Issues GitHub Last Commit License

The siddhi-io-kafka extension is an extension to Siddhi that receives and publishes events from and to Kafka.

For information on Siddhi and it's features refer Siddhi Documentation.

Download

  • Versions 5.x and above with group id io.siddhi.extension.* from here.
  • Versions 4.x and lower with group id org.wso2.extension.siddhi.* from here.

Latest API Docs

Latest API Docs is 5.0.6.

Features

  • kafka (Sink)

    A Kafka sink publishes events processed by WSO2 SP to a topic with a partition for a Kafka cluster. The events can be published in the TEXT XML JSON or Binary format.
    If the topic is not already created in the Kafka cluster, the Kafka sink creates the default partition for the given topic. The publishing topic and partition can be a dynamic value taken from the Siddhi event.
    To configure a sink to use the Kafka transport, the type parameter should have kafka as its value.

  • kafkaMultiDC (Sink)

    A Kafka sink publishes events processed by WSO2 SP to a topic with a partition for a Kafka cluster. The events can be published in the TEXT XML JSON or Binary format.
    If the topic is not already created in the Kafka cluster, the Kafka sink creates the default partition for the given topic. The publishing topic and partition can be a dynamic value taken from the Siddhi event.
    To configure a sink to publish events via the Kafka transport, and using two Kafka brokers to publish events to the same topic, the type parameter must have kafkaMultiDC as its value.

  • kafka (Source)

    A Kafka source receives events to be processed by WSO2 SP from a topic with a partition for a Kafka cluster. The events received can be in the TEXT XML JSON or Binary format.
    If the topic is not already created in the Kafka cluster, the Kafka sink creates the default partition for the given topic.

  • kafkaMultiDC (Source)

    The Kafka Multi-Datacenter(DC) source receives records from the same topic in brokers deployed in two different kafka clusters. It filters out all the duplicate messages and ensuresthat the events are received in the correct order using sequential numbering. It receives events in formats such as TEXT, XML JSON and Binary`.The Kafka Source creates the default partition '0' for a given topic, if the topic has not yet been created in the Kafka cluster.

Dependencies

Following JARs are needed from <KAFKA_HOME>/libs directory.

  • kafka_2.11-*.jar
  • kafka-clients-*.jar
  • metrics-core-*.jar
  • scala-library-2.11.*.jar
  • scala-parser-combinators_2.11.*.jar (if exists)
  • zkclient-*.jar
  • zookeeper-*.jar

Installation

For installing this extension and to add the dependent jars on various siddhi execution environments refer Siddhi documentation section on adding extensions and jars.

Setup Kafka

As a prerequisite, you have to start the Kafka message broker. Please follow better steps.

  1. Download the Kafka distribution
  2. Unzip the above distribution and go to the ‘bin’ directory
  3. Start the zookeeper by executing below command,
    zookeeper-server-start.sh config/zookeeper.properties
  4. Start the Kafka broker by executing below command,
    kafka-server-start.sh config/server.properties

Refer the Kafka documentation for more details, https://kafka.apache.org/quickstart

Then, you have to add necessary client jars (from <KAFKA_HOME>/libs directory) to Siddhi distribution as given below.

  • Copy below client libs to <SIDDHI_HOME>/bundles directory

    • scala-library-2.12.8.jar
    • zkclient-0.11.jar
    • zookeeper-3.4.14.jar
  • Copy below client libs to <SIDDHI_HOME>jars directory

    • kafka-clients-2.3.0.jar
    • kafka_2.12-2.3.0.jar
    • metrics-core-2.2.0.jar

!!! info "bundles directory to add OSGI bundles and jars directory to add non-OSGI jars."

Support and Contribution

  • We encourage users to ask questions and get support via StackOverflow, make sure to add the siddhi tag to the issue for better response.

  • If you find any issues related to the extension please report them on the issue tracker.

  • For production support and other contribution related information refer Siddhi Community documentation.