-
Notifications
You must be signed in to change notification settings - Fork 0
Home
Kafka Service is a spring boot application built to simplify testing Kafka based applications. This application provides you with a configurable test producer, ability to post templated payloads (inspired by Handlebar.java and Wiremock's templating helpers). See HOW section for more details. See Response Templating for available templating options and examples.
Most testing tools (Licensed such as Loadrunner and Cavisson Netstorm, or Open Sourced such as JMeter and Locust), are built with a primary focus on testing HTTP Based applications. Though all of these applications can be used to send payloads to Kafka, they often don't provide great APIs for usability. Kafka Service was built to bridge that gap.
Using Kafka Service, you can now build your scripts the same way you'd build them if you were testing a HTTP web/microservices based application. Your scripts will call Kafka Service APIs, which would in turn process your requests and post the messages to Kafka for you.
Additionally you can use handlebars in your payload and Kafka Service will automatically generate random numbers, strings, UUIDs, timestamps, dates etc for you.
Example:
curl -X POST "http://localhost:8080/kafka/publish?topic=Test" -H "accept: */*" -H "Content-Type: application/json" -d "{\"message\":\"Hello World\"}"
This posts a message "Hello World" to a Kafka topic named "Test" to a Kafka cluster hosted locally on your machine. See HOW section for more details about configuring the application to post messages to a remote Kafka cluster, a SSL secured Kafka cluster, with templated messages and more.
- Begin by clone this repository
git clone https://github.com/fauxauldrich/kafka-service.git
- There are several ways to run Kafka Service application on your local machine. One way is to execute the
main
method in thecom.fauxauldrich.kafkaservice.KafkaServiceApplication
class from your IDE. - Alternatively you can use the Spring Boot Maven plugin like so:
mvn spring-boot:run
- Navigate to
http://localhost:8080/swagger-ui
.
Kafka Service application can be deployed in several ways
- Build the application first.
mvn clean install package
- Simply run as a background service on a linux server
java -jar target/kafka-service-${VERSION}.jar
- On windows, use utilities such as NSSM to create a service.
- Or deploy as a docker container using the sample
Dockerfile
provided in the repository.- Build image with :
docker build -t kafka-service:0.0.1 .
- Run as a container using:
docker run -d --name kafka-service -p 8080:8080 -v /dir/containing/truststore.jks/files:/app/certs kafka-service:0.0.1
- Or use the
docker-compose.yml
to bring up your container. (Modifydocker-compose.yml
to reflect the dir path for your truststore files)
- Build image with :
- The generated docker image can also be pushed to a local docker registry and deployed on cloud platforms or on prem Kubernetes clusters.
- To deploy on a Tomcat server, package the application as a war file.
- Update the line
<packaging>jar</packaging>
in pom.xml as<packaging>war</packaging>
- Build the application again:
mvn clean package
- Deploy the war file to a tomcat server by:
- Placing the war file in tomcat/webapp directory; OR
- Using Tomcat Manager GUI
- Update the line
Following configurable properties can be updated as per your requirements (src/main/resources/application.properties
):
- spring.kafka.bootstrap-servers=127.0.0.1:9092
- kafkaservice.truststore.location=~/truststore.jks
- kafkaservice.truststore.password=Password@123
-
/kafka/publish - Posts a simple message to a Kafka topic with following details.
- Query Parameters: topic, KEY(Optional), PARTITION_ID(Optional)
- Body:
{ "message": "string" }
-
/kafka/publish/with-headers - Posts a message along with additional headers
- Query Parameters: topic, KEY(Optional), PARTITION_ID(Optional)
- Body:
{ "message": "string", "headers": { "additionalProp1": "string", "additionalProp2": "string", "additionalProp3": "string" } }
-
/kafka/publish/secure - Posts a simple message to a Kafka topic with following details. Use if connecting to Kafka cluster requires SSL authentication. Needs properties
kafkaservice.truststore.location
andkafkaservice.truststore.password
to be updated with relevant details before deployment.- Query Parameters: topic, KEY(Optional), PARTITION_ID(Optional)
- Body:
{ "message": "string" }
-
/kafka/publish/secure/with-headers: Same as
/kafka/publish/secure
but accepts additional headers- Query Parameters: topic, KEY(Optional), PARTITION_ID(Optional)
- Body:
{ "message": "string", "headers": { "additionalProp1": "string", "additionalProp2": "string", "additionalProp3": "string" } }
Kafka Service uses Spring Kafka templates to send messages to any cluster. However the templates are not created at runtime on the invocation of APIs, instead they are created during application initialization. This helps with the performance, but compromises with modularity. Imagine a scenario where you want to post a message to a different cluster or update the truststore details, only solution would be to update the application.properties and redeploy the application.
If you don't want to redeploy the application every time you need to test a new cluster, following "Dynamic"
APIs allow you to send a message to any cluster and any topic without updating the configs. They do so by creating a new producer every time the APIs are invoked and destroying the producer once message is sent. Please note that this might not be very scalable. But if you are testing with small TPS, go nuts.
-
/kafka/dynamic/publish
- Query Parameters: topic, KEY(Optional), PARTITION_ID(Optional)
- Body:
{ "brokers": "string", "message": "string" }
-
/kafka/dynamic/publish/with-headers
- Query Parameters: topic, KEY(Optional), PARTITION_ID(Optional)
- Body:
{ "brokers": "string", "message": "string", "headers": { "additionalProp1": "string", "additionalProp2": "string", "additionalProp3": "string" } }
-
/kafka/dynamic/publish/secure
- Query Parameters: topic, TRUSTSTORE_LOCATION, TRUSTSTORE_PASSWORD, KEY(Optional), PARTITION_ID(Optional)
- Body:
{ "brokers": "string", "message": "string" }
-
/kafka/dynamic/publish/secure/with-headers
- Query Parameters: topic, TRUSTSTORE_LOCATION, TRUSTSTORE_PASSWORD, KEY(Optional), PARTITION_ID(Optional)
- Body:
{ "brokers": "string", "message": "string", "headers": { "additionalProp1": "string", "additionalProp2": "string", "additionalProp3": "string" } }