Accept License Agreements

This app is provided by a third party and your right to use the app is in accordance with the license provided by that third-party licensor. Splunk is not responsible for any third-party apps and does not provide any warranty or support. If you have any questions, complaints or claims with respect to this app, please contact the licensor directly.

Thank You

Downloading Kafka Messaging Modular Input
SHA256 checksum (kafka-messaging-modular-input_10.tgz) 412538d95b313cebf05539a290aa48fe04cf1e823e3a9e4bde1014857eef0e0a SHA256 checksum (kafka-messaging-modular-input_092.tgz) 71993948ed293aa11ae1fa73c9ee530cf58d67fdfd0b9b08b71de214d1151deb SHA256 checksum (kafka-messaging-modular-input_091.tgz) bafbe8ad5f9c2a521d20a301c83aa1e16425bf71a5896a02e6849b537cbc6b7c SHA256 checksum (kafka-messaging-modular-input_09.tgz) 095fbf7d1031ffddfcbbbe2d1718bb35ca2106427a2bd3b6081b321dbaadb183 SHA256 checksum (kafka-messaging-modular-input_081.tgz) 6f242fc36fc428c014d93990ad53b72243286f8302788d542e94d1afcaa3b2d2 SHA256 checksum (kafka-messaging-modular-input_08.tgz) 5c4c32852301f7728f4ed35520c07055cd730ba6a4e10e7b9c28b2f9d0f1b8df SHA256 checksum (kafka-messaging-modular-input_07.tgz) 8e22dfe6a1ef50d4822940e025fcf37901f2f4dedc892e989fed182c8e3861b4 SHA256 checksum (kafka-messaging-modular-input_06.tgz) 0a3ff5a4021ea35a236e82c44b32e4cd8ecaae375ed35a9beecb867e03b4f030 SHA256 checksum (kafka-messaging-modular-input_05.tgz) 358449174b8d81d80ae95d6dd9c1e30092e0c576d89463bc167659d53571b09d
To install your download
For instructions specific to your download, click the Details tab after closing this window.

Flag As Inappropriate

Kafka Messaging Modular Input

Overview
Details
This is a Splunk Modular Input Add-On for indexing messages from an Apache Kafka broker or cluster of brokers that are managed by Zookeeper.
Kafka version 0.8.1.1 is used for the consumer and the testing of this Modular Input.

Splunk Kafka Messaging Modular Input v1.0

Overview

This is a Splunk Modular Input Add-On for indexing messages from a Kafka broker or cluster of brokers that are managed by Zookeeper.
Kafka version 0.8.1.1 is used for the consumer and the testing of this Modular Input.

What is Kafka ?

https://kafka.apache.org/

Dependencies

  • Splunk 5.0+
  • Java Runtime 1.7+
  • Supported on Windows, Linux, MacOS, Solaris, FreeBSD, HP-UX, AIX
  • Kafka version 0.8 +

Setup

  • Optionally set your JAVA_HOME environment variable to the root directory of your JRE installation.If you don't set this , the input will look for a default installed java executable on the path.
  • Untar the release to your $SPLUNK_HOME/etc/apps directory
  • Restart Splunk

Configuration

As this is a Modular Input , you can then configure your Kafka inputs via Manager->Data Inputs->Kafka. The field entry should be straightforward and intuitive for anyone with basic experience with Kafka / Zookeeper.

Logging

Any log entries/errors will get written to $SPLUNK_HOME/var/log/splunk/splunkd.log

JVM Heap Size

The default heap maximum is 64MB.
If you require a larger heap, then you can alter this in $SPLUNK_HOME/etc/apps/kafka_ta/bin/kafka.py on line 95

JVM System Properties

You can declare custom JVM System Properties when setting up new input stanzas.
Note : these JVM System Properties will apply to the entire JVM context and all stanzas you have setup

Customized Message Handling

The way in which the Modular Input processes the received Kafka messages is enitrely pluggable with custom implementations should you wish.

To do this you code an implementation of the com.splunk.modinput.kafka.AbstractMessageHandler class and jar it up.

Ensure that the necessary jars are in the $SPLUNK_HOME/etc/apps/kafka_ta/bin/lib directory.

If you don't need a custom handler then the default handler com.splunk.modinput.kafka.DefaultMessageHandler will be used.

This handler simply trys to convert the received byte array into a textual string for indexing in Splunk.

Code examples are on GitHub : https://github.com/damiendallimore/SplunkModularInputsJavaFramework/tree/master/kafka/src/com/splunk/modinput/kafka

Troubleshooting

  • JAVA_HOME environment variable is set or "java" is on the PATH for the user's environment you are running Splunk as
  • You are using Splunk 5+
  • You are using a 1.7+ Java Runtime
  • You are targetting Kafka version 0.8+
  • You are running on a supported operating system
  • Look for any errors in $SPLUNK_HOME/var/log/splunk/splunkd.log
  • Run this command as the same user that you are running Splunk as and observe console output : "$SPLUNK_HOME/bin/splunk cmd python ../etc/apps/kafka_ta/bin/kafka.py --scheme"

Contact

This project was initiated by Damien Dallimore , damien@baboonbones.com

Release Notes

Version 1.0
April 21, 2016

Added a new custom handler : com.splunk.modinput.kafka.CSVWithHeaderDecoderHandler
This allows you to roll out CSV files (with or without header) into KV or JSON before indexing.

Example config you could pass to the custom message handler when you declare it

headers=header1:header2:header3,outputFormat=json,hasHeaderRow=false

Version 0.9.2
Nov. 3, 2015

Better JSON handling for HEC output (hat tip to Tivo)

Version 0.9.1
Oct. 22, 2015

Better logging around HEC success/failure
Can now add custom timestamp into HEC payload
New custom handler (JSONBodyWithTimeExtraction) for pulling out timestamp from JSON messages from Kafka and adding this into HEC payload

Version 0.9
Sept. 22, 2015

Added support to optional output to Splunk via a HEC (HTTP Event Collector) endpoint

Version 0.8.1
June 20, 2015

Added support for raw connection string format so that multiple zookeeper hosts
can be provided in a comma delimited manner
ie: hostname1:port1,hostname2:port2,hostname3:port3/chroot/path

Version 0.8
June 17, 2015

Added chroot support for zookeeper connection strings

Version 0.7
Feb. 11, 2015

Enabled TLS1.2 support by default.
Made the core Modular Input Framework compatible with latest Splunk Java SDK
Please use a Java Runtime version 7+
If you need to use SSLv3 , you can turn this on in bin/kafka.py
SECURE_TRANSPORT = "tls"
#SECURE_TRANSPORT = "ssl"

Version 0.6
Feb. 5, 2015

You can now pass a charset name to the DefaultHandler

Version 0.5
June 30, 2014

Initial beta release

55
Installs
1,402
Downloads
Share Subscribe LOGIN TO DOWNLOAD

Subscribe Share

Splunk Certification Program

Splunk's App Certification program uses a specific set of criteria to evaluate the level of quality, usability and security your app offers to its users. In addition, we evaluate the documentation and support you offer to your app's users.

Are you a developer?

As a Splunkbase app developer, you will have access to all Splunk development resources and receive a 50GB license to build an app that will help solve use cases for customers all over the world. Splunkbase has 1000+ apps and add-ons from Splunk, our partners and our community. Find an app or add-on for most any data source and user need, or simply create your own with help from our developer portal.

Follow Us:
© 2005-2018 Splunk Inc. All rights reserved.
Splunk®, Splunk>®, Listen to Your Data®, The Engine for Machine Data®, Hunk®, Splunk Cloud™, Splunk Light™, SPL™ and Splunk MINT™ are trademarks and registered trademarks of Splunk Inc. in the United States and other countries. All other brand names, product names, or trademarks belong to their respective owners.