icon/x Created with Sketch.

Splunk Cookie Policy

We use our own and third-party cookies to provide you with a great online experience. We also use these cookies to improve our products and services, support our marketing campaigns, and advertise to you on our website and other websites. Some cookies may continue to collect information after you have left our website. Learn more (including how to update your settings) here.
Accept Cookie Policy

We are working on something new...

A Fresh New Splunkbase
We are designing a New Splunkbase to improve search and discoverability of apps. Check out our new and improved features like Categories and Collections. New Splunkbase is currently in preview mode, as it is under active development. We welcome you to navigate New Splunkbase and give us feedback.
Log4Shell Vulnerability: Information and guidance for you. Get resources.

Accept License Agreements

This app is provided by a third party and your right to use the app is in accordance with the license provided by that third-party licensor. Splunk is not responsible for any third-party apps and does not provide any warranty or support. If you have any questions, complaints or claims with respect to this app, please contact the licensor directly.

Thank You

Downloading Hadoop Monitoring
SHA256 checksum (hadoop-monitoring_10.tgz) ee852ec0e9ae04058f8e4925e341e727609b254dc71f55338cb5672ad14ad66d
To install your download
For instructions specific to your download, click the Details tab after closing this window.

Flag As Inappropriate


Hadoop Monitoring

This app is NOT supported by Splunk. Please read about what that means for you here.
The Hadoop Monitoring Add-on allows a Splunk software administrator to collect Yarn and Hadoop log files as well as Hadoop nodes OS matrix. The App was tested with Hortonworks, Cloudera, and MapR distributions. After the Splunk platform indexes the events, you can analyze the data by building searches and dashboards. The add-on includes few sample prebuilt dashboard panels and reports.

Steps to deploy:

1) In the App, Go to the examples directory
2) Copy the correct inputs.conf based on the Hadoop distribution (Hortonworks, Cloudera, or MapR) to the local directory.
3) Edit the file and make sure all the stanzas are pointing to the right Hadoop directories
4) Rename the file to inputs.conf
5) Deploy the forwarder and inputs.conf to all the Hadoop nodes

Included in the App:

1) The App creates three indexes: hadoopmon_metrics, hadoopmon_os, and hadoopmon_configs
2) The index hadoopmon_metrics includes sourcetypes to monitor yarn and hdfs: namenode, datanode, secndarynamenode, nodemanager, and resourcemanager.
The index hadoopmon_os includes sourcetypes to monitor disk, cpu, and memory usage.
3) The App includes a sample searches, reports and dashboards. However, the assumption is that the user will add many more reports based on the collected indexed data.

You could modify the inputs.conf file to also monitor spark, hbase, flume, sqoop, oozie, or any other component shipped with Hadoop

Sample inputs.conf to Monitor Hortonworks Yarn, Hadoop, Hbase, and Spark

Hortonworks Yarn Log Files

sourcetype = hadoop_historyserver
index = hadoopmon_metrics

sourcetype = hadoop_nodemanager
index = hadoopmon_metrics

sourcetype = hadoop_resourcemanager
index = hadoopmon_metrics

Hortonworks Hadoop Log Files

sourcetype = hadoop_datanode
index = hadoopmon_metrics

sourcetype = hadoop_namenode
index = hadoopmon_metrics

Hbase Monitoring

sourcetype = hbase_master
index = hbase_metrics

sourcetype = hbase_regionserver
index = hbase_metrics

Spark Monitoring

disabled = 0
sourcetype = hdfs_spark_metrics
index = spark_metricsdata

sourcetype = spark_historyserver
index = spark_metricsdata

sourcetype = spark_hivethrift
index = spark_metricsdata

Sample searches for Yarn, Hbase, and Spark

[Yarn Top User]
index=hadoopmon_metrics sourcetype=hadoop_resourcemanager APPID=*| top limit=20 USER

[Yarn Success Rate]
index=hadoopmon_metrics sourcetype=hadoop_resourcemanager | top finalState

[Yarn All Applications]
index=hadoopmon_metrics sourcetype=hadoop_historyserver user=* | eval elapsedtime = finishTime - submitTime| table jobName queue user numMaps numReduces status elapsedtime

index="hbase_metrics" sourcetype=hbase_regionserver totalSize=* | table totalSize freeSize max blockCount accesses hits hitRatio cachingAccesses cachingHits cachingHitsRatio evictions

index="spark_metricsdata" sourcetype=hdfs_spark_metrics | top limit=3 "App Name"

index="spark_metricsdata" sourcetype="hdfs_spark_metrics" "App ID"=* | table "App ID" "App Name" User Timestamp | dedup "App ID"

index="spark_metricsdata" sourcetype="hdfs_spark_metrics" "Stage IDs{}"="23" | table Properties.spark.job.description Properties.spark.jobGroup.id Properties.spark.rdd.scope | dedup Properties.spark.jobGroup.id

Release Notes

Version 1.0
April 20, 2016

Subscribe Share

Are you a developer?

As a Splunkbase app developer, you will have access to all Splunk development resources and receive a 10GB license to build an app that will help solve use cases for customers all over the world. Splunkbase has 1000+ apps from Splunk, our partners and our community. Find an app for most any data source and user need, or simply create your own with help from our developer portal.

Follow Us:
Splunk, Splunk>,Turn Data Into Doing, Data-to-Everything, and D2E are trademarks or registered trademarks of Splunk Inc. in the United States and other countries. All other brand names,product names,or trademarks belong to their respective owners.