icon/x Created with Sketch.

Splunk Cookie Policy

We use our own and third-party cookies to provide you with a great online experience. We also use these cookies to improve our products and services, support our marketing campaigns, and advertise to you on our website and other websites. Some cookies may continue to collect information after you have left our website. Learn more (including how to update your settings) here.
Accept Cookie Policy

Accept License Agreements

This app is provided by a third party and your right to use the app is in accordance with the license provided by that third-party licensor. Splunk is not responsible for any third-party apps and does not provide any warranty or support. If you have any questions, complaints or claims with respect to this app, please contact the licensor directly.

Thank You

Downloading Cisco Umbrella Add-On for Splunk
SHA256 checksum (cisco-umbrella-add-on-for-splunk_104.tgz) 761bdb35a6a5401d22595c1f89399af3a0c2b10558f30685092436973aa9c894 SHA256 checksum (cisco-umbrella-add-on-for-splunk_103.tgz) 143b685b7d94cc87ff068ed0a022739fa3ff12d42a00b0d83e59a5efb84087f5 SHA256 checksum (cisco-umbrella-add-on-for-splunk_102.tgz) e15b3f21549d176c9d0b36cf3a4da9aeaac4152267aa15a6f42defcfb7b1ccbf SHA256 checksum (cisco-umbrella-add-on-for-splunk_101.tgz) cb674496e33a619460d87f7a0cd9b59b9943c951d6ce96448c5462a9f8f1276c SHA256 checksum (cisco-umbrella-add-on-for-splunk_100.tgz) c04172ef092d23a2a36ecc25e920295bf19c267ad34c701562e49679fbcd113e
To install your download
For instructions specific to your download, click the Details tab after closing this window.

Flag As Inappropriate

Cisco Umbrella Add-On for Splunk

Splunk AppInspect Passed
Admins: Please read about Splunk Enterprise 8.0 and the Python 2.7 end-of-life changes and impact on apps and upgradeshere.
Overview
Details
The purpose of this add-on is to provide CIM compliant field extractions for Cisco Umbrella OpenDNS logs AWS S3 bucket logs.

This add-on requires the Splunk Add-on for Amazon Web Services as the means of data on-boarding.

+Built for Splunk Enterprise 6.x.x or higher
+CIM Compliance (CIM 4.0.0 or higher)
+Ready for Enterprise Security
+Requires Splunk Add-on for Amazon Web Services (unless using Cisco Managed S3)
++https://splunkbase.splunk.com/app/1876/
+Supports Cisco Umbrella Log Management Version 1-4
+Supports Cisco Managed S3 buckets via awscli and a simple shell script (See Cisco Managed Buckets Instructions)

Cisco Umbrella Add-on for Splunk

Add-on Homepage:
Author: Hurricane Labs
Version: 1.0.4

Description

The purpose of this add-on is to provide CIM compliant field extractions for Cisco Umbrella OpenDNS logs AWS S3 bucket logs.

This add-on requires the Splunk Add-on for Amazon Web Services as the means of data on-boarding.

+Built for Splunk Enterprise 6.x.x or higher
+CIM Compliance (CIM 4.0.0 or higher)
+Ready for Enterprise Security
+Requires Splunk Add-on for Amazon Web Services (unless using Cisco Managed S3)
++https://splunkbase.splunk.com/app/1876/
+Supports Cisco Umbrella Log Management Version 1-4

INSTALLATION AND CONFIGURATION

Search Head: Required
Heavy Forwarder: Optional
Indexer: Optional
Universal Forwarder: Not Supported
Light Forwarder: Not Supported

Ultimately this add-on needs to be installed on your Search Head and on the AWS Add-on collection point. It is recommended that you setup the AWS inputs on a Heavy Forwarder, and it is recommended that you setup the Cisco Umbrella Add-on on the Heavy Forwarder and Search Head. If you are using your Search Head as the AWS collection point, then this only needs to go on your search head. If your AWS collection point is on your Indexer, then this add-on needs to go on the Indexer and on your Search Head.

Setting up AWS Input

Reference Link: https://support.umbrella.com/hc/en-us/articles/230650987-Configuring-Splunk-for-use-with-Cisco-Umbrella-Log-Management-in-AWS-S3
1. Refer to the reference link to configure AWS and set up the account in Splunk, you can follow the rest of the steps in this guide when you reach the "Configuring Data Inputs for Splunk" step.
2. Navigate to the Splunk Add-on for AWS
3. Create a new input (Custom Data Type > Generic S3)
4. Select the appropriate AWS Account/Role/S3 Bucket
5. (Recommended) Do not set a S3 Key Prefix, and change the sourcetype to "opendns:s3".
5. (Optional) If you want more control over the data you bring in, you'll need to do separate inputs using these S3 Key Prefixes "/proxylogs/", "/dnslogs/", and "/iplogs/". Each of these will use it's own sourcetype: "opendns:dnslogs", "opendns:proxylogs" and "opendns:iplogs"

Heavy Forwarder & Search Head (RECOMMENDED)

  1. Install the add-on on the Heavy Forwarder and Search Head. A Splunk Restart may be required, you may also attempt a debug refresh.
  2. Configure the AWS Input to scrape your OpenDNS S3 bucket;
    2a. One Input: If you scrape the entire bucket, you NEED to use sourcetype "opendns:s3"
    2b. Multiple Inputs: If you want more control over the buckets being scraped, this add-on supports multiple inputs. There are three types of logs sent by Umbrella OpenDNS; "/proxylogs/", "/dnslogs/", and "/iplogs/". Each input should use exactly one of those prefixes. The respective sourcetypes for those prefixes are as follows; "opendns:dnslogs", "opendns:proxy" and "opendns:iplogs".
  3. Verify data is coming in and you are seeing the proper field extractions by searching the data.
  4. Example Search: index=awsindexyouchose sourcetype=opendns:* | dedup sourcetype

Search Head Only

  1. Install the add-on on the Search Head. A Splunk Restart may be required, you may also attempt a debug refresh.
  2. Configure the AWS Input to scrape your OpenDNS S3 bucket;
    2a. One Input: If you scrape the entire bucket, you NEED to use sourcetype "opendns:s3"
    2b. Multiple Inputs: If you want more control over the buckets being scraped, this add-on supports multiple inputs. There are three types of logs sent by Umbrella OpenDNS; "/proxylogs/", "/dnslogs/", and "/iplogs/". Each input should use exactly one of those prefixes. The respective sourcetypes for those prefixes are as follows; "opendns:dnslogs", "opendns:proxy" and "opendns:iplogs".
  3. Verify data is coming in and you are seeing the proper field extractions by searching the data.
  4. Example Search: index=awsindexyouchose sourcetype=opendns:* | dedup sourcetype

Indexer and Search Head (NOT RECOMMENDED)

  1. Install the add-on on the Indexer and Search Head. A Splunk Restart may be required, you may also attempt a debug refresh.
  2. Configure the AWS Input to scrape your OpenDNS S3 bucket;
    2a. One Input: If you scrape the entire bucket, you NEED to use sourcetype "opendns:s3"
    2b. Multiple Inputs: If you want more control over the buckets being scraped, this add-on supports multiple inputs. There are three types of logs sent by Umbrella OpenDNS; "/proxylogs/", "/dnslogs/", and "/iplogs/". Each input should use exactly one of those prefixes. The respective sourcetypes for those prefixes are as follows; "opendns:dnslogs", "opendns:proxy" and "opendns:iplogs".
  3. Verify data is coming in and you are seeing the proper field extractions by searching the data.
  4. Example Search: index=awsindexyouchose sourcetype=opendns:* | dedup sourcetype

Cisco Managed Buckets Instructions

Reference Links:
https://support.umbrella.com/hc/en-us/articles/360000739983-How-to-Downloading-logs-from-Cisco-Umbrella-Log-Management-using-the-AWS-CLI
https://support.umbrella.com/hc/en-us/articles/360001388406-Configuring-Splunk-with-a-Cisco-managed-S3-Bucket
The AWS Add-on can NOT be used to pull data from Cisco managed S3 buckets. The only way to get this data in is through the use of the awscli tool. This should be setup on a Splunk Enterprise system that is meant for data ingestion (such as a Heavy Forwarder). You need to install this add-on on your Search Heads and the system handling data ingestion. The steps below will ONLY apply to your data ingestion box, NOT your Search Heads. I have only tested this on Ubuntu and we will not provide any further support/guidance on this, these steps are provided as is:

-1. Install awscli via apt-get (apt-get install awscli)

-2. In $SPLUNK_HOME/etc/apps/TA-cisco_umbrella/bin create two shell scripts. If you want the shell script write to a different location, make sure you change the paths in both shell scripts!

pull-umbrella-logs.sh

#!/bin/sh
unset LD_LIBRARY_PATH
unset PYTHONPATH

AWS_ACCESS_KEY_ID=<KEY\_HERE> AWS_SECRET_ACCESS_KEY=<KEY_HERE> AWS_DEFAULT_REGION=<REGION_HERE> aws s3 sync s3://<CISCO_BUCKET_PATH_HERE> /opt/splunk/etc/apps/TA-cisco_umbrella/data

delete-old-umbrella-logs.sh

#!/bin/bash

#Removes old data depending on the "Retention Duration" configured in the Umbrella dashboard --> Admin --> Log Management:

a) 7 days:

find /opt/splunk/etc/apps/TA-cisco_umbrella/data -type f -name "*.csv.gz" -mmin +11520 -delete 2>&1 >/dev/null

b) 14 days:

find /opt/splunk/etc/apps/TA-cisco_umbrella/data -type f -name "*.csv.gz" -mmin +21600 -delete 2>&1 >/dev/null

c) 30 days:

find /opt/splunk/etc/apps/TA-cisco_umbrella/data -type f -name "*.csv.gz" -mmin +44640 -delete 2>&1 >/dev/null

The suggested values leave a one day buffer for safety.

#Removes old directories with no data
rmdir /opt/splunk/etc/apps/TA-cisco_umbrella/data/dnslogs/*

-3. Verify that Splunk is actually able to run the shell scripts without issue by running '$SPLUNK_HOME/bin/splunk cmd sh $SPLUNK_HOME/etc/apps/TA-cisco_umbrella/pull-umbrella-logs.sh'. If this does not work, there is a problem with your setup or keys.

-4. If you are using a different path than my example, you will need to create $SPLUNK_HOME/etc/apps/TA-cisco_umbrella/local/props.conf and alter this stanza to match:
[source::/opt/splunk/etc/apps/TA-cisco_umbrella/data/dnslogs/...]
TRANSFORMS-umbrella-logs_source = remove_umbrella_date_from_source

-5. In $SPLUNK_HOME/etc/apps/TA-cisco_umbrella/local/inputs.conf create the following stanzas. Make sure you change the path and index in the monitor stanza if necessary!

[script://./bin/pull-umbrella-logs.sh]
disabled = 0
interval = 300
index = _internal
sourcetype = cisco:umbrella:input
start_by_shell = false

[script://./bin/delete-old-umbrella-logs.sh]
disabled = 0
interval = 600
index = _internal
sourcetype = cisco:umbrella:cleanup
start_by_shell = false

[monitor:///opt/splunk/etc/apps/TA-cisco_umbrella/data/dnslogs/*/*.csv.gz]
disabled = 0
index = opendns
sourcetype = opendns:dnslogs

-6. Verify data is coming in and you are seeing the proper field extractions by searching the data.
----Example Search: index=awsindexyouchose sourcetype=opendns:dnslogs
----Note: You can look for script output by searching: index=_internal sourcetype=cisco:umbrella:*

New features

  • 1.0.3: Added support for Cisco Managed Bucket

Fixed issues

  • 1.0.1: Added timezone setting as logs are requested in UTC by default.
  • 1.0.2: Fixed "category" field as it was being split into "categories" field which broke lookup table functionality. Removed trailing dot at the end of "query" field.
  • 1.0.3: Added vendor_product field, lowered action field for CIM compliance, altered eventtype to account for managed buckets.
  • 1.0.4: Fixes an issue in the README under delete-old-umbrella-logs.sh

Known issues

Third-party software attributions

DEV SUPPORT

Contact: splunk-app@hurricanelabs.com

Release Notes

Version 1.0.4
July 4, 2020

Updates readme

Version 1.0.3
Nov. 15, 2019

### New features
+ 1.0.3: Added support for Cisco Managed Bucket

### Fixed issues
+ 1.0.1: Added timezone setting as logs are requested in UTC by default.
+ 1.0.2: Fixed "category" field as it was being split into "categories" field which broke lookup table functionality. Removed trailing dot at the end of "query" field.
+ 1.0.3: Added vendor_product field, lowered action field for CIM compliance, altered eventtype to account for managed buckets.

Version 1.0.2
Jan. 22, 2019

### New features
+ 1.0.2: Added support for Version 3 and 4 which adds additional fields granular_identity_type, identity_type, and blocked_category.

### Fixed issues
+ 1.0.2: Fixed "category" field as it was being split into "categories" field which broke lookup table functionality. Removed trailing dot at the end of "query" field.

Version 1.0.1
July 19, 2018

Version 1.0.1 released.
+ Added timezone setting as logs are requested in UTC by default.
+ Added better instructions for setting up the AWS inputs, alongside a reference link to setup instructions on Umbrella's website.

The purpose of this add-on is to provide CIM compliant field extractions for Cisco Umbrella OpenDNS logs AWS S3 bucket logs.

This add-on requires the Splunk Add-on for Amazon Web Services as the means of data on-boarding.

+Built for Splunk Enterprise 6.x.x or higher
+CIM Compliance (CIM 4.0.0 or higher)
+Ready for Enterprise Security
+Requires Splunk Add-on for Amazon Web Services
++https://splunkbase.splunk.com/app/1876/
+Supports Cisco Umbrella Log Management Version 1.1

Version 1.0.0
March 29, 2018

The purpose of this add-on is to provide CIM compliant field extractions for Cisco Umbrella OpenDNS logs AWS S3 bucket logs.

This add-on requires the Splunk Add-on for Amazon Web Services as the means of data on-boarding.

+Built for Splunk Enterprise 6.x.x or higher
+CIM Compliance (CIM 4.0.0 or higher)
+Ready for Enterprise Security
+Requires Splunk Add-on for Amazon Web Services
++https://splunkbase.splunk.com/app/1876/
+Supports Cisco Umbrella Log Management Version 1.1

1,937
Installs
3,028
Downloads
Share Subscribe LOGIN TO DOWNLOAD

Subscribe Share

AppInspect Tooling

Splunk AppInspect evaluates Splunk apps against a set of Splunk-defined criteria to assess the validity and security of an app package and components.

Are you a developer?

As a Splunkbase app developer, you will have access to all Splunk development resources and receive a 10GB license to build an app that will help solve use cases for customers all over the world. Splunkbase has 1000+ apps and add-ons from Splunk, our partners and our community. Find an app or add-on for most any data source and user need, or simply create your own with help from our developer portal.

Follow Us:
© 2005-2020 Splunk Inc. All rights reserved.
Splunk®, Splunk>®, Listen to Your Data®, The Engine for Machine Data®, Hunk®, Splunk Cloud™, Splunk Light™, SPL™ and Splunk MINT™ are trademarks and registered trademarks of Splunk Inc. in the United States and other countries. All other brand names, product names, or trademarks belong to their respective owners.