icon/x Created with Sketch.

Splunk Cookie Policy

We use our own and third-party cookies to provide you with a great online experience. We also use these cookies to improve our products and services, support our marketing campaigns, and advertise to you on our website and other websites. Some cookies may continue to collect information after you have left our website. Learn more (including how to update your settings) here.
Accept Cookie Policy

Accept License Agreements

This app is provided by a third party and your right to use the app is in accordance with the license provided by that third-party licensor. Splunk is not responsible for any third-party apps and does not provide any warranty or support. If you have any questions, complaints or claims with respect to this app, please contact the licensor directly.

Thank You

Downloading Deep Learning Toolkit for Splunk
SHA256 checksum (deep-learning-toolkit-for-splunk_320.tgz) 872bb2acf8fa01060d0f422ef19e6d12ae2a1c0b9a4156815f9e6bf6709da211 SHA256 checksum (deep-learning-toolkit-for-splunk_311.tgz) 4b9f89497aa87f6c02116bf54e5a7e93294cda5c3013ae00117ef2fa5b40bca7 SHA256 checksum (deep-learning-toolkit-for-splunk_300.tgz) 76989eea40a618ab170c1b3a882434c0ca06029e2032f601523611f17d2f5537 SHA256 checksum (deep-learning-toolkit-for-splunk_230.tgz) 6a2b82e3bb0b63e80dc33d840dd5a80b1ded3edf4440350a9b14903869359fe8
To install your download
For instructions specific to your download, click the Details tab after closing this window.

Flag As Inappropriate

Deep Learning Toolkit for Splunk

Splunk AppInspect Passed
Admins: Please read about Splunk Enterprise 8.0 and the Python 2.7 end-of-life changes and impact on apps and upgradeshere.
Overview
Details
The Deep Learning Toolkit for Splunk allows you to integrate advanced custom machine learning systems with the Splunk platform. It extends Splunk’s Machine Learning Toolkit with prebuilt Docker containers for TensorFlow, PyTorch and a collection of NLP and classical machine learning libraries. By using predefined workflows for rapid development with Jupyter Lab Notebooks the app enables you to build, test (e.g. using TensorBoard) and operationalise your customised models with Splunk. You can leverage GPUs for compute intense training tasks and flexibly deploy models on CPU or GPU enabled containers. The app ships with various examples that showcase different deep learning and machine learning algorithms for classification, regression, forecasting, clustering, graph analytics and NLP. This allows you to tackle advanced data science use cases in Splunk’s main areas of IT Operations, Security, Application Development, IoT, Business Analytics and beyond.

Deep Learning Toolkit for Splunk (DLTK)

Prerequisites

Quick start guide

  • Ensure Splunk Machine Learning Toolkit is installed and configured properly for your Splunk deployment.
  • Restart your Splunk instance after installing the Machine Learning Toolkit and the Deep Learning Toolkit App for Splunk.
  • You need to have an internet connected Docker environment accessible with permissions to pull the prebuilt MLTK container images from Dockerhub and start containers. If you are running Docker in an air-gapped environment read the description in the app.
  • Setup the Deep Learning Toolkit App for Splunk by connecting it to your Docker environment using the setup page in the app.
  • Start a development container from the container management dashboard and depending on your selected image run one of the examples to verify that the Deep Learning Toolkit app works:
  • Neural Network Classifier Example for Tensorflow
  • Logistic Regression Classifier Example for PyTorch

Build your own containers

Extend the app with custom MLTK Containers: if you want to rebuild the existing MLTK Container images or want to build your own custom images navigate to the github repo

Further information, blogs and resources

FAQ

Q: When I launch a container the first time, I cannot access Jupyter Lab.
A: The selected container image will be downloaded from dockerhub automatically in the background when you launch a container for the first time. Depending on your network this can take a while to download the docker image for the first time as the image sizes range from 2-6 GB. Please allow for some time to get the images initially pulled from Dockerhub. You can check which docker images are available locally by running docker images on your CLI.

Q: The example dashboards show no results or throw errors.
A: First, ensure that the right container image is downloaded and up and running for the specific example (e.g. TensorFlow examples require a TensorFlow container). Secondly, ensure that you have verified the associated notebook code exists in Juypter Lab and you have explicitly saved the notebook again (hit save button). By doing this, a python module is saved automatically generated (located in the /app/model folder in Juypter) which is needed to run the examples and populate the dashboards.

Q: Where are my notebooks stored in the docker environment?
A: By default, there are 2 docker volumes automatically mounted for persistance in your docker environment. Those volumes are named "mltk-container-app" and "mltk-container-notebooks". You can verify by running docker volume ls on your CLI. Important note: from DLTK version 3.1 onwards there is a new default volume called "mltk-container-data" - see migration notes below.

Q: What is the password for Jupyter Lab?
A: Please have a look at the Model Development Guide page in the Deep Learning Toolkit app.

Notebooks Migration note for change to 3.1

Due to the addition of Kubernetes there was a change and addition made to the way how volumes behave. From 3.1 on and with the use of the golden image the container directory /srv is the default and notebooks and app code is in there. In earlier versions there were two docker volumes mounted into /srv/app and /srv/notebooks which will be mapped into a backup folder in Jupyter from version 3.1 on. For migration simply copy your notebooks from the backup folder back into the notebooks and app folder in case those are empty.

Release Notes

Version 3.2.0
June 22, 2020

Graphics:
- Background graphics in content overview page
- Docker + Kubernetes status green highlight in setup dashboard
- Content UI icons refresh

New docker images for Spark and Rapids:
- Spark Image (Experimental)
- Rapids Image (Experimental)

Content updates:
- Correlation Matrix and seaborn plot embedding
- DGA datashader example
- Spark XGBoost (non distributed, local client only)
- Spark Hello World / Barebone / Pi
- Spark FP Growth
- Spark ALS Recommender System
- Rapids Graph example
- Rapids UMAP example

Other:
- Passwords.conf add for missing kubernetes field
- Search head cluster config replication (Thank you Martin!)
- Return dataframe of arbitrary/changed shapes

Version 3.1.1
May 11, 2020

- Setup options for Kubernetes and Openshift environment
- Refresh of the container image ("golden image") with added Jupyter Lab Extensions for integrated Tensorboard and DASK management
- New example for forecasting with Prophet
- New example for distributed machine learning with DASK
- New example for graph related algorithms with NetworkX
- New examples for device-agnostic PyTorch CPU/GPU
- New example for Japanese language NLP library Ginza
- Fix of splunk_server=local in | rest calls
- Several UI updates on dashboards
- Bugfix with auth_mode in sync handler
- Several other bug fixes

Version 3.0.0
Nov. 29, 2019

This version is only compatible with Splunk 8.0+ and MLTK 5.0+
If you run on Splunk 7.x and MLTK 4.x please select, download and install Deep Learning Toolkit version 2.3.0

Version 2.3.0
Oct. 14, 2019

This version is only compatible with Splunk 7.x and MLTK 4.x only.
If you run on Splunk 8.0+ and MLTK 5.0+ please select, download and install Deep Learning Toolkit version 3.0.0

318
Installs
1,402
Downloads
Share Subscribe LOGIN TO DOWNLOAD

Subscribe Share

AppInspect Tooling

Splunk AppInspect evaluates Splunk apps against a set of Splunk-defined criteria to assess the validity and security of an app package and components.

Are you a developer?

As a Splunkbase app developer, you will have access to all Splunk development resources and receive a 10GB license to build an app that will help solve use cases for customers all over the world. Splunkbase has 1000+ apps and add-ons from Splunk, our partners and our community. Find an app or add-on for most any data source and user need, or simply create your own with help from our developer portal.

Follow Us:
© 2005-2020 Splunk Inc. All rights reserved.
Splunk®, Splunk>®, Listen to Your Data®, The Engine for Machine Data®, Hunk®, Splunk Cloud™, Splunk Light™, SPL™ and Splunk MINT™ are trademarks and registered trademarks of Splunk Inc. in the United States and other countries. All other brand names, product names, or trademarks belong to their respective owners.