icon/x Created with Sketch.

Splunk Cookie Policy

We use our own and third-party cookies to provide you with a great online experience. We also use these cookies to improve our products and services, support our marketing campaigns, and advertise to you on our website and other websites. Some cookies may continue to collect information after you have left our website. Learn more (including how to update your settings) here.
Accept Cookie Policy

Accept License Agreements

This app is provided by a third party and your right to use the app is in accordance with the license provided by that third-party licensor. Splunk is not responsible for any third-party apps and does not provide any warranty or support. If you have any questions, complaints or claims with respect to this app, please contact the licensor directly.

Thank You

Downloading Deep Learning Toolkit for Splunk
SHA256 checksum (deep-learning-toolkit-for-splunk_360.tgz) dd89d21e6b53dc27b2432c6d0e04845a5b67082478876ee5ebe1dc0636ccf968 SHA256 checksum (deep-learning-toolkit-for-splunk_350.tgz) 95339a205ca795c43b0e03ab45e394955b66a6119b805b2291e0223e4c1a3761 SHA256 checksum (deep-learning-toolkit-for-splunk_340.tgz) 92ec463d6257ade26202be39392586f6f92dbd097d529e7a8d5acb56966398db SHA256 checksum (deep-learning-toolkit-for-splunk_330.tgz) 4216443f36aba411b9b5df210d20a8270b0bf3ef8164861dffef337e56a9ee2f SHA256 checksum (deep-learning-toolkit-for-splunk_320.tgz) 872bb2acf8fa01060d0f422ef19e6d12ae2a1c0b9a4156815f9e6bf6709da211 SHA256 checksum (deep-learning-toolkit-for-splunk_311.tgz) 4b9f89497aa87f6c02116bf54e5a7e93294cda5c3013ae00117ef2fa5b40bca7 SHA256 checksum (deep-learning-toolkit-for-splunk_300.tgz) 76989eea40a618ab170c1b3a882434c0ca06029e2032f601523611f17d2f5537 SHA256 checksum (deep-learning-toolkit-for-splunk_230.tgz) 6a2b82e3bb0b63e80dc33d840dd5a80b1ded3edf4440350a9b14903869359fe8
To install your download
For instructions specific to your download, click the Details tab after closing this window.

Flag As Inappropriate


Deep Learning Toolkit for Splunk

Splunk Cloud
Splunk Labs
This app is NOT supported by Splunk. Please read about what that means for you here.
The Deep Learning Toolkit App for Splunk ( DLTK ) allows you to integrate advanced custom machine learning systems with the Splunk platform. It extends Splunk’s Machine Learning Toolkit ( MLTK ) with prebuilt Docker containers for TensorFlow, PyTorch and a collection of NLP and classical machine learning libraries. By using predefined workflows for rapid development with Jupyter Lab Notebooks the app enables you to build, test (e.g. using TensorBoard) and operationalise your customised models with Splunk. You can leverage GPUs for compute intense training tasks and flexibly deploy models on CPU or GPU enabled containers. The app ships with various examples that showcase different deep learning and machine learning algorithms for classification, regression, forecasting, clustering, graph analytics and NLP. This allows you to tackle advanced data science use cases in Splunk’s main areas of IT Operations, Security, Application Development, IoT, Business Analytics and beyond.

Deep Learning Toolkit for Splunk (DLTK)

The latest version of DLTK 4.x is available open source on GitHub: https://github.com/splunk/deep-learning-toolkit - feel free to open issues or join the community and actively contribute. For DLTK 3.x feel free to open issues, report bugs or raise feature requests on https://github.com/splunk/splunk-mltk-container-docker. This app is community supported, please also refer to community.splunk.com, post your questions and engage with answers. Thanks for your collaboration!


Quick start guide

  • Ensure Splunk Machine Learning Toolkit is installed and configured properly for your Splunk deployment.
  • Restart your Splunk instance after installing the Machine Learning Toolkit and the Deep Learning Toolkit App for Splunk.
  • You need to have an internet connected Docker environment accessible with permissions to pull the prebuilt MLTK container images from Dockerhub and start containers. If you are running Docker in an air-gapped environment read the description in the app.
  • Setup the Deep Learning Toolkit App for Splunk by connecting it to your Docker environment using the setup page in the app.
  • Start a development container from the container management dashboard and depending on your selected image run one of the examples to verify that the Deep Learning Toolkit app works:
  • Neural Network Classifier Example for Tensorflow
  • Logistic Regression Classifier Example for PyTorch

Build your own containers

Extend the app with custom MLTK Containers: if you want to rebuild the existing MLTK Container images or want to build your own custom images navigate to https://github.com/splunk/splunk-mltk-container-docker

Further information, recent blog posts and additional resources

List of available algorithm examples (35)

  • Binary neural network classifier build on keras and TensorFlow
  • Logistic regression using PyTorch
  • Multi class neural network classifier using PyTorch with GPU
  • Multi class neural network classifier using PyTorch for DGA
  • Gradient boosting model with Spark's MLLib applied to the DGA dataset
  • XGBoost classifier with SHAP explainability
  • Automated machine learning with auto-sklearn
  • Linear regression using the TensorFlow™ estimator class
  • Regression using the TensorFlow™ Deep Neural Network (DNN) estimator class
  • XGBoost regressor
  • Support vector regressor with grid search
  • Multivariate LSTM regressor
  • Forecasting time series using TensorFlow (CNN)
  • Forecasting time series using TensorFlow (LSTM)
  • Forecasting time series using the Prophet library
  • Basic auto encoder using TensorFlow™
  • Distributed algorithm execution with DASK for KMeans
  • Clustering with UMAP and DBSCAN
  • Named Entity Recognition using spaCy for NLP tasks
  • Named Entity Recognition using spaCy Ginza (Japanese)
  • Sentiment Analysis using spaCy
  • Graph centrality algorithms using NetworkX
  • Graph community detection with Rapids (GPU accelerated)
  • Causal Inference with causalnex
  • Frequent itemset mining with Spark FP Growth
  • Recommender system with Spark Collaborative Filtering
  • Rapids UMAP (GPU accelerated)
  • Process Mining with PM4Py
  • Time series analysis with STUMPY
  • Changepoint Detection
  • Robust Random Cut Forest for anomaly detection
  • Time series decomposition (STL)
  • DGA Datashader Visualization Example
  • Correlation Matrix and Pair Plot
  • Spark Pi Hello World example


Q: When I launch a container the first time, I cannot access Jupyter Lab.
A: The selected container image will be downloaded from dockerhub automatically in the background when you launch a container for the first time. Depending on your network this can take a while to download the docker image for the first time as the image sizes range from 2-12 GB. Please allow for some time to get the images initially pulled from Dockerhub. You can check which docker images are available locally by running docker images on your CLI.

Q: When I run DLTK 3.5 locally my browser showes insecure connection.
A: From DLTK 3.5 onwards the container images use HTTPS by default with self signed certificates for the data transfer related api endpoints and Jupyter Lab. Many browsers show "insecure connection" warnings and some allow to suppress that for localhost connections used during development. For production use, please work with your administrators to secure your setup and build containers with your own certificates as needed or use more advanced container environment setups.

Q: The example dashboards show no results or throw errors.
A: First, ensure that the right container image is downloaded and up and running for the specific example (e.g. TensorFlow examples require a TensorFlow container). Secondly, ensure that you have verified the associated notebook code exists in Juypter Lab and you have explicitly saved the notebook again (hit save button). By doing this, a python module is saved automatically generated (located in the /app/model folder in Juypter) which is needed to run the examples and populate the dashboards. Lastly, please check if MLTK's app permissions are set to global so that DLTK can use the lookup files used in most of the examples.

Q: Containers suddenly get stopped in about 1 minute after start.
A: Most likely you have two or more DLTK apps installed and configured to use the same container environment. In DLTK 3.x versions there is a scheduled search (MLTK Container Sync) that ensures synchronization of running containers and associated models for the DLTK app. If more than one DLTK app is running, this can cause synchronization collisions and therefore containers get stopped. When using DLTK 3.x please connect each DLTK app in a 1:1 relationship with your Docker and/or Kubernetes environment.

Q: Where are my notebooks stored in the docker environment?
A: By default, there are 2 docker volumes automatically mounted for persistance in your docker environment. Those volumes are named "mltk-container-app" and "mltk-container-notebooks". You can verify by running docker volume ls on your CLI. Important note: from DLTK version 3.1 onwards there is a new default volume called "mltk-container-data" - see migration notes below.

Q: What is the password for Jupyter Lab?
A: Please have a look at the Model Development Guide page in the Deep Learning Toolkit app.

Notebooks Migration note for change to 3.1 and later versions

Due to the addition of Kubernetes there was a change and addition made to the way how volumes behave. From 3.1 on and with the use of the golden image the container directory /srv is the default and notebooks and app code is in there. In earlier versions there were two docker volumes mounted into /srv/app and /srv/notebooks which will be mapped into a backup folder in Jupyter from version 3.1 on. For migration simply copy your notebooks from the backup folder back into the notebooks and app folder in case those are empty.

Release Notes

Version 3.6.0
July 14, 2021
  • Bug fixes, updates and performance improvements
  • Operational overview dashboard
  • Drill-down links on the container management dashboard
  • New examples:
    -- Automated Machine Learning with auto-sklearn
    -- Robust Random Cut Forest for anomaly detection
    -- Time series decomposition for seasonality and trend (STL)
    -- Sentiment Analysis with spaCy
  • Image update: Golden Image CPU 3.6
Version 3.5.0
Feb. 22, 2021

Updated container images for:
- Golden Image CPU
- Golden Image GPU
- Rapids 0.17
- Spark 3.0.1

With improvements for:
- Model management with MLflow
- Integrated GIT version control in Jupyter Lab
- HTTPS for api and jupyter endpoints

Newly added algorithm examples:
- Matrix profiles with STUMPY
- Changepoint Detection
- Multivariate LSTM Regressor

Version 3.4.0
Dec. 21, 2020

Added algorithm examples for
- Support Vector Regressor with grid search in scikit-learn
- Causal inference
Updated Golden Image GPU to version 3.4.1
Example for Process Mining with PM4Py
Other minor small fixes

Version 3.3.0
Oct. 6, 2020
  • Added XGBoost Classifier with SHAP for Explainability
  • Added XGBoost Regressor
  • Bug fixes in setup UI
Version 3.2.0
June 22, 2020

- Background graphics in content overview page
- Docker + Kubernetes status green highlight in setup dashboard
- Content UI icons refresh

New docker images for Spark and Rapids:
- Spark Image (Experimental)
- Rapids Image (Experimental)

Content updates:
- Correlation Matrix and seaborn plot embedding
- DGA datashader example
- Spark GradientBoosting (non distributed, local client only)
- Spark Hello World / Barebone / Pi
- Spark FP Growth
- Spark ALS Recommender System
- Rapids Graph example
- Rapids UMAP example

- Passwords.conf add for missing kubernetes field
- Search head cluster config replication (Thank you Martin!)
- Return dataframe of arbitrary/changed shapes

Version 3.1.1
May 11, 2020
  • Setup options for Kubernetes and Openshift environment
  • Refresh of the container image ("golden image") with added Jupyter Lab Extensions for integrated Tensorboard and DASK management
  • New example for forecasting with Prophet
  • New example for distributed machine learning with DASK
  • New example for graph related algorithms with NetworkX
  • New examples for device-agnostic PyTorch CPU/GPU
  • New example for Japanese language NLP library Ginza
  • Fix of splunk_server=local in | rest calls
  • Several UI updates on dashboards
  • Bugfix with auth_mode in sync handler
  • Several other bug fixes
Version 3.0.0
Nov. 29, 2019

This version is only compatible with Splunk 8.0+ and MLTK 5.0+
If you run on Splunk 7.x and MLTK 4.x please select, download and install Deep Learning Toolkit version 2.3.0

Version 2.3.0
Oct. 14, 2019

This version is only compatible with Splunk 7.x and MLTK 4.x only.
If you run on Splunk 8.0+ and MLTK 5.0+ please select, download and install Deep Learning Toolkit version 3.0.0


Subscribe Share

Are you a developer?

As a Splunkbase app developer, you will have access to all Splunk development resources and receive a 10GB license to build an app that will help solve use cases for customers all over the world. Splunkbase has 1000+ apps and add-ons from Splunk, our partners and our community. Find an app or add-on for most any data source and user need, or simply create your own with help from our developer portal.

Follow Us:
Splunk, Splunk>,Turn Data Into Doing, Data-to-Everything, and D2E are trademarks or registered trademarks of Splunk Inc. in the United States and other countries. All other brand names,product names,or trademarks belong to their respective owners.