The latest version of DLTK 4.x is available open source on GitHub: https://github.com/splunk/deep-learning-toolkit - feel free to open issues or join the community and actively contribute. For DLTK 3.x feel free to open issues, report bugs or raise feature requests on https://github.com/splunk/splunk-mltk-container-docker. This app is community supported, please also refer to community.splunk.com, post your questions and engage with answers. Thanks for your collaboration!
Extend the app with custom MLTK Containers: if you want to rebuild the existing MLTK Container images or want to build your own custom images navigate to https://github.com/splunk/splunk-mltk-container-docker
Q: When I launch a container the first time, I cannot access Jupyter Lab.
A: The selected container image will be downloaded from dockerhub automatically in the background when you launch a container for the first time. Depending on your network this can take a while to download the docker image for the first time as the image sizes range from 2-12 GB. Please allow for some time to get the images initially pulled from Dockerhub. You can check which docker images are available locally by running docker images
on your CLI.
Q: The example dashboards show no results or throw errors.
A: First, ensure that the right container image is downloaded and up and running for the specific example (e.g. TensorFlow examples require a TensorFlow container). Secondly, ensure that you have verified the associated notebook code exists in Juypter Lab and you have explicitly saved the notebook again (hit save button). By doing this, a python module is saved automatically generated (located in the /app/model folder in Juypter) which is needed to run the examples and populate the dashboards.
Q: Where are my notebooks stored in the docker environment?
A: By default, there are 2 docker volumes automatically mounted for persistance in your docker environment. Those volumes are named "mltk-container-app" and "mltk-container-notebooks". You can verify by running docker volume ls
on your CLI. Important note: from DLTK version 3.1 onwards there is a new default volume called "mltk-container-data" - see migration notes below.
Q: What is the password for Jupyter Lab?
A: Please have a look at the Model Development Guide page in the Deep Learning Toolkit app.
Due to the addition of Kubernetes there was a change and addition made to the way how volumes behave. From 3.1 on and with the use of the golden image the container directory /srv is the default and notebooks and app code is in there. In earlier versions there were two docker volumes mounted into /srv/app and /srv/notebooks which will be mapped into a backup folder in Jupyter from version 3.1 on. For migration simply copy your notebooks from the backup folder back into the notebooks and app folder in case those are empty.
Added algorithm examples for
- Support Vector Regressor with grid search in scikit-learn
- Causal inference
Updated Golden Image GPU to version 3.4.1
Example for Process Mining with PM4Py
Other minor small fixes
- Added XGBoost Classifier with SHAP for Explainability
- Added XGBoost Regressor
- Bug fixes in setup UI
Graphics:
- Background graphics in content overview page
- Docker + Kubernetes status green highlight in setup dashboard
- Content UI icons refresh
New docker images for Spark and Rapids:
- Spark Image (Experimental)
- Rapids Image (Experimental)
Content updates:
- Correlation Matrix and seaborn plot embedding
- DGA datashader example
- Spark GradientBoosting (non distributed, local client only)
- Spark Hello World / Barebone / Pi
- Spark FP Growth
- Spark ALS Recommender System
- Rapids Graph example
- Rapids UMAP example
Other:
- Passwords.conf add for missing kubernetes field
- Search head cluster config replication (Thank you Martin!)
- Return dataframe of arbitrary/changed shapes
- Setup options for Kubernetes and Openshift environment
- Refresh of the container image ("golden image") with added Jupyter Lab Extensions for integrated Tensorboard and DASK management
- New example for forecasting with Prophet
- New example for distributed machine learning with DASK
- New example for graph related algorithms with NetworkX
- New examples for device-agnostic PyTorch CPU/GPU
- New example for Japanese language NLP library Ginza
- Fix of splunk_server=local in | rest calls
- Several UI updates on dashboards
- Bugfix with auth_mode in sync handler
- Several other bug fixes
This version is only compatible with Splunk 8.0+ and MLTK 5.0+
If you run on Splunk 7.x and MLTK 4.x please select, download and install Deep Learning Toolkit version 2.3.0
This version is only compatible with Splunk 7.x and MLTK 4.x only.
If you run on Splunk 8.0+ and MLTK 5.0+ please select, download and install Deep Learning Toolkit version 3.0.0
Splunk AppInspect evaluates Splunk apps against a set of Splunk-defined criteria to assess the validity and security of an app package and components.
As a Splunkbase app developer, you will have access to all Splunk development resources and receive a 10GB license to build an app that will help solve use cases for customers all over the world. Splunkbase has 1000+ apps and add-ons from Splunk, our partners and our community. Find an app or add-on for most any data source and user need, or simply create your own with help from our developer portal.