This add-on exports your Splunk search results to remote destinations so you can do more with your Splunk data. It provides search commands and alert actions to export/push/upload/share your data to multiple destinations of each type. The app must be configured via the Setup dashboard before using it. The setup dashboard includes a connection test feature in the form of a "Browse" action for all file-based destinations.
Use the Credentials tab to manage usernames, passwords, and passphrases (used for private keys) within the Splunk secret store. Certain use cases (such as private key logins) may not require a password, but Splunk requires one to be entered anyway. For passphrases, type any description into the username field. OAuth credentials such as those for AWS use the username field for the access key and the password field for the secret access key. Due to the way Splunk manages credentials, the username field cannot be changed once it is saved.
Add read capabilities for each command to users who require access to use the search command or alert action. Add write capability to allow them to make changes to the configuration. By default, admin has full access and power has read-only access. Credential permissions must be granted separately, but are required to use each command that depends on them.
Export Splunk search results to AWS S3-compatible object storage. Connections can be configured to authenticate using OAuth credentials or the assumed role of the search head EC2 instance.
<search> | epawss3
target=<target name/alias>
bucket=<bucket>
outputfile=<output path/filename>
outputformat=[json|raw|kv|csv|tsv|pipe]
fields="<comma-delimited fields list>"
compress=[true|false]
Syntax: target=<target name/alias>
Description: The name/alias of the destination connection
Default: The target specified as the default within the setup dashboard
Syntax: bucket=<bucket name>
Description: The name of the destination S3 bucket
Default: Specified within the target configuration
Syntax: outputfile=<[folder/]file name>
Description: The name of the file to be written to the destination. If compression=true, a .gz extension will be appended. If compression is not specified and the filename ends in .gz, compression will automatically be applied.
Default: app_username_epoch.ext
(e.g. search_admin_1588000000.log
). json=.json, csv=.csv, tsv=.tsv, pipe=.log, kv=.log, raw=.log
Keywords: __now__
=epoch, __today__
=date in yyyy-mm-dd format, __nowft__
=timestamp in yyyy-mm-dd_hhmmss format.
Syntax: outputformat=[json|raw|kv|csv|tsv|pipe]
Description: The format for the exported search results
Default: csv
Syntax: fields="field1, field2, field3"
Description: Limit the fields to be written to the exported file. Wildcards are supported.
Default: All (*)
Syntax: compress=[true|false]
Description: Create the file as a .gz compressed archive
Default: Specified within the target configuration
Export Splunk search results to Box cloud storage. Box must be configured with a Custom App using Server Authentication (with JWT) and a certificate generated. Then, the app must be submitted for approval by the administrator. The administrator should create a folder within the app's account and share it with the appropriate users.
<search> | epbox
target=<target name/alias>
outputfile=<output path/filename>
outputformat=[json|raw|kv|csv|tsv|pipe]
fields="<comma-delimited fields list>"
compress=[true|false]
Syntax: target=<target name/alias>
Description: The name/alias of the destination connection
Default: The target specified as the default within the setup dashboard
Syntax: outputfile=<[folder/]file name>
Description: The name of the file to be written to the destination. If compression=true, a .gz extension will be appended. If compression is not specified and the filename ends in .gz, compression will automatically be applied.
Default: app_username_epoch.ext
(e.g. search_admin_1588000000.log
). json=.json, csv=.csv, tsv=.tsv, pipe=.log, kv=.log, raw=.log
Keywords: __now__
=epoch, __today__
=date in yyyy-mm-dd format, __nowft__
=timestamp in yyyy-mm-dd_hhmmss format.
Syntax: outputformat=[json|raw|kv|csv|tsv|pipe]
Description: The format for the exported search results
Default: csv
Syntax: fields="field1, field2, field3"
Description: Limit the fields to be written to the exported file. Wildcards are supported.
Default: All (*)
Syntax: compress=[true|false]
Description: Create the file as a .gz compressed archive
Default: Specified within the target configuration
Export Splunk search results to SMB file shares.
<search> | epsmb
target=<target name/alias>
outputfile=<output path/filename>
outputformat=[json|raw|kv|csv|tsv|pipe]
fields="<comma-delimited fields list>"
compress=[true|false]
Syntax: target=<target name/alias>
Description: The name/alias of the destination connection
Default: The target specified as the default within the setup dashboard
Syntax: outputfile=<[folder/]file name>
Description: The name of the file to be written to the destination. If compression=true, a .gz extension will be appended. If compression is not specified and the filename ends in .gz, compression will automatically be applied.
Default: app_username_epoch.ext
(e.g. search_admin_1588000000.log
). json=.json, csv=.csv, tsv=.tsv, pipe=.log, kv=.log, raw=.log
Keywords: __now__
=epoch, __today__
=date in yyyy-mm-dd format, __nowft__
=timestamp in yyyy-mm-dd_hhmmss format.
Syntax: outputformat=[json|raw|kv|csv|tsv|pipe]
Description: The format for the exported search results
Default: csv
Syntax: fields="field1, field2, field3"
Description: Limit the fields to be written to the exported file. Wildcards are supported.
Default: All (*)
Syntax: compress=[true|false]
Description: Create the file as a .gz compressed archive
Default: Specified within the target configuration
Export Splunk search results to SFTP servers.
<search> | epsftp
target=<target name/alias>
outputfile=<output path/filename>
outputformat=[json|raw|kv|csv|tsv|pipe]
fields="<comma-delimited fields list>"
compress=[true|false]
Syntax: target=<target name/alias>
Description: The name/alias of the destination connection
Default: The target specified as the default within the setup dashboard
Syntax: outputfile=<[folder/]file name>
Description: The name of the file to be written to the destination. If compression=true, a .gz extension will be appended. If compression is not specified and the filename ends in .gz, compression will automatically be applied.
Default: app_username_epoch.ext
(e.g. search_admin_1588000000.log
). json=.json, csv=.csv, tsv=.tsv, pipe=.log, kv=.log, raw=.log
Keywords: __now__
=epoch, __today__
=date in yyyy-mm-dd format, __nowft__
=timestamp in yyyy-mm-dd_hhmmss format.
Syntax: outputformat=[json|raw|kv|csv|tsv|pipe]
Description: The format for the exported search results
Default: csv
Syntax: fields="field1, field2, field3"
Description: Limit the fields to be written to the exported file. Wildcards are supported.
Default: All (*)
Syntax: compress=[true|false]
Description: Create the file as a .gz compressed archive
Default: Specified within the target configuration
Push Splunk search results to a Splunk HTTP Event Collector (HEC) listener.
<search> | ephec
target=<target name/alias>
host=[host_value|$host_field$]
source=[source_value|$source_field$]
sourcetype=[sourcetype_value|$sourcetype_field$]
index=[index_value|$index_field$]
Syntax: host=[host_value|$host_field$]
Description: Field or string to be assigned to the host field on the pushed event
Default: $host$, or if not defined, the hostname of the sending host (from inputs.conf)
Syntax: source=[source_value|$source_field$]
Description: Field or string to be assigned to the source field on the pushed event
Default: $source$, or if not defined, it is omitted
Syntax: sourcetype=[sourcetype_value|$sourcetype_field$]
Description: Field or string to be assigned to the sourcetype field on the pushed event
Default: $sourcetype$, or if not defined, json
Syntax: index=[index_value|$index_field$]
Description: The remote index in which to store the pushed event
Default: $index$, or if not defined, the remote endpoint's default.
Having trouble with the app? Feel free to email us and we’ll help you sort it out. You can also reach the author on the Splunk Community Slack.
We welcome your input on our app feature roadmap, which can be found on Trello.
Fixed SHC configuration replication.
Fixed issues with the EC2 Assumed Role functionality.
Fixed alert action issues (compensating for blank inputs).
Fixed issues with CSS overriding the Splunk default styles.
Fixed automated React dependency vulnerabilities identified by Github.
Fixed AM/PM time format in destination Browse file listing interface.
Fixed parsing of output file paths (relative/combined with default_path, or absolute).
Updated token replacements logic to support combined tokens and strings.
Updated alert action configuration REST searches to use splunk_server=local.
Updated the Splunk SDK.
Removed Python 2 libraries.
Cleaned up imports.
Fixed app setup issue on Splunk 8.2.
Updated dashboard jQuery version.
Initial release
As a Splunkbase app developer, you will have access to all Splunk development resources and receive a 10GB license to build an app that will help solve use cases for customers all over the world. Splunkbase has 1000+ apps from Splunk, our partners and our community. Find an app for most any data source and user need, or simply create your own with help from our developer portal.