This app allows you to push Splunk search results to remote destinations. Currently supports:
search | s3ep credential=<credential name> bucket=<bucket> outputfile=<[folder/]filename> outputformat=[json|raw|kv|csv|tsv|pipe] compression=[true|false] fields=<fields list>
Push Splunk events to AWS S3 over JSON or raw text. The S3 bucket can be configured in the app setup page or in hep.conf directly. It can be configured to use the assumed role of the search head EC2 instance, or up to 20 OAuth credentials can be configured.
Syntax: credential=<credential name>
Description: The name/alias of the configured credential
Default: The credential specified in the "default_credential" setting in hep.conf will be used. This can be specified in the file by ID (e.g. credential1) or alias. If none is set and use_arn is enabled in the configuration, the assumed role of the local EC2 instance will be used.
Syntax: bucket=<bucket name>
Description: The name of the destination S3 bucket
Default: The bucket name defined in hep.conf, aws stanza
Syntax: outputfile=<[folder/]file name>
Description: The name of the file to be written to the S3 bucket. If compression=true, a .gz extension will be appended. If compression is not specified and the filename ends in .gz, compression will automatically be applied.
search_admin_1588000000.log). json=.json, csv=.csv, tsv=.tsv, pipe=.log, kv=.log, raw=.log
__today__=date in yyyy-mm-dd format,
__nowft__=timestamp in yyyy-mm-dd_hhmmss format.
Description: The format written for the output events/search results
Syntax: fields="field1, field2, field3"
Description: Limit the fields to be written to the S3 file
Default: All (Unspecified)
Description: Compress the output into a .gz file before uploading to S3
Default: false, unless outputfile ends in .gz
search | hep host=[host_value|$host_field$] source=[source_value|$source_field$] sourcetype=[sourcetype_value|$sourcetype_field$] index=[index_value|$index_field$]
Push Splunk events to an HTTP listener (such as Splunk HEC) over JSON.
Description: Field or string to be assigned to the host field on the pushed event
Default: $host$, or if not defined, the hostname of the sending host (from inputs.conf)
Description: Field or string to be assigned to the source field on the pushed event
Default: $source$, or if not defined, it is omitted
Description: Field or string to be assigned to the sourcetype field on the pushed event
Default: $sourcetype$, or if not defined, json
Description: The remote index in which to store the pushed event
Default: $index$, or if not defined, the remote endpoint's default.
We welcome your input on our app feature roadmap, which can be found on Trello.
- New setup page (removed legacy setup.xml)
- Refactored the setup process for S3 credentials.
- Removed S3 "region" setting and s3ep search command parameter.
- Added "default credential" option for AWS.
- Added "Use ARN" functionality to leverage the Search Head EC2 instance's assumed role (as returned by get-caller-identity).
- Added compression option for S3 file output as a search command argument.
- Added native credential encryption and decryption using splunk.secret, so that regular users can invoke the search commands with no special permissions.
- Python version 3 explicitly set for all scripts.
- Better Python 2/3 cross-compatibility.
** Special thanks to Steve McMaster for the help with the AES credential encryption/decryption in his splunksecrets SplunkSecrets code. He made special changes for us so it would work as a module within Splunk's Python environment.
Moved configuration from alert_actions.conf to hep.conf
Added search command for HEC Event Push (hep)
Added search command for S3 Event Push (s3ep)
As a Splunkbase app developer, you will have access to all Splunk development resources and receive a 10GB license to build an app that will help solve use cases for customers all over the world. Splunkbase has 1000+ apps and add-ons from Splunk, our partners and our community. Find an app or add-on for most any data source and user need, or simply create your own with help from our developer portal.