disable_extraction=yes. Best used when looking at individual fields of data.)
cc_regexto allow control over this secondary regex (Reference Section: How it Works)
output_fieldwas replaced with
output_prefixallowing better/more consistent control over output fields.
This search command is packaged with the following external library:
+ Splunk SDK for Python version 1.6.2 (http://dev.splunk.com/python)
Nothing further is required for this add-on to function.
This application intends to add a LUHN algorithm checking capability to Splunk Enterprise so you can easily search logs coming from PCI sources for potential credit card numbers. The goal is for administrators to find this data in order to properly mask it before it comes into Splunk. This command was created because regex checking for credit cards isn't enough.
Possible use cases for this command:
Using this command will result in two new fields in your data:
ta_luhn_matches All values matched in the input data ta_luhn_check True|False
The command can be controlled with the following options:
|disable_extraction||no||yes, no||Disables data extraction. Should be set to 'yes' when you do not require extraction of values from inside other text|
|input_field||_raw||string value||The field to evaluate for possible credit card numbers|
|output_prefix||ta_luhn_||string value||Prefix for fields added to events|
||regular expression||When extraction is enabled, this looks for all matches within a given data and extracts them for further analysis|
|ccpattern_regex||see below||regular expression||A credit card number is considered a valid match if it matches this regex and passes the LUHN check.|
Search raw data, extract any sequence of 13-30 characters comprised of digits, spaces, and/or dashes for potential credit card numbers:
sourcetype = pci_log_source | luhn
Look at a single field in a datasource, extraction not required:
sourcetype = pci_log_source | luhn input_field=cc_num_field disable_extraction="yes"
Specify your own regex for extraction:
sourcetype = pci_log_source | luhn disable_extraction="no" input_field=_raw regex="(\\d+)"
NOTE: You have to escape the regex when providing it via splunk search.
It basically works like this:
input_fieldexists in the data.
ccpattern_regexproduces a match, compare to luhn algorithm
If any match is found
ta_luhn_check will be set to
True. All found matches are stored in
If support is required or you would like to contribute to this project, please reference: https://gitlab.com/johnfromthefuture/TA-luhn. This app is supported by the developer as time allows.
+ Tested compat with Splunk 8.1
+ Upgraded Splunk SDK to 1.6.14
Confirmed compatibility with splunk 8 / py3
+ Small code changes to better support py3 in the future
+ Tested compatibility with Splunk 7.3
+ Removed configuration to force command to run locally to support distributed streaming
+ Tested compatibility with Splunk 7.2
+ Refactored to use Splunk SDK for Python instead of intersplunk
+ Updated README to a markdown file better suited for the git repository
+ Extraction regex can be bypassed to speed up checking generally making the command better. (Set `disable_extraction=yes`. Best used when looking at individual fields of data.)
+ Added option `cc_regex` to allow control over this secondary regex (Reference Section: How it Works)
+ Option `output_field` was replaced with `output_prefix` allowing better/more consistent control over output fields.
+ Output fields are now `output_prefix`+`luhn_check` and `output_prefix`+`luhn_matches`
+ Fixed default metadata.
+ Added appicon images for compatibility with certification.
1.0 Initial release
Splunk AppInspect evaluates Splunk apps against a set of Splunk-defined criteria to assess the validity and security of an app package and components.
As a Splunkbase app developer, you will have access to all Splunk development resources and receive a 10GB license to build an app that will help solve use cases for customers all over the world. Splunkbase has 1000+ apps and add-ons from Splunk, our partners and our community. Find an app or add-on for most any data source and user need, or simply create your own with help from our developer portal.