Splunk New Technology Add-ons: SyncKVStore and SendToHEC

I recently updated and replaced older repositories from my GitHub account that were hand made modular alerts to send search results to other Splunk instances. The first one sends the search results to a Splunk HTTP Event Collector receiver. The second one came from our Splunk 2016 .conf talk on KVStore. It was useful for sending search results (typically an inputlookup of a table) to a remote KVStore lookup table.

TA-Send_to_HEC

You can find the updated Send To HEC TA on Splunkbase: TA-Send_to_HEC or in my GitHub repository: TA-Send_to_HEC.

This is useful for taking search results and sending to another Splunk instance using HEC. If you chose JSON mode it will send the results as a JSON payload of all the fields after stripping any hidden fields. Hidden fields start with an underscore. RAW mode is a new option which takes the _raw field and sends ONLY that field to the remote HEC receiver.

TA-SyncKVStore

This has been completely redone. I have submitted it to Splunkbase, but for the moment you can get it from my GitHub repository: TA-SyncKVStore

Originally it only sent search results to a remote KVStore. Now it also has two modular inputs. The first pulls a remote KVStore collection (table) and puts it into a local KVStore collection. The second pulls the remote KVStore collection but indexes it locally in JSON format. It will strip the hidden fields before forming the JSON payload to index. You are responsible for making sure all the appropriate and matching KVStore collections exist.

If you look in the code you will notice an unusual hybrid of the Splunk SDK for Python to handle KVStore actions and my own python class for batch saving the data to the collection. I could not get the batch_save method from the SDK to work at all. My own class already existed and was threaded for performance from my old version of the modular input so I just used the SDK to clear data if you wanted a replace option and then my own code for saving the new or updated data.

I rebuilt both of these TAs using the awesome Splunk Add-on Builder. This makes it easy in the SyncKVStore TA to store the credentials in the internal Splunk encrypted storage. One comment to update on the previous post on credential storage. The Add-on Builder was recently updated and now gives much better multiple credential management with a “global account” pull down selector you can use in your inputs and alert actions.

Share

Splunk, Adafruit.io, and MQTT

I have been enjoying the Splunk HTTP Event Collector (HEC) since it’s introduction in Splunk v6.3. You can check out a python class I made for it over on the Splunk Blog. That got me started back on data collection from my Raspberry Pi. I can just send data straight into Spunk using the HEC. But what if I wanted data from a remote Raspberry Pi?

Adafruit.io:

That brought me back to messing around with my Beta Adafruit.io account. This is a data bus service being made by Adafruit perfect for your DIY Internet of Things projects. You can find a lot of their learning tutorials on it in the Adafruit LMS. I did some minor playing over the holiday. Then Lady Ada went and made a tutorial specifically on MQTT.

MQTT and Splunk:

I remember seeing a modular input for MQTT in Splunkbase. Why not try it out with Adafruit.io? Well the answer was… Its java dependent. I love Damien’s work which is awesome as always. But, the Splunk admin hat side of me cannot stand having to install Java to make a feature work. He is trying to convince me to made a Python based version myself. We shall see if I can make the time. Was there an alternative? Why… yes there is. That is how we come back full circle to the the HTTP Event Collector and my python class.

Mixing Chocolate and Peanut Butter:

I took the Adafruit Python class for adafruit.io and it’s example code. Just import in my HEC class and mod the Adafruit code just a little. Now we have a bridge between the Adafruit MQTT client example and sending it into Splunk via the HEC. This let me take the feed value posted to a give MQTT feed on Adafruit.io and send it into Splunk with a single listening Raspberry Pi running a python script local to my Splunk instance.

The code I used was the MQTT Client example. Just add import and creation of an HEC object at the top of the script right before the Adafruit_IO import section

Next we add the following to the bottom of the message method in the Adafruit code.

That is it. Now as long as the script is running it takes the value from a monitored Adafruit.io MQTT feed and kicks it over into Splunk via the HEC. Enjoy!

Share

Splunk sessionKeys and Python Indentions

On sessionKeys to the Kingdom:

I started making a scripted input app to pull in logs from the LastPass Enterprise API. Everything was progressing nicely until I found I could not retrieve the encrypted API key from Splunk where I saved it. I was going to be smart and re-use my credentialsFromSplunk.py class that I created for alert scripts. That is when I beat my head on the wall. Scripted Inputs get sent a sessionKey if you set the passAuth value for your script in the inputs.conf stanza. Stdin is also how sessionKeys are sent to alert_scripts. So I figured my existing code would work great.

I kept getting authentication failures on the API. It turns out, I had not put in a logDebug event for the original sessionKey as it came in. So I had not noticed an inconsistency. SessionKeys sent for scripted inputs do NOT have the “sessionKey=” string at the front of the key sent by Splunk to alert scripts. Thus re-using my existing code that clips off those eleven characters broke the sessionKey value. So I share it here in case you are learning new Splunk features that depend on the sessionKey. Check your original values if you get authentication errors on the API. The sessionKey could contain extra text you have to remove or is URL encoded.

Remember that you can submit feature enhancement requests through the Splunk support portal. It is important when you find such inconsistencies to submit it so Splunk developers can eventually fix it. I did on this one.

On Indention in Python Scripts:

I used to give no thought to using tabs in my Python script as long as it indented right. Now I am firmly in the “four spaces” in place of a tab camp. I have gone through and updated all the code in my git repo to be four spaces based. I found I was getting unexplained bugs where code may or may not execute when expected. It turns out it was all indention related. Code was running only when an IF stanza above it was true. A pesky hidden tab was causing it. So if you edit in vi/vim make sure to change your config to use four spaces. I found these two great links on doing that and putting it into your .vimrc file for persistance.

Share

Splunk a DNS Lookup for Abuse Contacts

Let’s follow up on our DNS theme of the last post. I have used my alert scripting to block attackers in the past such as those scanning heavily against SSH. Now I want to start considering emulating the complaint notification one can get from using fail2ban. So let’s start with just adding a simple external command lookup for getting the abuse contact for a given IP address. We will actually use the method found in the fail2ban complain module. So big thanks to them!

We want to have a search like this:
tag=authentication action=failure | stats count values(user) by src_ip | lookup abuseLookup ip AS src_ip

Once you add the transforms and python script below the command should work in Splunk. Keep in mind like the dnsLookup this has to happen on any search heads that will need it. I also have not yet worked on making this handle ipv6 which abusix.com can do with the lookups. The new abuseLookup will return a field to your events called abusecontact. Then you can use that how you want in reporting events.

First edit your transforms.conf to add this stanza:

Now create the python script abuseLookup.py in $SPLUNK_HOME/etc/system/bin/

Share

Splunk Alert Scripts – Automating Control

A big thanks to the members of the @SplunkDev team that were helpful and patient with my questions while I pulled this together. Thanks Guys: @gblock, @damiendallimore‎ and David Noble

In Splunk circles, you often hear about the holy grail of using Splunk to actively control other systems. It can be hard to find details or good examples on HOW to do it. I am always working on something new that deepens my technical skills. I had not previously dealt with REST APIs or Splunk alert scripts and this post is the result. Used well you can replace manual daily operations tasks; changing Splunk from a tool into a team member.

We will cover a working example of using Splunk alert results to update a Google Spreadsheet via the Drive Python SDK. Once you understand how it works, you can make you own controls of any system that supports REST API calls such as an Intrusion Prevention System to block a list of IP addresses using a scheduled Splunk alert.

We will leverage a Splunk blog post on saving credentials in a Splunk App to avoid leaving our Google credentials hard coded and exposed in the alert script. It turns out alert scripts work in the same way but it is not well documented. I built a Python class for retrieving those credentials from Splunk so you could re-use the code across many alert scripts. The scripts can all be found in the supporting GitHub repo. You will be able to use these as a framework for your own alert scripts to drive actions in other systems. I will not be stepping through the code itself as it is fairly well commented. There are plenty of moving parts to this so you need to be an experienced Splunk administrator to get it working. The benefit is that once you get one working you can just make new variants with little effort.

GoogleSpreadsheet

Continue reading “Splunk Alert Scripts – Automating Control”

Share

Splunk Presenting Data in statusboard on iPad

Splunk is a great tool for digging into data and presenting the results. Sometimes, you just want a status board of results that comes to you without having to log into a web application. A wonderful app for this is the iPad app statusboard by Panic software.

You always could create a panel on your statusboard that links to a URL of a file for presentation. However, this means your data is not protected by authentication. Panic added Dropbox support so you can now make a panel that pulls from a csv or json file. You can also airplay to an AppleTV or direct connect the iPad to a TV to present the dashboard on a large display.

In this post I will cover how I combined a Splunk alert script in python, dropbox and statusboard to get the result below. I am displaying the number of failed login attempts against my wordpress blog by country code for the previous 7 days. Keep in mind this is a Splunk instance running on my laptop with minimally sensitive information. I would never run dropbox directly on a work related production Splunk server. An alternative method would be to run a scheduled script that pulls the results out of Splunk via the REST api and write it out to a csv in the dropbox folder. I will do that version of this post in the future.

WordPress Logins Statusboard

Continue reading “Splunk Presenting Data in statusboard on iPad”

Share

Splunk Alert Script – OSX Notification Center

I want to start making some custom alert scripts. As usual, I like to practice by using a live example. I have SSH remote access and Apache enabled on my laptop. When at work I keep a map up in Splunk on my laptop showing the source ip location of any attempts to connect to my laptop. If you start beating on my laptop it results in an instant ban hammer in the network IPS.

I sometimes miss seeing the map updates when busy. If I had an alert history that is quickly accessible it would be easier to handle the scanning systems. I decided on this alert to test the hits on apache that runs every 15 minutes. These logs just happen to go into an index called os_osx. I tagged the combined_access source type as “web”.

index=os_osx tag=web | stats count by clientip

Now the fun part. I am working on my python skills so I did the alert script in python. This required me to call the OSX shell command osascript in order to execute the Apple Script that generates the actual Notification Center message. It took a minute of experimentation to get the right combination of escaped quotes to build the Apple Script command.

We get a result like this:

AlertSample

And here is the alert script that I saved as osx-alert.py in the /Applications/splunk/bin/scripts folder on my laptop. That is the script I chose to call on the search above when saved as an alert.

Share

Splunking the Laundry

I have been working on learning Python lately.  One of the best ways to learn is to pick small goals and achieve them.

The laundry room of my apartment building uses  a service called Laundryview.com to let people see the status of the washer and dryer units including time remaining.  I have my raspberrypi handy.  So I set out to put together a python script to scrape the machine status every fifteen minutes and push the data into splunkstorm.com.  This is so I can actually trend the machine usage to determine what days of the week and times are most available.   Plus I wanted to see if I could do something new. Below is a sample graph from splunkstorm showing the in use pattern for the washing machines.

Washing Machine status graph

If you want to see the python script just click more.  Warning it is down and dirty.  I could have made things more elegant but it works and I have not had time to polish it up.  You will see I use lxml to parse the mobile version of the site for the machine status from a table.

Continue reading “Splunking the Laundry”

Share