December 22, 2013: 9:26 am: Splunk

Getting Started

I am often asked how to start looking at Splunk when someone gets interested. This is the same thing I do for myself.

  1. Get the latest build of Splunk and install it on a machine you can test with. Usually this is your daily use laptop or desktop.
  2. Consider your license options. Splunk licensing is based on how much data per day you index into Splunk for searching. The free license will let you index up to 500MB per day. One thing many Splunk administrators do is to get a development license for their personal workstation. This will let you index up to 10GB per day and unlock all the enterprise features. This is great for prototyping and testing your parsing, apps etc on your workstation before moving it to your production system.
  3. Change your default admin password on Splunk once you login for the first time. The last thing you want is to be in a coffee shop and have someone poking into data you have indexed into Splunk that you might not want to share.
  4. Change the web interface to use https. Sure it is the default Splunk SSL certificate but it is better than no encryption at all. Just enable it under Settings->System Settings->General Settings

If you do not end up using a development license or your demo license runs out be sure to firewall Splunk from being accessed outside your local machine. Reference back to my someone in a coffee shop digging through your data comment.


September 14, 2011: 8:46 pm: Presentation

Here are my slides and the tutorial I made for Rolling your own logging vm. Between the slides and the tutorial you can find all the links I referenced.

The VM tutorial uses Ubuntu Linux, syslog-ng and Splunk.  I go over how to use syslog-ng with fifo queues to handle multiple sources and even rewrite forwarded syslog events coming from Kiwisyslog before indexing in Splunk.  The tutorial zip has both pdf and epub formats in it.

*update* I was asked some questions today during my presentation on MS Log Parser.  I added my post on it below to the link list.  Also for those downloading my actual logging vm from the link I gave those whom attended my talk.  The url does redirect to dropbox so do not be surprised.

*second update* a question came up today on a forensics mailing list to search some evtx event log files.  I suggested using MS Log Parser to replay output to syslog.  The target being spunk say like in my logging vm tutorial.  Then the logs are easily searchable.


October 21, 2010: 7:19 pm: Admin Tricks

This is sort of a follow up to my SSH screencast series for remote access to your Mac.  Maybe you are paranoid like me and want to know when a connection has been made to your mac, when a wrong user name has been tried or even a failure to login on a good username.  You also want to know this no matter where you are.

I was inspired by the script written by Whitson Gordon, over at Macworld on automating turning off your wireless Airport interface.  Note what I have below has only been tested on my Snow Leopard setup.  I leave it up to you if you are on Leopard or even Tiger.  BTW update your system if you are as far back as Tiger. C’mon join the modern world.

You will have to have Growl installed, also install growlnotify and last you need a Growl to push notification service like Prowl.  Then have the Prowl app on your iPhone or iPad.

Read on for the scripts and how to get it all working.


November 14, 2009: 8:00 am: Admin Tricks

Perhaps you have made yourself a logging vm, or even a logging machine out of an old laptop using my pdf instructions.  At home I actually turned a real old IBM Thinkpad A22m into a unbuntu logging machine.  Just like my directions only no vmware.

I send all my network hardware logs via syslog to the machine.  BUT I also did one simple change to the syslog.conf on every mac in my house.  Now all my mac logs collect into my machine for searching in Splunk.

  1. Just open Terminal on your mac.
  2. sudo vi /etc/syslog.conf
  3. edit the file and add the following line, substituting your own logging machine IP address.
    *.*               @loggingmachineipaddress
  4. Make sure to use an actual ip address in place of loggingmachineipaddress.  I tried using the bonjour or mdns name like logger.local and my macs never consistently sent logs.  So changing to IP address it seemed to work after that.
  5. Next if you are in Leopard you can do the following two terminal commands to restart syslog and pick up the config change.  Otherwise you could also just reboot your mac.
  6. sudo launchctl unload /System/Library/LaunchDaemons/
  7. sudo launchctl load /System/Library/LaunchDaemons/
November 13, 2009: 2:27 pm: Admin Tricks

Recently I wanted to build a log collection virtual machine.  I settled on a combination of syslog-ng and splunk.  Syslog-ng lets you do filtering, message rewriting and routing to multiple destination types.  Splunk v4 gives you a nice ability to search the gathered logs.  So you can follow my two documents.  The roll your own covers the building of the vm.  The getting started covers doing the last setup tweak to use and collect certain event types I decided would make a good stating example set.

We use Ubuntu sever 32bit 9.10 with syslog-ng v3 and splunk v4 in this tutorial.  I built mine in vmware fusion on my mac.  But you should be able to adapt to your own box/virtualization of choice.

November 11, 2009: 5:05 am: Admin Tricks

Lately I have been working on making a vmware virtual machine for combining syslog-ng version 3 and splunk.  I wanted to leverage syslog-ng for routing of messages and for rewriting messages from an existing kiwisyslog server.

Let’s say you have all your network gear sending events to an existing kiwisyslog install.  You can add an action to foward the messages and include the original source IP.  The problem is that the original IP becomes part of the message.  When it reaches splunk you would rather it see the messages as having come from the original host so you get the best mapping to host fields in splunk searches.

So we use syslog-ng to receive the forwarded messages then rewrite the message before it is picked up by splunk.  We tell syslog-ng to listen on udp port 3514.  This is the port we tell kiwisyslog to forward events to.  Next we tell syslog-ng to write the events to a fifo linux queue while applying the rewrite.  It is easy from there to tell splunk to pull events from the fifo.

So click more to see the config I used in syslog-ng to make this work.  The solution is a combination of telling syslog-ng to NOT parse the incoming messages then to apply the rewrite rule.  I do plan on writing a pdf guide on building the logging vm from scratch soon.  But for now you can check out the config below.


December 5, 2006: 2:15 pm: General

I have been having fun learning how to combine Dumpevt from Somarsoft and MS Log Parser. Let me say once you start to get the hang of it you can do some cool things. Also MS makes a great PDF for Security Event information. You can modify the below to make your own reports for those various IDs.
For example the below makes a CSV (comma seperated file) showing all user accounts who had EventID 644 lockouts the previous day. Download dumpevt, MS Log parser the header file I made all into a folder together. Then put the below commands into a bat file in the same folder. Remove my comments that are in bold.

  • First we call dumpevt for three domain controllers. Obviously this bat file has to be executed as a user whom has rights to pull the logs remotely. Dumpevt will actually concatenate the dumps into one file.

del c:\logs\644.csv
del c:\logs\errors.txt
dumpevt /computer=PDC /logfile=sec /outdir=c:\logs\ /outfile=644.csv /all >> c:\logs\errors.txt
dumpevt /computer=BDC01 /logfile=sec /outdir=c:\logs\ /outfile=644.csv /all >> c:\logs\errors.txt
dumpevt /computer=BDC02 /logfile=sec /outdir=c:\logs\ /outfile=644.csv /all >> c:\logs\errors.txt

  • Second we create a temp.csv file concatenating the header I provided with the output of the dumpevt calls.

type dumpevt-header.csv > temp.csv
type 644.csv >> temp.csv

  • Next we call Log Parser. We tell it the input is in CSV format and the first row is the header. We specify what format we want the timestamps in for output. Next we select all fields, parse out the user account name where eventID is 644 and the date is the previous day. We go from temp.csv into a new file temp2.csv and have the output in date-time order.

LogParser -i:CSV -headerRow on -iTSFormat:”yyyy-MM-dd” “SELECT *, SUBSTR(EXTRACT_TOKEN(Strings, 1,’^’), 23) AS Account INTO temp2.csv FROM temp.csv WHERE EventID = 644 AND Date = TO_DATE(SUB( SYSTEM_TIMESTAMP(), TIMESTAMP(’2′, ‘d’))) ORDER BY DATE, Time

  • Last we run the temp2.csv through Log Parser once more. This will generate a csv file called 644report.csv with the columns Date, Time, Computer and the Account that was locked out. Note it drops all entries where the user account name is blank. This happens with some 644 events. I am not sure yet why but I am still teaching myself all about this log parsing and interpretation.

LogParser -i:CSV -headerRow on -iTSFormat:”yyyy-MM-dd” “SELECT DATE, TIME, Computer, Account INTO 644report.csv FROM temp2.csv WHERE STRLEN(Account) >1″