Perhaps you have made yourself a logging vm, or even a logging machine out of an old laptop using my pdf instructions. At home I actually turned a real old IBM Thinkpad A22m into a unbuntu logging machine. Just like my directions only no vmware.
I send all my network hardware logs via syslog to the machine. BUT I also did one simple change to the syslog.conf on every mac in my house. Now all my mac logs collect into my machine for searching in Splunk.
- Just open Terminal on your mac.
- sudo vi /etc/syslog.conf
- edit the file and add the following line, substituting your own logging machine IP address.
- Make sure to use an actual ip address in place of loggingmachineipaddress. I tried using the bonjour or mdns name like logger.local and my macs never consistently sent logs. So changing to IP address it seemed to work after that.
- Next if you are in Leopard you can do the following two terminal commands to restart syslog and pick up the config change. Otherwise you could also just reboot your mac.
- sudo launchctl unload /System/Library/LaunchDaemons/com.apple.syslogd.plist
- sudo launchctl load /System/Library/LaunchDaemons/com.apple.syslogd.plist
Recently I wanted to build a log collection virtual machine. I settled on a combination of syslog-ng and splunk. Syslog-ng lets you do filtering, message rewriting and routing to multiple destination types. Splunk v4 gives you a nice ability to search the gathered logs. So you can follow my two documents. The roll your own covers the building of the vm. The getting started covers doing the last setup tweak to use and collect certain event types I decided would make a good stating example set.
We use Ubuntu sever 32bit 9.10 with syslog-ng v3 and splunk v4 in this tutorial. I built mine in vmware fusion on my mac. But you should be able to adapt to your own box/virtualization of choice.
Lately I have been working on making a vmware virtual machine for combining syslog-ng version 3 and splunk. I wanted to leverage syslog-ng for routing of messages and for rewriting messages from an existing kiwisyslog server.
Let’s say you have all your network gear sending events to an existing kiwisyslog install. You can add an action to foward the messages and include the original source IP. The problem is that the original IP becomes part of the message. When it reaches splunk you would rather it see the messages as having come from the original host so you get the best mapping to host fields in splunk searches.
So we use syslog-ng to receive the forwarded messages then rewrite the message before it is picked up by splunk. We tell syslog-ng to listen on udp port 3514. This is the port we tell kiwisyslog to forward events to. Next we tell syslog-ng to write the events to a fifo linux queue while applying the rewrite. It is easy from there to tell splunk to pull events from the fifo.
So click more to see the config I used in syslog-ng to make this work. The solution is a combination of telling syslog-ng to NOT parse the incoming messages then to apply the rewrite rule. I do plan on writing a pdf guide on building the logging vm from scratch soon. But for now you can check out the config below.
Continue reading “Logging – syslog-ng rewrite kiwiSyslog forwards”
I have been having fun learning how to combine Dumpevt from Somarsoft and MS Log Parser. Let me say once you start to get the hang of it you can do some cool things. Also MS makes a great PDF for Security Event information. You can modify the below to make your own reports for those various IDs.
For example the below makes a CSV (comma seperated file) showing all user accounts who had EventID 644 lockouts the previous day. Download dumpevt, MS Log parser the header file I made all into a folder together. Then put the below commands into a bat file in the same folder. Remove my comments that are in bold.
- First we call dumpevt for three domain controllers. Obviously this bat file has to be executed as a user whom has rights to pull the logs remotely. Dumpevt will actually concatenate the dumps into one file.
dumpevt /computer=PDC /logfile=sec /outdir=c:\logs\ /outfile=644.csv /all >> c:\logs\errors.txt
dumpevt /computer=BDC01 /logfile=sec /outdir=c:\logs\ /outfile=644.csv /all >> c:\logs\errors.txt
dumpevt /computer=BDC02 /logfile=sec /outdir=c:\logs\ /outfile=644.csv /all >> c:\logs\errors.txt
- Second we create a temp.csv file concatenating the header I provided with the output of the dumpevt calls.
type dumpevt-header.csv > temp.csv
type 644.csv >> temp.csv
- Next we call Log Parser. We tell it the input is in CSV format and the first row is the header. We specify what format we want the timestamps in for output. Next we select all fields, parse out the user account name where eventID is 644 and the date is the previous day. We go from temp.csv into a new file temp2.csv and have the output in date-time order.
LogParser -i:CSV -headerRow on -iTSFormat:”yyyy-MM-dd” “SELECT *, SUBSTR(EXTRACT_TOKEN(Strings, 1,’^’), 23) AS Account INTO temp2.csv FROM temp.csv WHERE EventID = 644 AND Date = TO_DATE(SUB( SYSTEM_TIMESTAMP(), TIMESTAMP(‘2’, ‘d’))) ORDER BY DATE, Time
- Last we run the temp2.csv through Log Parser once more. This will generate a csv file called 644report.csv with the columns Date, Time, Computer and the Account that was locked out. Note it drops all entries where the user account name is blank. This happens with some 644 events. I am not sure yet why but I am still teaching myself all about this log parsing and interpretation.
LogParser -i:CSV -headerRow on -iTSFormat:”yyyy-MM-dd” “SELECT DATE, TIME, Computer, Account INTO 644report.csv FROM temp2.csv WHERE STRLEN(Account) >1”