Splunk Setting up License Usage Trending

You get a good bit of license usage trends when you install the Deployment Monitor and Splunk on Splunk applications. Or if you don’t use those apps, data in the _internal index ages out over time and you lose your trends beyond approximately 30 days.

I prefer to setup my own index and collect the summarized usage data into it so I can keep it indefinitely and do easy graphs on the data in my daily admin dashboard. This is also handy on a Splunk instance where you do not have the CPU cores to spare for Deployment Monitor to be running a lot of scheduled searches. Such as your admin laptop instance.

Lastly, you may need this data over the long term so you can justify more Splunk license in your next budget as you get close to averaging at your license limit.

Make your personal Admin App

First we will create the app we will put our own personal admin searches and dashboards etc.

  1. Go into Apps-Manage Apps
  2. Add New App
  3. Use a naming scheme something like ORG-ADMIN. if this is personal on your admin laptop use MY-ADMIN.
  4. In the foldername field use the same thing as in step 3.
  5. I would set version to 1.0, leave the radio button set to visible and fill in your name as the author name
  6. Leave the selected template as barebones
  7. Click Save to create your new Application

Making the Summary Index

  1. Make sure you are in your Admin app by pulling down Apps and selecting MY-ADMIN
  2. Go into Settings->Data->Indexes
  3. Click Add
  4. Name the index “summary_license” Note on naming schemes. Try and decide ahead of time on using underscores vs dashes. I would use underscores in indexes and dashes in sourcetypes for example. Just be consistent once you choose.
  5. Take the defaults for the other options unless you know for sure what you want them to be.

Creating the Scheduled Data Collection

A comment about scheduling searches. Searches consume available CPU cores. So you need to play almost an admin Tetris game with scheduling.

Never schedule your own searches on hour, five minute, fifteen minute or thirty minute marks. The reason is that if you ever install The Enterprise Security App etc it likes to run jobs on those marks. So I like to schedule searches at 2 or 3 minute marks to either side of those marks. Some Splunk Admins assign a time period for certain Splunk user departments to spread things out. Like sales related jobs at 7 minutes after the hour.

License data is audited by Splunk nightly just after midnight system time. So we will want to make a scheduled search on that data to run sometime in the early morning hours.

  1. You should still be within your Admin app, if not go back into it. This will just save steps later by ensuring you are in the app context as we make a new saved search.
  2. Go into Settings->Knowledge->Searches and Reports
  3. Click Add New, make sure you see you are in your MY-ADMIN app context
  4. Name the search “Admin – Summary Licensing”
  5. Paste the following into the search field
    index=_internal source=*usage.log type=Usage | eval category="splunk_metric" |eval subcategory="indexing"| eval src_type="license_usage"| stats sum(b) as b by st h s pool poolsz category subcategory src_type | collect index=summary_license
  6. Set the start time to be -1d@d and the end time to @d. This basically means for “yesterday” with the breaks at midnight
  7. Check the box to schedule the search.
  8. Change the schedule type to “cron”
  9. Use the cron schedule “3 1 * * *” this says to run at 1:03am each day.

Creating our Daily Admin Dashboard

  1. You should still be within your Admin app, if not go back into it.
  2. Click Dashboards
  3. Click Add New
  4. Use the title “My-Daily-Admin”
  5. Click the Shared in App permissions tab
  6. Click Create Dashboard

Create the Splunk Login Activity Panel

  1. Click Edit->Edit Panels
  2. Click Add Panel
  3. Choose a title of “Splunk Web Login Activity (past 7 days)”
  4. Paste the following into the search field
    index=_internal sourcetype=splunk_web_service action=login | table _time, user, status, clientip | sort -_time
  5. Change the time range to last 7 days and click Add Panel to save it.
  6. I like to leave this one a statistics table visualization

Create the Active Forwarders panel

  1. Click Edit->Edit Panels
  2. Click Add Panel
  3. Choose a title of “Active Forwarders (past 7 days)”
  4. Paste the following into the search field
    index=_internal sourcetype=splunkd component="StatusMgr" | timechart span=1h dc(sourceHost) AS ActiveForwarders
  5. Change the time range to last 7 days and click Add Panel to save it.
  6. I like to use the area chart visualization for this one

Create the License Usage Week over Week panel

Note you need the Timewrap application installed for this one.

  1. Click Edit->Edit Panels
  2. Click Add Panel
  3. Choose a title of “License Usage (week over week)”
  4. Paste the following into the search field
    index=summary_license | timechart span=1d sum(b) AS bytes | eval GB=bytes/1024/1024/1024 | timechart span=1d sum(GB) | timewrap 1w
  5. Change the time range to last 3 weeks and click Add Panel to save it.
  6. I like to use the line chart visualization for this one

Create the License Daily Usage panel

  1. Click Edit->Edit Panels
  2. Click Add Panel
  3. Choose a title of “License Daily Usage (past 7 days)”
  4. Paste the following into the search field
    index=summary_license | timechart span=1d sum(b) AS bytes | eval GB=bytes/1024/1024/1024 | timechart span=1d sum(GB) AS GB
  5. Change the time range to last 7 days and click Add Panel to save it.
  6. I like to use the column chart visualization for this one

That should get you started. We will talk about some other panels in future posts. But I do like to add various graphs, radial dials etc showing daily events for logs that collect at a forwarder. The reason is because you can make the Deployment Monitor application send you alerts for forwarders that go completely silent. But what if only logs that accumulate at a forwarder stop but not all logs from that source. An example is that my iMac at home sends OS logs, but it also listens for syslog from my network gear and acts as a syslog collector. If the syslog stops yet I still get OS logs the alert would never come from the Deployment Monitor app. Checking the key “canary in a coal mine” graphs lets me know the logs I rely on the most for daily operations have not broken. I even use the SA-ldapsearch app to pull login information from Active Directory. I had it break one when java was patched on the search head where it was installed. I did not catch it for several days and it threw some password change trending off count. Now I can tell each morning that it is still working because I have a panel in this dashboard showing how many AD accounts it can see.

Enjoy!

Share