How to view FIXEdge logs in Splunk



This "Howto" presumes you have Windows server with Splunk 6.0 instance installed. You can download it for free here

Brief explanation how it works:

Adding FixEdge.log into Splunk

  • You need to log in either at http://localhost:8000 or http://splunkhostname:8000 Now you are at the home page.
  • Choose "Add data".
  • In the next page titled "Add data"  choose the link "A file or directory of files".
  • Click "next" under the option "Consume any file on this Splunk server".
  • Given you add FixEdge log for the first time, on the next page named "Data preview" choose "Preview data before indexing" and then either type the path to your log file or browse it through file dialog by clicking "Browse server" button:
  • Here you should choose the option "Start a new source type" and click "Continue".

Now you are in "Data preview" page. Green selection under the column "Event" shows that Splunk already tried to parse some formatted data. As you can see, it failed in the separating of FixEdge.log events because of their specific structure. That's why you should click the link "adjust timestamp and event break settings":

Splunk has some options for adjustment of log records parsing here. Choose "Timestamps" sheet. Each event record in FixEdge.log is prefixed with it's severity (NOTE/WARN/ERROR, etc) with trailing white spaces for padding before time-stamp itself. So you should define some regex and format string here for right events break-up. Severity prefix consist of 4 to 5 capital letter in square brackets so the regex may be defined  as "^\[[A-Z]{4,5}\][\s]+" (quote marks not included) and the time-stamp corresponds to format string "%Y%m%d-%H:%M:%S.%l". Click "Apply" to see actual events break-up. Green selection now highlights full time-stamp and records are separated correctly. Click "Continue" to proceed.

In the next dialog you can review and manually correct settings and give identification name to your new source type. Then click "Save source type".

Click "Save source type" to place FixEdge.log in its index.

No go to search application to adjust field extraction.

Fields extraction configuration

To simplify search strings and make searches fast you should add extracted fields into Splunk knowledge objects so it can index then, and there will be no need to parse them each time you search.

To do so return into "Events" sheet and click on black expansion arrow under column "i" on some event record and click "Extract Fields":

As an example let's break event record into several fields. To do so we need extract each field from a raw record through some regex. We just type field names in Splunk supported format (?<fieldname>), then Splunk adds that fields in its set. You can make them visible through selecting dialog by activating corresponding check boxes.

Each field can be extracted separately with individual regex and saved as if Splunk already aware of it.  Also Splunk can suggest you regex for field extraction based on examples of raw text you'd provided to it.

Time-stamp has predefined field _time. Also you can filter search with keywords and logic expressions. Search result may be converted into tables with given list of columns through pipe commands. Now search result with fields break looks like that:

sourcetype="fix_edge_log" NO CC_Layer | table _time severity trade_id category message

Dashboards: conversion from simple to advanced XML

Any search results visualization may be saved as dashboard for further usage and editing. Also several ones may be combined in one dashboard.

Click "Edit" button, then choose "Edit Source XML" from drop-down menu to open UI view source editor. By default Splunk manage dashboard layout in so-called "simple XML":

Simple XML itself has only few options for adjustment and should be converted into "Advanced XML" for further customization.

To do so you need to add ?showsource=advanced to dashboard URL, refresh page and copy XML block from page source window. Then open view XML editor and replace simple XML with previously copied advanced XML. Then click "Save" button. Now your dashboard will be managed explicitly as "Advanced XML".

Now you can add advanced controls in view form, share common search result between controls and convert view or set of views into Splunk application for deployment.

Make Splunk application

To convert saved views and dashboards into application for simple deployment and distribution, you need to create Splunk package.

Splunk package is just a tar.gz-archive of application sub-folder in %SPLUNK_HOME%\etc\apps on Windows (or $SPLUNK_HOME/etc/apps on Unix-like OSes).

Application folder should be named as your package filename. It should contain your views under sub-folder local\data\ui\views\. Also you  should place application menu file default.xml under sub-folder local\data\ui\nav\ in which you should specify your application default view.

Also to make your application standalone and self-sufficient, you should provide some configuration files with Splunk knowledge objects: host to monitor to, data source definition, event break and field extraction expressions, access restrictions and so on. All that data may be found under "Knowledge objects" menu in a Splunk server home page.

Data source definition should be placed in file local\inputs.conf in a form like shown below:

host = EPRUSARW0664
disabled = false
followTail = 0
sourcetype = fix_edge_log

Event break and field extraction expressions should be placed in file local\props.conf likewise:



TIME_FORMAT = %Y%m%d-%H:%M:%S.%l

TIME_PREFIX = ^\[[A-Z]{4,5}\][\s]+

pulldown_type = 1

EXTRACT-category = (?i)^(?:[^\]]*\]){2}\s+(?P<category>[^ ]+)

EXTRACT-message = [^\.\n]*\.\d+\s+\[\d+\]\s+\[\w+\]\s+\-\s+(?P<message>.*)

EXTRACT-severity = (?i)^\[(?P<severity>[^\]]+)

EXTRACT-thread_id = (?i)^[^\.]*\.\d+\s+(?P<thread_id>[^ ]+)

Compress application folder into .tar.gz-archive with 7zip in Windows (tar in Linux).