Active Directory,  Mac OS X,  Mac OS X Server,  Microsoft Exchange Server,  Network Infrastructure,  Ubuntu,  Unix,  VMware,  Windows Server

Stashbox: Turning a Mac Mini Into A Logstash and Kibana Server

You have a lot of boxes. You would like to be able to parse through the logs of all those boxes at the same time, searching for a given timestamp across a set of machines for a specific string (like a filename or a port number). elasticsearch, logstash and kibana are one way to answer that kind of need. This will involve downloading three separate packages (which for this article, we’ll do in /usr/local) and creating a config file.

First, install the latest Java JDK. This is available at jdk8-downloads-2133151.html.

The following is going to download the latest version of logstash and untar the package into /usr/local/logstash (I like nesting that logstash-1.4.0 inside logstash so when the next version comes out I can have it there too, I have plenty of space so keeping a couple versions back helps in the event I need some old binary and can’t get to it ’cause they revved out the version I wrote a script against at some point):

curl -O https://download.elasticsearch.org/logstash/logstash/logstash-1.4.0.tar.gz
mkdir /usr/local/logstash
tar zxvf logstash-1.4.0.tar.gz -C /usr/local/logstash

Once we have log stash, we’ll grab elastic search similarly:

curl -O https://download.elasticsearch.org/elasticsearch/elasticsearch/elasticsearch-1.0.1.tar.gz
mkdir /usr/local/elasticsearch
tar zxvf elasticsearch-1.0.1.tar.gz -C /usr/local/elasticsearch

Then we’ll untar kibana in the same manner:

curl -O https://download.elasticsearch.org/kibana/kibana/kibana-3.0.0.tar.gz
mkdir /usr/local/kibana
tar zxvf kibana-3.0.0.tar.gz -C /usr/local/kibana

Next we’ll make a very simple config file that we call /usr/local/stashbox.conf that listens on port 514 for syslog:

input {
tcp {
port => 514
type => syslog
}
udp {
port => 514
type => syslog
}
}
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
output {
elasticsearch { host => localhost }
stdout { codec => rubydebug }
}

Next, we’ll enable elastic search:

/usr/local/elasticsearch/elasticsearch-1.0.1/bin/elasticsearch

And finally, in a different window we’ll call logstash with that file as the config file:

/usr/local/logstash/logstash-1.4.0/bin/logstash -f /usr/local/stashbox.conf

Having each of these open in different Terminal windows allows you to see logs in stdout. Next, point a host at your new syslog box. You can use https://krypted.com//windows-server/use-syslog-on-windows for installing Windows clients or https://krypted.com//mac-security/redirect-logs-to-a-syslog-server-in-os-x/ for  a Mac. Once done, let’s get Kibana working. To do so, first edit the config.js.

vi /usr/local/kibana/kibana-3.0.0/config.js

Locate the elastic search setting and put the name of the host running logstash in there (yes, it can be the same as the actual logstash box as long as you install a web server on the logstash box). Then save the changes.

Now move the contents of that kibana-3.0.0 folder into your web directory. Let’s say this is a basic OS X Server, that would be:

cp -R /usr/local/kibana/kibana-3.0.0/* /Library/Server/Web/Data/Sites/Default/

You can then check out your Kibana site at http://localhost or http://localhost/index.html#/dashboard/file/logstash.json for the actual search pages, which is what I’ve bookmarked.

Screen Shot 2014-04-10 at 10.37.51 PM

For example, to see the impact of periodic scripts in System Logs:

Screen Shot 2014-04-12 at 9.07.44 AM