The Village Blacksmith

Under a spreading chestnut-tree
The village smithy stands;
The smith, a mighty man is he,
With large and sinewy hands;
And the muscles of his brawny arms
Are strong as iron bands.

His hair is crisp, and black, and long,
His face is like the tan;
His brow is wet with honest sweat,
He earns whate’er he can,
And looks the whole world in the face,
For he owes not any man.

Week in, week out, from morn till night,
You can hear his bellows blow;
You can hear him swing his heavy sledge,
With measured beat and slow,
Like a sexton ringing the village bell,
When the evening sun is low.

And children coming home from school
Look in at the open door;
They love to see the flaming forge,
And hear the bellows roar,
And catch the burning sparks that fly
Like chaff from a threshing-floor.

He goes on Sunday to the church,
And sits among his boys;
He hears the parson pray and preach,
He hears his daughter’s voice,
Singing in the village choir,
And it makes his heart rejoice.

It sounds to him like her mother’s voice,
Singing in Paradise!
He needs must think of her once more,
How in the grave she lies;
And with his haul, rough hand he wipes
A tear out of his eyes.

Onward through life he goes;
Each morning sees some task begin,
Each evening sees it close
Something attempted, something done,
Has earned a night’s repose.

Thanks, thanks to thee, my worthy friend,
For the lesson thou hast taught!
Thus at the flaming forge of life
Our fortunes must be wrought;
Thus on its sounding anvil shaped
Each burning deed and thought.

-Henry Wadsworth Longfellow

Geist Watchdog 15, SNMP, and Splunk

I have a few of the Geist Watchdog 15 devices in my data center.  They do a good job monitoring, but getting data out of them isn’t as easy as it could be.  Their latest firmware does introduce JSON over XML.  Unfortunately, there is no way to do API calls to return certain time frames.  You have to download the whole log file.  Geist heavily uses the SNMP method to pull the information.  While this is normally ok, but you do need the custom MIB file for the device which makes it a pain.  I tried multiple ways to have Splunk grab the values from the device, but failed each time.  With a deadline to produce a dashboard (it was 11pm and we had people visiting the office at 8am), I put my Google, Linux, and Splunk skills to a test.

First, let’s install the SNMP tools.

# yum install net-snmp net-snmp-devel net-snmp-utils

Let’s check where the default location of the MIBs are.

# net-snmp-config --default-mibdirs

We will want to copy the MIBs to the second location.

# cp /tmp/geist_bb_mib.mib /usr/share/snmp/mibs/geist_bb_mib.mib
(Source location will differ.  The location /tmp/ was where I copied the file to)

Referencing the MIB Worksheet, we can find the OID for the items we want.  In this script I selected: internalName, internalTemp, internalDewPoint, internalHumidity, tempSensorName, tempSensorTemp

Geist does not put the first period for the OID.  In the worksheet they list internalName as where the SNMP call would be to .  We also need to reference the device ID for the OID at the end of the OID.  The base for the Remote Temperature Sensor is .  To call the first Remote Temperature Sensor I would reference . and the second Sensor is .

To make the call to the device using SNMP, we will be using the snmpget command.

# /usr/bin/snmpget -m all -Ov -v 2c -c public .

-m all = Use all of the MIB files
-Ov = Print values only
-v 2c = Use version 2c
-c  public = Use the public snmp string = IP address of the Watchdog 15
. = tempSensorName for Device 1

STRING: ExternalTempSensor1

We are almost there.  Now to clear up the return to only give us the second part of the response.

 # /usr/bin/snmpget -m all -Ov -v 2c -c public . | awk '{print $2}'

Great, now we are getting just the value.  Time to tie the field and value together.  Since the internal name is going to be the same but we are gathering multiple values, I am also adding the _temp so I am able to tell which field I am getting.

InternalName01=`/usr/bin/snmpget -m all -Ov -v 2c -c public . | awk '{print $2}'`
 InternalTemp01=`/usr/bin/snmpget -m all -Ov -v 2c -c public . | awk '{print $2}'`
 echo $Section01

Almost there, now let’s add a date/time stamp.

InternalName01=`/usr/bin/snmpget -m all -Ov -v 2c -c public . | awk '{print $2}'`
 InternalTemp01=`/usr/bin/snmpget -m all -Ov -v 2c -c public . | awk '{print $2}'`
 echo -e `date --rfc-3339=seconds`","$Section01
 2016-05-16 22:07:57-05:00,ExternalTempSensor1_temp,871

I repeated the section for the different pieces of sensor data I wanted and ended up with a small script.


InternalName01=`/usr/bin/snmpget -m all -Ov -v 2c -c public . | awk '{print $2}'`
 InternalTemp01=`/usr/bin/snmpget -m all -Ov -v 2c -c public . | awk '{print $2}'`
 echo -e `date --rfc-3339=seconds`","$Section01

InternalDewPoint01=`/usr/bin/snmpget -m all -Ov -v 2c -c public . | awk '{print $2}'`
 echo -e `date --rfc-3339=seconds`","$Section02

InternalHumidity01=`/usr/bin/snmpget -m all -Ov -v 2c -c public . | awk '{print $2}'`
 echo -e `date --rfc-3339=seconds`","$Section03

RemoteName01=`/usr/bin/snmpget -m all -Ov -v 2c -c public . | awk '{print $2}'`
 RemoteTemp01=`/usr/bin/snmpget -m all -Ov -v 2c -c public . | awk '{print $2}'`
 echo -e `date --rfc-3339=seconds`","$Section04

RemoteName02=`/usr/bin/snmpget -m all -Ov -v 2c -c public . | awk '{print $2}'`
 RemoteTemp02=`/usr/bin/snmpget -m all -Ov -v 2c -c public . | awk '{print $2}'`
 echo -e `date --rfc-3339=seconds`","$Section05

2016-05-16 22:12:57-05:00,Base_temp,873
 2016-05-16 22:12:57-05:00,Base_dewpoint,620
 2016-05-16 22:12:57-05:00,Base_humidity,43
 2016-05-16 22:12:57-05:00,ExternalSensor1_temp,688
 2016-05-16 22:12:57-05:00,ExternalSensor2_temp,717

I created a folder /opt/scripts/ and /opt/scripts/logs/.  I placed the script in /opt/scripts/ and named it  I set the script to be able to run with:

# chmod +x /opt/scripts/

I then add it to the crontab.

# crontab -e

*/1 * * * * /opt/scripts/ >> /opt/scripts/logs/`date +”%Y%d%m”`_geist.log

You can verify that the script is set to run with:

# crontab -l

*/1 * * * * /opt/scripts/ >> /opt/scripts/logs/`date +"%Y%d%m"`_geist.log

Now we can log in to Splunk and add the log in to Splunk.  After you log in, go to Settings and then Data inputs.


Under the Files & directories, click the Add new link.


Under the Full path to your data, enter the path to the log file you are writing in the crontab.  Check the box for the More settings option.


You can set the Host that will be indexed with your data.  In the source type, select From list and then select csv.  You then can select an index for the log files.


Now we will set up the field extractions.  You will need to edit the props.conf and transforms.conf files.  If you want to keep this in a certain application, change the file path to $SPLUNK_HOME/etc/apps/{appname}/local/props.conf.

# vi $SPLUNK_HOME/etc/system/local/props.conf
 REPORT-Geist = REPORT-Geist

# vi $SPLUNK_HOME/etc/system/local/transforms.conf

 DELIMS = ","
 FIELDS = "DateTime","SensorName","SensorValue"

Restart Splunk and you should be able to search you SNMP values.

# $SPLUNK_HOME/bin/splunk restart

The Hacker Manifesto turn 30

The Hacker Manifesto turns 30 today. I remember the first time reading this. I still get goosebumps. I lived the era of the BBS. I was the kid tying up the phone line. I remember the rush of connecting to systems and exploring. Talking to people I didn’t know but I did know them.  We shared knowledge and experience.
We were the Keyboard Cowboys, the System’s Samurai, and the Phone Phreaks.

\/\The Conscience of a Hacker/\/