Wednesday, October 9, 2013

Importing Nessus CSV reports to SPLUNK from the Command Line!

Problem Solved!  Hurricane Labs provides a nice Splunk App to consume Nessus CSV files.  But, I did not want to manually download a new CSV from the Nessus web interface and then move it to my Splunk server. I could have made a samba share from my Splunk server to my PC and just saved the output from the Nessus web interface to the share.. Still too much manual work!

After a lot of searching I found some good information on the Nessus discussion pages

https://discussions.nessus.org/message/17812#17812
cmerchant@responsys.com answers their own question:

#!/bin/bash

AUTH=$(wget --no-check-certificate --post-data 'login=nessus&password=password' https://server:8834/login -O -| grep -Po '(?<=token\>)[^\<]+(?=\<\/token)')
FILE=$(wget --no-check-certificate --post-data 'token='$AUTH'&report=XXXXXX&xslt=csv.xsl' https://server:8834/file/xslt -O - | grep -Po '(?<=/file/xslt/download/\?)[^\"]+(?=\"\>)')

wget --no-check-certificate --post-data 'token='$AUTH'&'$FILE'&step=2' https://server:8834/file/xslt/download -O test.csv

This got me moving toward a solution. I had never done any web page parsing with wget and javascripts, so it was about time to learn...

My requirements were:

  • No interaction - must be able to be run in cron
  • Grab all completed Nessus results
  • Save the file with the Friendly Report name so Splunk can use the file name as the Report Name

Here is the results. This needs some clean up and more documentation, but it is completely usable as is. Except you will need to replace xxxxxx with your password and x.x.x.x with your nessus server IP.
(word wrap didnt play nice here, carefull with your cut and paste)


#!/bin/bash

#Variables
SPLUNK_NESSUS=/mnt/nessus

#Retrive AUTH Token
token="$(/usr/bin/wget -q --no-check-certificate --post-data 'login=nessus&password=xxxxxx' https://x.x.x.x:8834/login -O - | grep -Po '(?<=token\>)[^\<]+(?=\<\/token)')"

#Get list of reports
/usr/bin/wget -q --no-check-certificate --post-data "token=$token" https://x.x.x.x:8834/report/list -O - | grep -Po '(?<=name\>)[^\<]+(?=\<\/name)' > /tmp/reports

#Get Friendly Names
/usr/bin/wget -q --no-check-certificate --post-data "token=$token" https://x.x.x.x4:8834/report/list -O - | grep -Po '(?<=readableName\>)[^\<]+(?=\<\/readableName)' > /tmp/names

#Merge two files
/usr/bin/pr -tmJ --sep-string=" " /tmp/reports /tmp/names > /tmp/named.reports

for i in $(cut -d' ' -f1 /tmp/named.reports);
do
#Get Filenames for reports
FILENAME=$(/usr/bin/wget -q --no-check-certificate --post-data 'token='$token'&report='$i'&xslt=csv.xsl' https://x.x.x.x:8834/file/xslt -O - | grep -Po '(?<=/file/xslt/download/\?fileName=)[^\"]+(?=\"\>)')

#Get files
#build Readable name to report number match
READABLENAME=$(grep $i /tmp/named.reports | cut -d' ' -f2- --output-delimiter='')
sleep 5
/usr/bin/wget -q --no-check-certificate --post-data 'token='$token'&fileName='$FILENAME'&step=2' https://x.x.x.x:8834/file/xslt/download -O $SPLUNK_NESSUS/$READABLENAME.csv;
done;

#Cleanup
rm /tmp/reports
rm /tmp/names
rm /tmp/named.reports

#note
# Remove files in /opt/nessus/var/nessus/users/nessus/files on nessus server

If you use this please send me an email rossw@woodhome.com



DNS Visibility

Doug Burks wrote up a 2nd post at the SecurityOnion Blog about DNS Visibility.
http://securityonion.blogspot.com/2013/10/got-dns-visibility.html

Some of the original ideas come from a post by Johannes Ullrich
 https://isc.sans.edu/diary/A+Poor+Man%27s+DNS+Anomaly+Detection+Script/13918

Doug Comments:
For those running Bro [1] on Security Onion [2], I've modified the script [3].
and he posts the code at: 
[3] - http://code.google.com/p/security-onion/wiki/DNSAnomalyDetection



#!/bin/bash

BRO_LOGS="/nsm/bro/logs"
TODAY=`date +%Y-%m-%d`
YESTERDAY=`date -d yesterday +%Y-%m-%d`
OLD_DIRS=`ls $BRO_LOGS |egrep -v "current|stats|$TODAY|$YESTERDAY"`
TMPDIR=/tmp
OLDLOG=$TMPDIR/oldlog
NEWLOG=$TMPDIR/newlog
SUSPECTS=$TMPDIR/suspects
for DIR in $OLD_DIRS; do zcat $BRO_LOGS/$DIR/dns* |bro-cut query; done | sort | uniq -c | sort -k2 > $OLDLOG
zcat $BRO_LOGS/$YESTERDAY/dns* |bro-cut query | sort | uniq -c | sort -k2 > $NEWLOG
join -1 2 -2 2  -a 2 $OLDLOG $NEWLOG | egrep -v '.* [0-9]+ [0-9]+$' | sort -nr -k2 | head -10 > $SUSPECTS

This script will run through *ALL* of your DNS/bro logs on your Security Onion install.  I have modified the script to allow you to look back over any number of days and compare that summary to yesterday's DNS logs. 
Examples:
last week compared to yesterday ($sh DNSAnomalyDetection 7)
2 days ago compared to yesterday ($sh DNSAnomalyDetection 2)

-----------------
#!/bin/bash
BRO_LOGS="/nsm/bro/logs"
TIMEFRAME=$1
TODAY=`date +%Y-%m-%d`
YESTERDAY=`date -d yesterday +%Y-%m-%d`
TMPDIR=/tmp
OLDLOG=$TMPDIR/oldlog
NEWLOG=$TMPDIR/newlog
SUSPECTS=$TMPDIR/suspects
while [ $TIMEFRAME -ne 1 ];
do
 PROCESSING_DAY=`date -d "-$TIMEFRAME Day" +%Y-%m-%d`
 zcat $BRO_LOGS/$PROCESSING_DAY/dns* |bro-cut query;
 TIMEFRAME=$((TIMEFRAME - 1))
done | sort | uniq -c | sort -k2 > $OLDLOG
zcat $BRO_LOGS/$YESTERDAY/dns* | bro-cut query | sort | uniq -c | sort -k2 > $NEWLOG
join -1 2 -2 2 -a 2 $OLDLOG $NEWLOG | egrep -v '.* [0-9]+ [0-9]+$' | sort -nr -k2 | head -10 > $SUSPECTS
-Ross Warren