Tuesday, September 9, 2014

cbQosCMDropPkt demystified

It is very difficult to get stats on QOS drops from a Cisco router via snmp. There are multiple nested OIDs that eventually will get you the proper index and class-map. But following the bread crumb trail of OIDs is not straightforward. Sure it is documented by Cisco, but the trail is very confusing

My project required that I grab the QOS drops from the different class-maps on multiple routers and then feed them into a Splunk dashboard.  Scripting the multiple snmpwalks and outputting to CSV for ingestion by Splunk was the best way I had at my disposal.

Introducing get_qos-v3.sh

Usage : get_qos-v3.sh Customer Host_IP Host_Name Priv_Proto(DES/AES)

> get_qos-v3.sh Cust1 10.10.10.4 USA-Winslow "DES"

This script will output the following:

> cat /nsm/snmp/customer/Cust1/Cust1-10.10.10.4-qos.log

07-03-2014 17:00:01,Cust1,10.10.10.4,Winslow,class-default,sla,0
07-03-2014 17:00:01,Cust1,10.10.10.4,Winslow,class-default,class-default,0
07-03-2014 17:00:01,Cust1,10.10.10.4,Winslow,class-default,control,0
07-03-2014 17:00:01,Cust1,10.10.10.4,Winslow,class-default,media-ports,25
07-03-2014 17:00:01,Cust1,10.10.10.4,Winslow,MAP-QoSParentPolicy,class-default,25

My requirements dictated that I had customer separation as well as individual sites listed separately. Remove them if you dont need! (Or ask me to edit them out of the code)

Before you run the script you will need to edit the top part of the script to match your SNMPv3 settings, and log file locations. User, auth_passwd, priv_passwd and log_file are the only variables you should need to set. Please note that user and auth_passwd need the single (') quotes around them, while priv_passwd does not.. It was extremely frustrating troubleshooting those quotes.

This script assumes SNMPv3. If you need SNMPv2 functionality, write me and we can work up a new command line.

> cat get_qos-v3.sh 
#/bin/sh

# http://www.activexperts.com/admin/mib/Cisco/CISCO-CLASS-BASED-QOS-MIB/
# ftp://ftp.cisco.com/pub/mibs/oid/CISCO-CLASS-BASED-QOS-MIB.oid

if [ "$2x" = "x" ];
        then
        echo "Usage : $0 Customer Host_IP Host_Name Priv_Proto(DES/AES)"
        echo " "
        exit 1
fi

customer=$1
host=$2
host_name=$3
version=3
#add user here - keep the quotes
user='SNMPv3 user'
auth_mode=authPriv
auth_proto=SHA
#add auth_password - keep the quotes
auth_passwd='auth_password'
priv_proto=$4
#add priv_password here - no quotes this time
priv_passwd=priv_password 
mibs_dir=/usr/share/mibs/
cmd_variables="-v $version -M $mibs_dir -m ALL -u $user -l $auth_mode -a $auth_proto -A $auth_passwd -x $priv_proto -X $priv_passwd " 
#change log location - keep quotes
log_file="/nsm/snmp/customer/$customer/$customer-$host-qos.log"

##
#Should not have to edit below this line!!
##

#Save Field Seperator
OldFS=$IFS

# Get cbQosCMDropPkt with snmpwalk
OID=1.3.6.1.4.1.9.9.166.1.15.1.1.13
timestamp=$(date +"%m-%d-%Y %T")
cbQosCMDropPkt_WALK=$(snmpwalk $cmd_variables $host $OID)

#New Field seperator to parse the SNMPWALK
IFS=$'\n'

#For each line in cbQosCMDropPkt_WALK determine index and QOS drops
i=1
for I in $cbQosCMDropPkt_WALK;
        do
cbQosCMDropPkt_index=`echo $I | awk 'BEGIN {FS="."} {print $16}' | awk 'BEGIN {FS="="} {print $1}'`
CM_parent_index=`echo $I | awk 'BEGIN {FS="."} {print $15}' | awk 'BEGIN {FS="="} {print $1}'`
cbQosCMDropPkt=`echo $I  | awk '{print $4}'`

#Set Field seperator back to original
IFS=$OldFS

### Get class-map name
#Match up the index from cbQosCMDropPkt to cbQosConfigIndex
   OID="1.3.6.1.4.1.9.9.166.1.5.1.1.2.$CM_parent_index.$cbQosCMDropPkt_index"
   cbQosConfigIndex=`snmpget $cmd_variables $host $OID | cut -d" " -f4`

   OID="1.3.6.1.4.1.9.9.166.1.7.1.1.1.$cbQosConfigIndex"
   cbQosCMName_classmap=`snmpget $cmd_variables $host $OID | sed 's/\"//g' | awk '{print $4}'`
### 

#Clear variables
cbQosParentObjectsIndex1=
cbQosParentObjectsIndex2=
cbQosConfigIndex2=
cbQosCMName_parent=

#What site does this class-map belong too (double query to cbQosParentObjectsIndex)
OID="1.3.6.1.4.1.9.9.166.1.5.1.1.4.$CM_parent_index.$cbQosCMDropPkt_index"
cbQosParentObjectsIndex1=`snmpget $cmd_variables $host $OID |awk '{print $4}'`

if [ $cbQosParentObjectsIndex1 -ne $CM_parent_index ]; then

   OID="1.3.6.1.4.1.9.9.166.1.5.1.1.4.$CM_parent_index.$cbQosParentObjectsIndex1"
   cbQosParentObjectsIndex2=`snmpget $cmd_variables $host $OID |awk '{print $4}'`

   OID="1.3.6.1.4.1.9.9.166.1.5.1.1.2.$CM_parent_index.$cbQosParentObjectsIndex2"
   cbQosConfigIndex2=`snmpget $cmd_variables $host $OID |awk '{print $4}'`

   OID="1.3.6.1.4.1.9.9.166.1.7.1.1.1.$cbQosConfigIndex2"
   cbQosCMName_parent=`snmpget $cmd_variables $host $OID |sed 's/\"//g' | awk '{print $4}'`

else

   OID="1.3.6.1.4.1.9.9.166.1.5.1.1.2.$CM_parent_index.$cbQosParentObjectsIndex1"
   cbQosParentObjectsIndex2=`snmpget $cmd_variables $host $OID |awk '{print $4}'`

   OID="1.3.6.1.4.1.9.9.166.1.6.1.1.1.$cbQosParentObjectsIndex2"
   cbQosCMName_parent=`snmpget $cmd_variables $host $OID | sed 's/\"//g' | awk '{print $4}'`

fi

        echo $timestamp,$customer,$host,$host_name,$cbQosCMName_parent,$cbQosCMName_classmap,$cbQosCMDropPkt >> $log_file

 i=`echo $i+1|bc`
done

Thursday, January 9, 2014

Open Source Vulnerability Scanning

I love being able to use open source software and intelligence to perform security tasks. When asked to do some vulnerability scanning away from my commercial tools, I found some work by Adam Ziaja ().


On a Kali linux machine I installed his script called “vulnerability check”. Adam has put together what he calls “ A simple script uses open source software (nmap, vFeed and DPE) and performs almost same task as Nessus or AVDS”. 

After following his instructions and installing the software and updating the databases:

I created my own wrapper script around his work.


vc_run
usage: vc_run network_range report_name
example: ./vc_run 10.10.1.0/24 report-10.10


#!/usr/bin


# Map the network with software version discovery
nmap -sV $1 -oX /root/reports/$2.xml > /root/reports/$2.nmap.txt


# Feed authors script with our NMAP Results
php vc.php /opt/vFeed/vfeed.db /opt/dpe/dpe_db.xml /root/reports/$2.xml > /root/reports/$2.vfeed.txt


# Convert the xml to html (install xsltproc)
xsltproc /root/reports/$2.xml -o /root/reports/$2.html


# Uncomment to open in browser at the end of the run
#firefox /root/reports/$2.html


Additional, after running the scan, each CVE can be evaluated on three items. (1) risk, (2) is there a OS (Microsoft, Linux) patch available and (3) is there a public exploit available.  

The following command lines where used to acquire this additional detail:


/opt/vFeed/vfeedcli.py get_risk CVE-2010-3972
Severity: High
Top vulnerability: True
       [cvss_base]: 10.0
       [cvss_impact]: 10.0
       [cvss_exploit]: 10.0
PCI compliance: Failed


/opt/vFeed/vfeedcli.py get_ms CVE-2010-3972
[Microsoft_ms_id]: MS11-004
[Microsoft_ms_title]: Vulnerability in Internet Information Services (IIS) FTP Service Could Allow Remote Code Execution


[stats] 1 Microsoft MS Patch(s)


/opt/vFeed/vfeedcli.py get_msf CVE-2010-3972
[msf_id]: iis75_ftpd_iac_bof.rb
[msf_title]: Microsoft IIS FTP Server Encoded Response Overflow Trigger
[msf_file]: metasploit-framework/modules/auxiliary/dos/windows/ftp/iis75_ftpd_iac_bof.rb


[stats] 1 Metasploit Exploits/Plugins



These results should be summarized into this report to provide system admins with a prioritized, actionable list that will provide immediate tangible results.  The remaining vulnerabilities can be remediated as time allows and as patches or fixes are released from vendors.

Additional work that could be done. It would be interesting to parse the output, grep for the CVE and then script the get_risk, get_ms, get_msf commands for a bit more automation...


Tuesday, November 19, 2013

Code accepted into Splunk App!

Bill Matthews informed me that the script I wrote and referenced in a previous post has made it into the Hurricane Labs Vulnerability Management v 1.5 app for Splunk!

 http://apps.splunk.com/app/1093/

They cleaned it up and put it in /opt/splunk/etc/apps/HurricaneVulnerabilityManagement/bin/Nessus.sh
#!/bin/bash

#Variables
SPLUNK_NESSUS=/mnt/nessus
SERVER="x.x.x.x"

#Retrive AUTH Token
token="$(/usr/bin/wget -q --no-check-certificate --post-data 'login=USERNAME&password=PASSWORD' https://$SERVER:8834/login -O - | grep -Po '(?<=token\>)[^\<]+(?=\<\/token)')"

#Get list of reports
/usr/bin/wget -q --no-check-certificate --post-data "token=$token" https://$SERVER:8834/report/list -O - | grep -Po '(?<=name\>)[^\<]+(?=\<\/name)' > /tmp/reports

#Get Friendly Names
/usr/bin/wget -q --no-check-certificate --post-data "token=$token" https://$SERVER:8834/report/list -O - | grep -Po '(?<=readableName\>)[^\<]+(?=\<\/readableName)' > /tmp/names

#Merge two files
/usr/bin/pr -tmJ --sep-string=" " /tmp/reports /tmp/names > /tmp/named.reports

for i in $(cut -d' ' -f1 /tmp/named.reports);
do
#Get Filenames for reports
FILENAME=$(/usr/bin/wget -q --no-check-certificate --post-data 'token='$token'&report='$i'&xslt=csv.xsl' https://$SERVER:8834/file/xslt -O - | grep -Po '(?<=/file/xslt/download/\?fileName=)[^\"]+(?=\"\>)')

#Get files
#build Readable name to report number match
READABLENAME=$(grep $i /tmp/named.reports | cut -d' ' -f2- --output-delimiter='')
sleep 5
/usr/bin/wget -q --no-check-certificate --post-data 'token='$token'&fileName='$FILENAME'&step=2' https://$SERVER:8834/file/xslt/download -O $SPLUNK_NESSUS/$READABLENAME.csv;
done;

#Cleanup
#rm /tmp/reports
#rm /tmp/names
#rm /tmp/named.reports


Wednesday, October 9, 2013

Importing Nessus CSV reports to SPLUNK from the Command Line!

Problem Solved!  Hurricane Labs provides a nice Splunk App to consume Nessus CSV files.  But, I did not want to manually download a new CSV from the Nessus web interface and then move it to my Splunk server. I could have made a samba share from my Splunk server to my PC and just saved the output from the Nessus web interface to the share.. Still too much manual work!

After a lot of searching I found some good information on the Nessus discussion pages

https://discussions.nessus.org/message/17812#17812
cmerchant@responsys.com answers their own question:

#!/bin/bash

AUTH=$(wget --no-check-certificate --post-data 'login=nessus&password=password' https://server:8834/login -O -| grep -Po '(?<=token\>)[^\<]+(?=\<\/token)')
FILE=$(wget --no-check-certificate --post-data 'token='$AUTH'&report=XXXXXX&xslt=csv.xsl' https://server:8834/file/xslt -O - | grep -Po '(?<=/file/xslt/download/\?)[^\"]+(?=\"\>)')

wget --no-check-certificate --post-data 'token='$AUTH'&'$FILE'&step=2' https://server:8834/file/xslt/download -O test.csv

This got me moving toward a solution. I had never done any web page parsing with wget and javascripts, so it was about time to learn...

My requirements were:

  • No interaction - must be able to be run in cron
  • Grab all completed Nessus results
  • Save the file with the Friendly Report name so Splunk can use the file name as the Report Name

Here is the results. This needs some clean up and more documentation, but it is completely usable as is. Except you will need to replace xxxxxx with your password and x.x.x.x with your nessus server IP.
(word wrap didnt play nice here, carefull with your cut and paste)


#!/bin/bash

#Variables
SPLUNK_NESSUS=/mnt/nessus

#Retrive AUTH Token
token="$(/usr/bin/wget -q --no-check-certificate --post-data 'login=nessus&password=xxxxxx' https://x.x.x.x:8834/login -O - | grep -Po '(?<=token\>)[^\<]+(?=\<\/token)')"

#Get list of reports
/usr/bin/wget -q --no-check-certificate --post-data "token=$token" https://x.x.x.x:8834/report/list -O - | grep -Po '(?<=name\>)[^\<]+(?=\<\/name)' > /tmp/reports

#Get Friendly Names
/usr/bin/wget -q --no-check-certificate --post-data "token=$token" https://x.x.x.x4:8834/report/list -O - | grep -Po '(?<=readableName\>)[^\<]+(?=\<\/readableName)' > /tmp/names

#Merge two files
/usr/bin/pr -tmJ --sep-string=" " /tmp/reports /tmp/names > /tmp/named.reports

for i in $(cut -d' ' -f1 /tmp/named.reports);
do
#Get Filenames for reports
FILENAME=$(/usr/bin/wget -q --no-check-certificate --post-data 'token='$token'&report='$i'&xslt=csv.xsl' https://x.x.x.x:8834/file/xslt -O - | grep -Po '(?<=/file/xslt/download/\?fileName=)[^\"]+(?=\"\>)')

#Get files
#build Readable name to report number match
READABLENAME=$(grep $i /tmp/named.reports | cut -d' ' -f2- --output-delimiter='')
sleep 5
/usr/bin/wget -q --no-check-certificate --post-data 'token='$token'&fileName='$FILENAME'&step=2' https://x.x.x.x:8834/file/xslt/download -O $SPLUNK_NESSUS/$READABLENAME.csv;
done;

#Cleanup
rm /tmp/reports
rm /tmp/names
rm /tmp/named.reports

#note
# Remove files in /opt/nessus/var/nessus/users/nessus/files on nessus server

If you use this please send me an email rossw@woodhome.com



DNS Visibility

Doug Burks wrote up a 2nd post at the SecurityOnion Blog about DNS Visibility.
http://securityonion.blogspot.com/2013/10/got-dns-visibility.html

Some of the original ideas come from a post by Johannes Ullrich
 https://isc.sans.edu/diary/A+Poor+Man%27s+DNS+Anomaly+Detection+Script/13918

Doug Comments:
For those running Bro [1] on Security Onion [2], I've modified the script [3].
and he posts the code at: 
[3] - http://code.google.com/p/security-onion/wiki/DNSAnomalyDetection



#!/bin/bash

BRO_LOGS="/nsm/bro/logs"
TODAY=`date +%Y-%m-%d`
YESTERDAY=`date -d yesterday +%Y-%m-%d`
OLD_DIRS=`ls $BRO_LOGS |egrep -v "current|stats|$TODAY|$YESTERDAY"`
TMPDIR=/tmp
OLDLOG=$TMPDIR/oldlog
NEWLOG=$TMPDIR/newlog
SUSPECTS=$TMPDIR/suspects
for DIR in $OLD_DIRS; do zcat $BRO_LOGS/$DIR/dns* |bro-cut query; done | sort | uniq -c | sort -k2 > $OLDLOG
zcat $BRO_LOGS/$YESTERDAY/dns* |bro-cut query | sort | uniq -c | sort -k2 > $NEWLOG
join -1 2 -2 2  -a 2 $OLDLOG $NEWLOG | egrep -v '.* [0-9]+ [0-9]+$' | sort -nr -k2 | head -10 > $SUSPECTS

This script will run through *ALL* of your DNS/bro logs on your Security Onion install.  I have modified the script to allow you to look back over any number of days and compare that summary to yesterday's DNS logs. 
Examples:
last week compared to yesterday ($sh DNSAnomalyDetection 7)
2 days ago compared to yesterday ($sh DNSAnomalyDetection 2)

-----------------
#!/bin/bash
BRO_LOGS="/nsm/bro/logs"
TIMEFRAME=$1
TODAY=`date +%Y-%m-%d`
YESTERDAY=`date -d yesterday +%Y-%m-%d`
TMPDIR=/tmp
OLDLOG=$TMPDIR/oldlog
NEWLOG=$TMPDIR/newlog
SUSPECTS=$TMPDIR/suspects
while [ $TIMEFRAME -ne 1 ];
do
 PROCESSING_DAY=`date -d "-$TIMEFRAME Day" +%Y-%m-%d`
 zcat $BRO_LOGS/$PROCESSING_DAY/dns* |bro-cut query;
 TIMEFRAME=$((TIMEFRAME - 1))
done | sort | uniq -c | sort -k2 > $OLDLOG
zcat $BRO_LOGS/$YESTERDAY/dns* | bro-cut query | sort | uniq -c | sort -k2 > $NEWLOG
join -1 2 -2 2 -a 2 $OLDLOG $NEWLOG | egrep -v '.* [0-9]+ [0-9]+$' | sort -nr -k2 | head -10 > $SUSPECTS
-Ross Warren

Friday, September 6, 2013

Summer is over! Now time for the Transonic Buffet

Summer is over the kids are in school! The traffic in NoVA is back.

It has been a while since I have posted, was busy with a few other things.

Passed my GCIA and wrote an article for Coast Guard Forum Magazine.

The GCIA was the hardest SANS test I have taken.  A lot of raw packet hex conversions, and a big focus on ICMP codes !?!  But it is done. I can add to the alphabet soup at the end of my name, change my Linkedin profile and join a few more groups.

The article for the Coast Guard Forum Magazine was a request from our marketing department to explain cyber security for our company.  4 questions that needed to be answered in 200-300 words.  After a few revisions with an editor the final text has been sent to the publisher.  Waiting anxiously to see my name in print in the realm of cyber security! I made sure to mention Security Onion and NSM!

Lets end today with a quote from Elon Musk.  This was found in the Hyperloop design details document.

"And, when you get right down to it, going through transonic buffet in a tube is just fundamentally a
dodgy prospect." - Musk