Monday, January 14, 2013

[UPDATED] DNS Scraping for Corporate AV Detection - "Admins, stop sharing your cache with everyone!"

NEW TOOL: Scrape-DNS

**Since this was first posted the technique has been implemented into Metasploit (which means it's on Kali Linux), Recon-ng and ArchAssault Project.

Back at my old job, we used cache snooping techniques (Scraping) to check for evidence of client systems that were attempting to resolve known malware sites.

We would use the list at Mayhemiclabs.com and compare it to our cached DNS entries.

So, why don't we do something badass like that, but to support the penetration test or red team mission?

Using standard cache snooping techniques you can determine what anti-virus vendors might be in use on a clients network.

HOW? Simple. By making non-recursive queries to the client's DNS servers for known AV update site domains.

Yes, it is that simple.

To query cached DNS entries, you need only to make a NON-recursive request a target DNS server.  

Dig seems to yield the most reliable results.

First we need to create a list of sites to check for. 

Let's put the list of sites in updates.list:



As you can see, I have added “www.” to each entry as a modified duplicate. 

You can simply run the following command:

dig @129.71.1.1 -f updates.list +norecurse | grep -A 2 "ANSWER SECTION" | sort -u | sed '/^$/d'| sed 's/^/[+] Success - /g'






Lookie there. Symantec, Sophos, Forefront, etc. Jackpot!

Looks ghetto, but it works and you can get a decent screenshot for your report.

VERY IMPORTANT! If you forget to use the +norecurse option, you will have just poisoned the DNS cache with your list of sites. You will now have to wait for the entries to expire before proceeding.

So, after learning this lesson and playing with various and hilarious website lists, I opted to just write a simple script. One that has pre-populated site lists and won’t let me forget the +norecurse option :D

I call it Scrape-DNS.

This script leverages Dig to perform non-recursive lookups against vulnerable DNS servers in order to determine if certain domains are stored in the DNS server’s cache. 

Usage:


Check DNS servers for interesting cached entries

Examples:

./scrape.sh -t 8.8.8.8 -a
./scrape.sh -t 8.8.8.8 -o
./scrape.sh -t 8.8.8.8 -u
./scrape.sh -t 8.8.8.8 -i custom_sites.list

OPTIONS:
   -h     Show this message
   -a     All Mode 
   -i      Import Mode 
   -u     Common AV Mode
   -o     Obscene Mode
   -t   Target DNS Server

Some results are funnier than others.

Using Free Porn Mode, er.. Obscene Mode :


So, next time you need to demonstrate this vulnerability (DNS Cache Snooping Enabled) or need / want to perform recon before crafting your payloads, you can try this technique with Dig, or you can Scrape-DNS.

Alternate mehod: If non-recurse is not available, we can try the ‘timing method’.

The idea here is, if I request a domain 5 times; the first time the response takes 130ms, the second 51ms, the third 50ms, then 53ms, 52ms. We can infer that the requested domain was NOT in the server’s cache the first time we requested it. But, obviously it would be the second, third, etc. Assuming this test was conducted before the entry expired. 

Here is what that might look like;


Let’s check for update.symantec.com: dig @dns-server-ip update.symantec.com | grep -A 5 "Query time:"

Looks like update.symantec.com IS in the server’s cache. They might have Symantec Av running in the environment?

Let’s try Avast: dig @dns-server-ip download797.avast.com | grep -A 5 "Query time:"




And it looks like download797.avast.com was NOT in the servers cache when we first checked. They MAY not be running Avast.

You can also run continuous checks to interrogate the DNS server. I won’t get into it here, but I just used a for loop and SLEEP(s) to pull that off.
Scrape-DNS v2 with include the option for continuous collection. Assuming anyone cares enough to use v1 and gives any feedback :)

[UPDATED]

I found a cool blog entry "Poor man's bar graphs in bash" by Ger-Jan. He wrote a cool portable bash loop that will build out bar graphs in bash. 

So, I quickly incorporated it into the scraping routine, so that I can visually compare the DNS response times in an effort to make better judgement when using the timing method to find cached DNS entries.

Here is s a screenshot:



Looks like Sophos is NOT is use on this network. OR, they never update o.O !!


I will add this feature to Scrape-DNS in the next version or so. 

Enjoy,

@304geek