My next class:

Abusing Web Filters Misconfiguration for Reconnaissance

Published: 2019-11-22. Last Updated: 2019-11-22 06:34:39 UTC
by Xavier Mertens (Version: 1)
0 comment(s)

Yesterday, an interesting incident was detected while working at a customer SOC. They use a “next-generation” firewall that implements a web filter based on categories. This is common in many organizations today: Users' web traffic is allowed/denied based on an URL categorization database (like “adult content”, “hacking”, “gambling”, …). How was it detected? 

We received notifications about suspicious traffic based on bad websites detection (read: “not allowed” in the firewall policy). The alert could be read like:

The IP x.x.x.x  tried to access the URL xxxxxxxxx (matching category: xxxxxxxxx)

Why was it more suspicious than usual? It was generated by an “external” IP address (not belonging from any subnet - routed or non-routable). This was strange and it deserved some deeper investigations! No routing issue, no spoofing, it looked like the traffic was really coming from the wild Internet and triggered the web filter. After a deeper check with the team in charge of the firewall, they discovered the mistake: The web filter was also applied on incoming interfaces and the default denied page was returned to the visitor! The offending IP address was (ab)using the web filter feature to perform some reconnaissance: To detect which web sites are allowed/denied by the organization. How? Just visit the website to be tested by connecting to the public address of the target! 

To achieve this, just adapt your '/etc/hosts' (UNIX) or '%SYSTEM%\drivers\etc\hosts' (Windows) and add a static entry:

x.x.x.x    www.maliciouswebsite.com

Where ‘x.x.x.x.’ is an IP address belonging to the customer.

Not very practical if you must test a lot of sites. Thanks to curl[1], we can automate this in a nice way. Let’s save our malicious sites to be tested in a file, one hostname per line:

$ echo <<__END__ >sites.txt
www.malicious1.com
www.malicious2.be
www.malicious3.tk
__END__

Now, use the power of curl! It has a nice feature to resolve sites to a specific address (via the '--resolve' parameter):

$ cat sites.txt | while read URL
do
  curl -s --resolve $URL:x.x.x.x http://$URL | grep -i “blocked” >/dev/null || echo “$URL is NOT blocked”
done

This command will visit potential malicious URLs by connecting to the customer's IP address. If the website is not allowed in the policy, the default access denied page will be displayed and we search for an interesting keyword like 'blocked'. Keep this in mind in your future security tests, always try to access a suspicious URL combined with an IP address of your target.

Since this incident, the firewall policy has been fixed!

[1] https://isc.sans.edu/forums/diary/Exploiting+the+Power+of+Curl/23934

Xavier Mertens (@xme)
Senior ISC Handler - Freelance Cyber Security Consultant
PGP Key

0 comment(s)
My next class:

Comments


Diary Archives