Abusing Web Filters Misconfiguration for Reconnaissance
Yesterday, an interesting incident was detected while working at a customer SOC. They use a “next-generation” firewall that implements a web filter based on categories. This is common in many organizations today: Users' web traffic is allowed/denied based on an URL categorization database (like “adult content”, “hacking”, “gambling”, …). How was it detected?
We received notifications about suspicious traffic based on bad websites detection (read: “not allowed” in the firewall policy). The alert could be read like:
The IP x.x.x.x tried to access the URL xxxxxxxxx (matching category: xxxxxxxxx)
Why was it more suspicious than usual? It was generated by an “external” IP address (not belonging from any subnet - routed or non-routable). This was strange and it deserved some deeper investigations! No routing issue, no spoofing, it looked like the traffic was really coming from the wild Internet and triggered the web filter. After a deeper check with the team in charge of the firewall, they discovered the mistake: The web filter was also applied on incoming interfaces and the default denied page was returned to the visitor! The offending IP address was (ab)using the web filter feature to perform some reconnaissance: To detect which web sites are allowed/denied by the organization. How? Just visit the website to be tested by connecting to the public address of the target!
To achieve this, just adapt your '/etc/hosts' (UNIX) or '%SYSTEM%\drivers\etc\hosts' (Windows) and add a static entry:
x.x.x.x www.maliciouswebsite.com
Where ‘x.x.x.x.’ is an IP address belonging to the customer.
Not very practical if you must test a lot of sites. Thanks to curl[1], we can automate this in a nice way. Let’s save our malicious sites to be tested in a file, one hostname per line:
$ echo <<__END__ >sites.txt www.malicious1.com www.malicious2.be www.malicious3.tk __END__
Now, use the power of curl! It has a nice feature to resolve sites to a specific address (via the '--resolve' parameter):
$ cat sites.txt | while read URL do curl -s --resolve $URL:x.x.x.x http://$URL | grep -i “blocked” >/dev/null || echo “$URL is NOT blocked” done
This command will visit potential malicious URLs by connecting to the customer's IP address. If the website is not allowed in the policy, the default access denied page will be displayed and we search for an interesting keyword like 'blocked'. Keep this in mind in your future security tests, always try to access a suspicious URL combined with an IP address of your target.
Since this incident, the firewall policy has been fixed!
[1] https://isc.sans.edu/forums/diary/Exploiting+the+Power+of+Curl/23934
Xavier Mertens (@xme)
Senior ISC Handler - Freelance Cyber Security Consultant
PGP Key
Comments
www
Nov 17th 2022
6 months ago
EEW
Nov 17th 2022
6 months ago
qwq
Nov 17th 2022
6 months ago
mashood
Nov 17th 2022
6 months ago
isc.sans.edu
Nov 23rd 2022
6 months ago
isc.sans.edu
Nov 23rd 2022
6 months ago
isc.sans.edu
Dec 3rd 2022
5 months ago
isc.sans.edu
Dec 3rd 2022
5 months ago
<a hreaf="https://technolytical.com/">the social network</a> is described as follows because they respect your privacy and keep your data secure. The social networks are not interested in collecting data about you. They don't care about what you're doing, or what you like. They don't want to know who you talk to, or where you go.
<a hreaf="https://technolytical.com/">the social network</a> is not interested in collecting data about you. They don't care about what you're doing, or what you like. They don't want to know who you talk to, or where you go. The social networks only collect the minimum amount of information required for the service that they provide. Your personal information is kept private, and is never shared with other companies without your permission
isc.sans.edu
Dec 26th 2022
5 months ago
isc.sans.edu
Dec 26th 2022
5 months ago