The "Yes, but..." syndrome
This weekend, I worked on a pentest report that was already pending for a while. I'm honest: I'm lazzy to write reports (like many of us, no?). During a pentest, it is mandatory to keep evidences of all your findings. No only the tools you used and how you used them but as much details as possible (screenshots, logs, videos, papers, etc). Every day, we had a quick debriefing meeting with the customer to make the point about the new findings. The first feedback was often a "Yes, but...":
Me: "We were able to connect a USB stick and to drop unwanted tools to the laptop!"
Customer: "Yes, but it will be disabled soon, the deployment process is ongoing!"
Me: "We were able to find sensitive documents on an admin's desktop!"
Customer: "Yes, but the admin forgot to delete it"
Me: "We were able to connect to remote servers using the same credentials!"
Customer: "Yes, but the operators are not supposed to connect on this network segment"
I could write tons of examples like these! When you're engaged in a pentest, it is executed within a defined scope at a time "t". If your targeted infrastructure is not ready yet, postpone the project. And if you think to be ready, accept to be compromised!
When speaking to customers, I like to compare a pentest to a plane crash. If a plane crash result often in many dead people, planes can be considered as safe. Statistically, flying is less dangerous than driving to the airport with your car! Modern planes are very reliable: critical components are redundant, strong procedures and controls are in place. Planes are designed to fly under degraded mode. The cabin crew is also trained to handle such situations. Finally, planes maintenance are scheduled at regular interval.
How to explain that, from time to time, a plane crash occurs? Most of the time, post-crash investigations reveal that the crash is the result of a suite of small or negligible issues which, taken out of context, are not critical. But, a small incident might introduce a second one, then a third one, etc. If a suite of such small incidents occur, nasty things may happen up to... a crash! This is called the “butterfly affect” which describes how a small change in a deterministic nonlinear system can result in large differences in a later state (definition by Wikipedia).
Keep in mind that the same may occur during a pentest. A small issue in a configuration file associated to files left in a public directory, an unpatched system and a lack of security awareness of the operators might result in a complete compromisation of your infrastructure. Avoid "Yes, but..." comments and take appropriate action to solve the issues.
Xavier Mertens
ISC Handler - Freelance Security Consultant
PGP Key
Reverse-Engineering Malware: Malware Analysis Tools and Techniques | Amsterdam | Jan 20th - Jan 25th 2025 |
Comments
John
jbmoore61@gmail.com
Anonymous
Oct 27th 2015
9 years ago
Anonymous
Oct 27th 2015
9 years ago
Anonymous
Oct 27th 2015
9 years ago
Your post brought back some excellent ""Yes, but..." moments.
Russell
Anonymous
Oct 27th 2015
9 years ago
Jim
Anonymous
Oct 28th 2015
9 years ago