DFIR Exposed #1: The Crime of Silence

DFIR Exposed #1: The Crime of Silence by David Cowen - Hacking Exposed Computer Forensics Blog



Hello Reader,
          I've often been told I should commit to writing some of the stories of the cases we've worked as to not forget them. I've been told that I should write a book of them, and maybe some day I will. Until then I wanted to share some of cases we've worked where things went outside the norm to help you be aware of not what usually happens, but what happens with humans get involved.

Our story begins...
It's early January, the Christmas rush has just ended and my client reaches out to me stating.

"Hey Dave, Our customer has suffered a breach and credit cards have been sent to an email address in Russia"

No problem, this is unfortunately fairly common so I respond we can meet the client as soon as they are ready. After contracts are signed we are informed there are two locations we need to visit.

1. The datacenter where the servers affected are hosted which have not been preserved yet
2. The offices where the developers worked who noticed the breach

Now at this point you are saying, Dave... you do IR work? You don't talk about that much. No, we don't talk about it much, for a reason. We do IR work through attorneys mainly to preserve the privilege and I've always been worried that making that a public part of our offering would effect the DF part of our services as my people would be flying around the country.

BTW Did you know that IR investigations lead by an attorney are considered work product by case law on a case I'm happy to say I worked on? Read more here: https://www.orrick.com/Insights/2015/04/Court-Says-Cyber-Forensics-Covered-by-Legal-Privilege

So we send out one examiner to the datacenter while we gather information from the developers. Now you may be wondering why we were talking to the developers and not the security staff. It was the developers who found the intrusion after trying to track down an error in their code and comparing the production code to their checked in repository. Once they compared it they found a change in their shopping cart that showed the form submitted with the payment instructions was being processed while also being emailed to a russian hosted email address.

The developers claimed this was their first knowledge of any change and company management was quite upset with the datacenter as they were supposed to provide security in their view of the hosting contracts signed. Ideas of liability and litigation against the hosting provider were floating around and I was put on notice to see what existed to support that.

It was then that I got a call from my examiner who went to the datacenter, he let me know that one of the employees of the hosting company handed him a thumbdrive while he was imaging the systems saying only:

 'You'll want to read this'

You know what? He was right!

On the thumbdrive was a transcript of a ticket that was opened by the hosting companies SOC. In the transcript it was revealed that a month earlier the SOC staff was informing the same developers who claimed to have no prior knowledge of an intrusion that a foreign ip had logged into their VPS as root ... and that probably wasn't a good thing.

I called the attorney right away and let her know she likely needed to switch her focus from possible litigation against the hosting provider and to an internal investigation to find out what actually happened. Of course we still needed to finish our investigation of the compromise itself to make sure the damage was understood from a notification perspective.

Step 1. Analyzing the compromised server

Luckily for us the SOC ticket showed us when the attacker had first logged in as the root account which we were able to verify through the carved syslog files. We then went through the servers and located the effected files, established the mechanism used and helped them define the time frame of compromise so they could through their account records to find all the affected customers.

Unfortunately for our client, it was the Christmas season and one their busiest time of year. Luckily for the client it happened after Black Friday which IS their busiest time of the year. After identifying the access, modifications and exfil methods we turned our focus to the developers.

We talked to the attorney and came up with a game plan. First we would inform them that we needed to examine each of their workstations to make sure they were not compromised and open for re-exploitation, which was true. Then we would go back through their emails, chat logs and forensic artifacts to understand what they knew and or did when they were first notified of the breach. Lastly we would bring them in to be interviewed to see who would admit to what.

Imaging the computers was uneventful as you always hope it was, but the examination turned out to be very interesting. The developers used Skype to talk to each other and if you've ever analyzed Skype before you know that by default it keeps history Forever. There in the Skype chats was the developers talking to each other about the breach when it happened, asking each other questions about the attackers ip address, passing links and answers back to each other.

And then.... Nothing

Step 2. Investigating the developers

You see investigations are not strictly about the technical analysis in many cases, some are though, there is always the human element which is why I've stayed so enthralled by this field for so long. In this case the developers were under the belief that they were going to be laid off after Christmas so rather than take action they decided it wasn't their problem and went on with their lives. They did ask the hosting provider for recommendations of what to do next, but never followed up on them.

A month later they got informed they were not being laid off, and instead were going to be transferred to a different department. With the knowledge that this was suddenly their problem again they decided to actually look at the hosted system and found the modified code.


Step 3. Wrapping it up

So knowing this and comparing notes with the attorney we brought them in for an interview.

The first round we simply asked questions to see what they would say, who would admit what and possibly who could keep their jobs. When we finished talking to all the developers, all of which pretended to know nothing of the earlier date we documented their 'facts' and thanked them.

Then we asked them back in and one fact at a time showed them what we knew. Suddenly memories returned, apologies were given and the chronology of events was established. As it turns out the developers never notified management of the issue until the knew they were going to remain employed and just sat on the issue.

Needless to say, they no longer had that transfer option open as they were summarily terminated.

So in this case a breach that should have only lasted 4 hours at most (time of login notice by SOC to time of remediation) lasted 30 days of Christmas shopping because the developers of the eCommerce site committed the crime of silence for purely human reasons.