Saturday, November 30, 2013

Daily Blog #161: Sunday Funday 12/1/13

Hello Reader,
        It's Sunday Funday time! Let's get into some more real world scenarios and combine some different types of analysis.

The Prize:


  • A $200 Amazon Gift Card

The Rules:
  1. You must post your answer before Monday 12/2/13 2AM CST (GMT -5)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful 
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post

The Challenge:
You are involved in a case that involves emails containing confidential information being sent to outside parties. You've been given an image of one of the outside parties computer and the name of the email that was sent by subject and date.  The image was created two weeks after the email was sent. You've located the message on the image but the suspect has denied accessing any attachments.

Please detail how on a Windows 7 system running Outlook 2007 you can determine:
1. What attachments were accessed in the last two weeks
2. When attachments were accessed
3. How many times attachments were accessed

Good luck!

Daily Blog #160: Saturday Reading 11/30/13

Hello Reader,
        It's Saturday! Time for more links to make you think as we go through what I've been reading this week.

1. The handler diaries is a blog I've been following recently, this isn't a new post but I don't think I linked it before. This post covers the discovery phase of an incident which can be the most crucial phase, http://blog.handlerdiaries.com/?p=128.

2. Our DFIR friends in NOLA have put out a master's class on registry forensics over on hacker academy. It only costs $399 and its gotten high marks from Ken Pryor. I plan to give it a go later on this month, https://hackeracademy.com/masterclass/registry-forensics.

3. There is a good article up on forensic focus on what we can determine from the usage of the Windows 8 file history feature, http://articles.forensicfocus.com/2013/11/24/2869/.

4. Lee Whitfield over on the Forensic 4:cast has announced the beginnings of the 2014 Forensic 4:Cast awards and is looking for comments on how to improve them. If you have ideas please let him know! http://forensic4cast.com/2013/11/4cast-awards-2014/

5. Hexacorn has a new blog up on how to setup your work environment do get things done faster, http://www.hexacorn.com/blog/2013/11/25/doing-things-faster/. It's not directly forensic related but following his tips could help you find your zen like analysis state.

6. The Accessdata Users Conference opened up a CFP for the first time, https://www.ad-users.com/call-for-speakers.  It's a fun conference and a week separated from CEIC and both in Las Vegas, so if you speak at both you could stay in Vegas even longer!

7. SANS has been accredited to offer a Masters program in conjunction with their training, http://www.sans.edu/accreditation. That's pretty neat and a nice option for those of you that are looking to get both technical training and higher education credentials.


That's all for this week, things get slower around Thanksgiving in the states. I hope you had a great holiday with your families as I did with mine. Please make time tomorrow for another Sunday Funday!

Friday, November 29, 2013

Daily Blog #159: Making the forensic lunch

Hello Reader,
       This is normally where I would post a video and tell you about the guests we have on the forensic lunch this week except, there was no forensic lunch this week! Seeing as it is a holiday weekend in America it seemed unlikely to succeed so we have punted until next week. So I thought this week I would address a question I get asked a lot, what equipment are you using to make the forensic lunch?

We have 4 microphones all from sterling audio on mic stands that run into a focursrite 18i20 for preamp and mixing. We then have an output running from the focusrite into the line in on the PC in the room. For video we have a hp lifecam on top of the TV to get a wide angle into the room.

From there its all about the right combination of google services with the right account to make things work. If you want to get a hangout on air event like we do you need to do the following:

1. Link a youtube account to a google+ page
2. SMS verify your account, this will enable hang outs on air
3. As the same google+ page you then create an event for the time you want the event to occur, do not enable as a hangout
4. On the day of your event create a hangout on air as the google+ page
5. Once it has started enable the Q&A module which will, if you did all the right things as the right account, find your google+ event and link the hangout on air to it
6. Give the hangout link to the people you'd like to join
7. Use the cameraman app to set your options for how the video will be displayed and recorded within the hangout
7. Click go on air when you are ready to broadcast
8. Take questions with the Q&A app and make sure to click which ones you are answering for those watching later
9. Take the youtube link when your done to share the video

So there you go! The next hard part is finding people to come on the air. Speaking of I'm looking for guests for next week's forensic lunch so please email me, dcowen@g-cpartners.com, to get on.

Thursday, November 28, 2013

Daily Blog #158 Happy Thanksgiving!

Hello Reader,
          It's Thanksgiving in America and I had a great day of cooking and enjoying family time. My wife always tells me that I should break up the technical posts with a recipe now and then so let's do that.

The HECF Blog Smoked Turkey

If you are my friend on Facebook you would know that on fathers day I got  a Weber Smokey Mountain smoker and I've been reaching my full potential as a Texan ever since by using it. I used it this morning to make a smoked turkey that was met was happy bellies.

I started my recipe by following the recipe listed here:
http://amazingribs.com/recipes/chicken_turkey_duck/ultimate_smoked_turkey.html

There is a lot of words there, and if you want to make the most amazing meal of your thanksgiving days you should read all of them. The summary though is as follows.

1. It takes multiple days to actually defrost a turkey, make sure you buy it 4 days before and let it rest in the fridge or cooler to fully defrost.
2. Many turkeys sold in stores today have already been put into some kind of brine, don't brine your turkey unless it does not say some version of 'contains up to x% water and spices'
3. The night before you are going to cook, get a rub (either store bought or follow a recipe on amazing ribs) and mix it with equal parts olive oil to make it a wet rub.
4. Cut off the excess fat around the neck and cavity as well as anything binding the legs together
5. Place the rub on the skin but also reach into the neck cavity and get it under the skin and on to the meat directly and then let it sit overnight
6. The morning of place onions/garlic/orange peels and thyme in the cavity
7. Take a little extra olive oil and salt and place it on the skin so it will crisp better
8. Make a drip pan to put under your turkey with onions, herbs, and chicken broth and the giblets
9. Heat up your smoker/grill to 325 and then place the bird on with a drip pan underneath to catch the drippings
10. Cook the bird until the internal temp hits 160
11. Remove the bird from the smoker and then let it rest 30mins
12 . Take the dripping pan and dump it into a blender
13. Mix the blended drippings, flour and heavy cream w/ chicken broth and pepper and stir to make a gravy

This is what the turkey will look like!


There you go!

Wednesday, November 27, 2013

Daily Blog #157: Metadiver!

Hello Reader,
            Another tool from our lab has escaped into the light of day. David Dym on his RedRock blog has posted up the first version of Metadiver, http://redrocktx.blogspot.com/2013/11/introducing-metadiver.html. Metadiver was born out of the frustation of most forensic suites inability to display all the relevant metadata embedded with many of the file format available today.

What metadiver does is call shell32 to query all the metadata fields the driver for that file type makes available, recurses through directories in doing so and then writes it all out so you can quickly review for the metadata your interested in. You should give the tool a shot and see if you find some metadata your tools aren't showing you.

Tuesday, November 26, 2013

Daily Blog #156: The search for the next great intern

Hello Reader,
         I'd like to take the time this post to inform you of two things:

1. We are accepting applications currently for a paid internship for the spring 2014 semester.
If you are willing to move to DFW metro area, or are already here, and looking for an internship I'm looking for you! This is a paid research internship focused on file system forensics and other new types of analysis were you will be helping break new ground! Mail your resumes to dcowen@g-cpartners.com with the subject 'Intern Spring 2014'.

2. We are preparing a game of sorts for what i like to call the 'next great intern' contest for fall 2014

To make things more interesting and to help us find candidates who shine with their problem solving skills better than their resumes we are preparing a contest. The contest will be a series of puzzles that lead to each other that you can solve at your own pace. The winner of the contest will be given a paid internship for the Fall 2014 semester!

So email me now and keep a look out for our first clue coming soon!

Monday, November 25, 2013

Daily Blog #155: Sunday Funday 11/24/13 Winner!

Hello Reader,
     Another Sunday Funday come and gone and some good entries this week!I really liked this weeks winner because he went into both physical access control (badge logs), network access controls (firewall logs), and host access controls (event logs and artifacts) to narrow down his suspects and fully examine the facts at hand. Great job Steve M!

The Rules:
  1. You must post your answer before Monday 11/25/13 2AM CST (GMT -5)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful 
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post

The Challenge:
Your board of directors have received an email from a Gmail address sent from Thunderbird mailer at 9pm at night with insider information about the company with a demand for action or the sender will go to the press.  IT security has found the IP address of the companies firewall of one of the smaller company branches in the email header and passed the data to you. The branch has only 8 employees and normal office hours end at 5pm.

Please detail how you will:
1. Determine which system sent the email
2. Determine which user of the system sent the email

The Winning Answer:
Steve M
Considering this case may involve litigation, I would stress collecting good case notes and following proven processes throughout the investigation.

To initially narrow the search of suspects, I would first check physical access badge logs to determine if anyone was in the office at 9pm.  If employees can access the Internet via this firewall when connected via VPN, I would check VPN access logs as well.

On the network side, I would begin by looking at the perimeter firewall traffic logs for mail connections to Google/gmail, specifically on ports 993/TCP (IMAP), 995/TCP (POP3), and 25/TCP or 587/TCP (SMTP) which would be used by Thunderbird.  Google may switch services between various IP ranges, but as of right now it appears most mail related addresses resolve within the 173.194.0.0/16 subnet.  It is probably a safe assumption to say Google won't move their mail services outside of this network/port range in the foreseeable future.  Once I have identified an internal IP address as communicating to Google over these ports during the window the email was received, I would look for DHCP logs or an asset inventory system to determine which internal system had the IP address at that time.

Once I have identified a suspect internal system, I would proceed with host level forensics to find the user with that account's information in his/her Thunderbird profile (assuming Win7):

- First, I would take an image of the system for investigation and evidence purposes.  I would also duplicate it for a working copy.

- I would sweep the system for directories matching "C:\Users\\AppData\Roaming\Thunderbird\Profiles\", to see which Windows logon accounts use Thunderbird.

- At this point, I would perform a raw text search of the entire contents of this directory (including "ImapMail" and "Mail") for the specific source email address in question.  If hits were found, I would focus the remainder of my investigation on this profile.

- If no hits were found, I may have to extract the usernames from all Thunderbird profiles on this system.  Considering both usernames and passwords can be decrypted with this method, I would NOT do this on a large number of potentially unrelated profiles due to privacy concerns.  Instead I would revisit my case notes to see if I can pinpoint the suspected user via other means (event log records and other filesystem activity within the same time period for example).

- Once I've identified a profile with hits, I would extract the profile folder from the disk for further investigation.  Mozilla stores the usernames and passwords encrypted in the "moz_logins" table of the "signons.sqlite" sqlite db and the encryption keys in the "key3.db" file.

- I would use a tool such as "ThunderbirdPassDecryptor" to decrypt the username/password from this profile using these files.  At that point, the source email address (and password) should be known, and the owner of the Windows profile can be considered the source.

Saturday, November 23, 2013

Daily Blog #154: Sunday Funday 11/24/13

Hello Reader,
        It's Sunday Funday time! Let's get into some more real world scenarios and combine some different types of analysis.

The Prize:

  • A $200 Amazon Gift Card

The Rules:
  1. You must post your answer before Monday 11/25/13 2AM CST (GMT -5)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful 
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post

The Challenge:
Your board of directors have received an email from a Gmail address sent from Thunderbird mailer at 9pm at night with insider information about the company with a demand for action or the sender will go to the press.  IT security has found the IP address of the companies firewall of one of the smaller company branches in the email header and passed the data to you. The branch has only 8 employees and normal office hours end at 5pm.

Please detail how you will:
1. Determine which system sent the email
2. Determine which user of the system sent the email

Good luck!

Daily Blog #153: Saturday Reading 11/23/13

Hello Reader,
           It's Saturday! I had a bit too much fun playing the heartstone beta last night so I didn't post this the night before like I usually do, no reason not to share good links though!

1. Forensic Lunch went down yesterday! We had Mari DeGrazia on to talk about her research into SQLite deleted data recovery and Eric Zimmerman talking about being the first Xways Xpert and OsTriage v2. Watch it here http://hackingexposedcomputerforensicsblog.blogspot.com/2013/11/daily-blog-152-forensic-lunch-112213.html.

2. Yogesh Khatri has been putting up some good blog posts this week in regards to changes in USB device forensic in Windows 8. He's done this in two posts this week, the first is on new registry entries from USB device removal with timestamps, http://www.swiftforensics.com/2013/11/windows-8-new-registry-artifacts-part-1.html very cool! The second is talking about which event logs are not being created on USB device insertion and removal http://www.swiftforensics.com/2013/11/event-log-entries-for-devices-in.html. This is great stuff and hopefully he'll keep going!

3. In an interesting civil case over on the CYB3RCRIM3 blog a unhappy consumer sued best buy an represented himself, http://cyb3rcrim3.blogspot.com/2013/11/the-laptop-malware-and-consumer-sales.html. This case is interesting to me because claims revolved around not just the typical warranty issues but also the malware/spyware found on his computer. Good reading for anyone buying computers and warranties from a retailer.

4. On forensic focus there is a new article up on new metadata found in OSX Mavericks, read it here http://articles.forensicfocus.com/2013/11/13/os-x-mavericks-metadata/. The article goes into two different types of new metadata found in OSX Mavericks, email attachments saved to disk and file tagging.

5. Harlan has a new post up on using the 'sniper forensics' methodology of examination to quickly find malware and reduce analysis time. He then goes into working with Volatility and his steps taken in using it for memory analysis. A good read you can see here http://windowsir.blogspot.com/2013/11/sniper-forensics-memory-analysis-and.html.

6. If you are doing forensics on OSX systems your going to run into virtual machines as most users run their Windows apps in Parallels of Fusion. This can be a pain as you want a forensic image to work with in most of your tools. This article on appleexaminer goes through how to convert these images to raw/dd images using qemu http://www.appleexaminer.com/MacsAndOS/Analysis/VirtDiskConv/VirtDiskConv.html.

7. Dealing with dropbox on Windows XP and want to decrypt more of the databases? Magnet forensics has updated their tool to now work against any Dropbox database and its free! http://www.magnetforensics.com/decrypting-the-config-dbx-file/

8. Forensic Femmes has a good interview with Sk3tchmoose aka Melissa Augustine about her work in DFIR http://christammiller.com/2013/11/19/forensic-femmes-4-melissa-augustine/

9. The Volatility guys put up some more training dates, http://volatility-labs.blogspot.com/2013/09/2014-malware-and-memory-forensics.html, this is a class I'd like to take in the future!

That's all for this week, lots of good stuff out there. Sunday Funday is coming up shortly after!

Friday, November 22, 2013

Daily Blog #152: Forensic Lunch 11/22/13

Hello Reader!,
            Another Forensic Lunch already? I know right! Another week has gone by and we have a great forensic lunch for you this week, not that i would tell you its not great every week. This week Mari DeGrazia join us to talk about her work building a python parser for recovering deleted data from SQLite databases and Eric Zimmerman came on to talk to us about passing the new X-ways Xpert certification and the upcoming OSTriage v2 which will be available for non law enforcement use!

You can read Mari's blog here: http://az4n6.blogspot.com/ 
To read up more on OsTraige read the forensic focus thread here: http://www.forensicfocus.com/Forums/viewtopic/p=6565347/


Want to be on the lunch? Just email me dcowen@g-cpartners.com and I'd love to have you on!

BTW If you've been wanting to listen to the lunch as a podcast you can now! Just subscribe to it here: http://www.learndfir.com/?feed=podcast

Thursday, November 21, 2013

Daily Blog #151: Automating FTK Filter creation

Hello Reader,
           Normally I would just include something like this on a Saturday Reading link, but being that today was pretty busy and this something pretty useful to those of you using FTK I thought it was worth its own post.

If you use FTK then you know about the power of filters, much like other tools you can use filters in FTK to lock down your views to different dates, hashes, file types, paths, categories, etc... We use this feature a lot to take advantage of some of the more harder to find FTK features like LNK Metadata export. Well if you are using filters on a regular basis and using long filters to do these like only show files with a certain hash value you should check out this tool written by David Dym, read the blog post here http://redrocktx.blogspot.com/2013/11/scripting-with-ftk-filters.html.

You might know David Dym as the author of shadowkit but I know him as a fellow employee of G-C Partners where we've been using this tool on a number of cases. In the example shown in his blog entry he is getting FTK to show only those itemids listed. We do this a lot with attorneys providing them a spreadsheet of files with itemids included, telling them to mark which ones they are interested in. Then we can just export out those itemids to FTK in a filter form and easily export out the data they want.

That's just one example but you can extend and automate any number of large and long filters this way and then just import them into your case.

Tomorrow is the forensic lunch!

Wednesday, November 20, 2013

Daily Blog #150: Forensic artifacts from renaming accounts in Windows 7

Hello Reader,
             One of things I enjoy is talking to other examiners out there and hearing about mysteries they find that our current knowledge base does not cover. I like hearing about these because then I can try to help by setting up a test platform to determine what is causing the underlying mystery and identify a new artifact that we can all benefit from. Such an instance happened yesterday in a discussion with an examiner, who can attribute himself if he chooses to, regarding a system he was looking at.

The system in question was running Windows 7 and had a peculiar situation occurring. When the examiner looked at the file system a user SID and name we'll call 'NameB' was associated, however the SAM and event logs made referenced to the same SID but with another user name that we'll call 'NameA'. The same SID, thus the same user, be referenced by two different names by two different sources both generated by the system itself. This was odd to say the least and the examiner pointed out that he saw 'NameB' in the 'HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\ProfileList' key while he saw 'NameA' in the SAM registry located within %SYSTEM DRIVE%\Windows\System32\Config\SAM.

My hypothesis, that I thought unlikely, was that there was a bug in windows. If an account is first created and then renamed that the two registries would be out of sync. So test to this hypothesis I took one of my stock Windows 7 virtual machines and created two accounts:

1. I created two accounts
           a. standarduser - A non administrative account
           b. testuser - An administrative account
2. I then logged into each account and logged off of them to make sure all profile data was created
3. I then rebooted the system to make sure all changes were flushed to the system registries
4. I then logged in as a third user and renamed both accounts
           a. I renamed standarduser to notstandarduser
           b. I renamed testuser to NotTestUser
5. I then inspected the registries

What I found was interesting. The profile names in the Users directory and within the stub of the SAM file remained the same but within the 'V' key under SAM\Domains\Account\Users\\V I found the old and new names listed. 

'testuserNotTestUser'
'standarduserNotStandardUser'

I need to find the specification of this key so we can parse this automatically as there is no termination character between the two names.

So if in the future you have a case where your names of ownership and login don't line up check SAM\Domains\Account\Users\\V to find out if the account was renamed. Of course if you only relied on the SID you wouldn't have this problem but most of like to attribute a username as well as that's what others outside of our field would understand.

Hope this was helpful! Leave a comment if you've seen something similar or have found other changes that can cause similar behavior.

Tuesday, November 19, 2013

Daily Blog #149: PFIC Day 2 Notes

Hello Reader,
           Here are my notes from Day 2 of PFIC, this is the last of these posts as I didn't attend the day 3 session in depth as snow was falling and clients were calling. I'll be updating these posts with the slides from the relevant lectures so you can see those as well.

Day 2 - PFIC Notes

8:00am Session - Ira Winkler ' The Cyber Jungle'

Ira is very personable, I like his show as well as him
Two good stories so far, the first promoting infragard (Ira is the president of his local infragard) the other involving credit card fraud.

Why does the media ask dumb questions on tv? The guest gives them dumb questions to ask

Executives don't want to disclose and notify, this is something I also have found

Crypto Locker story time

pointing out fud about crypto locker thats out there, bad media report showing a technical person saying that firewalls, service packs and good passwords could have prevented crypto locker.

another good story, this one about a reporters experience with some attorneys

Reporters are under pressure to get multiple stories a day. This can hurt parties who can't handle the media well and be able to provide and answer questions quickly.

An interesting story about how ankle bracelets are being removed and being used to commit crimes in las vegas. Then placing their bracelet back on when they get back to their house. The bracelets are not being monitored actively and the process is broken.

Downtown streetlights in las vegas will be able to monitor audio in the future. In the near future the officers will be able to monitor this audio via iOS apps on their phones. Ira is wondering if anyone properly securing this channel, applying ISO 27k or another security standard, to prevent non LEO from listening.

Make sure to listen to cyberjungleradio.com for his weekly podcast. Link to site: http://thecyberjungle.com/index.php

10:00am session Python for web application security testing


This is a talk on writing python code for web app testing rather than popular tools.

Recommends head first programming to learn python

Showing how to build a buffer overflow script in python
All of these scripts and example app is on a dropbox shared folder for those that want to try this at home.

This isn't your normal DFIR presentation, very infosec focused. The audience seems interested though so that's good.



Showing how web apps store data and failed logins from buffer overflow attempts within a user authentication form. this is not a python tutorial but rather a show of whats capable and what it leaves behind.

Edited some code and talked about what things effect and change.

Moved on to XSS attacks
Talking about the python function htmlspecialchars to prevent xss

Moving on to how to use python to do testing and getting over common hurdles. First hurdle is basic auth

don't store credentials within code, retrieve it via prompts to the user on execution

All functions covered so far as built in python libs.
He is now going into Scapy which is a 'full featured library for preforming network operations'. Packet capture/manipulation/creation/replay lib


Live demonstration of capture, reviewing and replaying traffic with scapy
Showing the built in fuzzer within scapy
Showing how to spoof the traffic in your fuzzing with scapy

Ending now and discussing the benefits of python. Not saying not to use off the shelf tools but if you want to be able to be successful and understand more getting lower level with python directly will allow you to be more versatile.

10:30am Session - Me!

It was amazing!
It was wonderful!
Offers of free coffee were given!
I'm writing this before my session but this is how I want it to go.
In reality it went well but i had live demos fail as they are apt to do, event excel was crashing on me. Luckily I added in pre-generated results to move things forward

11:30am Session - Jake Williams IaaS forensics

IaaS is the acronym that represents most of the cloud virtualized systems we talk about, infrastructure as a service
Get a Incident Response plan and make sure it contains what to do for both your internal and externally hosted assets
You are stuck trusting the hypervisor at some base level
In a commercially hosted cloud you don't have access to the hypervisor (amazon) if you are a privately hosted cloud (your own esx server) you do have access to the hypervisor.
You need to validate that the hypervisor has not been compromised
If the hypervisor has been tampered with you need to collect additional evidence.
Jake has found an esx server where the hypervisor was compromised and thus can no longer say it doesn't happen. If the hypervisor is compromised then the attacker can control physical memory outside of the guest os and guest os artifacts.
There are hypervisor logs that you should be collecting.
This is not typical though, but you should grab the logs to be sure
The vm-support command will output a tgz file with the log and vm inventories that you need
USB over IP devices are seperately logged by the hypervisor versus USB devices physically plugged in
Don't use shared admin accounts if you want easy attribution of admin actions
Introspection isn't easily detected by the attacker and can be normally used to collect data outside of the attackers view
Inband (non hypervisor based actions) are bad because bad guys can easily detect your response effort
You can't do out of band actions on public clouds (amazon) as they don't give you hypervisor access ,so your stuck with traditional live response
Making full disk images of cloud hosts is typically difficult as your bandwidth to the site is your bottleneck.
Amazon and hopefully soon rackspace will write your data to a physical disk and mail it to you
You supply the drive and cables, they charge you $80 per disk, they will accept a shipping label so you can get it via fedex
Accounting records will be provided but they don't do Chain of Custody
The amazon feature mention called 'bulk export' is not meant as a forensic/ir service
A good alternative is to spin up a forensic/ir virtual instance so you can keep the data within the cloud and speed your investigation
Have a dongle restricted software you want to run in the cloud? Use USB over IP
The hardest part of dealing with hosted/cloud hosted systems is making sure the tech is going to follow your procedures and not shut down the system or kill the vm instance
Snapshots are great, memory is better
Public cloud (amazon, etc..) don't allow you to request physical memory out of band from the hypervisor
Public cloud snapshots are disk states but not memory states
If you capture the memory to a network share, make sure you lock down who can access them or else you may have non authorized personnel accessing secrets
You can still do CoC yourself, f-response is a great imaging solution for cloud hosts
If you get compromised public providers like amazon limit their liability in case of a compromise from their end to a refund of that months fees
If you don't want to use f-response FAU is another good tool to use for live cloud imaging, but make sure to put it over an encrypted tunnel
Protect your memory dumps, possibly encrypt them
Out of band imaging is still the best option
HP has internal resources that can out of band image a HP hosted cloud server
The issue is with imaging logical disks in non Vmware clouds is that tools often can't find the end of disk and keep writing forever
test your tools in your cloud for your IR plan to find out which ones fail silently
Hypervisor imaging is as simple as snapshotting

1:30pm Session - Memory forensics with Chad Tilbury

I should have go into this session but I was too busy talking to people through lunch. I did see the end and recognized a subset of slides from For 508 but he ended it with a nice preview of Mac and Linux memory forensics.

2:30pm Session - Recovering your costs in ediscovery

Quote from a judge on the fair housing center of southwest michigan v Hunt where the judge chastised a party for turning the litigation into a e-discovery workshop.
Nice review of which ESI costs can be recoverable, this is good information for me to advise my clients when they are not aware this exists.
If you want to recover costs you have to show detail and provide affidavits that explain why it was necessary and how  the costs break down.
Don't be vague on invoices and document your work if you want your costs to be recoverable for your client in the event they prevail
Moore v Weinstein - Prevailing party received $36,196, of which e-discovery service provider made up $22,000 of and asking for $40,000
In house work done within parties firm need to have reasonable costs and the work done must justify the rate desired to applied
A fun sidebar about thor and shield and whether working with thor would show the government endorsing a religion.
Interesting, court rulings have come out stating that native productions of documents are not recoverable costs
No cost for hosting, courts still compare data hosting to warehouses holding paper - non recoverable costs
Forensic costs within ediscovery is recoverable, forensic investigation fees of an expert witness are also recoverable separately
 Second 'geek break' discussion on how wills would effect 12 regenerations of dr who







Monday, November 18, 2013

Daily Blog #148: Sunday Funday 11/17/13 Winner!

Hello Reader,
        Another Sunday Funday come and gone, a new victor arises to claim the prize! This week I put out a challenge that I know we've covered in different aspects here on the blog, CD Burning artifacts, to see what you would come back with. While some of the responses covered what we've talked about here, some of you went beyond and found additional artifacts! This week the 'earliest most complete submission wins' rule came into effect.The winning answer this week from
Martijn Veken was received at 8:34am central time beating the other great submissions by hours.

The Challenge:
     Your client has given you three CDROMs that contain their tradesecrets. They want to determine as much information as possible about the CDs to determine:
1. Which system burned them
2. What software created the CDs
3. When they were burned
4. If there were other CDs burned
5. Which user burned the CDs

The client is a small company with 5 systems of which you've been given access to all of them. Each of the 5 systems runs Windows 7.

The Winning Answer:


Martijn Veken



1. Which system burned them
If you figured out at what time the CD’s were burned (see answer 3), check the system eventlog for event id 133, this indicates that files were burned to CD using Windows Explorer. If so, in the registry under key ”HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Explorer\CD Burning\StagingInfo”, there are keys indicating where files were staged before they were burned to the CD. You can use file system forensics to investigate what was in the folder to try to match them to the disc. You can also check the timestamp of the registry key to see at what time it was written to search more specifically.

If another tool was used, there are clues on which tool this was on the CD (see step 2). Look for indications in the prefetch, RunMRU and user assist to see if the tool has run on the system. If the tool is or used to be present, look for the temp folders or log files it produces to see if you can match it to the CD.

2. What software created the CDs
If it’s ISO9660, usually the name of the application that has created the CD is in the session start section, somewhere just after 0x8000.

3. When they were burned
If it’s ISO9660, there are a couple of timestamps indicating the time of burn in the session start section. If you have figured out on which system the discs were created, check eventlog to see if there are any events (event id 1) that indicate that the system time was changed prior to burning the disc.

4. If there were other CDs burned
If the CD’s have been burned with Windows explorer, there will be events with id 133 in the eventlog. In the registry key described in step 1 will be entries for staging folders. Examine these forensically to see if there are residues of files there.

Other burning applications also usually have a temp or staging folder for burning CD’s. You can check these folders for residues indicating that files have been burned to a CD.

5. Which user burned the CDs
In most cases, the location of the log or staging files in the users AppData folder will indicate which user created the CD’s.

If not, use the time that the CD was created to check the security event log for audit events that indicate which user was logged on to the system at the time of the creation of the CD’s. To burn a disk, a user usually needs to logon physically to the system, so look for logons of types 2 and 7 prior to burning the disc.

Make some time for next week's Sunday Funday and you too can win a prize worth researching for!