Saturday, August 31, 2013

Daily Blog #70: Sunday Funday 9/1/13

Hello Reader,
           It's that time again, Sunday Funday time! For those not familiar every Sunday I throw down the forensic gauntlet by asking a tough question. To the winner go the accolades of their peers and prizes hopefully worth the time they put into their answer. This week I am changing things up and letting the winner pick their choice of prizes!

The Prize:
  • A signed copy of the new book
The Rules:
  1. You must post your answer before Midnight PST (GMT -7)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful 
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post

The Challenge:
One of things that's important in any investigation is knowing whats normal, what should be there and whats missing. On a Windows 7 system with a SSD drive what forensic artifacts no longer get created or maintained by default?

Friday, August 30, 2013

Daily Blog #69: Saturday Reading 8/31/13

Hello Reader,
        It's Saturday!

1. We had another Forensic Lunch on Friday, watch it here. We talked about legitimate uses for Tor, the CryptoParty movement, our new Plist parser and much more! Give it a watch and tune in live next week and get involved!

2. Over on Mari DeGrazia's 'Another Forensic Blog' there is a new tool for parsing Safari binary cookies, http://az4n6.blogspot.com/2013/08/safari-binary-cookies-now-with-more.html, Mari has written and put up for download a tool that will parse an entire directory of binary cookies at once as well as decoding the Google analytics values! Well done Mari!

3. On the Ride the Lighting blog there is an interesting write up on a news article that is making the rounds, http://ridethelightning.senseient.com/2013/08/insurance-company-refuses-to-pay-for-data-breach-will-yours.html. The midwestern grocery chain Schnucks had a data breach back in March and had to disclose the loss of 2.4 million credit card numbers. Like most companies dealing with the legal and operating expenses of dealing with a breach, remediation and notification Schnucks filed a claim on their insurance for it. The insurance company is has denied their claim and the resulting lawsuit (I'm guessing one is coming) will make for interesting case law on what an insurance company has to cover for breached companies.

4. I enjoy the cyb3rcrim3 blog as its one of the few legal blogs out there that focuses on the evolving case law of digital evidence. This week an interesting post went up http://cyb3rcrim3.blogspot.com/2013/08/staples-laptop-and-4th-amendment.html describing an individual who was prosecuted for posession for child porn after bringing in the computer to Staples to have a virus removed and the technician found the child porn and reported it. If you are in a corporate environment and want to understand more about your obligations and the law regarding what to do when child porn is located on a system in your possession, read on.

5. On the Mandiant M-unition blog there is a good first post in a series on determining proof of program execution https://www.mandiant.com/blog/execute/. I watched Mary and her co-worker at Mandiant give a great presentation at the DFIR summit on the Application Compatibility Cache aka shimcache and learned some interesting caveats to shimcache evidence on a per OS version.

6. Chad Tilbury has posted an hour long video with SANS on an introduction to Windows Memory Analysis. While its not something you can read it is worth you time to watch if you are unfamiliar with it! https://www.youtube.com/watch?feature=player_embedded&v=SjDH_vTuefM

7. Lastly two new SANS courses are coming out that I plan to finagle my way into sitting though. The first is FOR 572 which is the new advanced network forensics class. The concept behind FOR 572 is really neat, they took all the network captures from the attacks you discover in FOR 508 and built a class showing how to do the same level of response on the network level. Very cool stuff.

The second class FOR 585 is all about mobile forensics. In the course of my investigations I do mobile phones on a regular basis and I am always looking for new ways to get more data. Heather Mahalik and Cindy Murphy have worked hard to build quite a course and I'm waiting to see what new tricks they have up their sleeves.

That's all for today! Tomorrow is Sunday Funday, I hope your ready for a challenge!

Daily Blog #68: Forensic Lunch 8/30/13

Hello Reader,
       It's time for another Forensic Lunch! Join us as we talk IR tools with Kyle Maxwell, More extended MAPI fun with James Lohman and updates on our OSX plist parser and the triforce.


Thursday, August 29, 2013

Daily Blog #67 Understanding the artifacts DeviceClasses

Hello Reader,
         Tomorrow is the Forensic Lunch, are you going to join us? Click here to RSVP and be notified when the YouTube link is live! Want to be on the video chat? Email me dcowen@g-cpartners.com and I'll get you in the video chat room.

Today we are going to talk about the DeviceClasses registry key. Introduced in Windows Vista the DeviceClasses subkeys are created when a plug and play device driver is successfully loaded. This is great for us because unlike USBStor/SBP2Stor and IDE it is not confined to any one type of connection and makes it harder to miss connected devices. If you want to read about PnP subsystem and how it updates the registry go here: http://msdn.microsoft.com/en-us/library/windows/hardware/ff558808%28v=vs.85%29.aspx

I most regularly use DeviceClasses on Windows 7 system to determine last plug in times for external storage media. Understanding what it does, why it creates these keys, why there are so many keys and what you should expect to find in them will help you when everything else fails.

The DeviceClasses key found in the System registry under
SYSTEM\\Control\DeviceClasses\
 contain a number of subkeys, each subkey relates to the type of device that is being recognized. For instance the physical disk of an external drive would be in the GUID key: {53F56307-B6BF-11D0-94F2-00A0C91EFB8B}
reference: http://msdn.microsoft.com/en-us/library/windows/hardware/ff545824%28v=vs.85%29.aspx

While the volume that is mounted would be stored under GUID: {53F5630D-B6BF-11D0-94F2-00A0C91EFB8B}
reference: http://msdn.microsoft.com/en-us/library/windows/hardware/ff545990%28v=vs.85%29.aspx

The full list of storage device types and GUIDs can be found here:
http://msdn.microsoft.com/en-us/library/windows/hardware/ff541389%28v=vs.85%29.aspx


The full list of all GUIDs by category of device can be found here:
http://msdn.microsoft.com/en-us/library/windows/hardware/ff553412%28v=vs.85%29.aspx

Whats important to understand here is that all PnP devices are recorded under these subkeys. If you have a case where you think a non standard device may have been utilized (a voice recorder, a media player without direct physical disk access, web cam, etc...) and a driver installed at one point you should look up the GUID that corresponds to the type of device to review the subkeys and determine what was plugged in.

For example, if you have a USB Storage Device plugged it will add keys to the two GUIDS listed earlier and contain information relevant to the disk/volume. The following examples call from the GUID key:
SYSTEM\\Control\DeviceClasses\{53F56307-B6BF-11D0-94F2-00A0C91EFB8B} 

As an example here is what a Tableau connected via Firewire looks like:
##?#SBP2#Tableau&Forensic_SATA_Bridge&LUN0&REV15#000ecc01000f4063#{53f56307-b6bf-11d0-94f2-00a0c91efb8b}
Here is what an eSATA attached SSD looks like:
##?#IDE#DiskPatriot_Torqx_2_32GB_SSD________________S5FAM014#4&35e86db3&0&0.2.0#{53f56307-b6bf-11d0-94f2-00a0c91efb8b}
Here is what a F-Response iSCSI disk looks like:
##?#SCSI#Disk&Ven_FRES&Prod_FRES#1&1c121344&0&000000#{53f56307-b6bf-11d0-94f2-00a0c91efb8b}
This is what an external non generic USB Disk looks like:
\##?#USBSTOR#Disk&Ven_FUJITSU&Prod_MHT2060AH&Rev_#43527242060A&0#{53f56307-b6bf-11d0-94f2-00a0c91efb8b}
This is what an external generic USB Disk looks like:
##?#USBSTOR#Disk&Ven_&Prod_&Rev_PMAP#07032AE8AFBE2481&0#{53f56307-b6bf-11d0-94f2-00a0c91efb8b}
This is what a USB attached android phone looks like:
##?#USBSTOR#Disk&Ven_Android&Prod___UMS_Composite&Rev___00#8&d3cc3f0&0&304D1905736FB98E&0#{53f56307-b6bf-11d0-94f2-00a0c91efb8b}
When looking at the Volume GUIDs you'll see the full key:
SYSTEM\\Control\DeviceClasses\{53F5630D-B6BF-11D0-94F2-00A0C91EFB8B}

A MagicISO mounted image looks like:
##?#SCSI#CdRom&Ven_MagicISO&Prod_Virtual_DVD-ROM&Rev_1.0A#1&2afd7d61&0&0000#{53f5630d-b6bf-11d0-94f2-00a0c91efb8b}
The volume of the generic USB storage device we referenced above looks like:
##?#STORAGE#VOLUME#_??_USBSTOR#DISK&VEN_&PROD_&REV_PMAP#07032AE8AFBE2481&0#{53F56307-B6BF-11D0-94F2-00A0C91EFB8B}#{53f5630d-b6bf-11d0-94f2-00a0c91efb8b}
If you haven't taken a look at your own DeviceClasses key go take a look, there may be more facts you can determine from your own usage of your forensic workstation than you thought.

Many times in our analysis we focus on whats common, USB storage is the most common type of external storage for instance. There is no guarantee though that your custodian will oblige your convenience, so if you are dealing with something nonstandard this a great place to determine the last plug in time of your device.

I'll go into key analysis and linkage in the Usage post, so be sure to watch the Forensic Lunch live tomorrow and ask questions!




Wednesday, August 28, 2013

Daily Blog #66: Understanding the artifacts setupapi.log/setupapi.dev.log

Hello Reader,
            Friday is quickly coming up, have you made plans to spend your lunch hour with us? You can eat while we talk and then type your questions so you can be polite and not talk with your mouth full. You can RSVP for the lunch here, https://plus.google.com/u/0/events/ccu3b7246h9sk16jpg79l2co9mo?authkey=CJ3X6u7G6PjlSw, and email me dcowen@g-cpartners.com if you want to be on it!

            Today is a relatively simple post but I think I need to make sure to address it separately to be complete. Today we are going to talk about the setupapi.log (xp/2000/2003) aka the setupapi.dev.log (vista/7/8).

Windows XP/2000/2003
Starting with Windows 2000 and then continuing with Windows XP and 2003 the underlying installer system (setup) began logging for debug and troubleshooting purposes all of the drivers it loaded for devices. The log was called setupapi.log and located under %systemdrive%\Windows The underlying system and configuration for this logging is detailed on the following MSDN page:
http://msdn.microsoft.com/en-us/library/windows/hardware/ff550882(v=vs.85).aspx

By default the logging level will be:
0x00000020
Log errors and warnings.


So you will capture in this logfile all drivers and devices loaded onto the system with timestamps and which drivers were loaded. This is important to determine:
  • When external devices were plugged in for the first time
  • When a malicious driver was loaded onto a system
  • What drivers were loaded for an unknown device to determine its functionality
  • Proving a device was successfully installed and accessible

If you want to be exact in your interpretation of each logged line refer here:

Windows Vista/7/8
The setup service log was split into two logs in Vista moving forward. There are now two logs both now in %systemdrive%\windows\inf:
setupAPI.dev.log - Device and driver installations 
setupapi.app.log - Application installations

The MSDN specification for these two logs can be found here:

The device log is similar to the prior version but the application log is new and is of interest. In order to interpret the setupapi.app.log you need to refer to the following device install codes:

I want to do some more research into this log as I've finding some interesting entries relating to my use of a network scanner. I'll make a new blog just about this file after we've done some testing.

The same type of data we talked about in the XP and after logs can be found within these logs as well. If you have not been including this data in your analysis make sure to do so! There are several factors about the setupapi logs that are important in your examination:
  • They are created by default and cannot be turned off without a registry change 
  • They do not delete themselves so you should have all devices every plugged in
  • In an OS upgrade they would remain and indicate when the new OS was installed
  • Many system cleaners focus on registry keys and miss the data located here
  • It's the only exact source of first plug in times
  • If the OS is reinstalled the log format is carvable
Tomorrow I'll see if there are any other artifacts I need to include before we talk about stitching it all together.

Tuesday, August 27, 2013

Daily Blog #65: Understanding the artifacts EMDMgmt

Howdy Reader,
            Another good Sunday Funday come and gone. I want to make these contests fun and accessible for you and for those vendors who have graciously provided prizes worth your time and effort! Have an idea of how to make Sunday Funday better? Comment here or email me dcowen@g-cpartners.com. Also remember that this Friday we will be doing another Forensic Lunch and we will be showing the first alpha of our Plist parsing tool. You can register for the Forensic Lunch here to be notified when it begins and any changes and ask questions! If you want to be on the video chat for Forensic Lunch and have something to talk about email me dcowen@g-cpartners.com!

Today we are going back to the understanding series before I get more side tracked and wanting to write another topic. We've covered 6 artifacts so far in stitching together what it takes to really show usage but we are not done yet! Now we need to talk about a registry key first introduced in Windows Vista called EMDMgmt. Harlan has talked about it here: http://windowsir.blogspot.com/2013/04/plugin-emdmgmt.html and earlier in his blog as well. EMDMgmt or External Memory Device Management is part of the 'Readyboost' service first provided in Vista. Whether Readyboost is enabled or not the EMDMgmt key will be populated with all available external storage devices where it could write Readyboost data. In order to make sure it can uniquely identify a volume it includes both the driver identification and volume serial number of the attached device.

This is important for us in our investigations because it is the only key outside of Mountpoints to be able to link which external device found in the system registry corresponds to which volume serial number/volume name stored in the LNK files/Jump lists. To quote the Microsoft technet article found here:

ReadyBoost consists of a service implemented in %SystemRoot%\System32\Emdmgmt.dll that runs in a Service Host process, and a volume filter driver, %SystemRoot%\System32\Drivers\Ecache.sys. (Emd is short for External Memory Device, the working name for ReadyBoost during its development.) When you insert a flash device like a USB key into a system, the ReadyBoost service looks at the device to determine its performance characteristics and stores the results of its test in HKEY_LOCAL_MACHINE\Software\Microsoft\Windows NT\Currentversion\Emdmgmt, seen in Figure 1.
Now many times you will see USB mentioned here but its not just USB devices you will find in this key. I find eSATA, USB, Firewire, local disks, anything non system drive storage that is plugged in will be stored here with the driver identification, volume label and volume serial number. This can be very helpful when you are trying to understand why a device you know was accessed does not appear in the USBStor. There are times when either a) a device isn't USB b) the driver loads a hybrid driver (cdrom/storage) and the drive will be appear as a local disk instead.

The one problem for analysts is that Readyboost is disabled by default on SSD drives on at least Windows 7 (part of the Windows 7 optimization for SSDs). This can lead to a lot of false positives of anti-forensics or spoliation from an inexperienced examiner. So you are back to timeline analysis to determine which drives were plugged in at what time if you have a SSD user.

Now if the system is Vista or 7 (Have not checked 8) and your suspect does not have a SATA drive this key is created by default. If it does not exist check to see if the readyboost service was disabled (some users complain about its performance) but that disable would have to have occurred before the first external storage device was plugged in. Otherwise you have a good indication of anti-forensics if this is missing.

EMDMgmt is something I've learned to rely on and tools like Woanware's USBDeviceForensics and TZworks USBStor Storage Parser relies on to uniquely match drives. If you haven't looked at it before I would encourage you to do so, it will make your life much easier!

Tomorrow we continue to wrap up the current understanding series!

Monday, August 26, 2013

Daily Blog #64: Sunday Funday 8/25/13 Winner(s)!

Hello Reader,
           Another Sunday Funday has come and gone and this week I decided to change things up again. I gave out two questions and allowed people to either answer one or both of them. I thought this would be a fun way to allow for varying levels of difficulty within the same challenge. I had several good responses and picking a winner was tough.

The Challenge:
Two questions this week! Answer one or both for those over achievers:
Question 1:
Your client has a home computer running Windows 7 and uses Internet Explorer for his web access. He has switched jobs and is working on a competing product. An opposing expert has issued a report stating that your client must have accessed a website containing the new competing product earlier than the internet history shows because he found the same fragment of a page found in the unallocated space of a shadow copy he imaged. He is alleging that this earlier access shows he was working for the competitor before he quit his job.

Why is he wrong?

The Wining Answer:
Sajeev Nair
Unallocated is not monitored by VSS, so what they were seeing was the general unallocated and not associated with VSS. Only volumes are tracked. 

This can be confirmed by 
checking the registry value HKLM\SOFTWARE\Microsoft\WINDOWS NT\CURRENTVERSION\SPP\Clients\ {09F7EDC5-294E-4180-AF6A-FB0E6A0E9513}, this value determines which volume is currently monitored by VSS. 

Additionally, the below registry key determines, which files are not copied  by VSS. 

HKLM\System\CurrentControlSet\Control\Backup Restore\FilesNotToSnapshot
What I liked about this answer is that they included the registry keys to see what is and is not being included within VSS snapshots. The second place answer actually talked about overlays which was great, but looked at it from a live system perspective rather than the issue at hand. There is more to this and Joachim Metz's papers are great place to start if you want to understand the possible misinterpretations you'll likely have to face in the future.

Question 2:
Your suspect  has a new work computer running Windows 7 and uses Google Chrome to access the internet. He has switched companies and used to work for your client. You have found Google Chrome history that predates his employment at the new company that reflects accesses to your clients system. 

Why is the activity there?

The Winning Answer:
Paul Bobby

You can 'log in' to Google Chrome, so to speak. Browsing history from other devices, such as home computers or previous work computers, that also use Chrome (and log in to synch changes) are now visible on this new work computer.
If you synch Chrome by 'signing in', open a new tab. In the lower right you will see an "other devices" pull down.

What I liked about this answer is mentioned not just about the Chrome sync service but also how to identify where the other devices are. Knowing how to distinguish what system is creating the history Chrome shows is important when assigning activity to a system!

Thanks everyone for playing! Winners please send me a message for what prize you are picking. Tomorrow its back to either the business of forensics or finishing the understanding the artifacts series.

Saturday, August 24, 2013

Daily Blog #63: Sunday Funday 8/25/13

Hello Reader,
           It's that time again, Sunday Funday time! For those not familiar every Sunday I throw down the forensic gauntlet by asking a tough question. To the winner go the accolades of their peers and prizes hopefully worth the time they put into their answer. This week I am changing things up and letting the winner pick their choice of prizes!

The Prize:
The Rules:
  1. You must post your answer before Midnight PST (GMT -7)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful 
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post

The Challenge:
Two questions this week! Answer one or both for those over achievers:
Question 1:
Your client has a home computer running Windows 7 and uses Internet Explorer for his web access. He has switched jobs and is working on a competing product. An opposing expert has issued a report stating that your client must have accessed a website containing the new competing product earlier than the internet history shows because he found the same fragment of a page found in the unallocated space of a shadow copy he imaged. He is alleging that this earlier access shows he was working for the competitor before he quit his job.

Why is he wrong?

Question 2:
Your suspect  has a new work computer running Windows 7 and uses Google Chrome to access the internet. He has switched companies and used to work for your client. You have found Google Chrome history that predates his employment at the new company that reflects accesses to your clients system. 

Why is the activity there?

That does it for this week. I think this challenge should be more accessible to a wider breadth of people. Answer both questions and win both prizes!

Friday, August 23, 2013

Daily Blog #62: Saturday Reading 8/24/13

Hello Reader,
         It's Saturday? How did that happen. Time for another saturday reading. I've been trying to expand my reading list to find more DFIR related blogs and websites to bring you good content every saturday. What I've noticed is that most of the old blogs I was following have fallen silent. If you know of a good current DFIR blog let me know in the comments, I'm always looking to learn. Let's get to todays reading.

1.Over on the e-discovery team blog they've publshed an easy to read version of the upcoming proposed FRCP changes. For those not familiar FRCP is the Federal Rules of Civil Procedure with lay out the rules for how civil courts operate. The last major change came in 2010 I believe when communications and drafts between experts and lawyers became work product. If you are a current or aspiring expert wintess you should make sure to keep up with the changes, http://e-discoveryteam.com/2013/08/16/proposed-amendments-to-the-rules-the-easy-to-read-e-discovery-only-version/

2. Forensic focus has been getting some good articles posted lately from a variety of sources. I thought this article, http://www.forensicfocus.com/News/article/sid=2085/, was quite good. It covers recovering data from damaged mobile phones. While chip off is a well known technique its not often that some writes a publically accessible primer on how to do it.

3. In cases that will likely show up in your future news Sharon Nelson has a short write up on the upcoming legal battles expected now that Bitcoin is officially will be regulated. http://ridethelightning.senseient.com/2013/08/court-rules-that-sec-has-authority-to-regulate-bitcoin-investments.html

4. When we were doing some Mac research I got linked to this blog by Mari DeGrazia and her research into MS Office plists, http://az4n6.blogspot.com/2013/08/ms-office-recent-docs-plist-parser.html. It's a great write up showing how the break down the plists and found timestamps and other embedded data. I extended her concepts into some other plists and found more timestamps there as well.

5. I thought this was interesting and not something I had heard of before. Jason Hale has an interesting write up on extracting not which user last saved an Excel spreadsheet but which user last opened it! http://dfstream.blogspot.com/2013/07/ms-excel-and-biff-metadata-last-opened.html Very cool stuff

6. Open source hardware is something that I think is pretty interesting. This project shows you how to create an open source write blocker and drive imager, http://digitalfire.ucd.ie/?page_id=1011. So if you are feeling handy this looks like a fun weekend project.

7. I'm very tempted to go to the Open Source Digital Forensics conference, http://www.basistech.com/about-us/events/open-source-forensics-conference/, its the one conference I'm considering attending this year that I'm not speaking at!

8. We had another Forensic Lunch! You can watch the latest episode here, http://www.youtube.com/watch?v=kOBW2R4N2HA

Well thats what I have for you today. Hopefully we will figure out our audio issues for the additional 2 mics in the forensic lunch for next weeks show. Tomorrow is Sunday Funday so make some time for some Windows Forensic Fun!

Daily Blog #61: Forensic Lunch 8/23/13

Hello Reader,
          It's time for another Forensic Lunch!


Thursday, August 22, 2013

Daily Blog #60: On the business of being in business in the expert business, business

Hello Reader,
         We've been going strong, 60 days of blogs. To those of you who have kept up daily I salute you. I've been keeping things mainly technical but at times I know there is a wider audience of people who also are curious about other aspects of computer forensics. With that in mind I'm going of course today and going talk about the business of computer forensics.

My partner Rafael Gorgal and I started G-C Partners back in 2005, it's been 8 years and we've learned a lot in the process. I've seen other forensicators attempt to start their own independent labs over the years with various degrees of success and thought it would help more of you to know what it takes to start, run and succeed at running a civil digital forensic laboratory.

1. Pre-Requirements for a successful lab 

Before you start a lab you should make sure you that you actually are ready to do so. I've seen many a person go out and spend time and money putting out their sign without realizing what it takes to attract the business they need to succeed.

You will need at a minimum

a. At least one person who has already successfully testified as an expert witness in digital forensics


This is probably where I've seen the most problems for people trying to start new labs. You really, really need someone (preferably you to keep costs down) who has testified before to get civil expert witness work.

If you have been doing forensics work internally for a company for 15 years thats great, but lawyers only want to put experts on the stand who have testified before. If this person is not you, or can't be you, that's OK. You need to find a partner who has testified to fulfill this requirement while you spend your time getting business, managing the lab and working with clients.

This is where law enforcement has an advantage, most LE investigators will testify long before their civil counterparts will in terms of years of experience. As such its more common to see part time or retired law enforcement set up shop to offer services to the civil side.

If you only plan to take on IR work I still would encourage you to find a testifying expert to work with as it will make the lawfirms who your IR client retains more comfortable with your work product. In the end there may never be a lawsuit against the attacker you detected, but the people whose data was breached may sue and the IR team may have to testify.

b. The ability to qualify for whatever licensing is required in your state/country


You may have a disagreement with licensing for digital forensics but for those of who you live in a country/state with such restrictions its something you have to do. Take the time to research the requirements for your country/state and find out whats needed. For instance in Texas you have to become a licensed PI company, but the PI managers qualifications don't provide any provisions for experience as an examiner for qualification. Instead they want someone who has either

i. Been a PI manager for another company for 3 years
ii. Has a degree in criminal justice
iii. Has taken a course to earn a certificate of competency as a PI manager

Point iii didn't exist until a couple years after the law was changed so many existing labs had to pay existing PI managers to 'manage' their company until they had the requisite number of years managing a licensed PI company so they could take the managers test. The managers test BTW contains 0 questions about forensics.

c. Enough cash reserves to last three months without payment


When you first start our and you are working your first cases you'll be quite excited when you submit your first invoices. You'll be very sad when they don't get paid within 30 days. When you are a new vendor to a large company you will first have to be placed into the accounting system for payment and then your invoice will sit the requisite waiting period they established before they will mail out a check.

When all your clients are new you should expect that it will take two months to get paid for your first months work (if you are lucky enough to get work your first month). With that in mind make sure you have enough cash in the bank for the following:

i. Your expenses
ii. Business insurance
iii. Office expenses, unless you are going to work from home (which is just fine)
iv. Taxes
v. Expenses for travel and hard drives

If at the start of the 3rd month you have no business or your bills are still being questioned and you don't have enough cash saved to go beyond 3 months its time to go look for another job!

d. Enough prior exposure to local lawfirms and companies for word of mouth to build


This is important and only really understood by those who get the work already. When a lawyer looks to hire an expert witness they don't just go to google and search for 'expert witness'. They call other lawyers in the their lawfirm and their lawyer friends and ask who they used and are happy with.

Word of mouth is what civil forensic labs live and die off of. No matter how much money you spend on marketing, ad words, websites, cards, logos, flyers, brochures, super special deals on your services it won't matter if someone else has not already used you and liked you.

There is a considerable amount of risk involved for a lawyer picking the expert witness. If you can't produce a good report and provide good testimony then their case will suffer. Worse if your findings are proven to be incorrect or you are disqualified as an expert they will look to their clients as if they didn't do their job right. Possibly leading to lawsuits against the lawyer and you for negligence.

So understand and respect this requirement, you can't change it. I don't know how IR work is gained but I will tell you that civil expert witness work comes from the outside lawfirm no matter how many good friends you have working within a company who recommend you to in house legal.

e. Licensed copies of your preferred tools


Please, please, please don't ever used pirated software in your work as a testifying expert. If you are going for a full open source shop thats great, but never used pirated software. If it comes out in testimony that you don't have a license for the software you used to generate the results it will not reflect well on your testimony.


I think that's enough for this post, I'll be writing more about this topic as the blog moves forward to give me a variety of topics to discuss. I hope my past experience is helpful to you and will lead you to future success. We need more independpent labs out there, every case needs at least two experts! 

Wednesday, August 21, 2013

Daily Blog #59: Understanding the artifacts ShellBags

Hello Reader,
           Another day, another blog. They say if you've done something for two weeks it becomes a habit. Well it's been two months and I will tell you that I know each evening that I should be writing tomorrows blog, but life (and good tv shows/movies) often gets in the way. So I just got back from lunch and its time to push through the remaining usage artifacts so we can talk about the combined analysis of them. I think after I'm done with all of these posts I will feel some feeling of relief but also another separate list of which artifacts I need to go into more technical detail on in the future. Blog posts sometimes just write other blog posts, but mainly your comments are what help drive the direction of my writing. Also please note that if you have not added me to your Google+ circles and made your comment limited, I can't see it.

Let's talk about Shell Bags! Shell bags is one of my favorite Windows artifacts as it reveals so much as to what the custodian was interested in data wise. For a technical primer on shell bags, go here:
http://computer-forensics.sans.org/blog/2011/07/05/shellbags
and
http://windowsir.blogspot.com/2012/08/shellbag-analysis.html

As has been stated shellbags record a users preference for each folder viewed within the gui explorer. That is important as the only ways to get around a shellbag in viewing folders that I know of is to:

  • Load a command prompt
  • Utilize a third party file system navigation tool
  • Browse for files inside of an application that does not use the win32 browser call
Otherwise, if a folder is accessed and viewed within the GUI a shellbag entry is going to be made to record their preferences. As a by product of storing those preferences (item list type, window size, sorting) it also stores the MAC times of the directory, the full path, the last time of update to the registry key and in Windows 7 the MFT record number. 
For the most in depth treatise on the shell item format and how its changed between Windows versions read this: https://googledrive.com/host/0B3fBvzttpiiSajVqblZQT3FYZzg/Windows%20Shell%20Item%20format.pdf

This is important. Why you ask? While full paths are great for static drive letters, without volume serial numbers (as we find in LNK files) we have no way to uniquely match them to removable devices without doing some deep timeline analysis showing what was attached at what times. With the addition of the MFT record number (consisting of the entry number and sequence number) which will allows us to identify uniquely the directories and files being recorded in the shellbags to the directory/file located on external media.

Now I just assumed something of you reader, I assumed you understand the power of shellbags in getting more information about what was contained on removable devices. The shellbag entries are stored on a per user basis and are not limited in scope to just the local disks. Whatever removable or network based storage the user views through the GUI explorer gets recorded. As far as I know, and please leave a comment and correct me if i'm wrong, the shellbags are the only artifact that will reveal the existence of directories accessed without the need of a file being accessed within them. LNK files do get created pointing to directories at times, but not the breadth and depth that the shellbag entries show you. 

So, shellbags are awesome. You should be checking them. 
This is my favorite tool to check them with:

Don't exclude them in your analysis just because its not a built in feature of your tool.

Tomorrow we move onward towards more artifacts and greater understanding!

Tuesday, August 20, 2013

Daily Blog #58: Understanding the artifacts Jump Lists

Hello Reader,
         Another Sunday Funday is behind us and from it I've identified another blog series that need to be written. We are trucking along through the artifacts needed to better understand usage. We've covered LNK files, the USN Journal, USB Stor and User Assist. Today we are going to jump into Jump Lists which first made their appearance in Windows 7.

If you've never heard of Jump Lists before go here, http://www.forensicswiki.org/wiki/Jump_Lists, this blog post assumes you are familiar with them and seeks to help you better understand them. For instance I won't be explaining the difference between automatic/custom jump lists or where to find them and their structures.

If you want to read the most thorough write up of Jump Lists I've seen to date go here: http://articles.forensicfocus.com/2012/10/30/forensic-analysis-of-windows-7-jump-lists/.

An easy way to think about jump lists, though not technically accurate, is a chained series of LNK files stored on a per application basis. The biggest fundamental issue regarding Jump Lists versus the LNK files that analysts know and love is that LNK files where created, stored and maintained for the explorer shell (with the one exception I know of being Microsoft Office). Through program shourtcuts, recent documents, office application documents, etc... there is a shared set of LNK files maintained. Jump lists does not entirely replace this functionality but rather extends it allowing tracking of recently used documents from the registry to individual jump lists on a per application basis through automatic destination files.

In short, if you are analyzing a Windows 7 system and you are not parsing/analyzing the jump lists then you are missing evidence. Many up to date forensic suites are not parsing jump list data structures yet and instead will carve LNK files from custom destination lists. Get a tool that handles them correctly:

TZWorks: http://tzworks.net/prototype_page.php?proto_id=20
Woanware: http://www.woanware.co.uk/?p=265

This is good news for us as that means what documents are being accessed through an application are no longer just maintained in the registry through MRU's and we get much more data to analyze on a per file basis. Some applications, notably Microsoft Office, emulated this functionality through LNK files in prior versions of Windows but Jump Lists extends this through auto destinations. MRU keys only keep the date of the last file accessed for that MRU key and the order of last access, while automatic Jump Lists records the same type of data a LNK files does but extends it.

One of the difficulties investigators have had with jump lists was matching the appid that makes up the jump lists name back to which application it was tracking. Luckily for us Hexacorn seems to have solved this issue and made a perl script for use (Yay Perl!):
http://www.hexacorn.com/blog/2013/04/30/jumplists-file-names-and-appid-calculator/ which will allow you to generate the app-id for any given string.

So in short, and I probably will want to revisit this blog post, Jump Lists extend the analysis you did prior with LNK files and is stored on a per user basis for recent document access like LNK files, but is stored on a per application basis. One of things mentioned between all of the major sources is that jump lists are not deleted when a program is uninstalled and they would not be deleted by any system cleaner that is not 'Windows 7 aware'. So if you are not currently taking them into account in your investigations you should change that today.

Monday, August 19, 2013

Daily Blog #57: Sunday Funday 8/18/13 Winner!

Hello Reader,
        Another Sunday Funday behind us and another winning answer given. This week we did a Linux challenge and from the lack of responses and high readership I would say that this is a weak point for most of you. I have noted this and will devote some future blog posts to Linux forensics. This weeks winner is Tony Micah Lambert congratulations Tony! By having the confidence to give the contest a try you have won! Here was this weeks challenge:

The Challenge:

The suspect is believed to have taken source code from his past employer and made use of it in the development of a new product. For a  Ubuntu Linux system (any modern version 11 forward) where the user is using Gnome and CVS answer the following:

1. Where would you look to see what devices had been connected
2. Where would you look to see what files/directories had been accessed
3. Where would you look for user activity related to source code development

Here is Tony's answer:
You can view what devices have been recently connected by consulting the syslog at /var/log/syslog. Depending on how much time has passed, you may have to look for /var/log/syslog.x or syslog.x.gz where x= a sequential number. This log will have enough unique information about a device for it to be identified

For files/directories and user activity in CVS, all checkouts, commits, and updates can be checked from the history file located at $CVSROOT/CVSROOT/history (assuming proper configuration). This information can be easily accessed using the CVS "history -u " command to filter results for a specific developer's username.

 Tony is correct in his answer, but there is much more to find! The clue I left here was Gnome and I'll focus on what the additions to Linux to make it user friendly have created in terms of artifacts in the Linux forensic series. Until then, well done Tony! Let me know which prize you would prefer.

Saturday, August 17, 2013

Daily Blog #56: Sunday Funday 8/18/13

Hello Reader,
           It's that time again, Sunday Funday time! For those not familiar every Sunday I throw down the forensic gauntlet by asking a tough question. To the winner go the accolades of their peers and prizes hopefully worth the time they put into their answer. This week I am changing things up and letting the winner pick their choice of prizes!

The Prize:
The Rules:
  1. You must post your answer before Midnight PST (GMT -7)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful 
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post

The Challenge:

The suspect is believed to have taken source code from his past employer and made use of it in the development of a new product. For a  Ubuntu Linux system (any modern version 11 forward) where the user is using Gnome and CVS answer the following:

1. Where would you look to see what devices had been connected
2. Where would you look to see what files/directories had been accessed
3. Where would you look for user activity related to source code development

Good luck! I'm having fun switching up operating systems to expand the challenge to more participants! I'm looking to see who else out there is having fun with some mainly unknown Linux forensic artifacts.

Friday, August 16, 2013

Daily Blog #55: Saturday Reading 8/16/13

Hello Reader,
      Wait, It's Saturday? Where did the week go? It's time for another Saturday Reading where I list out what I've been reading this week and what tools we've been trying out. Let's get started.

1. We had another Forensic Lunch yesterday, http://www.youtube.com/watch?v=wOHG_pwHyRo, Brian Lockery came on to talk about the crimes against children conference and his products. We talked about our efforts to get TSK's api to bind to perl and Matthew talked about his formal education towards a bachelor in computer forensics.

2. If you watched the forensic lunch you heard me talk about SWIG, http://swig.org/, which is a pretty neat project. If you want to bind a C/C++ API to your choice of language (C#, Java, Perl, Python, Ruby, etc..) it will auto generate code to wrap the functions and make them available. It takes some work to learn but it does work!

3. I finally got the website for the book done , http://www.learndfir.com, and the links are all up for the new book. Just click on the cover to be taken to it! Next we need to upload the images we made for the analysis chapters so you can solve the cases at home.

4. Are you a perl monk like I am? If so you should check out Inline C, http://search.cpan.org/~sisyphus/Inline-0.53/C/C-Cookbook.pod#The_Main_Course, which allows you to embed C and call out to C libraries within perl. The code gets compiled at run time and then cached allowing for C speed with perl execution.

5. For those of you who heard Matthew talking about his college experience getting a degree in computer forensics here are the programs he is graduating from. He is getting his Bacehlors in Information Assurance and Forensics from OSU IT http://www.osuit.edu/academics/information_technologies/ba_about.html and got his associates in forensics from Richland https://www1.dcccd.edu/catalog/programs/degree.cfm?degree=digi_forensics_aas&loc=8

6. James Webb has proferred a maturity model for organizations to measure their incident reponse capabiltiies against, I thought it was a good write up, http://blog.jameswebb.me/2013/08/modeling-ir-program-maturity.html

7. Over on the SANS blog Ira Victor has a nice writeup on his experience at Blackhat and Defcon, http://computer-forensics.sans.org/blog/2013/08/11/case-leads-a-forensicators-take-on-blackhatdefconbsides. These are traditionally very infosec focused conferences so Ira has found those takeaways that are most relevant to forensics.

8. If you watched last week's forensic lunch we talked about extended mapi parsing in Outlook. David Nides was nice enough to share a free package that parses this data, http://www.dimastr.com/redemption/home.htm, called Outlook Redemption. Check it out!

That's all this week, make sure you come back tomorrow for Sunday Funday! Another challenge and another prize for those that are ready to flex their forensic mental muscles!

Daily Blog #54: Forensic Lunch 8/16/13

Hello Reader!,
    It's Friday and time for another forensic lunch! You can watch it live and ask us questions or the recording later.

See you tomorrow for Saturday Reading!

Thursday, August 15, 2013

Daily Blog #53: The TriForce is now Patent Pending

Hello Reader,
             Short one today as I'm trying to get some work done, tomorrow I'll continue our series of understanding the artifacts into jump lists.

I'm happy to announce that we've filed a provisional patent on the triforce and what new types of analysis we can perform with it. I debated for a long time on whether or not to patent what was possible but in the end we came to the conclusion that the following facts were true:

1. If we didn't someone else might
2. I want to make sure that those people who want to extend our research in the free and open source world are protected from patent trolls
3. I want to make sure commercial entities who try to profit from our research feel the need to contribute back to our research for a license
4. I thought the relationships and the new facts they uncover were unique enough to justify a patent

So if in the future you see Patent Pending on a beta update now you know. Once the patent filing is public I'll link to it so you can see all the use cases we've thought of in the future. I've said before that I think we are starting a new branch of computer forensics related to file system journaling forensics. I believe this is true and I think we extend what we do now for post mortem and incident response into more proactive detection of bad events, actors and software.

Let me know how you feel about it in the comments, I want to make sure we are open and honest with the community as we continue forward both in our research and the free/commercial products we plan to make.

Wednesday, August 14, 2013

Daily Blog #52: Understanding the artifacts LNK Files

Hello Reader,
               Time to continue the series of understanding the artifacts building up to a deeper understanding of proving usage. Today we are going to go into a well known artifact LNK files and them move through Jump lists, MRU keys and the other artifacts we use to establish use and explain how to stitch them together. Along the way we will detail the nuances that can change your opinion or possibly lead to misinterpretation.

LNK files are one the simplest artifacts and many, many, many people have written about them. Here are some of my favorite LNK write ups if you are reading this and are not familiar with them:

http://www.forensicswiki.org/wiki/LNK
http://www.forensicfocus.com/link-file-evidentiary-value
http://windowsir.blogspot.com/2013/06/there-are-four-lights-lnk-parsing-tools.html

The funny thing about artifacts as simple as LNK files is that they reveal as much information to the examiner as they care to know. When I do interviews for a position at G-C I ask a series of questions relating to artifacts and what they mean to the examiner. This isn't a trick question, which I explain to the interviewee, but rather a gauge to determine how far down the rabbit hole the examiner has gone. As an example for LNK files I would ask the following to an interviewee:

'What can you determine from a LNK file'

I can determine the rough expertise of an examiner by how many of the following points they answer with. I then take this in combination with other artifact questions/scenarios and the level of depth they answer to determine their level of forensic experience rather than focus on their resume.

Beginner Answer:
A LNK file reveals what files and/or programs a user accessed.

Intermediate Answer:
A LNK files reveals what files and/or programs a user accessed and the network path and MAC address of the where the access took place.

Experienced Answer:
A LNK file reveals what files and/or programs a user accessed and the network path and MAC address of where the access took place. In addition it contains the timestamps captured from the file and/or program being accessed that represents the file at the time the access took place.

Senior Answer:
A LNK file reveals what files and/or programs a user accessed and the full path\network path and MAC address of where the access took place. In addition it contains the timestamps captured from the file and/or program being accessed that represents the file at the time the access took place. It also contains the volume serial number of the device which you can use to match the LNK back to the volume the file came from if not a network data source. In addition LNK files contain shell items allowing the examiner to determine the type of folder being accessed (volume/network/file/uri).

Expert Answer:
A LNK file contains two sets of timestamps relevant to the examiner. The first set of MAC times belong to the LNK file itself, it reveals by creation date when the file was first accessed as recorded by this LNK file. The modification time records the last time the LNK file was updated and should reflect the last successful access. The second set of dates is maintained within the LNK file and represents the MAC times of the file being accessed based on the last successful access to the file from the LNK file. In order to determine prior states of the file you can examine the restore points (XP), shadow copies (win 7) and carved LNK files to find all the other versions of this LNK file that also reference this file and volume serial number/shell item uniquely. Each updated set of internal MAC times represents another successful access of the file through the LNK File and should be counted towards usage.

Now if you noticed I didn't say the Expert Answer had to go into depth on the technical structure as to what all can be contained within a LNK file, that isn't as important to me as the ability to properly interpret what the data means in the context of analysis. I assume that anyone who can give me an expert answer already has the technical knowledge of the file format to give additional facts when needed, but I find that people who give just technical information are missing the larger picture of what they data means in their analysis and what they can prove with it.

So with that said, tomorrow we will continue on with usage artifacts. Do you think I missed something or do you have an even better answer? Leave it in the comments, I'm always interested in additional views on analyzing familiar artifacts! 

Tuesday, August 13, 2013

Daily Blog #51: Understanding the artifacts USNJrnl

Hello Reader,
        I'm going to change tracks this week and focus on a deeper understanding of the USNJrnl and its associated artifacts to prove usage from our challenge two weeks ago. To prepare for this series I want to take a bit to explain what each of the artifacts we rely on for proof of usage were created for. When we are complete I hope you will look at your cases in a different way.

Today we are going to talk about the USNJrnl. The USN Jrnl or Update Sequence Number Journal aka the Change Journal was first introduced in Windows 2000 but didn't get enabled by default until Windows Vista (that I know of, please leave a comment if you have evidence of other default states/os's). I have seen it enabled for EFS encrypted drives on XP but I can't say if that's a default setting.The concept of the change journal was simple, many programs need to know when files are changed so they can be backed up, compressed, scanned, replicated, etc...

Prior to the change journal a developer would have to register hooks or shims into the operating system for all reads and writes to be able to be notified that a file is being created/modified/deleted and to process it. The Change Journal gave the developer a central api to monitor that covered all subscribing functions and prevented  a lot of unnecessary overhead. You can read more about the basics of the Change Journal here on wikipedia. The original announcement of it was made in September 1999 and can be found here its interesting that it took as long as it did for it to be enabled by default. You can see that it was being marketed to developers as a way to centrally monitor file system changes and improve performance.


The current change journal development documents are here and if you relying on change journal evidence in your cases you should be familiar with the use case scenario because things are not as black and white as they appear. What do I mean by that? In our testing we've found that a file left open overnight and accessed at different times will create multiple USN open/close/delete events. You cannot rely on the file open and file close times of a file to determine total time of access, it may only be showing you the times of activity against a file that was open the entire time. In addition we've found file close/file delete being used to close a file handle but not to delete the file itself.

I'm going to into more detail of how individual Change Journal operations work and get logged as we move forward so I don't want to get ahead of myself. So in summary remember that the Change Journal keeps track of file system changes for the benefit of those subscribing services. If you are unsure of a pattern of records your seeing the best thing you can do is build a virtual machine and try to recreate those actions to validate your assumptions. The Change Journal is not as simple as we all though it to be! Tomorrow I'm going to continue talking about Usage artifacts and then go into depth on the Change Journal and the rest of them. 

Monday, August 12, 2013

Daily Blog #50: Sunday Funday 8/11/13 Winner!

Hello Reader,
        Wow, there are a lot of OSX and Timemachine loving DFIR people out there! I received a lot of submissions and they are all very good. I had to read over and compare the submissions but one was a clear standout. Congratulations to Sarah Edwards (@iamevltwin) who brought an answer so well written it had be in a PDF to include the figures she referenced!

Here was the Challenge:
This week on the forensic lunch we have been talking about OSX and timemachine forensics. So let's have a OSX/Timemachine Challenge!

You have been given a timemachine drive that had multiple systems backing up to it over the network. After imaging it you need to determine what has been done, answer the following questions:

1. What are the different types of backups you could find on a timemachine drive
2. How can you distinguish which hosts backup you are looking at
3. How would you extract a single backup for a specific date
4. What is the difference between a timemachine backup and a .mobilebackup

Here is Sarah's winning answer, 
Pdf link to read offline here: https://www.dropbox.com/s/lnlz9r6eerxlmhy/SundayFunday_TimeMachine.pdf


















































































So it would appear as the bar as been raised this week! Sarah let me know if which prize you prefer, you earned it.

Saturday, August 10, 2013

Daily Blog #49: Sunday Funday 8/11/13

Hello Reader,
           It's that time again, Sunday Funday time! For those not familiar every Sunday I throw down the forensic gauntlet by asking a tough question. To the winner go the accolades of their peers and prizes hopefully worth the time they put into their answer. This week I am changing things up and letting the winner pick their choice of prizes!

The Prize:
The Rules:
  1. You must post your answer before Midnight PST (GMT -7)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful 
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post

The Challenge:
This week on the forensic lunch we have been talking about OSX and timemachine forensics. So let's have a OSX/Timemachine Challenge!

You have been given a timemachine drive that had multiple systems backing up to it over the network. After imaging it you need to determine what has been done, answer the following questions:

1. What are the different types of backups you could find on a timemachine drive
2. How can you distinguish which hosts backup you are looking at
3. How would you extract a single backup for a specific date
4. What is the difference between a timemachine backup and a .mobilebackup

There, thats not too bad now is it? I look forward to your answers!

Friday, August 9, 2013

Daily Blog #48: Saturday Reading 8/10/13

Hello Reader,
            It's Saturday! Hooray! The week is over and fedex pickup ends earlier today meaning you either have extra time in the lab or a some time at home. Either way, get some coffee and lets get our forensic reading going.

1. Joachim Metz has updated his volume shadow specification paper, not this week bu recently enough that I didn't read it until this week. If you are at all curious about how the volume shadow service data structures are stored then read this for what I believe to be the most detailed guide outside of whatever internal team at Microsoft developed it. In addition if you care more about the usage of volume shadow copies in your analysis and the existence of unallocated space in VSC's you should read this paper he presented which will answer questions you didn't even know you had.

2. Did you read yesterday's blog? No? Oh well we had another Forensic Lunch with David Nides, Kyle Maxwell, Joseph Shaw and the fine fellows I work with at G-C Partners. Tune in and keep up with what I think was a great hour of forensic discussion.

3. Andrea London has posted the slides for her talk at DefCon http://www.strozfriedberg.com/wp-content/uploads/2013/08/DefCon-2013.pdf tilted 'The Evidence Self Destructing Message Apps Leave Behind'. Her talk covers a wider base of these applications than I've seen covered before and it's a good read as she and Kyle O'Meara go deep into the file system internals and network traffic exchanged.

4. Lenny Zeltser posted a nice retrospective of how teaching Malware Analysis has grown, http://blog.zeltser.com/post/57795714681/teaching-malware-analysis-and-the-expanding-corpus-of. It's a nice short read and reinforced the idea that his advice remains the same 10 years later:
  • Too many variables to research without assistance
  • Ask colleagues, search Web sites, mailing lists, virus databases
  • Share your findings via personal Web sites, incidents and malware mailing lists

5. If you are doing USB device forensics and have a Windows 8 system that Woanware's USB Device Forensics application does not support yet then check out TzWork's USB Storage Parser. So far its the only tool that I have that take the multiple Windows 8 USB artifacts and combines them to a single report of activity.

6. Hal Pomeranz put out a new Command Line Kung Fu entry this week, http://blog.commandlinekungfu.com/2013/08/episode-169-move-me-maybe.html, always a good read.

7.  On an earlier Forensic Lunch you may have heard Rob Fuller talk about anti-forensic hard drive custom firmwares. Going more into that topic here is a great article about Hard Drive hacking and showing how these firmware changes are researched, implemented and performed. If you are dealing with an advanced subject you might want to be aware of these new possibilities! http://spritesmods.com/?art=hddhack

8. In this week Forensic Lunch we talked about parsing carved binary plists. For those of you looking to implement your own parsers or just try to understand the format better here are two sources. The first is the OSX code for binary plists, http://opensource.apple.com/source/CF/CF-550/CFBinaryPList.c, and a great write up on plist forensics by CCL http://www.cclgroupltd.com/images/property%20lists%20in%20digital%20forensics%20new.pdf.

That's all I have for this Saturday Reading. I hope these links are enough to get you through your day. Tomorrow is Sunday Funday and I have yet another challenge waiting for you to solve. This week we will have 'winners choice' where the winner can pick from a free ticket to PFIC or a year license to AccessData's Triage tool!

Daily Blog #47: Forensic Lunch 8/9/13

Hello Reader,
Going to try something different today and see if I can embed our Forensic Lunch live stream in the blog!

Forensic Lunch is something we are trying to do every Friday where we talk about updates to research from around the community as well as our challenges and successes here in the G-C Lab. If all goes well you can watch the show either love or recorded in the embedded Youtube below!



Tomorrow is Saturday Reading and I have some good articles and papers to pass on and don't forget Sunday for our weekly forensic contest!

Thursday, August 8, 2013

Daily Blog #46: Understanding the Artifacts USBStor

Hello Reader,
               No time to finish my Gmail code review so I'm going to continue the understanding the artifacts posts to keep things going. I got some good responses yesterday from the prolific Joachim Metz regarding what he's seen in User Assist keys which I updated the post to include. The more we share our knowledge with each other the better picture we have of whats true and whats possible, so if you see something you feel is missing please let me know and I'll incorporate it!

USBStor

Most of us doing forensics are familiar with the USBStor key, we look to it to identify USB devices plugged into a system and identify the make, model (unless its generic) and serial number (as windows reports it)  of the device. USBstor also has at least two sister keys IDE (for physical disks) and SBP2stor (for firewire) all of which serve the same purpose. This is one of the first registry artifacts many examiners are made away of as what USB external storage devices have been attached is so important to most investigations. Many times I'm asked as I've stated in the prior post, 'Is the computer logging this to track us? Did the NSA request this feature?'. The answer is, as far as I know, no.

Instead the USBStor and its sister keys are all related to a convenience mechanism to the user that is greatly appreciated. It associates a known device to its loaded driver! Without these keys every time you inserted an external device (USB, eSATA, Firewire in this example), the system would have to look up the driver to load it, check to see if it has the driver and load it. Instead thanks to the caching of known device to driver pairs the device quickly comes up each subsequent plugin.

You might ask, well why does it not stop keeping knowledge of devices after so many days. The answer that its more inefficient to check and expire registry keys and then just recreate them again in the future if the device is plugged in rather than just store it since hard drive space is no longer a premium.

This understanding can help you to explain odd scenarios. For instance lets say a generic USB device was plugged in (many white labeled devices do not identify a specific manufacturer) and from its name you cannot determine what kind of device it was, storage or connectivity of some kind (CDROM, Phone, MP3 player that does not expose its file system). You can look at the driver loaded to determine what functionality Windows made available to the custodian and how the custodian could have made use of it on this system.

It's this kind of deeper understanding that will lead to better explanations, testimony and fact finding. I hope you look to understand deeper and let me know if you think there is functionality that i'm missing in the comments below!