Daily Blog #99: Sunday Funday 9/29/13 Winner!


Hello Reader,
        Another challenge has found a victor! Congratulations to Steve M who provided this weeks best answer and really did a good job going into depth.  Next week get ready for a full forensic image challenge and read what Steve M has to say today and next year when it appears in print in Hacking Exposed: Computer Forensic 3rd Edition!

The Challenge:
Your suspect has a Windows XP system and you evidence from the User Assist records that he ran CCleaner a month ago, but the count shows it has been run multiple times before. Write out what your methodology would be to determine:

  • If system cleaning took place
  • If wiping took place
  • What is now missing

The Winning Answer:
From Steve M
The UserAssist key on the suspected system indicates CCleaner was run one month ago, but the count indicates it has been run more than once.  Here is how I would answer the questions outlined in the contest:

1) If system cleaning took place

System Cleaning, by default, will remove files from  several locations including browser specific Temporary Internet Files, Cookies, and histories.  The date of last execution specified in the UserAssist registry key can be considered a checkpoint, for which we can investigate if the system activity only appears to happen after the date.  For example, if the suspect system only has cookies being created after the date CCleaner was last run, it would serve as a strong indicator that system cleaning took place.

To determine the options enabled when CCleaner was last run, I would look at "HKCU\Software\Piriform\CCleaner" for the user who last used the software.  Each option has a registry key that enables/disables the checkbox in the GUI, and would be a good indicator of what options were run the last time the application was used.  It is completely possible the suspect modified the settings for the last run or enabled/disabled settings without actually performing a clean, so these should not be considered hard evidence.

To determine if the user had run system cleaning based on disk contents, I would look for the presence (and contents) of the following files:

Internet Explorer:
C:\Documents and Settings\\Local Settings\Temporary Internet Files\Content.IE5\index.dat & subdirectories
C:\Documents and Settings\\Cookies
C:\Documents and Settings\\Local Settings\History\History.IE5

Chrome:
C:\Documents and Settings\\Local Settings\Application Data\Google\Chrome\User Data\

FireFox:
C:\Documents and Settings\\Local Settings\Application Data\Mozilla\Firefox\Profiles\\cookies.sqlite
C:\Documents and Settings\\Local Settings\Application Data\Mozilla\Firefox\Profiles\\downloads.sqlite
C:\Documents and Settings\\Local Settings\Application Data\Mozilla\Firefox\Profiles\\places.sqlite
C:\Documents and Settings\\Local Settings\Application Data\Mozilla\Firefox\Profiles\\search.sqlite

These artifacts can be investigated using commercial products (Internet Evidence Finder, ChromeAnalysis Plus, Encase) or free tools (Redline, Galleta, Pasco).  Presence only of files created after the CCleaner execution date would indicate the system cleaning took place.  Lack of files or directories would warrant further investigation to determine if the system is commonly used for web browsing.

Additionally, by default CCleaner will clear certain Windows logs as well.  Specifically, logs in C:\Windows\system32\wbem\logs can be inspected to see the earliest entries.  If the earliest entries all appear after the last execution time of CCleaner, it is likely these logs were wiped via the system cleaning process as well.

2) If wiping took place

CCleaner offers a free space wiping utility as well, which will identify all unallocated clusters and fill them with "0"'s (nulls).  To identify this has been performed, a low level disk analysis tool can be used (Encase, dd, etc).  Specifically, viewing the a hex dump of the contents of any unallocated clusters will show them as containing null characters instead of miscellaneous undeleted data.  While some of the wiped clusters may now have data from activity after the running of the CCleaner tool, it is unlikely the majority of them will given today's large capacity disks.  Therefore, viewing several unallocated clusters at random should give a good indication if they have been zero'd out or not.

Additionally, CCleaner appears to store the settings used upon last execution in "HKCU\Software\Piriform\CCleaner", so the investigator could extract the suspect user's registry hive and search for "(App)Wipe Free Space" to see if the checkbox is checked or not (it is "False" by default, meaning "wipe free space" is disabled).  Also, the investigator could potentially launch the CCleaner.exe executable on a mirror copy of the drive as the user who last executed it and determine if "Wipe Free Space" was checked (it is not by default).  This would provide good confidence if the data agreed, but itself would not be substantial to say that wiping had been performed since you don't need to complete the action for the preference to persist.

3) What is now missing

Assuming CCleaner was run to perform system cleaning and wipe free space on the sole physical disk drive, important artifcacts regarding browsing history, system restore points, and deleted but not overwritten files may now be unrecoverable.  Therefore, it may be much more dificult for an investigator to find the information they are looking for (as per the product's intentions).  However, some of these artifacts may be recovered by examining the system's pagefile (stored in memory during the execution of CCleaner, then flushed to disk), slack space on the disk (not overwritten by "wipe free space"), off-device backups to either a network storage device or external media, and proxy solutions which are typically deployed in an enterprise.  The analysis will be harder, but not necessarily impossible.


Also Read: Daily Blog #98

Daily Blog #98: Sunday Funday 9/29/13 - Detecting CCleaner Challenge

Detecting CCleaner Challenge by David Cowen - Hacking Exposed Computer Forensics Blog

Hello Reader,
          It's Sunday Funday time again! I have some more images but I'm saving them for some bigger prizes, so this week be another scenario question. I was talking this week to the Texas Lawyer Technology Summit about spoliation so I thought that would be a fun topic for this weeks Sunday Funday.

The Prize:

  • Your chance to become a contributing author to the 3rd edition of hacking exposed computer forensics. Yeah you read that right, we are in the process of updating hacking exposed computer forensics, win this week and you'll get to update a chapter and be listed as a contributing author in the 3rd edition! How's that for a good prize?

The Rules:
  1. You must post your answer before Monday 9/30/13 2AM CST (GMT -5)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful 
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post

The Challenge:
Your suspect has a Windows XP system and you evidence from the User Assist records that he ran CCleaner a month ago, but the count shows it has been run multiple times before. Write out what your methodology would be to determine:
  • If system cleaning took place
  • If wiping took place
  • What is now missing
 Good luck future co-author!

Also Read: Daily Blog #97 

Daily Blog #97: Saturday Reading 9/28/13

Saturday Reading by David Cowen - Hacking Exposed Computer Forensics Blog

Hello Reader,
          It's Saturday! It's been a great week in the lab over at G-C, lots of good research and good cases keeping us going. This week I have another pile of good links for you to read over a hopefully uneventful weekend with lots of good stuff happening this week.

1. We had another Forensic Lunch this week with Harlan Carvey, Zoltan Szabo and Jake Williams joining us in the lab for a great hour of discussion. We covered shell items, Richland college digital forensics program, the updated FOR 610 and more on OSX 10.8's Document Revision functionality to recover wiped files. You can watch it here and remember to make time to watch it live next week so you can ask your questions.

2. Windows 8 is slowly being adopted and with research coming out with new forensic artifacts being found I'm waiting fore my next Wndows 8 system to come into the lab. This blog written by Jared Atkinson from the Invoke IR blog has a fascinating write up on new data being found in Windows 8 prefetch files, read it here http://www.invoke-ir.com/2013/09/whats-new-in-prefetch-for-windows-8.html

3. Monday Lenny Zeltser is having a webcast introducing how to do behavioral analysis of malware https://www.sans.org/webcasts/introduction-behavioral-analysis-malicious-software-97180 I'm planning on tuning in myself.

4. Over on Eric Huber's blog 'A fist full of dongles' he has part 2 of his critique of the current state of academia in relation to digital forensics, http://www.ericjhuber.com/2013/09/ever-get-feeling-youve-been-cheated.html, it's a good read and part of why I stay involved with the Richland College program.

5. Mark Spencer over at Arsenal Recon has released with a dual license his own image mounting tool over on GitHub, https://github.com/ArsenalRecon/Arsenal-Image-Mounter. Now you may wonder why this is something of note when you could be using FTK Imager or another image mounter program. The reason to be excited is that Mark has figured out how to get the mounted image to show up as a physical disk rather than a network mount. This means that tools like vssadmin and others that require a physical disk will finally work right without having to convert the image to a vhd!

6. Following up on our Forensic Lunch talk with Harlan Carvey about shell items, check out this wrote up on Harlan's Blog http://windowsir.blogspot.com/2013/09/artifacts.html where he goes into further detail on the format and why MFT reference numbers appear there now.

7. Over on Corey's Journey Into Incident Response blog he has a great write up on triaging malware incidents, http://journeyintoir.blogspot.com/2013/09/triaging-malware-incidents.html, a great read if you are trying to get your process together and want to learn from Corey who clearly has done the work.

8. On the SketchyMoose blog there is a good writeup, http://sketchymoose.blogspot.com/2013/09/total-recall-script-released.html, on a script for parsing memory dumps for known items of interest. The script is extensible so you can change it to fit your needs as well.

That's all for this Saturday, it was a good week in DFIR! Tomorrow is Sunday Funday so get ready!

Also Read: Daily Blog #96

Daily Blog #96: Forensic Lunch 9/27/13 - Discussion with Jake Willaims, Harlan Carvey, and Zoltan Szabo


Hello Reader,
          Today we had another great Forensic Lunch! On today's show we had:

Jake Williams - Talking about FOR 610 for sans and the addition of a whole 6th day for netwars style competition for malware reverse engineering, fun stuff!

Harlan Carvey - Talking about windows shell items and how they are embedded, parsed and understood. Windows shell items have greatly expanded in Windows Vista/7 and 8 in their usage and the information they now contain.

Zoltan Szabo - Talking about his digital forensics program at Richland College were they offer an associates in digital forensics.

Matt and I talking about the 9/22/13 Sunday Funday and showing how to recover the contents of a previously wiped OSX file.

Hope you enjoy and try to tune in next week to see it live!
Come back tomorrow for another Saturday Reading!


Also Read: Daily Blog #95

Daily Blog #95: Webmail artifacts from Sunday Funday 9/15/13

Webmail artifacts from Sunday Funday by David Cowen

Hello Reader,
         Tomorrow we have a pretty great Forensic Lunch coming together with Harlan Carvey, Zoltan Szabo and us in the lab talking about forensics, you can RSVP here for it. Today we are going to look at what remnants are left from the uploading and attachment of files from the Sunday Funday 9/15/13 image. We will finish this series of how-to next week with the CD Burning detection.

If you haven't already done so, you can download the forensic image that we have been working off of.

 This bit of analysis was probably the most surprising to me because of the lack of what I found.  I normally can fire IEF at an image and recover messages viewed and some attachments, but in this case our suspect image was pretty efficient and within the image we can't find any direct reference to the attachment of files.

Instead our best evidence of uploading via Gmail comes from the LastActive Internet Explorer session cache. Located in the path \users\suspect\appdata\local\microsoft\internet explorer\recovery\last active is a file with the name {6140DFAE-14EE-11E3-B113-080027F68913}.dat that is not a valid index.dat formatted file. This file is a 'Travel Log' and contains a number of appended OLE streams that contain what sites were visited so a crashed session could be restored in the future.

Webmail artifacts from Sunday Funday by David Cowen

Contained within these steams are evidence of open tabs that show emails being composed as well as the name of the Gmail account being accessed:
https://mail.google.com/mail/u/0/?shva=1 ï¾ #compose\TCompose Mail - ntglty512@gmail.com - Gmail
 There are multiple compose tabs recorded but other than the JavaScript necessary to attach a file no artifacts recording a successful attachment.

In this case our suspect composed a new message on Gmail, attached a file and sent it but never viewed the message he received.  What we can see is the Inbox counter of unread messages increase in the Window title after our suspect has sent themselves a message

https://mail.google.com/mail/u/0/?shva=1 ï¾ #inbox\NInbox (2) - ntglty512@gmail.com - Gmail
 But no proof of what they sent.

We will be experimenting in the lab to determine what condition needs to occur for the inbox and attachment confirmations to be swapped to the pagefile as well as what images are requested when an attachment is uploaded in the hopes of firming up this analysis. Until then, those are the facts as we see them!

Did you find something else? Let me know in the comments and lets all learn from each other!

Daily Blog #94: Determining what was accessed from USB on Sunday Funday 9/15/13

Determining what was accessed from USB on Sunday Funday 9/15/13

Hello Reader,
          Friday's Forensic Lunch is looking pretty good so far, our first confirmed guest this week is Harlan Carvey. Click the link above to RSVP and receive reminders and notification on when the stream begins at noon CST (GMT -5) so you can watch live and ask questions.

Within the post I make heavy use of FTK Imager and TZ Works tools mainly because for me its a fast and convenient way to triage a forensic image to determine what occurred. You can use any other tool and come to the same results I did. If you find additional artifacts of interest please leave a comment as I'm only covering those I think most relevant to the challenge.

Let's continue our analysis of the Sunday Funday 9/15/13 forensic image. Today let's talk about what we can determine was copied to the external drives found in the forensic image. If you read the answer key from last week, you know that three files were copy and pasted to one external drive. Of those files only one was accessed. So knowing this I exported the following user data to find evidence of files on the external device:

1. Lnk Files
Remember in Windows 7 the default location for a user's LNK files is no longer %user home%\recent it's now \users\\appdata\roaming\microsoft\windows\recent. In the case of our suspect image its located under \users\suspect\appdata\roaming\microsoft\windows\recent as seen below:
Determining what was accessed from USB on Sunday Funday 9/15/13

There are a lot of LNK parsers out there, I used the TZWorks lnk file parser. Which reveals the following file accessed from the E: Drive with volume serial number ba95-34d6:

{CLSID_MyComputer}\E:\Acme 2013 Budget.rtf                  

With a creation date of 8/31/13 00:53:48 UTC which reflects the time the file was copied onto the external media.


2. Jump lists
Next I exported out the two directories that contain the jump lists for this user AutomaticDestinations and CustomDestinations, which are also located under \users\suspect\appdata\roaming\microsoft\windows\recent.
Determining what was accessed from USB on Sunday Funday 9/15/13

I then parsed the contents of both directories with the TZWorks jump list parser. 

In my review of that data I see:


{CLSID_MyComputer}\E:\Acme 2013 Budget.rtf                  

This is the same file found in the LNK files above with the same volume serial number and creation date. It was opened with AppID 2b88af31b31e51e0 which belongs to the Microsoft Word Viewer installed on the suspect system.

3. Index.dat
When files are opened within explorer often times you'll find an index.dat entry associated with them, in this case we find that here as well. I exported out the Index.dat from \users\suspect\appdata\local\microsoft\windows\history\history.IE5

Then parsing the output with TZWorks Index.dat Parser to find the same file again:
Visited: Suspect@file:///E:/Acme%202013%20Budget.rtf

at the same time as seen in the prior two sources.

4. Shell bags 
Next I exported the USRCLASS.DAT from \users\suspect\appdata\local\microsoft\windows
Determining what was accessed from USB on Sunday Funday 9/15/13

and parsed it with TZWorks Shell bag parser finding the access an e:\ drive but no underlying directories, as there were none accessed.

Conclusions

So we can conclude that at least one file was copied and accessed from an external drive on 8/31/13, but we cannot determine the total scope of files copied. We know two additional files were copied onto the drive, why can't we see them? The answer is that the system does not record which files are copy and pasted, even in a data transfer between drives. Instead the artifacts only record which data was accessed from the external drive not its total contents.

In many cases our suspect is nice enough to copy large numbers of files and directories after which they access them and make sure they have what they need. This makes for good evidence to prove what was on the drive but still does not fully document what was contained on an external drive. It's at this point that I normally inform counsel of my findings and we begin the legal process to demand the return of this drive to determine the extent of the data copied.

Tomorrow let's do web analysis of this image leading up to another Friday Forensic Lunch!

Daily Blog #93: FileZilla Artifacts

FileZilla Artifacts by David Cowen - Hacking Exposed Computer Forensics Blog

Hello Reader,
         Continuing from last weeks blogs on how to solve the 9/15/13 Sunday Funday we've covered how to quickly determine which USB devices were connected, today lets look at another method that was used to transfer data FileZilla.

This is an interesting piece of analysis for me as its fun to see what an application you use on a regular basis leaves behind. In the case of FileZilla our source of evidence is the xml configuration files that FileZilla leaves behind in the %user%\appdata\roaming\filezilla directory. For our contest image that was \users\suspect\appdata\roaming\filezilla:

FileZilla Artifacts by David Cowen - Hacking Exposed Computer Forensics Blog

The first we want to look at is the filezilla.xml file that retains user preferences:

FileZilla Artifacts by David Cowen - Hacking Exposed Computer Forensics Blog

You can see contained within this xml is the last local directory the user viewed with the FTP client. Why is this important? When uploading files the user has to pick which local directory to upload them from, in this case that's \Users\Suspect\Contacts

Next we want to know what server he was connecting to, for that we look at recentservers.xml also located in the same directory.

FileZilla Artifacts by David Cowen - Hacking Exposed Computer Forensics Blog

We can see that our suspect connected to the IP 192.168.1.229 and authenticated as the user NT-GLTY.

I've checked the $logfile and $usnjrnl neither record which files were accessed on 9/3/13 between 9:43pm and 9:47pm when these uploads occured and as this is a Win7 system no longer updates access times when a file is accessed we get no clues from the MFT. Instead we can determine that FileZilla executed from the presence of the prefetch file located in \windows\prefetch and the temp files seen in the $usnjrnl:$j:

FileZilla Artifacts by David Cowen - Hacking Exposed Computer Forensics Blog

So the conclusion we can reach is that our suspect connected to a FTP server running on 192.168.1.229 as the user NT-GLTY on 9/3/13 between 9:43pm and 9:47pm and accessed the directory C:\Users\Suspect\Contacts. As we see no file creations during this time period in the $usnjrnl::$j we can assume no files were downloaded. We can also propose that files from the contacts directory were uploaded but we won't know that without examining the FTP server logs.

The only remaining artifact to find would be the deleted journal for the queue.sqlite3 database. The database itself contains no deleted records so only the deleted journal for the sqlite database may contain this data. Matthew and I will take a look at that this week.

Thanks for following along, tomorrow lets look into what we can determine from the Gmail web accesses.

Daily Blog #92: Sunday Funday 9/22/13 Winner!

Sunday Funday by David Cowen - Hacking Exposed Computer Forensics Blog

Hello Reader,
This was a fun challenge, the clue here really was the specific version of OSX I referenced, 10.8 which added a new feature called 'Revert to Last Save' featured here: https://www.apple.com/osx/whats-new/features.html under Auto Save and detailed in the Apple support article here: https://support.apple.com/kb/HT4753. While there have been some interesting security writeups on this artifact we haven't found much forensic writeup on this feature. 

If you watched the Forensic Lunch on friday we even talked about this feature at the end of it. Of the submissions I received only one person clued into this feature and that was Dave Pawlak who submitted a very nice writeup that I'm attaching in this post.

The Challenge:

 Your suspect is running OSX 10.8 Mountain Lion, he has wiped a document from his system that you know was downloaded from his google drive that by his internet history name appears to be related to a business plan and ends with a iWork extension. Based on the recent items plists it would appear as though this document was edited on this system but there is no time machine backups.

What can you do recover any of the contents of this wiped document?

The Winning Answer:
 
Dave Pawlak

@meatball4rensix


Here is the text of his answer:

Solution:
More recent versions of Mac OS X have a document revision directory which can be very significant to investigations.  The reason for their significance is not only due to the fact some document details can be recovered but because snapshots of documents are taken which can later be individually reviewed to allow the investigator a peek into the users mindset.

Using Terminal type sudo sh to enter a privileged user shell.  Enter your user password to grant access and follow along.

Change directories to root by typing cd / and pressing return.

To view a listing of hidden files type ls -alF.

We see a hidden directory titled “.DocumentRevisions-V100/”.  We know it’s hidden beca
use of the “.” preceding the directory name.

Change directories by typing cd .DocumentRevisions-V100/.

View the files in the directory by typing ls -alF.

The content we are looking for and hoping to find is buried in the PerUID directory.  Change directories by typing cd PerUID/.  The User ID (UID) relates to OS X security identifier like user ID system.  Rather than using the 1000 range of identifiers, Mac OS X assigns the first user the number 501.

Note: If there was more than one user account on the particular Mac OS X installation who has edited and saved a document expect to see more than one UID listed here.  My box currently has only one installed user account thus only one UID is present.  You can check the users on a Mac OS X system by entering a sudo shell and navigating to /private/var/db/dslocal/nodes/Default/users.

Change directories into the correct UID.  In this example we’ll use 501.  Type cd 501/ then press enter.  Type ls -alF to review the directory contents.

Here we see a number of files by changing the command to include a pipe through sort option by typing | sort we can arrange the output to be in some relative order.

Change directories to 13/ by typing cd 13/ and pressing enter.  List the directory contents and find the com.apple.documentVersions/ directory.

Change directories to com.apple.documentVersions/.  List the directory contents and see the file identified with a unique identifier containing a “.pages” file extension which is a
n iWork file extension.

Type open C03FD46F-0D9B-4440-81C5-89AD9EF7F70A.pages-tef/

Additionally, using the GUI you can navigate to:~/Library/Mobile Documents/com~apple~Pages/Documents or /iWork Previews

It’s possible the user has his/her iWork documents set to backup to iCloud.  In which case these locations may be fruitful.  In this case, a thumb from the iCloud interface could be located.  Clicking the file and pressing the spacebar to activate the quick look feature reveals the iCloud thumb.

You might also try carving for Apple Pages documents if it really came down to it.  Look for the file header “PK” which is /x50/x4B/x03/x04/x14/x00/x00/x00/x00/x00

The footer changes but it appears from limited testing the footer is “index.xmlPK” which is /x69/x6E/x64/x65/x78/x2E/x78/x6D/x6C/x50/4B followed by 14 bytes which are variable in value before the end of the file.  

Great work Dave! There is more here to know involving how to use the Sqlite db maintained by 'Versions' to match clusters stored to generational documents and we will talk about it during this weeks Forensic Lunch and in future blog posts!

Daily Blog #91: Sunday Funday 9/22/13 - OSX 10.8 Mountain Lion Challenge

Sunday Funday by David Cowen - Hacking Exposed Computer Forensics Blog


Hello Reader,
           It's that time again, Sunday Funday time! For those not familiar every Sunday I throw down the forensic gauntlet by asking a tough question. To the winner go the accolades of their peers and prizes hopefully worth the time they put into their answer.

The Prize:


The Rules:
  1. You must post your answer before Monday 9/23/13 1PM CST (GMT -5)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful 
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post

The Challenge:


Your suspect is running OSX 10.8 Mountain Lion, he has wiped a document from his system that you know was downloaded from his google drive that by his internet history name appears to be related to a business plan and ends with a iWork extension. Based on the recent items plists it would appear as though this document was edited on this system but there is no time machine backups.

What can you do recover any of the contents of this wiped document?


Daily Blog #90: Saturday Reading 9/21/13


Hello Reader,
       Another week has ended and for those of us not in the lab this weekend or onsite responding to some rude ruffian running an otherwise ideal weekend its time to give yourself a coffee break and get some forensic reading done.

1. If it's the first item on my Saturday Reading list it must be this weeks Forensic Lunch, you can watch this weeks show here http://www.youtube.com/watch?v=e4EjVftQ56o. I think this weeks show was pretty great as it involved three guests, Blazer Catzen, Jonathan Tomczak and Suzanne Widup talking for the majority of the hour instead of me! Topics this week included the Verizon veris database project, mft references numbers in jump lists, lnk files and shell bags with TZWorks and html5 offline artifacts.

2. If you do forensics on iPhones this week Linux Sleuthing has a pretty great writeup on how he got a semi functional device into DFU and then back to normal using a couple different packages, http://linuxsleuthing.blogspot.com/2013/09/iphone-recovering-from-recovery.html.

3. If you are triaging for malware then Corey Harrell's blog post this week is going to help, http://journeyintoir.blogspot.com/2013/09/tr3secure-data-collection-script.html. His updated tr3secure script will grab all of the most common artifacts and both volatile and non volatile from a system to help you get to the facts faster.

4. The last person I saw actively blog about GPS device forensics with forensics from the sausage factory but it looks like the fork() blog has taken up the mantle, to see the current state of his very thorough testing read here http://forensicsblog.org/research-gps-device-analysis/

5. Curious about what happens in the clean rooms at drive recovery shops? Watch this video by Scott Moulton where he films the whole process https://www.youtube.com/watch?feature=player_embedded&v=g3Dqld3PLNY

6. Hexacorn is always a good read, he really knows his stuff. He's currently running a series on going beyond the normal malware persistance locations, http://www.hexacorn.com/blog/2013/09/19/beyond-good-ol-run-key-part-4/ it is worth your time to read.

7. Jimmy Weg has put up a nice tutorial on what you need to know when mounting your forensic images as physical disk to get them to boot in Vmware 10 http://justaskweg.com/?p=1355.

That's all I have for this week. Did I leave off your blog or article? Let me know! I am always looking for more reading material to learn more and share more. Tomorrow its going to be another Sunday Funday with our most popular prize back up for grabs, a 4TB external hard drive.

Get Ready!

Also Read: Daily Blog #89

Daily Blog #89: Forensic Lunch 9/20/13 - Discussion with Suzanne Widup, Jonathan Tomczak, and Blazer Catzen


Hello Reader,
           We have another great Forensic Lunch for you, thanks to all of you who watched live with us! I hope you can join us for the next live broadcast so you can get your questions in. This week we had:
  • Suzanne Widup with Verizon DBIR talking about VCDB
  • Jonathan Tomczak with TZWorks talking about new developments in tracking lnk, jumplists and shellbags with mft reference numbers back to the files they reference with gena and other tools
  • Blazer Catzen with Catzen Forensics talking about more HTML5 offline content research, linkedin ios message recovery, comparing tools for parsing IOS backups and tool testing
  • Matt and myself talking about HFS+ Journaling, rewriting the current NTFS Journal parser and other topics
 

 Links for this week:
For the VCDB You can get an overview here:
http://public.tableausoftware.com/views/vcdb/Overview for VCDB.
The VCDB Github is located here:
https://github.com/vz-risk/VCDB
And the currently open issues is here:
https://github.com/vz-risk/VCDB/issues?state=open

You can visit Tzworks here:
https://www.tzworks.net/

And get the tools shown today here:
LNK Parser: https://www.tzworks.net/prototype_page.php?proto_id=11
Jump list parser: https://www.tzworks.net/prototype_page.php?proto_id=20
Shellbag parser: https://www.tzworks.net/prototype_page.php?proto_id=14
GENA here: https://www.tzworks.net/prototype_page.php?proto_id=28

I hope you like it, if you want to be on the Forensic Lunch just send me an email dcowen@g-cpartners.com we are always looking for new people to come and share with us.


Also Read: Daily Blog #88

Daily Blog #88: Solving the USB Device connections for Sunday Funday 9/15/13

Solving the USB Device connections for Sunday Funday

Hello Reader,
            I thought it would be helpful for many of you who want to get some practice to walk through how to solve the image we used for last weeks Sunday Funday. This image is actually based on chapter 13 of the new book 'Infosec pro guide to computer forensics' but you don't need to buy it to learn how to do forensics. 

In this post I'm going to start with how to map out with USB devices were connected to the system. I am going to do this the easier way by exporting the right artifacts and then loading it into Woanware's USB Device Forensics after exporting them from the vhd image with FTK Imager.

First export the System and Software registry keys found under \windows\system32\config

Solving the USB Device connections for Sunday Funday

The system key holds the USBStor and DeviceClasses keys while the Software key holds the EMDMGMT key both of which we've talked about in prior posts.

Next export the setupapi.dev.log from the \windows\inf directory as this is a windows 7 system in order to get the most accurate first plugin time.

Solving the USB Device connections for Sunday Funday

Now we export the NTUSER.DAT from the suspects home folder, \users\suspect


With all 4 artifacts exported we can load them up in Woanware's USB DeviceForensics tool

Solving the USB Device connections for Sunday Funday

 Once parsed I like to export the data shown as a CSV

Solving the USB Device connections for Sunday Funday

 And finally load it into Excel for easy viewing

Solving the USB Device connections for Sunday Funday

Cut off from this screenshot is all the date and times from the registry keys parsed by the tool.

There you go that is my easy go to way of getting external devices into an easily reviewable and with some good color coding an easy deliverable to those who are requesting the information.

Tomorrow is another forensic lunch and then we will continue to show to examine this image next week!

Daily Blog #87: Slides and Link from today's Austin HTCIA presentation

Slides and Link from today's Austin HTCIA presentation

Hello Reader,
        We had a good time in Austin today where they gave us almost three hours to talk journal filesystem forensics, and boy did we! We went through NTFS, EXT3 and HFS+ with demos for different aspects of NTFS and HFS+ journal forensics. We are also releasing the beta of our HFS+ parser as we continue to expand our research.

Today's slides are here:
https://docs.google.com/file/d/0B_mjsPB8uKOANE1rbG9ySmxkTzg/edit?usp=sharing

The signup link for the ANJP beta (NTFS Parser) is here:
https://docs.google.com/forms/d/1GzOMe-QHtB12ZnI4ZTjLA06DJP6ZScXngO42ZDGIpR0/viewform

The signup link for the HFS+ Journal parser public beta is here:
https://docs.google.com/forms/d/1_Zrf7LfmnklJfJ7CteecdAiAWGdRkNp2ltqqHuYFncQ/viewform

Tomorrow we'll start a walk through of the Sunday Funday image.

Daily Blog #86: Sunday Funday 9/15/13 Answers

Sunday Funday Answers by David Cowen - Hacking Exposed Computer Forensics Blog

Hello Reader,
       Thank you for all of you who attempted our first full forensic image challenge. We are going to be alternating between images and scenarios for Sunday Fundays and I'll continue trying to to tweak the format and deadlines so all of you can have a chance! Today let's give you the answer key to this Sunday Funday and then we'll go into depth on how to recover this data.


Total of 13 files with bonus (10 for challenge, 3 for bonus), one file “HowToCatchARoadrunner.bmp” failed to send (though an attempt was made), so it may show as a false positive, but maybe someone catches that.  Acme.zip contains two files.  The original files were placed in the Recycle Bin, so the copies only made it out with the zip file.  Words in blue are the files for the case.
FTP – Filezilla
                Contacts folder contents
-Company A.contact,
-Company B.contact,
-Company C.contact
-Suspect.contact
               
Webmail - Gmail
                -Birdseed Facts.rtf
                -How to spear Bird.rtf
                -HowToCatchARoadrunner.bmp (this one failed to send)
USB Devices
Microcenter 32 GB USB 3.0
"Unknown Device" 12FE:5200
VendorID 13FE - ProductID 5200
Revision:0110    SN:0707335DB6A54359
Verbatim Store 'n' Go - 512MB (Data copied to this device)
VendorID:  08EC - ProductID 0008
Revision: 0100
Serial No. 0AC1F7605250196A
Clicked Start, then "Documents"  Selected "Acme 2013 Budget","Acme Employee Bonuses",  and "Passwords"
-copied the selections to the thumbdrive (drive E:)
-opened the "Acme 2013 Budget.rtf” document
-Closed windows
-removed device at 2:49 pm
Netbook Essentials - Flash Media Solutions - Thumb Drive 2GB
-Shows as USB Disk 2.0
-VendorID: 13FE - ProductID 3200
-Revision: 0110
-Serial No. 079805001BB401AC
Netbook Essentials - Flash Media Solutions - Thumb Drive 2GB (with red markings on it)
-Shows as USB Disk 2.0
-VendorID: 13FE - ProductID: 3200
-Revision: 0110
-Serial No. 0798050023450032
Patriot Memory USB - 64 GB
-VendorID: 13FE - ProductID: 5200
-Revision: 0110
-Serial No. 0701342394A3C4813
CD Burning (bonus)
                Acme.zip
                                -Burned from Desktop, using Windows default burning utility.   
-There are two files inside the zip file.
                                                -Information for Patent.rtf
                                                -Prototype.bmp

This week we will go through these sections with a focus on the ones that people had the most problems with. I hope you keep up and if you didn't try and want some practice, download the image and see if you can find everything we left behind!

Also Read: Daily Blog #85