@night 1803 access accessdata active directory admissibility ads aduc aim aix ajax alex levinson alissa torres amcache analysis anjp anssi answer key antiforensics apfs appcompat appcompatflags applocker april fools argparse arman gungor arsenal artifact extractor attachments attacker tools austin automating automation awards aws azure azuread back to basics backstage bam base16 best finds beta bias bitcoin bitlocker blackbag blackberry enterprise server blackhat blacklight blade blanche lagny book book review brute force bsides bulk extractor c2 carved carving case ccdc cd burning ceic cfp challenge champlain chat logs Christmas Christmas eve chrome cit client info cloud forensics command line computer forensics computername conference schedule consulting contest cool tools. tips copy and paste coreanalytics cortana court approved credentials cryptocurrency ctf cti summit cut and paste cyberbox Daily Blog dbir deep freeze defcon defender ata deviceclasses dfa dfir dfir automation dfir exposed dfir in 120 seconds dfir indepth dfir review dfir summit dfir wizard dfrws dfvfs dingo stole my baby directories directory dirty file system disablelastaccess discount download dropbox dvd burning e01 elastic search elcomsoft elevated email recovery email searching emdmgmt Encyclopedia Forensica enfuse eric huber es eshandler esxi evalexperience event log event logs evidence execution exfat ext3 ext4 extended mapi external drives f-response factory access mode false positive fat fde firefox for408 for498 for500 for526 for668 forenisc toolkit forensic 4cast forensic lunch forensic soundness forensic tips fraud free fsutil ftk ftk 2 full disk encryption future gcfe gcp github go bag golden ticket google gsuite guardduty gui hackthebox hal pomeranz hashlib hfs honeypot honeypots how does it work how i use it how to howto IE10 imaging incident response indepth information theft infosec pro guide intern internetusername Interview ios ip theft iphone ir itunes encrypted backups jailbreak jeddah jessica hyde joe sylve journals json jump lists kali kape kevin stokes kibana knowledgec korman labs lance mueller last access last logon lateral movement leanpub libtsk libvshadow linux linux forensics linux-3g live systems lnk files log analysis log2timeline login logs london love notes lznt1 mac mac_apt macmini magnet magnet user summit magnet virtual summit mari degrazia mathias fuchs md viewer memorial day memory forensics metaspike mft mftecmd mhn microsoft milestones mimikatz missing features mlocate mobile devices mojave mount mtp multiboot usb mus mus 2019 mus2019 nccdc netanalysis netbios netflow new book new years eve new years resolutions nominations nosql notifications ntfs ntfsdisablelastaccessupdate nuc nw3c objectid offensive forensics office office 2016 office 365 oleg skilkin osx outlook outlook web access owa packetsled paladin pancake viewer path specification pdf perl persistence pfic plists posix powerforensics powerpoint powershell prefetch psexec py2exe pyewf pyinstaller python pytsk rallysecurity raw images rdp re-c re-creation testing reader project recipes recon recursive hashing recycle bin redteam regipy registry registry explorer registry recon regripper remote research reverse engineering rhel rootless runas sample images san diego SANS sans dfir summit sarah edwards saturday Saturday reading sbe sccm scrap files search server 2008 server 2008 r2 server 2012 server 2019 setmace setupapi sha1 shadowkit shadows shell items shellbags shimcache silv3rhorn skull canyon skype slow down smb solution solution saturday sop speed sponsors sqlite srum ssd stage 1 stories storport sunday funday swgde syscache system t2 takeout telemetry temporary files test kitchen thanksgiving threat intel timeline times timestamps timestomp timezone tool tool testing training transaction logs triage triforce truecrypt tsk tun naung tutorial typed paths typedpaths uac unc understanding unicorn unified logs unread updates usb usb detective usbstor user assist userassist usnjrnl validation vhd video video blog videopost vlive vmug vmware volatility vote vss web2.0 webcast webinar webmail weekend reading what are you missing what did they take what don't we know What I wish I knew whitfield windows windows 10 windows 2008 windows 7 windows forensics windows server winfe winfe lite winscp wmi write head xboot xfs xways yarp yogesh zimmerman zone.identifier

I am going to interrupt the series to just write a small love note to accessdata.

Dear FTK,
I know we've had some tough times together in the past. Me cussing at a crashed indexed, you not responding to my mouse clicks. There were times I thought we wouldn't last and that I would find someone else who would fulfill my needs. Then I saw the new you (FTK 2.2.1) and when I actually exported the emails from a indexed search into a recreated recursive directory path from the PST folder structure that it came from I held my breath. When I then saw that actual MSG files were contained in the right folders my heart skipped a beat. Then when I saw that the attachment was actually in place in the MSG ... I knew everything would work out.

G-C Partners, LLC

Seriously though, for those who didn't immediately get this joke alot of the forensic tools available to the market for the last 10 years have had some real gaps of functionality that made our lives torture. One of these most basic features missing was the ability to export an email found when reviewing an image in a forensic tool back to a msg or pst instead of just a text export or html export that wasn't even compliant to the rfc specifications needed for most tools to convert it. If we didn't have it in msg or pst most lawfirms and ediscovery firms could not process it.

FTK 2.2.1 has fixed that issue and for this my office will gain many, many hours of producivity back instead of running my very long process to reassemble the data from other tool outputs.

Back to the series in the next post, thanks for reading.

Howdy Reader,

It's been quite some time since my last blog post, I apologize. Things have been pretty busy, apparently the recession/depression has really spurred civil crimes and I had a very nice vacation. In our last time together we discussed more detectable methods of how suspects remove data from their systems. I've left off the most common and lengthy portion of the post so I could give it the detail and supporting documentation it deserves. In this post we will finish the concept exploring method 3 in this post and 4-5 in the next. This series does focus on Microsoft windows systems as they are the most popular business system in use, I will write another linux or mac specific series at another time.

Method 3 – Copying data to an external drive

  1. How did they take data from the system

    The first step you should take in a windows system is examining the contents of the registry keys that track storage devices plugged into the system. Inside these registry keys located under the system registry file under the system control sets are at least three keys keeping track of three types of external storage devices:

    1. USB Devices

    USB storage devices have their information store under the USBSTOR key found under:


    1. Firewire Devices

    Firewire devices that are also storage devices can be found in the system registry under the system control set as well at:


    1. eSATA Devices

    eSATA devices that are plugged into the system can be found in the system registry under the system control as well at:


    It's important to know that the type of eSATA enclosure (for instance I was testing with a Simpletech Prodive) will not appear in the IDE registry key. The type and serial number of the drive will appear in the registry but you will have no way to identify what enclosure the drive was in from the registry. Of course you can compare the drives in enclosure to find the right drive but if drafting a subpoena you will not be able to specify what enclosure the drive is in.

    1. Responsive to all types of devices:

    There are additional versions under the system key some of which are duplicates of currentcontrolset so make sure to check each one. Each controlset that is numbered such as controlset001 is a configuration state of the system that booted successfully at one time. The currentcontrolset points to the numbered controlset that was last booted from successfully.

    An easy way to parse out these registry entries is with RegRipper which creates a nice text file with all the most useful parts of the registry for the forensic examiner but in its latest version does not include the sbp2 key but I'm sure it will be added soon.

    There is one registry entry under these keys for each storage device that has been attached to the computer since it was first installed if these keys do not exist or are empty then someone has run a system cleaner as the key will only get created on the first attachment of a storage device except for IDE which will exist if their is an IDE drive in the system. Remember these entries are in the system registry so it applies to every user who has used the system. This means that if you have a multi user system you still will have to verify who plugged it in during the times and dates we find. These entries will contain digital cameras, thumb drives, external hard drives, ipods, cell phones, anything that provides some type of storage and will be accessed as a drive letter. Each entry will contain the parent id, the vendor id and what is marked as the serial number of the device. The serial number reported to windows is not always the serial number printed on the physical device and this varies by manufacturer so when requesting these devices in a subpoena or other form make sure to specify it 'as reported to Microsoft Windows'.

    The last written date of the registry key for each device entry tells you the last time the device was plugged into the system. We can determine the first time the device was plugged into the system by searching for the device name we found in the USBSTOR/SBP2/IDE keys and searching for it in the setupapi.log file found in the 'windows' directory in windows xp and in the setupapi.dev.log located under 'windows\inf' in Vista.

    To find out each additional time the device was plugged into the system we can look at the backed up copies of the system registry located in the restore points. For Windows XP this is located under the 'system volume information folder\rp'. There is a new version of regripper for restore point registry examination called ripxp that will run the ripper not only against the current registry file but also all the previous copies of it in the restore points.

    Windows Vista restore points are renamed to "system restore" points and utilize the shadow copy service to make a separate volume where previous versions of files and system files are kept depending on the configuration and version of windows vista. You can use programs such as Shadow Explorer to access these volumes on a live system (or an image running in a vm) where you can browse the point in time back up of each partition on the system for the same registries. I have not found a forensic tool to date that can mount these shadow volumes in the way that shadow explorer can.

  2. What did they take

    If our suspect did not wipe the system clean of the information we now know all of the external devices they could have copied information to. Determining the extent of what they have copied on to these devices is not as well recorded by the system. There are several ways that a suspect may attempt to copy data to the external drive.

    1. Backup programs

      There are a variety of backup programs a suspect can use. Some of them will come bundled with the external media and others are built into the operating system. We can determine what backup program the suspect ran from the techniques discussed in part 2. Once you've identified the software used a quick google should reveal what if any logging the software left behind. For instance in Carreker Corporation v. Cannon et al (4:06-cv-00175-RAS-DDB) we found the use of Dantz Retrospect which creates a log file for each of the backups performed logging the configuration, directories backed up, files backed up and total data copied for each backup done with the software.

    2. Copy programs

      Some suspects will choose to use utilities such as robocopy or xxcopy to copy the data to external media. In those cases the techniques discussed in part 2 will help you identify what program they used and when they copied the data.

    3. Standard copy and paste

      If our suspect copied the data to some external media with just a copy and paste or drag and drop there will be no record that I have found to date to reflect it.

      The next thing we have to determine is what they copied on to the external drive. There are two reliable methods generated by Windows automatically that can tell us what files and/or directories they accessed from the external media.

      1. LNK Files

      Windows shortcut or 'LNK' (pronounced link) files have been a standard feature of windows since windows 95. LNK files as most forensic examiners refer to them are created for a variety of reasons. What is most important to us for this method is that for files and directories opened in windows explorer a LNK file will be created in the users recent directory ('\documents and settings\user name\recent' in windows xp, '\users\user name\recent' in windows vista) and if a program such as Microsoft office is associated with the file then a second lnk file will be located in the program's own recent directory located in the application data directory. The LNK file in its normal usage allows a user to quickly access the file that it points to. We can examine the LNK files and see which of them show that the file or directory it points to existed on an external disk. For more information on LNK files read this or this. For a free utility that will parse these files and other try Windows File Analyzer (most forensic tools have this capability already either built in or through some provided script).

      The LNK file tells us many important facts about a file that it points to.

      1. The time the file was first accessed on this computer

        The created date of the LNK file will tell you the first time the file was accessed through windows explorer, this captures the first access to the file. If the modification date varies from the creation date then you have the last time the file was opened as well.

      2. The time the file was first created on the media it resides on

        The LNK file captures the creation, modification, access dates as well as the size of the file that it points to within the LNK file structure. This allows us to know the creation time of the file which reflects the first time the file was copied onto the media. We can then determine when the data our suspect has taken was first copied on to the media. We will only know this if the suspect accesses the files or directories after copying them to the disk.

      3. The name, type and volume serial number of the media the file resided on.

        Using this we can determine which files accessed came from external media and match it up to those devices we identified in this section.

      1. MRU

        Most Recently Used entries in the registry exist for multiple types of applications and windows components. They keep track of the last files opened by the user for that application but they only track the file opened and the date on which the file was opened. The only way to determine if the path where the file was opened was on external media is to check if the drive letter shown was not local to the system. You can easily pull out most MRUs with regripper.

At this point we can now determine what external devices were in use and what files we can determine were placed there. The last two methods to discuss are copying data to network locations and uploading data to file hosting websites.

Hello everyone,
Sorry for the lack of new posts. I've been out of the country for work and vacation the last couple months but I'm back and ready to finish the series. In good news while onsite for two months I did run across a number of new OWA log format that I wrote some new parsers for that I will be posting soon.

Howdy Reader,

In the prior posts in this series we've talked about how to determine if our suspect burned a CD and then what programs he ran before he left. In this post we will discuss ways our suspect could have taken out of the system and how we can find out what they took. There are several options available to someone who wants to take data depending on the environment they are in. They could burn a cd (which we talked about before), send out an email via their companies servers, send out an email via a webmail service, copy the data to a external drive, copy data to another network location or upload data to a web based file hosting service are the most common. Any combination of these methods can be used so we have two challenges:

  1. We need to determine what method or methods they used to take data from the system
  2. We need to determine what files they took using these methods

Method 1 – Email via corporate servers

  1. How did they take data from the system

How we are able to recreate activities will depend on the email system they have in place. Typically a suspect using this method will be emailing files and forwarding messages to their personal account. The sophistication of the suspect is pretty low in this method but occasionally even these suspects will delete these messages. Every email system has its own particular quirks for the recovery and analysis of messages, enough so that we will have separate posts to deal with each one at a later date. In most cases if a user has deleted the email messages we have three sources of recovery:

  1. Recovery of deleted messages from the local system

    When an email is sent from a suspect's system it is typically saved in the local and possible the server side 'sent' box. How we recover the message depends on how much time has passed since the user deleted the message, what other activity has occurred on the system since that time and whether or not the user purposefully attempted to push the email out of the local email archive.

    i. Messages recovered from the application database structure

    Most email systems have an accompanying client side application (groupwise, outlook, notes) that store the emails they receive into a database like structure. Typically when these emails are deleted they remain within the database like structure until they are flushed out (such as using the outlook compact and repair function). Until that occurs then most of the major commercially available forensic tools (Paraben's email examiner, Encase Forensic, Forensic Toolkit) can recover these deleted messages if they support the file format. If you are looking for a free option you might try the steps outlined here but that could become quite burdensome if you have a large number of messages to recover.

    ii. Messages recovered from the unallocated space

    Depending on the email client used the data will either be in plain text in the unallocated space for easy carving or it may be stored in some binary format such as the case with deleted messages from most outlook pst's. Outlook pst's support a data encoding format known as outlook compressible encryption. When this data is pushed out of the pst structure either over time or through the operation of a compact and repair the message will then exist in the unallocated space but be encoded in OCE. The only tool I know of currently that can search for OCE data in the unallocated space is encase. However the temporary files made by Microsoft Word, which has been the default editor for emails inside of Outlook since at least 2003, are recoverable as plain or Unicode text in the unallocated space.

  2. Recovery of deleted messages from the live server

    Most email servers don't flush out emails from their database when the user deletes them. If you or your IT contact has administrative access to the email server you can ask them to recover recently deleted messages from the live server. In exchange there is what has been called a 'dumpster' functionality that will retain emails for a definable set of days (default of 14 I believe). In groupwise you can recover deleted messages using tools like network email examiner or the salvage utility. In lotus notes you could either check to see if the 'soft delete' option was set and for how long it will retain messages or again use commercial tools. There are not many (any?) open source or free tools for dealing with enterprise email server solutions.

    If the email server has flushed out the message recovering the message from the unallocated space becomes more difficult since I don't know the encoding of the message. If you are dealing with a sendmail/imail server like Iplanet you can recover the messages in the unallocated space through regex searches for headers.

  3. Recovery of deleted messages from backup

    When all else fails you can go to backup, usually tape. Restore the email database for the relevant time intervals to hopefully capture the email it was not purged the same day it was deleted, not very typical. You can restore the tape with either the native software (netbackup, backupexec, arcserve, etc…) or with software that supports your tape format (Ontrack Powercontrols, Quest Recovery Manger for Exchange). Then you will need to access the email database either with the native email server in a recovery environment or with a tool that supports reading the database directly (network email examiner, Ontrack Powercontrols, Quest Recovery Manger for Exchange). Once you've done this you can see if the emails you are looking for exist. This is a very detailed topic on the variances of backup software for the forensic examiner and data available, I plan to make separate posts about each of the major backup formats and tape examination techniques.

  4. What did they take

    Now that we have the email messages we can see what was forwarded and attached to those messages to make a list of those files, email addresses and subject matters. In my job I have no knowledge of internal matters so I have to hand over this data to counsel so they can determine what that was sent to themselves contained relevant data.

Method 2 – Email via webmail

  1. How did they take data from the system

    The first step I would perform is examining the internet history of the user. If they did not clear out their internet history then you can look through it for webmail websites they have been visiting. The majority of webmail services are either free or in the case of a hosted website they own will be using a free webmail package (like squirrel mail). We can use these sites to get unique keywords they display in either the html source of the page or in the rendered page itself to identify specific pages of interest.

    The nature of how we send and receive data to websites dictates what is and is not recoverable. We can recover pages they have viewed, such as the contents of a mailbox/folder . We can see messages received the form to write an email but not the email they wrote. So we can look for the most recent views of their inbox for emails they have sent to themselves with files attached. Many people have said that gmail no longer leaves cached emails for us to recover. This is not in fact true, the move to the ajax model means we no longer have a separate cached page for every email viewed, instead we have to look at the virtual memory (pagefile.sys in windows, the swap partition in unix based systems) to find these email remnants.

  2. What did they take

    Most webmail sites will separately make a pop up or page for attaching files and giving notification of successfully attached files. This is good for us as we can recover each notification page and make a list of what files they sent themselves. In addition some suspects are nice enough to open each email they forwarded to themselves to just make sure they got their files.

There are three more methods to detail, but I don't want to wait another day to get this up. To be continued in part 4.

Hello Reader,

        In Part 1 we discussed how to determine if a CD was burned. Knowing what application it was burned with and what other tools they ran before they left is also important.

  1. User Assist

One way to determine this is with the user assist registry keys. Over the years since the user assist registry keys were first discovered (they were included in our windows analysis chapter in 2005) many people have realized the impact it can have on their case. The User Assistance functionality has existed since Windows 2000 and is a registry key divided into two parts that keeps track of recently used programs and files for the start menu.

The user assist registry key exists in each user's ntuser.dat under the key: HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Explorer\UserAssist

Of which there are multiple keys depending on the version of windows you are examining, two for windows 2000, xp, 2003 and three for windows vista, server 2008, under which you will find a count key that contains the actual data we are looking for. Entries are encoded in rot13 and if you are not using one the tools listed in this blog you will need to decode them yourself to read the entries.


There are multiple tools that support the user assist registry keys for analysis (Accessdata's registry viewer and Didier Stevens tool for instance) that will quickly allow you to see:

  1. What program or file was accessed
  2. How many times the program or file has been accessed through windows explorer
  3. The last time the program or file was accessed through windows explorer

As a simple example, I use Microsoft Office a lot. In fact I write my blog posts in it as it can directly post them to blogger (hopefully catching all my typos). So a decoded user assist entry for Office in my registry looks like this:

"{75048700-EF1F-11D0-9888-006097DEACF9}","20","UEME_RUNPATH:C:\Program Files\Microsoft Office\Office12\WINWORD.EXE","","54","37","3/22/2009 9:25:59 PM"

This entry was found in:


Decoding the entry section by section we see:

  1. {75048700-EF1F-11D0-9888-006097DEACF9} - the registry key under user assist that this belongs to, data appears to be grouped into categories based on these id's.
  2. 20 – The index number this number increments as entries are added to this key. In this case this is the 20th entry logged. If you have a program executed multiple times, such as my Word 2007 program, sorting by the index number will give you an idea of when it was first executed.
  3. UEME_RUNPATH:C:\Program Files\Microsoft Office\Office12\WINWORD.EXE – This is two pieces of info combined into one:
    1. UEME_RUNPATH – This is the prefix for all entries that will give you a full path to the program or file being accessed
    2. C:\Program Files\Microsoft Office\Office12\WINWORD.EXE – this is the full path to the program or file executed
  4. 54 – This is the session, its use is still unknown
  5. 37 – This is the number of times the program has been executed
  6. 3/22/2009 9:25:59 PM – This is the last time the key was updated and should be the last time it was executed

Going through the user assist then allows us to find out what programs where being executed around the time that for instance a CD was burned. Sorting the entries for that time we can see what was being executed around that time. If there is no corresponding entry you may want to look at the restore points for backups of the ntuser.dat close to time of the burn to find the program executed.

If the user assist keys is missing two things could have occurred

  1. The user disabled them, there will be a registry key created showing this if true.
  2. The user has deleted them, this can be an indication of some type of 'cleaning' tool being run such as Crap Cleaner.

Now the user assist registry keys are not the only place to look for what programs have been executed. We don't want to rely solely on access times as they change so easily and don't prove that a program was actually executed. We want to focus on artifacts created because of an execution of which there are two other well documented sets of artifacts that show the actual execution of a program.

  1. Shortcut/Lnk Files

Stored in several locations depending on its function LNK files so named because of the extension '.LNK' that is given to them. We will discuss LNK files in more detail in the next post as they are an extreme wealth of information but for the purposes of this post it can suffice to say that we can use LNK files to determine if a program was executed through it.

The start menu for each user stored from windows 2000 and on is under the user's profile directory (\documents and settings\<user>\start menu in xp and \users\<user>\start menu in vista and 2008) contains a LNK file for each of the files listed in the user's start menu when the click the start button. So each time a user loads a program through it the modified date of the LNK file will change to reflect it. This also applies to any other instance of the lnk file such as in the quick launch bar or on the desktop.

So for instance my Office 2007 LNK file in the start menu shows a created time of 11/24/2008 which is when I installed office 2007 on this computer. The modification date is 3/22/09 9:25pm which is the last time I used the LNK file to load up office 2007. You can see that the prefetch reference below says 9:26pm, it takes a couple second between the time I clicked the LNK and when the prefetch file gets created.


  1. Prefetch Files

Stored in the \Windows\Prefetch directory there is one .pf file for each of a max of 128 programs and the last modified time is updated each time the program is executed. The Forensic Wiki has a nice write up on prefetch files. There are several tools out there for parsing prefect files, one that is free is part of the Windows File Analyzer program. If I were to analyze the prefetch file for Office 2007 I would see the following:

File name: WINWORD.EXE-6AC9169C.pf

Last loaded: 3/22/09 and 9:26PM

This is when I started writing this blog post, it's been a couple days of research catching up on old topics to see what people have figured out.

So the prefetch file is a third correlation point we can use to determine if and when a program has been executed.

  1. Conclusion

So we now have three separate sources on a typical Windows system that we can use to determine what programs had been executed (the first and last times), when and how many times they have been executed. These are not the only places we can look for this information but they are three of the most reliable due to the nature of their creation and use. If you find that all of this data is missing then it becomes almost certain that either

  1. The system is being reimaged each time it reboots/logs in (some public access terminals do this)
  2. A cleaning/wiping tool has been run

I plan to make a post on how to determine what a user has wiped after this series but if a cleaning tool has not been run one or all of these sources will allow you to state for a fact what program was executed to:

  1. Run a backup program (such as the ones that are packaged with some external hard drives like retrospect)
  2. Burn a CD
  3. Run an ftp program
  4. Access some kind of archiving or copy tool

Which will then lead to the next question and our next post in the series : Part 3 - Where did it go and what did they take?

Dear Reader,

We've been discussing server level analysis for the last couple posts but there is plenty to talk about on the desktop. This will be a multi part series discussing different artifacts that we can recover that give us provable facts regarding a user's activity. It is easy to speculate on actions based on speculative data such as access data or related files or dll's accessed on a system but it is always better to rely on a repeatable process that creates a specific artifact each time to explain a user's action.

We only do cases that either lead to civil litigation or are in the process of civil litigation (no criminal work). One of our most common requests is the question, before this employee left did they take any documents with them. There are several places on a system we check to determine if a user has taken a document from the system in some fashion (CD, USB Drive, Emailed out, printed, etc…) and in this post we will discuss how to determine if a user has burned a CD. If you are examining a Windows XP or Windows Server 2003 (I have not been able to test this on Vista or server 2008 yet) image then the system event log will contain eventids 7036 and 7035 as it was generated by the Service Control Manager and will contain in the description a string starting with The IMAPI CD-Burning Service. There will be one such set of entries showing the service starting and stopping on each reboot but any entry not close to a reboot will indicate that a CD is being burned from this system.

An example of a burning entry, yes my machine is named HOSS:


3:04:13 PM

Service Control Manager






3:04:13 PM

Service Control Manager






3:04:22 PM

Service Control Manager






The IMAPI CD-Burning COM Service service entered the running state.


The IMAPI CD-Burning COM Service service was successfully sent a start control.


The IMAPI CD-Burning COM Service service entered the stopped state.

Sorry for the bad editing here, the full row will not fit in this blog template. The line starts with the date and then continues in the block below. There is one date for each of the IMAPI entries.

If those three entries are not part of a reboot/startup sequence then you have found a user burning a CD. These entries do not have to be in uninterrupted sequence as you see here, but there should be a start and a stop to show a successful burn. This is not just for CDs burned by Windows directly, third party applications will also call this service when burning a CD. You can estimate the size of the data burned to the disk by determining the number of minutes spent burning (the time between the start and stop of the service) multiplied by the write speed of the CDROM. This also applies to DVDs.

I will not discuss how to determine if a CD was accessed in this post as that is material for Part 2 – What was accessed from external drives.

Update: As per the comments below, more activities than just booting and burning will cause these event log entries to show up. I will be doing some more testing to find a better answer.

Hello Reader,

        To the end user the blackberry server is what their blackberries get their email from. But there are multiple methods of communication a blackberry is capable of relaying, logging and recovering by an informed investigator.

  1. Email
  2. SMS
  3. Blackberry Messenger
  4. PIN Messaging
  5. Phone Call Log

The blackberry server will create the following type of logs in total:

  • ALRT - BES Alert
  • BBIM - BlackBerry Instant Messenger (4.1)
  • BBUA - BlackBerry User Administration Service (BRK)
  • CBCK - Backup Connector
  • CEXC - Exchange PIM Connector
  • CMNG - Management Connector
  • CTRL - BlackBerry Controller
  • DISP - BlackBerry Dispatcher
  • MAGT - BlackBerry Mailbox Agent (aka BlackBerry Messaging Agent)
  • MDAT - Mobile Data Services
  • MDSS - MDS Services (4.1)
  • MDSS-DISCOVERY - MDS Services (4.1)
  • POLC - Policy Service
  • ROUT - Router
  • SYNC - BlackBerry SyncServer
  • PhoneCallLog (4.1)
  • PINLog (4.1)
  • SMSLog (4.1)


(Thanks Wikipedia http://en.wikipedia.org/wiki/BlackBerry_Enterprise_Server)

  1. Email – The blackberry server logs will store when a device connects to the server to pull email and delivers mail and other messages. When you are dealing with a time sensitive issue of did a message get received/sent/deleted from a blackberry these logs may be your best source of evidence if a enough time has passed to let the message be deleted from the blackberry device itself before imaging. Regarding imaging blackberry devices I personally use Paraben's device seizure (found here http://www.paraben-forensics.com/catalog/product_info.php?products_id=405) to do the device acquisition.

    The MAGT log with a name like "<Blackberry server name>_MAGT_01_20090108_0001.txt" will be a listing of every action taking place regarding the delivery of messages/calendar items/etc.. to every blackberry communicating with the server. You will find them in multiple segments per day. This is the place to look if the timing of the delivery/deletion/forwarding of a message from a blackberry is at issue.

  2. SMS – When configured to do so the blackberry server will log into a csv file the following fields:

    "Name.ID,"Email Address","Type of Message","To","From","Callback Phone Number","Body","Send/Received Date","Server Log Date","Overall Message Status","Command","UID"

    With a file name such as "SMSLog_20070927.csv" with one log being created per day.

    The file is written out in utf16 so be aware of that if you to parse it out.


  1. Blackberry Messenger – This is a blackberry IM program that according to my current research will not be logged on the server without creating an account to relay all the messages to. Without prior configuration the only way to recover these messages is from the device itself.


  1. PIN Messaging – This is the PIN messaging log. PIN Messages are those messages sent between blackberries directly through the blackberry server directed to the PIN assigned to the blackberry by the server. By default the blackberry server will log into a csv the following fields:

    "Name.ID,"PIN","Email Address","Type of Message","To","Cc","Bcc","From","Subject","Body","Send/Received Date","Server Log Date","Overall Message Status","Command","UID"

    With a file name such as "PINLog_20070927.csv" with one log being created per day.

    The file is written out in utf16 so be aware of that if you to parse it out. I'm writing a parser now to dump them all into a mysql database that I will post when I correct a weird multiline message that I've found. Special bonus it's a perl script that correctly handles utf16.


  1. Phone Call Log – This is a log of all of the calls being made out of the blackberry devices, note this only applies to calls made on blackberries connected to this blackberry server. This includes missed calls, outgoing calls and incoming calls that I've seen to date. By default the blackberry server will log into a csv the following fields:

    "Name.ID","Type of Call","Name","Phone Number","Start Date","Server Log Date","Elapsed Time","Memo","Command","UID"

    With a file name such as "PhoneCallLog_20070927.csv" with one log being created per day.

    The file is written out in utf16 so be aware of that if you to parse it out.

All of the CSV files will load into excel directly if you import them, otherwise if there is a large number of dates in question I would recommend parsing them into some kind of database so you can pull records by the user's name or PIN.

Depending in the current configuration of the blackberry server after the date in question or the changes you make to a server now in preparation (if you are internal) a large amount of responsive data that the user may not believe exists will be available to you. Don't expect your blackberry admin to be aware of this data existing but make sure to ask for a copy of the log director regardless.

Dear Reader,

    Today we will not discuss OWA again. Rather we will discuss a peculiar case of a temporary file that lead into a journey of discovery into Microsoft internals.

I was working a case Lockheed Martin v L-3, et al (6:05-cv-1580-Orl-31KRS), which has since settled, which involved amongst other things several files that were contained on a CDROM and accessed on a laptop. On this CDROM were lots of files and one of the issues in the case revolved around which if any of those files had been accessed on the laptop showing which information may have been exposed and/or transferred to the rest of the company.

So like a good computer forensic investigator I reviewed all of the recently used registry entries, the lnk files and the user assist records regarding any of the files known to have come from that CD. One of the files in particular had an extension of 'shs'. 'shs' files are scrap files made when a user is copy and pasting items such as powerpoint slides, in this case it was a powerpoint slide. So I found the entries referencing that this specific shs file, which when loaded into powerpoint is a single slide, was accessed on three occasions. At times corresponding to these accesses I found a temporary file on the desktop that contained keywords relevant to the case and appeared by content to be a powerpoint document but no matter what tool I used it would not open it. All of my file signature tools regarded the file as 'data' with no specific file type.

The opposing investigator had the system this CD was burned form and thus had one significant advantage over me, he knew that the temporary file was related to the scrap file contained on the CD. Sure enough when I renamed this temporary file that no tool regarded as anything to an extension of 'shs' it opened up right away in powerpoint revealing the same slide as contained in the shs file on the CDROM. This left the question, how did this file get created on desktop?

So I keep reiterating the CDROM for a reason, normally when temporary files are created in office documents they are created in the same directory as the file you are working with. When you are working on a file in a read only directory, like a CDROM, it will instead create the temporary file on the desktop. So mystery of why the file exists solved! We already knew the scrap file was accessed and now we have corresponding temporary files to show that on the desktop.

The opposing expert was not deterred so easily, he pointed out that the temporary file sh60.tmp had the numeric 60 in it meaning in his opinion that it had in fact been accessed many more times than 2 since the 60 is actual hex for 96 so he claimed it was accessed approximately 95 times. This would a very large amount of accesses for a single powerpoint slide no matter what the contents so I was skeptical. We did some research to determine what creates the temporary file and found out it was a shared Microsoft library that many, many applications use including the application of hotfixes and service packs. Each time a temporary file is created by anything that uses this shared temporary file library the counter is incremented thus explaining how we had such huge jumps between our temporary files left on the desktop and the discrepancy of the offset to the number of times the rest of the forensic artifacts showed the file being accessed.

So the morale of the story is, sometimes a temporary file isn't just a temporary file so be careful out there and always test your assumptions. In this case both myself for assuming the temporary file was just a temporary file and the opposing expert for assuming that nothing else would change the counter on the temporary file got to learn an important lesson.

Hello Readers,

I will not be talking about OWA every time.

 In our prior time together we discussed parsing OWA logs to determine who has been accessing someone else's account. For criminal prosecution (unauthorized access) or internal investigations this might be enough, but for investigations involving the civil court system you need to show that the information accessed and the time they accessed it corresponds to some claim such as tortuous interference.

The same OWA logs we looked at last time will allow you to do this, with some caveats. When you see a single entry to access an item such as:

" /exchange/USA/Attach/read.asp?obj=000000007C6A5AC4439BD948B2EDEC2B4701083907007DC649E6901ED711982E0002B3A2389C000000C0411400007DC649E6901ED711982E0002B3A2389C0000013340B20000&att=ATT-0-C9D9D5C63632DD439C1AF3C6A4B4AF8A-TOD9D1%7E1.PPT"

This is a request to open up an email attachment, the obj show here in the query is a unique identifier for the item within the exchange database. This means that if you replay that url while, and this is important, logged in as that user you will be able to bring up the exact same message that was viewed at that time (If it was not deleted). If you attempt to access this object while logged in as any other user it will deny you, even if you login as the administrator. If you want to make sure the messages exist (meaning not deleted) restore the exchange server from a backup tape referring to the time period the message we viewed and replay it to the restored server.


These are the following asp pages that can be called by an OWA user according to about two years worth of logs from one case I worked:











































Of these we care about the following:

This is a user logging in - /exchange/USA/LogonFrm.asp

This is a user requesting to read a specific message - /exchange/USA/forms/IPM/NOTE/read.asp

This is a user opening an attachment - /exchange/USA/Attach/read.asp

This is a user composing a new message - /exchange/USA/forms/IPM/NOTE/cmpMsg.asp

This is a user reading a message request - /exchange/USA/forms/IPM/SCHEDULE/MEETING/REQUEST/mrread.asp


If you parsed our just these commands identified by the logged in user you could see what specific emails, meetings, and attachments a webmail user had viewed, created, sent using OWA and the time on which they did. Using these times and matching the ip address to the suspect you can then combine the information accessed, to the time it was accessed, to the benefit they received by having that information at that time.


As an example, in the case Exel Transporation Services Inc v. Total Transportation Services LLC et al (3:06-cv-00593) I used this to uncover a large industrial espionage case. First I used the program in the prior post to find which accounts were being used to access other email accounts in the system. Then I looked up the IP Addresses and found out one of them was actually registered to one of the ex-executives of exel directly on ARIN. We then broke out just the accesses used by those accounts (I mean really why else would the blackberry server administrative account or the voicemail server be logging into a website .. something we had to explain to counsel) into a database divided up by type of item accessed (email, attachment, calendar).


The next part was more difficult, we had to replicate their exchange network, AD controller, etc.. to restore their exchange server backups and replay those months to find out what our suspects were viewing. This included almost every decision maker within exel and according to the filings I read about $120 million dollars in lost business as they were able to read the contracts sent to customers during a bidding process and always beat them. We fed the urls into a GUI automation tool that would interact with the web browser and save the emails and attachments into MHT (full website archive) files for the lawyers review. I couldn't within the time frame get a pure perl program to work the way I needed it to.


For more information read this news article:



The case was settled out of court with a public apology written by TTS. The final stone in my understanding that led to settlement was when we matched the TTS OWA logs to the Exel OWA logs and showed the suspects logged into the TTS server with their real user name, with the same ip and at the same date/time, as they were logged into the Exel OWA server with their administrative accounts.


I hope this was useful, I can post parsers I wrote if you think it would help you in the future.



Author Name

Contact Form


Email *

Message *

Powered by Blogger.