@night 1803 access accessdata active directory admissibility ads aduc aim aix ajax alissa torres amcache analysis anjp anssi answer key antiforensics apfs appcompat appcompatflags applocker april fools argparse arman gungor arsenal artifact extractor attachments attacker tools austin automating automation awards aws azure azuread back to basics backstage base16 best finds beta bias bitcoin bitlocker blackbag blackberry enterprise server blackhat blacklight blade blanche lagny book book review brute force bsides bulk extractor c2 carved carving case ccdc cd burning ceic cfp challenge champlain chat logs Christmas Christmas eve chrome cit client info cloud forensics command line computer forensics computername conference schedule consulting contest cool tools. tips copy and paste coreanalytics cortana court approved credentials cryptocurrency ctf cti summit cut and paste cyberbox Daily Blog dbir deep freeze defcon defender ata deviceclasses dfa dfir dfir automation dfir exposed dfir in 120 seconds dfir indepth dfir review dfir summit dfir wizard dfrws dfvfs dingo stole my baby directories directory dirty file system disablelastaccess discount download dropbox dvd burning e01 elastic search elcomsoft elevated email recovery email searching emdmgmt Encyclopedia Forensica enfuse eric huber es eshandler esxi evalexperience event log event logs evidence execution exfat ext3 ext4 extended mapi external drives f-response factory access mode false positive fat fde firefox for408 for498 for500 for526 for668 forenisc toolkit forensic 4cast forensic lunch forensic soundness forensic tips fraud free fsutil ftk ftk 2 full disk encryption future gcfe gcp github go bag golden ticket google gsuite guardduty gui hackthebox hal pomeranz hashlib hfs honeypot honeypots how does it work how i use it how to howto IE10 imaging incident response indepth information theft infosec pro guide intern internetusername Interview ios ip theft iphone ir itunes encrypted backups jailbreak jeddah jessica hyde joe sylve journals json jump lists kali kape kevin stokes kibana knowledgec korman labs lance mueller last access last logon leanpub libtsk libvshadow linux linux-3g live systems lnk files log analysis log2timeline login logs london love notes lznt1 mac mac_apt macmini magnet magnet user summit mathias fuchs md viewer memorial day memory forensics metaspike mft mftecmd mhn microsoft milestones mimikatz missing features mlocate mobile devices mojave mount mtp multiboot usb mus mus 2019 mus2019 nccdc netanalysis netbios netflow new book new years eve new years resolutions nominations nosql notifications ntfs ntfsdisablelastaccessupdate nuc nw3c objectid offensive forensics office office 2016 office 365 oleg skilkin osx outlook outlook web access owa packetsled paladin path specification pdf perl persistence pfic plists posix powerforensics powerpoint powershell prefetch psexec py2exe pyewf pyinstaller python pytsk rallysecurity raw images rdp re-c re-creation testing reader project recipes recon recursive hashing recycle bin redteam regipy registry registry explorer registry recon regripper remote research reverse engineering rhel rootless runas sample images san diego SANS sans dfir summit saturday Saturday reading sbe sccm scrap files search server 2008 server 2008 r2 server 2012 server 2019 setmace setupapi sha1 shadowkit shadows shell items shellbags shimcache silv3rhorn skull canyon skype slow down smb solution solution saturday sop speed sponsors sqlite srum ssd stage 1 stories storport sunday funday swgde syscache system t2 takeout telemetry temporary files test kitchen thanksgiving threat intel timeline times timestamps timestomp timezone tool tool testing training transaction logs triage triforce truecrypt tsk tun naung tutorial typed paths typedpaths uac unc understanding unicorn unified logs unread usb usb detective usbstor user assist userassist usnjrnl validation vhd video video blog videopost vlive vmug vmware volatility vote vss web2.0 webcast webinar webmail weekend reading what are you missing what did they take what don't we know What I wish I knew whitfield windows windows 10 windows 2008 windows 7 windows forensics windows server winfe winfe lite wmi write head xboot xfs xways yarp yogesh zimmerman zone.identifier

Daily Blog #52: Understanding the artifacts LNK Files

Hello Reader,
               Time to continue the series of understanding the artifacts building up to a deeper understanding of proving usage. Today we are going to go into a well known artifact LNK files and them move through Jump lists, MRU keys and the other artifacts we use to establish use and explain how to stitch them together. Along the way we will detail the nuances that can change your opinion or possibly lead to misinterpretation.

LNK files are one the simplest artifacts and many, many, many people have written about them. Here are some of my favorite LNK write ups if you are reading this and are not familiar with them:


The funny thing about artifacts as simple as LNK files is that they reveal as much information to the examiner as they care to know. When I do interviews for a position at G-C I ask a series of questions relating to artifacts and what they mean to the examiner. This isn't a trick question, which I explain to the interviewee, but rather a gauge to determine how far down the rabbit hole the examiner has gone. As an example for LNK files I would ask the following to an interviewee:

'What can you determine from a LNK file'

I can determine the rough expertise of an examiner by how many of the following points they answer with. I then take this in combination with other artifact questions/scenarios and the level of depth they answer to determine their level of forensic experience rather than focus on their resume.

Beginner Answer:
A LNK file reveals what files and/or programs a user accessed.

Intermediate Answer:
A LNK files reveals what files and/or programs a user accessed and the network path and MAC address of the where the access took place.

Experienced Answer:
A LNK file reveals what files and/or programs a user accessed and the network path and MAC address of where the access took place. In addition it contains the timestamps captured from the file and/or program being accessed that represents the file at the time the access took place.

Senior Answer:
A LNK file reveals what files and/or programs a user accessed and the full path\network path and MAC address of where the access took place. In addition it contains the timestamps captured from the file and/or program being accessed that represents the file at the time the access took place. It also contains the volume serial number of the device which you can use to match the LNK back to the volume the file came from if not a network data source. In addition LNK files contain shell items allowing the examiner to determine the type of folder being accessed (volume/network/file/uri).

Expert Answer:
A LNK file contains two sets of timestamps relevant to the examiner. The first set of MAC times belong to the LNK file itself, it reveals by creation date when the file was first accessed as recorded by this LNK file. The modification time records the last time the LNK file was updated and should reflect the last successful access. The second set of dates is maintained within the LNK file and represents the MAC times of the file being accessed based on the last successful access to the file from the LNK file. In order to determine prior states of the file you can examine the restore points (XP), shadow copies (win 7) and carved LNK files to find all the other versions of this LNK file that also reference this file and volume serial number/shell item uniquely. Each updated set of internal MAC times represents another successful access of the file through the LNK File and should be counted towards usage.

Now if you noticed I didn't say the Expert Answer had to go into depth on the technical structure as to what all can be contained within a LNK file, that isn't as important to me as the ability to properly interpret what the data means in the context of analysis. I assume that anyone who can give me an expert answer already has the technical knowledge of the file format to give additional facts when needed, but I find that people who give just technical information are missing the larger picture of what they data means in their analysis and what they can prove with it.

So with that said, tomorrow we will continue on with usage artifacts. Do you think I missed something or do you have an even better answer? Leave it in the comments, I'm always interested in additional views on analyzing familiar artifacts! 

Post a Comment


Author Name

Contact Form


Email *

Message *

Powered by Blogger.