@night 1803 access accessdata active directory admissibility ads aduc aim aix ajax alex levinson alissa torres amcache analysis anjp anssi answer key antiforensics apfs appcompat appcompatflags applocker april fools argparse arman gungor arsenal artifact extractor attachments attacker tools austin automating automation awards aws azure azuread back to basics backstage bam base16 best finds beta bias bitcoin bitlocker blackbag blackberry enterprise server blackhat blacklight blade blanche lagny book book review brute force bsides bulk extractor c2 carved carving case ccdc cd burning ceic cfp challenge champlain chat logs Christmas Christmas eve chrome cit client info cloud forensics command line computer forensics computername conference schedule consulting contest cool tools. tips copy and paste coreanalytics cortana court approved credentials cryptocurrency ctf cti summit cut and paste cyberbox Daily Blog dbir deep freeze defcon defender ata deviceclasses dfa dfir dfir automation dfir exposed dfir in 120 seconds dfir indepth dfir review dfir summit dfir wizard dfrws dfvfs dingo stole my baby directories directory dirty file system disablelastaccess discount download dropbox dvd burning e01 elastic search elcomsoft elevated email recovery email searching emdmgmt Encyclopedia Forensica enfuse eric huber es eshandler esxi evalexperience event log event logs evidence execution exfat ext3 ext4 extended mapi external drives f-response factory access mode false positive fat fde firefox for408 for498 for500 for526 for668 forenisc toolkit forensic 4cast forensic lunch forensic soundness forensic tips fraud free fsutil ftk ftk 2 full disk encryption future gcfe gcp github go bag golden ticket google gsuite guardduty gui hackthebox hal pomeranz hashlib hfs honeypot honeypots how does it work how i use it how to howto IE10 imaging incident response indepth information theft infosec pro guide intern internetusername Interview ios ip theft iphone ir itunes encrypted backups jailbreak jeddah jessica hyde joe sylve journals json jump lists kali kape kevin stokes kibana knowledgec korman labs lance mueller last access last logon lateral movement leanpub libtsk libvshadow linux linux forensics linux-3g live systems lnk files log analysis log2timeline login logs london love notes lznt1 mac mac_apt macmini magnet magnet user summit magnet virtual summit mari degrazia mathias fuchs md viewer memorial day memory forensics metaspike mft mftecmd mhn microsoft milestones mimikatz missing features mlocate mobile devices mojave mount mtp multiboot usb mus mus 2019 mus2019 nccdc netanalysis netbios netflow new book new years eve new years resolutions nominations nosql notifications ntfs ntfsdisablelastaccessupdate nuc nw3c objectid offensive forensics office office 2016 office 365 oleg skilkin osx outlook outlook web access owa packetsled paladin pancake viewer path specification pdf perl persistence pfic plists posix powerforensics powerpoint powershell prefetch psexec py2exe pyewf pyinstaller python pytsk rallysecurity raw images rdp re-c re-creation testing reader project recipes recon recursive hashing recycle bin redteam regipy registry registry explorer registry recon regripper remote research reverse engineering rhel rootless runas sample images san diego SANS sans dfir summit sarah edwards saturday Saturday reading sbe sccm scrap files search server 2008 server 2008 r2 server 2012 server 2019 setmace setupapi sha1 shadowkit shadows shell items shellbags shimcache silv3rhorn skull canyon skype slow down smb solution solution saturday sop speed sponsors sqlite srum ssd stage 1 stories storport sunday funday swgde syscache system t2 takeout telemetry temporary files test kitchen thanksgiving threat intel timeline times timestamps timestomp timezone tool tool testing training transaction logs triage triforce truecrypt tsk tun naung tutorial typed paths typedpaths uac unc understanding unicorn unified logs unread updates usb usb detective usbstor user assist userassist usnjrnl validation vhd video video blog videopost vlive vmug vmware volatility vote vss web2.0 webcast webinar webmail weekend reading what are you missing what did they take what don't we know What I wish I knew whitfield windows windows 10 windows 2008 windows 7 windows forensics windows server winfe winfe lite winscp wmi write head xboot xfs xways yarp yogesh zimmerman zone.identifier

Daily Blog #15: 7/72013 Sunday Funday winner!

Howdy Reader,
        I'm in Austin for the DFIR Summit, but the daily blogs must continue! Yesterday we had a particularly challenging Sunday Funday regarding detecting web server log tampering. We had a couple contenders and the winner this week is Jacob Williams! Here is Jacob's winning answer:

Wow, I wish I had access to the server.  Text based log files are one of the few places where slack space analysis can be a benefit.  There's plenty of room to potentially find evidence of log tampering (especially if the tampered log is smaller than the original).  Over three years of web server logs, I'd hope to find SOMETHING in slack space if logs were manipulated.

The first thing to check are time series within the logs.  By time series, I mean does every log entry come after the one before it.  This is one place that people totally screw up when modifying logs.  I actually have written scripts to check this in various formats and again, it's a place that inexperienced forgers get caught.

Timestamps on the logs might also be useful, though less reliable depending on how you were provided the logs.  W3C formatted logs begin anew each day. Obviously you want to check for the timestamps to be consistent with the dates of the logs.  Again, depending on how you were provided the logs (FAT formatted thumb drive for instance), the file timestamps may not be usable.

A piece of the case that isn't specified is whether the suspect has a static IP address.  Obviously we'll want to correlate log entries to that static IP if one exists.  If the user has a dynamic IP, check the range to make sure it is consistent with his ISP.  Three years is probably too far back to subpoena DHCP logs from the ISP, but get as much as you can.

GoGo InFlight Internet service has sort of screwed up this next one, but I want to get the suspect's travel records to identify time periods when he couldn't have had access to the Internet to make the illicit logins.  Times when a suspect is in the air, etc. are great.  Is the suspect a public speaker? Check the logs for times when he was speaking.  I like to think I'm talented, but I have a hard time hacking websites and teaching SANS FOR610 at the same time (even if I do know the material like the back of my hand).  Find as many instances of these time issues as possible.  It might be conceivable that the suspect violated the laws of time and space once, but thirty times? Fifty times? Come on, this isn't an episode of Fringe.  Of course the attacker could have been creative with his Internet access or set an automated timed attack, but let the plaintiff prove this.  Just as in the possibility of tampering with forensic data, the simple possibility of isn't sufficient to say it happened.

One of the more technical approach I'd take would be analysis of actual usage patterns.  Are the suspect's usage patterns (i.e. pages accessed) consistent with what the plaintiff is alleging?  Does the defendant magically skip the login screen and go directly to authenticated access when everyone else must login through login.aspx?  Stuff like this can be an indicator that the logs have been tampered with.

Some of the rest depends on the style of the web application and the verbosity of logging.  If the logs contain some sort of session ID (in the URL perhaps) we should analyze how this session ID is generated.  If it is completely random, do the logs ever show our suspect using the same ID?  If so, the odds of hitting the same random session id are nil to none.  Go buy a lottery ticket.  Another thing we often see are time based session IDs where the IDs increase over time.  Again, make sure that the IDs are increasing over time for the user's login.  If the web application places a time stamp in the URL to prevent replay attacks, make sure that the URL timestamps are consistent with the log timestamps.  Also check that they are increasing.

We also want to check the user agents being recorded. Is your suspect a total techno-tard but his user agent indicates Linux?  Mac user agent, but the user doesn't own a Mac?  Look for accepted languages in the HTTP requests.  If the user is in America and doesn't speak Chinese, then the accepted language in the HTTP headers probably won't be Chinese.

Anomalies in the logs are also something to check for.  Did the suspect's log entries happen at a particular time?  One of the ways people screw up forging logs is to change a legitimate log entry to cover up illicit activity.  In this case, check other user's patterns of behavior.  Does user X always log in between 0900 and 1100, but fails to on days when our suspect logs in at the same time (and coincidentally performs the same actions)?

One of the final things I'd check would be whether the web application logs both a text based userID and a numeric user ID.  We want to make sure that these are always consistent with one another and never reused in the logs.

As a side note, I'd also want to subpoena the web server configuration and web application to audit the code.  If this is a high profile case, it's worth performing tests on the web application to ensure that the expected logging matches the actual logging.
To extend my answer a little, I'd like to add to check the Cookie and Referrer fields in the W3C formatted logs (if those attributes are being logged).

Either attribute could highlight an anomaly in forged records.  For instance, cookie values could point to session IDs inconsistent with those in the GET request (or assigned simultaneously to another user).

Referrer fields are another issue entirely.  Web applications usually have a fairly static content flow.  If the referred fields for our suspected forged records are inconsistent with those of legitimate users, we may have found the smoking gun.  This again underscores the need to subpoena the web application for testing.  If the plaintiff claims the logs are damning "because that's how the custom web app works" we should have cause to examine the web application to determine logging fringe cases/inconsistencies.

    This was a great answer! Jake certainly showed a mastery of web log analysis in his response and I hope you will get some good pointers here for your own log analysis.

    This Sunday Funday was based on a real case, ILS v Partsbase, that I worked on for three years back in 2003-2006. The case went to a jury trail where we were able to successfully show that 3 years of web logs were altered to make it appear as though our client had made 1.6 million unauthorized accesses over 3 years.

    The case came together in a series of steps that Jacob highlighted in his answer but that I want to highlight and explain.
  1. Compare session states, in my case I was lucky and the developer decided to store the cookies assigned in the web logs and they contain an environmental variable for the users IP address. In my case the IP stored in the cookie never matched the IP recorded in the log. 
  2. Review the the user agents, Using the user agents we were able to pull out 1,100 different devices suddenly associated with my clients outgoing IP address. 
  3. Review the user agents for browser customization. This was the most critical aspect in getting the jury to understand what occurred. There was a public and a private web site with two different sets of logs. If you have a standard image rolled out to your systems you may see a message on the browser bar (IE specifically) that says something like 'Provided by HECFBlog'. This information is passed on in the UserAgent.
  4. Put all your logs into a database for better analysis and cross queries. With distinct customized browser user agents located , for example 'University of Some State' I was able to reconstruct sessions between the two logs and show that when a visit to the public side was made it contained an IP address belonging to the University but when an image request was made to the private side my clients IP address would suddenly appear in the logs as the requester.
    We still don't know who modified the logs and due to protective orders I can't reveal what all we discovered. However using the type of analysis Jacob described and the facts I identified above, combined with some great simple animated PowerPoint slides we were able to clearly demonstrate to the jury why any reasonable person could see the logs were manipulated.

The milestone series resumes tomorrow!
Labels:

Post a Comment

[blogger][disqus][facebook][spotim]

Author Name

Contact Form

Name

Email *

Message *

Powered by Blogger.