Daily Blog #654: Sunday Funday 3/31/19 - Google Cloud Challenge

Google Cloud Challenge - Hacking Exposed by David Cowen


Hello Reader,
           No April Fools this week I didn't post an answer for last weeks challenge because ... I didn't receive any qualifying answers. So let's try this again shall we? A second week to show the world your expertise with the google cloud.


The Prize:

$100 Amazon Giftcard

The Rules:

  1. You must post your answer before Friday 4/5/19 7PM CST (GMT -5)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com. Please state in your email if you would like to be anonymous or not if you win.
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post


The Challenge:
Name and describe all of the available forensic data sources provided by Google Cloud Platform

Daily Blog #653: Forensic Lunch Test Kitchen 3/26/19


Hello Reader,
        Tonight I tried to do a live stream from my hotel in  Jeddah, KSA. Looking back at the recording I'm not sure how well it did but I was able to get some base testing done for a better test of some SRUM recording features tomorrow night UTC +3. In short I did the following to see how SRUM would record it:

  • The livestream to youtube via Xsplit
  • The youtube access via chrome
  • A chrome incognito window to see if it gets tracked seperately
  • Copied data to an external drive with windows explorer
  • Copied data to an internal drive with copy.exe from the command line
  • Deleted files within file explorer
  • Deleted files in the GUI
I'll let this computer run overnight and use the wired internet I have in the classroom to stream tomorrow.

You can watch the video here: https://youtu.be/0I1xgA3DhYo

Also Read: Daily Blog #652

Daily Blog #652: Seeking Sponsor for the Unofficial Defcon DFIR CTF 2019

Seeking Sponsor for the Unofficial Defcon DFIR CTF 2019


Hello Reader,
        Do you or your company want to provide a prize for the Unofficial Defcon DFIR CTF now in its third year? If so email me at dcowen@g-cpartners.com so we can talk. In the past SANS, Magnet Forensics, Blackbag and Metaspike have all graciously provided prizes for our worthy contenders and we'd like to open this up to all of you.

The Defcon DFIR CTF usually gets 100+ players during the events and 100s more once the CTF is opened to the public. I'd like to expand the prize pool so we can award more cool things to more people in three groups.

  1. Top finishers at Defcon
  2. Top finishers online
  3. Noteworthy achievements (Like first perfect score)
So reach out if you are interested and I hope to hear from you soon.

Daily Blog #651: Sunday Funday 3/24/19 - FRS Google Cloud Platform Challenge

FRS Google Cloud Platform Challenge



Hello Reader,
   Let's finish this trifecta of the major three cloud compute vendors. I think that getting more of this knowledge out there will many random internet searches just trying to understand whats possible, when someone else made a decision to move their assets to the cloud. We have a streak of new winners and you, yes you, reading this now. I want you to be my next winner so take the time to do some research and I look forward to hearing from you!



The Prize:

$100 Amazon Giftcard

The Rules:

  1. You must post your answer before Friday 3/29/19 7PM CST (GMT -5)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com. Please state in your email if you would like to be anonymous or not if you win.
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post


The Challenge:
Name and describe all of the available forensic data sources provided by Google Cloud Platform

Also Read: Daily Blog #650

Daily Blog #650: Solution Saturday 3/23/19 - Forensic Data Sources by Azure Challenge Solution

Forensic Data Sources by Azure Challenge Solution



Hello Reader,
         This week's challenge was met with many challenges but they were overcome by @darizotas aka Dario B. I think you'll see in his winning post that did a pretty thorough job documenting what existed with solid references for following up. I'm loving all of these new people in the community getting involved and showing what they have to contribute! So next week, let that be you!


The Challenge:
Name and describe all of the available forensic data sources provided by Azure Computer 

The Winning Answer:
@darizotas 
https://darizotas.blogspot.com/2019/03/azure-and-office-365-logging.html


Also Read: Daily Blog #649


Daily Blog #649: How to Pick Something to Test

: How to Pick Something to Test - Hacking Exposed Blog by David Cowen





Hello Reader,
         One of the questions I get asked on a semi regular basis is, how do I pick what to test/research? The answer is more simple than you would expect:

Selection pool:

  •  I look at an interaction I just experienced while using the operating system
  • I think about an artifact I don't feel I full understand
  • I am working on a case and have to find a way to recreate a behavior I found

After that, as you can see the in test kitchen videos, I spend hours testing/recreating/examining/understanding the behavior that I'm seeing. While it is possible that two different actions can result in the same behavior, typically those different actions create their own marks on the system allowing you determine which route the user went through.

I have my travel streaming system with me now and I'm currently in an airport in Dubai awaiting my flight to Jeddah. My hope is that the hotel in Jeddah will have enough bandwidth to let me stream some testing this week, wish me luck!



Daily Blog #648: How to Stream your own Test Kitchen

How to Stream your own Test Kitchen DFIR - Hacking Exposed Blog by David Cowen



Hello Reader,
       As I prepare to get the test kitchen back in service I thought I'd share what I use for others who are looking to do the same. I got this idea after this tweet from Gerald Davis


So here is my setup:
Hardware: I have a Windows 10 desktop with a Nvidia GTX 980 an i7 processor and 32gb of RAM. It's nothing special and you don't need much in order to do this. The OS is running off of a SSD and the virtual machines are running off a 2TB 7200 RPM Western Digital.

Broadcasting software: I'm using XSplit broadcaster. You could use something else like OBS or Wireframe but XSplit when I was looking was the easiest to setup and use with all of its built in plugins and stream support.

Hypervisor: I'm using VMWare Workstation

Mic: I'm using a HyperX Cloud 2 headset, nothing special just a headset with a mic.

I have a 4K Samsung monitor so I have the VM running in the upper corner of the monitor and have drawn the broadcast window over it. This allows me to keep most of the screen off camera as it were so I can monitor the stream, check chat and google things while also making the streamed desktop be a readable size to the viewer.

I got the Windows OS ISO's from MSDN but you could also get eval images from Microsoft directly.

Have other questions? Let me know in the comments and I can explain more about what I do and how it works. 

Daily Blog #647: Windows Forensics in San Diego

Windows Forensics in San Diego - SANS Event by David Cowen


Hello Reader,

               Looks like I'll be heading to sunny San Diego California to teach SANS FOR500: Windows Forensics this May 9 2019. The event is called Security West and its one of the bigger SANS events of the years. If you wanted to learn Windows Forensics, see San Diego and see some great bonus sessions for some amazing SANS instructors its a great event.

Want to learn more? Click below
https://www.sans.org/event/security-west-2019/course/windows-forensic-analysis


Daily Blog #646: Sunday Funday 3/17/19 - Azure Compute Challenge

Azure Compute Challenge



Hello Reader,
              I always appreciate it when people spend their time researching rather than doing other fun things, like playing video games or reading a non-technical book. When we share what we know, even if we don't know everything about something, it helps someone else leap frog forward and learn more. This week let's keep our head in the clouds for another challenge.


The Prize:

$100 Amazon Giftcard

The Rules:

  1. You must post your answer before Friday 3/22/19 7PM CST (GMT -5)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com. Please state in your email if you would like to be anonymous or not if you win.
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post


The Challenge:
Name and describe all of the available forensic data sources provided by Azure Compute


Also Read: Daily Blog #645

Daily Blog #645: Solution Saturday 3/16/19 - Amazon AWS For EC2


Daily Blog #645: Solution Saturday 3/16/19 - Amazon AWS For EC2

Hello Reader, Spring break is ending which means kids are going back to school soon and I'll be back on track with blogging. Here is this weeks winner! The Challenge:
Name and describe all of the available forensic data sources provided by Amazon AWS for EC2 The Winning Answer: Jonathan Yan CloudTrail Logs:
Amazon AWS For EC2
Cloudtrail is an audit log that is enabled by default and stores all actions on resources for an account for 90 days. For EC2 specifically, it can provide information on the user and the action they performed on a specific resource such as EC2 KeyPairs, NetworkAcl, SecurityGroup, Snapshot to see if any suspicious changes were made. EBS (Elastic Block Store) Snapshots

Amazon AWS For EC2



Elastic Block Store are the hard drives that EC2 instances use to store data on. Snapshots can be taken of the EBS of a compromised and mounted onto a trusted EC2 instance for forensic investigation. These snapshots can be taken regularly as part of backups or whilst responding to an incident. Note that the ownership of snapshots can be assigned to another AWS account to ensure they cannot be modified by anyone with permissions over a compromised account. VPC Flow Logs
Amazon AWS For EC2
VPC Flow logs are a record of ip traffic to and from network interfaces within a Virtual Private Cloud (VPC), which is the segregated network that EC2 instances reside in. It can provide a trail of all network traffic to and from each EC2 instance. However, this has to be enabled per VPC and then sent to AWS CloudWatch or stored in an AWS S3 bucket, where it can then be analysed. AWS Systems Manager
Amazon AWS For EC2

AWS Systems Manager is a utility that can be enabled as an AWS agent on an EC2 instance to record all the installed software, network configurations, CPU data, windows patch versions, specific windows registry keys and files. It could be useful for a first glance while in the console, but this has to be enabled and configured correctly before an incident occurs to provide value. Additionally, information shown here can be found during forensic investigation of EBS volume. AWS Inspector AWS Inspector is a vulnerability scanning platform that can identify vulnerabilities in applications running on EC2 instances. If enabled and configured, it could be useful during forensic investigations to narrow down which vulnerabilities may have been exploited on a host. Cheers,

Daily Blog #644: Creating decrypted images of APFS file systems encrypted with T2 Chips with Macquistion

: Creating decrypted images of APFS file systems encrypted with T2 Chips with Macquistion by David Cowen


Hello Reader,
          Dealing with T2 Chips on recent model Macbooks has been a real pain point for us in the lab so I was very, very happy to read that Blackbag (thanks Joe and Vico!) have figured out how to transparently decrypt the physical blocks of a drive being managed by a T2 chip at imaging time. Now the important to understand is that this decryption is being done at Image time, meaning Macquisition is not extracting the keys for later use. Instead Blackbag has found a way to get the T2 chip to return decrypted blocks rather than just files.

This is a big step forward as all of the other solutions I'm aware of (including the previous version of Macquistion) where stuck just doing file system images (logical images) of APFS drives with T2 chips. Now with this feature you can get all the data including APFS snapshots and possibly deleted data as well.

You can read more here:
https://www.blackbagtech.com/blog/2019/03/11/macquisition-will-decrypt-physical-images-macs-t2-chip/

Daily Blog #643: Sunday Funday 3/10/19 - Amazon AWS for EC2 Challenge

Amazon AWS for EC2 Challenge Hacking Exposed Computer Forensic Blog by David Cowen

Hello Reader,

        On this blog we focus on a lot of host related issues, but the world is no longer confined to single on premises hosts anymore. This week let's set our challenge sights to the skies and start seeing what you can research about ... the cloud.



The Prize:

$100 Amazon Giftcard

The Rules:

  1. You must post your answer before Friday 3/15/19 7PM CST (GMT -5)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com. Please state in your email if you would like to be anonymous or not if you win.
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post


The Challenge:
Name and describe all of the available forensic data sources provided by Amazon AWS for EC2

Also Read: Daily Blog #642

Daily Blog #642: Solution Saturday 3/9/19 - Winner of OSX Mojave Challenge

Winner of OSX Mojave Challenge by David Cowen

Hello Reader,


         I love weeks when we get to crown new winners. Tun is not new DFIR, you may have seen his tweets before, but he is new to the Sunday Funday winners circle. Tun did some great testing which he documented below specifically for OSX. Give his work a look and join me on congratulating Tun on his win!



The Challenge:



On OSX Mojave, what information can you determine from the KnowledgeC database?


The Winning Answer:


Tun Naung 


Sarah Edwards has done quite an extensive research on the knowledge database and wrote a few posts on her blog mac4n6.com.  I have never used a Mac before and this is a good opportunity to learn Mac forensics. So, I decided to take the challenge and installed a macOS Mojave in Virtual Box. As a beginner I simply follow the same testing that Sarah did on macOS 10.13 to see the output of macOS 10.14 Mojave.

On macOS 10.14 Mojave, the knowledgeC.db database is found in the same locations as the previous version, macOS 10.13.

/private/var/db/CoreDuet/Knowledge system context database

~/Library/Application Support/Knowledge user context database

adams-iMac:~ adam$ sw_vers ProductName: Mac OS X ProductVersion: 10.14

BuildVersion: 18A391

adams-iMac:Knowledge adam$ pwd

/private/var/db/CoreDuet/Knowledge adams-iMac:Knowledge adam$ ls -l total 6120

-rw------- 1 root wheel 774144 Mar 8 04:25 knowledgeC.db

-rw------- 1 root wheel 32768 Mar 8 04:25 knowledgeC.db-shm

-rw------- 1 root wheel 2323712 Mar 8 05:23 knowledgeC.db-wal

adams-iMac:knowledge adam$ pwd

/Users/adam/Library/Application Support/knowledge

adams-iMac:knowledge adam$ ls -l total 1008

-rw------- 1 adam staff 425984 Mar 8 04:28 knowledgeC.db

-rw------- 1 adam staff 32768 Mar 8 04:28 knowledgeC.db-shm

-rw------- 1 adam staff 53592 Mar 8 04:38 knowledgeC.db-wal


From this database, we can get the information about


Application Usage



Application Activities



Safari Browser History


Using the DB Browser for SQLite, we can open the knowledgeC.db database and look at the structure. The database has many tables with many columns.

Note: timestamps in this database use the Mac Epoch time (01/01/2001 00:00:00 UTC).

Figure 1: Database structure of the knowledgeC.db, both system and user context database have 16 tables

Three tables are of particular interest and they store the information which is useful and valuable for an investigator.


ZOBJECT – contains usage entries (Sarah mentioned four weeks, but I could not test as my mac has been running for a week or two only)



ZSOURCE – source of ZOBJECT entries



ZSTRUCTUREDMETADATA – additional metadata associated with ZOBJECT entries


By running the SQLite query on the system context knowledgeC.db database, we get the following

“types”:

SELECT

DISTINCT ZOBJECT.ZSTREAMNAME

FROM ZOBJECT

ORDER BY ZSTREAMNAME


/activity/level



/app/activity



/app/inFocus



/app/usage



/display/isBacklit



/safari/history


For application usage information, we can run the SQLite query below where “/app/inFocus” type will provide what application is being used at a given time.

SELECT

datetime(ZOBJECT.ZCREATIONDATE+978307200,'UNIXEPOCH','LOCALTIME') as "ENTRY CREATION", CASE ZOBJECT.ZSTARTDAYOFWEEK

WHEN "1" THEN "SUNDAY" WHEN "2" THEN "MONDAY" WHEN "3" THEN "TUESDAY" WHEN "4" THEN "WEDNESDAY" WHEN "5" THEN "THURSDAY" WHEN "6" THEN "FRIDAY" WHEN "7" THEN "SATURDAY"

END "DAY OF WEEK", ZOBJECT.ZSECONDSFROMGMT/3600 AS "GMT OFFSET",

datetime(ZOBJECT.ZSTARTDATE+978307200,'UNIXEPOCH','LOCALTIME')    AS    "START", datetime(ZOBJECT.ZENDDATE+978307200,'UNIXEPOCH','LOCALTIME')   AS    "END", (ZOBJECT.ZENDDATE-ZOBJECT.ZSTARTDATE) AS "USAGE IN SECONDS", ZOBJECT.ZSTREAMNAME,

ZOBJECT.ZVALUESTRING FROM ZOBJECT

WHERE ZSTREAMNAME IS "/app/inFocus" ORDER BY "START"

-- Result: 697 rows returned in 145ms

Figure 2: Snapshot of the result of SQLite query which show the application usage

Note: the application usage is only for GUI-based applications. User attribution is an issue as this data is from system context database on macOS.

For application activities, we can use “/app/activity” stream type to add more context.

SELECT

datetime(ZOBJECT.ZCREATIONDATE+978307200,'UNIXEPOCH','LOCALTIME') as "ENTRY CREATION", ZOBJECT.ZSECONDSFROMGMT/3600 AS "GMT OFFSET",

CASE ZOBJECT.ZSTARTDAYOFWEEK WHEN "1" THEN "SUNDAY" WHEN "2" THEN "MONDAY" WHEN "3" THEN "TUESDAY" WHEN "4" THEN "WEDNESDAY" WHEN "5" THEN "THURSDAY" WHEN "6" THEN "FRIDAY" WHEN "7" THEN "SATURDAY"

END "DAY OF WEEK", datetime(ZOBJECT.ZSTARTDATE+978307200,'UNIXEPOCH','LOCALTIME')    AS    "START", datetime(ZOBJECT.ZENDDATE+978307200,'UNIXEPOCH','LOCALTIME')   AS    "END", (ZOBJECT.ZENDDATE-ZOBJECT.ZSTARTDATE) AS "USAGE IN SECONDS",

ZOBJECT.ZSTREAMNAME,

ZOBJECT.ZVALUESTRING,

ZSTRUCTUREDMETADATA.Z_DKAPPLICATIONACTIVITYMETADATAKEY   ACTIVITYTYPE AS "ACTIVITY TYPE",

ZSTRUCTUREDMETADATA.Z_DKAPPLICATIONACTIVITYMETADATAKEY   TITLE AS "TITLE",

ZSTRUCTUREDMETADATA.Z_DKAPPLICATIONACTIVITYMETADATAKEY  USERACTIVITYREQUIREDSTRI NG AS "ACTIVITY STRING",

datetime(ZSTRUCTUREDMETADATA.Z_DKAPPLICATIONACTIVITYMETADATAKEY  EXPIRATIONDATE+9 78307200,'UNIXEPOCH','LOCALTIME') AS "EXPIRATION DATE"

FROM ZOBJECT

LEFT JOIN ZSTRUCTUREDMETADATA on ZOBJECT.ZSTRUCTUREDMETADATA=ZSTRUCTUREDMETADATA.Z_PK

WHERE ZSTREAMNAME IS "/app/activity" or ZSTREAMNAME is "/app/inFocus" ORDER BY "START"

-- Result: 754 rows returned in 207ms

Figure 3: Snapshot of the result of the SQLite query which show more context to the application activity

Even though I did edited the note, activity type data is not populated. Further testing is required for this.

Note: ZSTRUCTUREDMETADATA table has more than 100 columns which is worth looking into to see what data the apps are populating in this table.

For Safari history information, we can run the same query as the previous queries.

SELECT

datetime(ZOBJECT.ZCREATIONDATE+978307200,'UNIXEPOCH','LOCALTIME') as "ENTRY CREATION", CASE ZOBJECT.ZSTARTDAYOFWEEK

WHEN "1" THEN "SUNDAY"

WHEN "2" THEN "MONDAY" WHEN "3" THEN "TUESDAY" WHEN "4" THEN "WEDNESDAY" WHEN "5" THEN "THURSDAY" WHEN "6" THEN "FRIDAY" WHEN "7" THEN "SATURDAY"

END "DAY OF WEEK", ZOBJECT.ZSECONDSFROMGMT/3600 AS "GMT OFFSET",

datetime(ZOBJECT.ZSTARTDATE+978307200,'UNIXEPOCH','LOCALTIME')    AS    "START", datetime(ZOBJECT.ZENDDATE+978307200,'UNIXEPOCH','LOCALTIME')   AS    "END", (ZOBJECT.ZENDDATE-ZOBJECT.ZSTARTDATE) AS "USAGE IN SECONDS", ZOBJECT.ZSTREAMNAME,

ZOBJECT.ZVALUESTRING FROM ZOBJECT

WHERE ZSTREAMNAME IS "/safari/history" ORDER BY "START"

-- Result: 57 rows returned in 28ms

Figure 4: Snapshot of the results of the SQLite query which shows the Safari history

Note: if the user clears the browsing history, these entries get removed too. Private Mode browsing does not show up either.

For user correlation, we can run the SQLite query on the user context knowledgeC.db database.

SELECT

DISTINCT ZOBJECT.ZSTREAMNAME FROM ZOBJECT

ORDER BY ZSTREAMNAME


/event/tombstone



/portrait/entity



/portrait/topic


There are a lot more testing to be done as this database contains a lot of information.

- Knowledge is power -

Credits:

https://www.mac4n6.com/blog/2018/8/5/knowledge-is-power-using-the-knowledgecdb-database-  on-macos-and-ios-to-determine-precise-user-and-application-usage


https://sqlitebrowser.org/


https://www.hecfblog.com/2019/03/daily-blog-636-sunday-funday-3319.html


Also Read: Daily Blog #641