Tuesday, April 30, 2019

Saturday, April 13, 2019

Daily Blog #660: Solution Saturday 4/13/19

Hello Reader,
           This weeks winner only recently discovered the blog and the contests therein. Why bring this up? It shows that you don't have to be around for years to have a chance at winning a Sunday Funday. In Michael's case he posted a comment with an answer and sent me an email, and when he is getting in reply today is a winning answer. Congratulations Michael Bryan!




Question: 
For Dropbox Audit logs what all data can you determine about someone who was logged in?
What allows you to unique identify a file?
The Winning Answer:

Master Deputy Michael Bryan

Dropbox Audit Logs or Activity Logs are a feature of the Dropbox business accounts. The Advanced Team accounts include file level Audit logs as a part of the paid service. These logs are accessible from the Account Console which available the account administrator or administrators. The console provides very detailed information about team member’s usage of the account and nearly all facets of the members’ interactions are recorded and can be reviewed. The following items can be viewed in the Console of an advanced account regarding FILES:
Added a file
Added a file to their Dropbox
Added a file to their Dropbox (non-team member)
Added a folder
Allowed anyone to view links to files in a shared folder
Allowed file request emails for the team
Allowed non collaborators to view links to files in a shared folder
Allowed only team members to view links to files in a shared folder
Changed a file request
Closed a file request
Copied a file
Copied a file to their Dropbox
Copied a file to their Dropbox (non-team member)
Copied a folder
Created a link to a file using an app
Created a new file request
Deleted a file
Deleted a file comment
Deleted a folder
Disabled file requests
Downloaded a file (non-team member)
Downloaded files
Edited files
Enabled file request emails for everyone
Enabled file requests
Failed to delete some files remotely
File added to a showcase
File downloaded (non-team member) from a showcase
File downloaded (team member) from a showcase
File in showcase viewed by non-team member
File in showcase viewed by team member
File removed from a showcase
Liked a file comment
Made a file viewable only to members of the file
Made a file viewable only to team members with the link
Made a file viewable to anyone with the link
Moved a file
Moved a folder
Multiple files downloaded (non-team member) from a showcase
Multiple files downloaded (team member) from a showcase
Opened a file (non-team member)
Prevented non-team members from viewing links to files in a shared folder
Previewed files
Received files via file request
Received files via file request
Renamed a file
Renamed a folder
Requested access to a file (non-team member)
Resolved a file comment
Restored a file
Restored a folder
Restored a resolved file comment
Reverted files to a previous version
Rolled back file changes
Subscribed to file comment notifications
Successfully deleted some files remotely
Unliked a file comment
Unsubscribed from file comment notifications

Additionally, the audit logs maintain information about the users themselves. An administrator can see the following regarding member uses:
The date and time of the event
The member who initiated the event
The details of the event
The location in the form of an IP address of the team member
The logs detail who are the active team members of the last 28 days, the number of shared folders over the last 28 days, how much storage space is used, the number of links created, and a log of what devices are accessing the account over the previous 28 days. From the console you can also monitor password changes, sign ins, connected apps, changes in sharing, changes in groups, and changes in membership. 

The files specific path and file name along with the connected user interactions would allow an administrator to identify a file in the log data.

The information for this initial and feeble attempt at an answer was gathered from poking around the internet and reading Dropbox.com helps files, Dropboxforum posts and two blogs written by “Kevin” on metadatum.wordpress.com (who actual cites the author of this challenge in his 2013 post about Dropbox forensics.) 

Sunday, April 7, 2019

Daily Blog #659: Sunday Funday 4/7/19

Hello Reader,
           Sounds like Google Compute DFIR knowledge must be sparse based on the responses I've gotten .. namely none! So let's change platforms to see how well you know PaaS, Platform as a Service specifically Dropbox.

The Prize:
$100 Amazon Giftcard

The Rules:

  1. You must post your answer before Friday 4/12/19 7PM CST (GMT -5)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com. Please state in your email if you would like to be anonymous or not if you win.
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post


The Challenge:
For Dropbox Audit logs what all data can you determine about someone who was logged in?
What allows you to unique identify a file?

Saturday, April 6, 2019

Daily Blog #658: MUS 2019 DFIR CTF Perfect Score Achieved

Hello Reader,
           Just a note that we already have a perfect score winner!



Congratulations to Plop aka Bastien Lardy who I will be contacting about their prize!

The CTF will remain up for quite some time to allow all of you a chance to learn and get ready for the big DFIR CTF of the year, the Defcon Unofficial DFIR CTF!

Thursday, April 4, 2019

Daily Blog #657: MUS2019 DFIR CTF open to the public

Hello Reader,
    The DFIR CTF that we ran at the magnet user summit is now open to the public.

You can download the evidence and a 30 day license key for Magnet Axiom here:
https://drive.google.com/drive/u/0/mobile/folders/1E0lELj9NouMwSMGZCI7lXWRqYE2uQCpW?usp=sharing

You can register for the CTF and play here:
https://mus2019.ctfd.io/

Daily Blog #656: Forensic Lunch 4/3/19 Live from MUS2019

Hello Reader,
           Today we had a Forensic Lunch live from the Magnet User Summit 2019 with guests:

  • Kevin Pagano talking about his experience playing (and winning) the MUS2019 DFIR CTF
  • Jessica Hyde and Jad Saliba talking about whats next for Magnet 
You can watch the video here:

Wednesday, April 3, 2019

Daily Blog #655: Magnet User Summit DFIR CTF 2019 Results

Hello Reader,
             We had a great CTF today that will soon be released to the public. I'm happy to announce the top three winners.

#1 Kevin Pagano
#2 Jonathan Rajewski
#3 Santiago Ayala

Prizes were given away and more prizes await those who will now compete in the online public offering that will be released thursday Until then you can see the current scoreboard here:


https://mus2019.ctfd.io/scoreboard

You can also register for an account in expectation of Thursday, the ctf is currently paused until then.

Sunday, March 31, 2019

Daily Blog #654: Sunday Funday 3/31/19

Hello Reader,
           No April Fools this week I didn't post an answer for last weeks challenge because ... I didn't receive any qualifying answers. So let's try this again shall we? A second week to show the world your expertise with the google cloud.


The Prize:
$100 Amazon Giftcard

The Rules:

  1. You must post your answer before Friday 4/5/19 7PM CST (GMT -5)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com. Please state in your email if you would like to be anonymous or not if you win.
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post


The Challenge:
Name and describe all of the available forensic data sources provided by Google Cloud Platform

Tuesday, March 26, 2019

Daily Blog #653: Forensic Lunch Test Kitchen 3/26/19

Hello Reader,
        Tonight I tried to do a live stream from my hotel in  Jeddah, KSA. Looking back at the recording I'm not sure how well it did but I was able to get some base testing done for a better test of some SRUM recording features tomorrow night UTC +3. In short I did the following to see how SRUM would record it:

  • The livestream to youtube via Xsplit
  • The youtube access via chrome
  • A chrome incognito window to see if it gets tracked seperately
  • Copied data to an external drive with windows explorer
  • Copied data to an internal drive with copy.exe from the command line
  • Deleted files within file explorer
  • Deleted files in the GUI
I'll let this computer run overnight and use the wired internet I have in the classroom to stream tomorrow.

You can watch the video here: https://youtu.be/0I1xgA3DhYo

Monday, March 25, 2019

Daily Blog #652: Seeking Sponsor for the Unofficial Defcon DFIR CTF 2019

Hello Reader,
        Do you or your company want to provide a prize for the Unofficial Defcon DFIR CTF now in its third year? If so email me at dcowen@g-cpartners.com so we can talk. In the past SANS, Magnet Forensics, Blackbag and Metaspike have all graciously provided prizes for our worthy contenders and we'd like to open this up to all of you.

The Defcon DFIR CTF usually gets 100+ players during the events and 100s more once the CTF is opened to the public. I'd like to expand the prize pool so we can award more cool things to more people in three groups.


  1. Top finishers at Defcon
  2. Top finishers online
  3. Noteworthy achievements (Like first perfect score)
So reach out if you are interested and I hope to hear from you soon.

Sunday, March 24, 2019

Daily Blog #651: Sunday Funday 3/24/19

Hello Reader,
   Let's finish this trifecta of the major three cloud compute vendors. I think that getting more of this knowledge out there will many random internet searches just trying to understand whats possible, when someone else made a decision to move their assets to the cloud. We have a streak of new winners and you, yes you, reading this now. I want you to be my next winner so take the time to do some research and I look forward to hearing from you!



The Prize:
$100 Amazon Giftcard

The Rules:

  1. You must post your answer before Friday 3/29/19 7PM CST (GMT -5)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com. Please state in your email if you would like to be anonymous or not if you win.
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post


The Challenge:
Name and describe all of the available forensic data sources provided by Google Cloud Platform

Saturday, March 23, 2019

Daily Blog #650: Solution Saturday 3/23/19

Hello Reader,
         This week's challenge was met with many challenges but they were overcome by @darizotas aka Dario B. I think you'll see in his winning post that did a pretty thorough job documenting what existed with solid references for following up. I'm loving all of these new people in the community getting involved and showing what they have to contribute! So next week, let that be you!


The Challenge:
Name and describe all of the available forensic data sources provided by Azure Compute

The Winning Answer:
@darizotas 
https://darizotas.blogspot.com/2019/03/azure-and-office-365-logging.html

Thursday, March 21, 2019

Daily Blog #649: How to pick something to test

Hello Reader,
         One of the questions I get asked on a semi regular basis is, how do I pick what to test/research? The answer is more simple than you would expect:

Selection pool:

  •  I look at an interaction I just experienced while using the operating system
  • I think about an artifact I don't feel I full understand
  • I am working on a case and have to find a way to recreate a behavior I found

After that, as you can see the in test kitchen videos, I spend hours testing/recreating/examining/understanding the behavior that I'm seeing. While it is possible that two different actions can result in the same behavior, typically those different actions create their own marks on the system allowing you determine which route the user went through.

I have my travel streaming system with me now and I'm currently in an airport in Dubai awaiting my flight to Jeddah. My hope is that the hotel in Jeddah will have enough bandwidth to let me stream some testing this week, wish me luck!

Tuesday, March 19, 2019

Daily Blog #648: How to stream your own test kitchen

Hello Reader,
       As I prepare to get the test kitchen back in service I thought I'd share what I use for others who are looking to do the same. I got this idea after this tweet from Gerald Davis


So here is my setup:
Hardware: I have a Windows 10 desktop with a Nvidia GTX 980 an i7 processor and 32gb of RAM. It's nothing special and you don't need much in order to do this. The OS is running off of a SSD and the virtual machines are running off a 2TB 7200 RPM Western Digital.

Broadcasting software: I'm using XSplit broadcaster. You could use something else like OBS or Wireframe but XSplit when I was looking was the easiest to setup and use with all of its built in plugins and stream support.

Hypervisor: I'm using VMWare Workstation

Mic: I'm using a HyperX Cloud 2 headset, nothing special just a headset with a mic.

I have a 4K Samsung monitor so I have the VM running in the upper corner of the monitor and have drawn the broadcast window over it. This allows me to keep most of the screen off camera as it were so I can monitor the stream, check chat and google things while also making the streamed desktop be a readable size to the viewer.

I got the Windows OS ISO's from MSDN but you could also get eval images from Microsoft directly.

Have other questions? Let me know in the comments and I can explain more about what I do and how it works. 

Monday, March 18, 2019

Daily Blog #647: Windows Forensics in San Diego

Hello Reader,

               Looks like I'll be heading to sunny San Diego California to teach SANS FOR500: Windows Forensics this May 9 2019. The event is called Security West and its one of the bigger SANS events of the years. If you wanted to learn Windows Forensics, see San Diego and see some great bonus sessions for some amazing SANS instructors its a great event.

Want to learn more? Click below
https://www.sans.org/event/security-west-2019/course/windows-forensic-analysis

Sunday, March 17, 2019

Daily Blog #646: Sunday Funday 3/17/19

Hello Reader,
              I always appreciate it when people spend their time researching rather than doing other fun things, like playing video games or reading a non-technical book. When we share what we know, even if we don't know everything about something, it helps someone else leap frog forward and learn more. This week let's keep our head in the clouds for another challenge.


The Prize:
$100 Amazon Giftcard

The Rules:

  1. You must post your answer before Friday 3/22/19 7PM CST (GMT -5)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com. Please state in your email if you would like to be anonymous or not if you win.
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post


The Challenge:
Name and describe all of the available forensic data sources provided by Azure Compute

Daily Blog #645: Solution Saturday 3/16/19

Hello Reader,
         Spring break is ending which means kids are going back to school soon and I'll be back on track with blogging. Here is this weeks winner!

The Challenge:
Name and describe all of the available forensic data sources provided by Amazon AWS for EC2

The Winning Answer:
Jonathan Yan

CloudTrail Logs
Cloudtrail is an audit log that is enabled by default and stores all actions on resources for an account for 90 days. For EC2 specifically, it can provide information on the user and the action they performed on a specific resource such as EC2 KeyPairs, NetworkAcl, SecurityGroup, Snapshot to see if any suspicious changes were made.
cloudtrail-start-stop-instance.png
EBS (Elastic Block Store) Snapshots
Elastic Block Store are the hard drives that EC2 instances use to store data on. Snapshots can be taken of the EBS of a compromised and mounted onto a trusted EC2 instance for forensic investigation. These snapshots can be taken regularly as part of backups or whilst responding to an incident. Note that the ownership of snapshots can be assigned to another AWS account to ensure they cannot be modified by anyone with permissions over a compromised account.
EBS.png

VPC Flow Logs
VPC Flow logs are a record of ip traffic to and from network interfaces within a Virtual Private Cloud (VPC), which is the segregated network that EC2 instances reside in. It can provide a trail of all network traffic to and from each EC2 instance. However, this has to be enabled per VPC and then sent to AWS CloudWatch or stored in an AWS S3 bucket, where it can then be analysed.
vpc-flow-logs.png
AWS Systems Manager
AWS Systems Manager is a utility that can be enabled as an AWS agent on an EC2 instance to record all the installed software, network configurations, CPU data, windows patch versions, specific windows registry keys and files. It could be useful for a first glance while in the console, but this has to be enabled and configured correctly before an incident occurs to provide value. Additionally, information shown here can be found during forensic investigation of EBS volume.
aws-system-manager-auditing.png
AWS Inspector
AWS Inspector is a vulnerability scanning platform that can identify vulnerabilities in applications running on EC2 instances. If enabled and configured, it could be useful during forensic investigations to narrow down which vulnerabilities may have been exploited on a host.
Cheers,

Tuesday, March 12, 2019

Daily Blog #644: Creating decrypted images of APFS file systems encrypted with T2 Chips with Macquistion

Hello Reader,
          Dealing with T2 Chips on recent model Macbooks has been a real pain point for us in the lab so I was very, very happy to read that Blackbag (thanks Joe and Vico!) have figured out how to transparently decrypt the physical blocks of a drive being managed by a T2 chip at imaging time. Now the important to understand is that this decryption is being done at Image time, meaning Macquisition is not extracting the keys for later use. Instead Blackbag has found a way to get the T2 chip to return decrypted blocks rather than just files.

This is a big step forward as all of the other solutions I'm aware of (including the previous version of Macquistion) where stuck just doing file system images (logical images) of APFS drives with T2 chips. Now with this feature you can get all the data including APFS snapshots and possibly deleted data as well.

You can read more here:
https://www.blackbagtech.com/blog/2019/03/11/macquisition-will-decrypt-physical-images-macs-t2-chip/

Monday, March 11, 2019

Daily Blog #643: Sunday Funday 3/10/19

Hello Reader,
        On this blog we focus on a lot of host related issues, but the world is no longer confined to single on premises hosts anymore. This week let's set our challenge sights to the skies and start seeing what you can research about ... the cloud.



The Prize:
$100 Amazon Giftcard

The Rules:

  1. You must post your answer before Friday 3/15/19 7PM CST (GMT -5)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com. Please state in your email if you would like to be anonymous or not if you win.
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post


The Challenge:
Name and describe all of the available forensic data sources provided by Amazon AWS for EC2

Saturday, March 9, 2019

Daily Blog #642: Solution Saturday 3/9/19

Hello Reader,
         I love weeks when we get to crown new winners. Tun is not new DFIR, you may have seen his tweets before, but he is new to the Sunday Funday winners circle. Tun did some great testing which he documented below specifically for OSX. Give his work a look and join me on congratulating Tun on his win!


The Challenge:
On OSX Mojave, what information can you determine from the KnowledgeC database?

The Winning Answer:

Sarah Edwards has done quite an extensive research on the knowledgeC database and wrote a few posts on her blog mac4n6.com.  I have never used a Mac before and this is a good opportunity to learn Mac forensics. So, I decided to take the challenge and installed a macOS Mojave in Virtual Box. As a beginner I simply follow the same testing that Sarah did on macOS 10.13 to see the output of macOS 10.14 Mojave.


On macOS 10.14 Mojave, the knowledgeC.db database is found in the same locations as the previous version, macOS 10.13.
/private/var/db/CoreDuet/Knowledge system context database
~/Library/Application Support/Knowledge user context database
adams-iMac:~ adam$ sw_vers ProductName: Mac OS X ProductVersion: 10.14
BuildVersion: 18A391


adams-iMac:Knowledge adam$ pwd

/private/var/db/CoreDuet/Knowledge adams-iMac:Knowledge adam$ ls -l total 6120
-rw------- 1 root wheel 774144 Mar 8 04:25 knowledgeC.db

-rw------- 1 root wheel 32768 Mar 8 04:25 knowledgeC.db-shm

-rw------- 1 root wheel 2323712 Mar 8 05:23 knowledgeC.db-wal


adams-iMac:knowledge adam$ pwd

/Users/adam/Library/Application Support/knowledge

adams-iMac:knowledge adam$ ls -l total 1008
-rw------- 1 adam staff 425984 Mar 8 04:28 knowledgeC.db


-rw------- 1 adam staff 32768 Mar 8 04:28 knowledgeC.db-shm

-rw------- 1 adam staff 53592 Mar 8 04:38 knowledgeC.db-wal


From this database, we can get the information about

  • Application Usage
  • Application Activities
  • Safari Browser History

Using the DB Browser for SQLite, we can open the knowledgeC.db database and look at the structure. The database has many tables with many columns.
Note: timestamps in this database use the Mac Epoch time (01/01/2001 00:00:00 UTC).


Figure 1: Database structure of the knowledgeC.db, both system and user context database have 16 tables

Three tables are of particular interest and they store the information which is useful and valuable for an investigator.
  • ZOBJECT – contains usage entries (Sarah mentioned four weeks, but I could not test as my mac has been running for a week or two only)
  • ZSOURCE – source of ZOBJECT entries
  • ZSTRUCTUREDMETADATA – additional metadata associated with ZOBJECT entries

By running the SQLite query on the system context knowledgeC.db database, we get the following

“types”:
SELECT
DISTINCT ZOBJECT.ZSTREAMNAME


FROM ZOBJECT
ORDER BY ZSTREAMNAME

    • /activity/level

    • /app/activity
    • /app/inFocus
    • /app/usage
    • /display/isBacklit
    • /safari/history

For application usage information, we can run the SQLite query below where “/app/inFocus” type will provide what application is being used at a given time.
SELECT

datetime(ZOBJECT.ZCREATIONDATE+978307200,'UNIXEPOCH','LOCALTIME') as "ENTRY CREATION", CASE ZOBJECT.ZSTARTDAYOFWEEK
WHEN "1" THEN "SUNDAY" WHEN "2" THEN "MONDAY" WHEN "3" THEN "TUESDAY" WHEN "4" THEN "WEDNESDAY" WHEN "5" THEN "THURSDAY" WHEN "6" THEN "FRIDAY" WHEN "7" THEN "SATURDAY"
END "DAY OF WEEK", ZOBJECT.ZSECONDSFROMGMT/3600 AS "GMT OFFSET",
datetime(ZOBJECT.ZSTARTDATE+978307200,'UNIXEPOCH','LOCALTIME')    AS "START", datetime(ZOBJECT.ZENDDATE+978307200,'UNIXEPOCH','LOCALTIME')   AS "END", (ZOBJECT.ZENDDATE-ZOBJECT.ZSTARTDATE) AS "USAGE IN SECONDS", ZOBJECT.ZSTREAMNAME,
ZOBJECT.ZVALUESTRING FROM ZOBJECT
WHERE ZSTREAMNAME IS "/app/inFocus" ORDER BY "START"
-- Result: 697 rows returned in 145ms




Figure 2: Snapshot of the result of SQLite query which show the application usage


Note: the application usage is only for GUI-based applications. User attribution is an issue as this data is from system context database on macOS.

For application activities, we can use “/app/activity” stream type to add more context.
SELECT
datetime(ZOBJECT.ZCREATIONDATE+978307200,'UNIXEPOCH','LOCALTIME') as "ENTRY CREATION", ZOBJECT.ZSECONDSFROMGMT/3600 AS "GMT OFFSET",
CASE ZOBJECT.ZSTARTDAYOFWEEK WHEN "1" THEN "SUNDAY" WHEN "2" THEN "MONDAY" WHEN "3" THEN "TUESDAY" WHEN "4" THEN "WEDNESDAY" WHEN "5" THEN "THURSDAY" WHEN "6" THEN "FRIDAY" WHEN "7" THEN "SATURDAY"
END "DAY OF WEEK", datetime(ZOBJECT.ZSTARTDATE+978307200,'UNIXEPOCH','LOCALTIME')    AS "START", datetime(ZOBJECT.ZENDDATE+978307200,'UNIXEPOCH','LOCALTIME') AS "END", (ZOBJECT.ZENDDATE-ZOBJECT.ZSTARTDATE) AS "USAGE IN SECONDS",


ZOBJECT.ZSTREAMNAME,
ZOBJECT.ZVALUESTRING,

ZSTRUCTUREDMETADATA.Z_DKAPPLICATIONACTIVITYMETADATAKEY   ACTIVITYTYPE AS "ACTIVITY TYPE",
ZSTRUCTUREDMETADATA.Z_DKAPPLICATIONACTIVITYMETADATAKEY   TITLE AS "TITLE",

ZSTRUCTUREDMETADATA.Z_DKAPPLICATIONACTIVITYMETADATAKEY  USERACTIVITYREQUIREDSTRI NG AS "ACTIVITY STRING",
datetime(ZSTRUCTUREDMETADATA.Z_DKAPPLICATIONACTIVITYMETADATAKEY  EXPIRATIONDATE+9 78307200,'UNIXEPOCH','LOCALTIME') AS "EXPIRATION DATE"
FROM ZOBJECT

LEFT JOIN ZSTRUCTUREDMETADATA on ZOBJECT.ZSTRUCTUREDMETADATA=ZSTRUCTUREDMETADATA.Z_PK
WHERE ZSTREAMNAME IS "/app/activity" or ZSTREAMNAME is "/app/inFocus" ORDER BY "START"
-- Result: 754 rows returned in 207ms



Figure 3: Snapshot of the result of the SQLite query which show more context to the application activity


Even though I did edited the note, activity type data is not populated. Further testing is required for this.
Note: ZSTRUCTUREDMETADATA table has more than 100 columns which is worth looking into to see what data the apps are populating in this table.

For Safari history information, we can run the same query as the previous queries.
SELECT

datetime(ZOBJECT.ZCREATIONDATE+978307200,'UNIXEPOCH','LOCALTIME') as "ENTRY CREATION", CASE ZOBJECT.ZSTARTDAYOFWEEK
WHEN "1" THEN "SUNDAY"


WHEN "2" THEN "MONDAY" WHEN "3" THEN "TUESDAY" WHEN "4" THEN "WEDNESDAY" WHEN "5" THEN "THURSDAY" WHEN "6" THEN "FRIDAY" WHEN "7" THEN "SATURDAY"
END "DAY OF WEEK", ZOBJECT.ZSECONDSFROMGMT/3600 AS "GMT OFFSET",
datetime(ZOBJECT.ZSTARTDATE+978307200,'UNIXEPOCH','LOCALTIME')    AS "START", datetime(ZOBJECT.ZENDDATE+978307200,'UNIXEPOCH','LOCALTIME')   AS "END", (ZOBJECT.ZENDDATE-ZOBJECT.ZSTARTDATE) AS "USAGE IN SECONDS", ZOBJECT.ZSTREAMNAME,
ZOBJECT.ZVALUESTRING FROM ZOBJECT
WHERE ZSTREAMNAME IS "/safari/history" ORDER BY "START"
-- Result: 57 rows returned in 28ms


Figure 4: Snapshot of the results of the SQLite query which shows the Safari history

Note: if the user clears the browsing history, these entries get removed too. Private Mode browsing does not show up either.


For user correlation, we can run the SQLite query on the user context knowledgeC.db database.
SELECT

DISTINCT ZOBJECT.ZSTREAMNAME FROM ZOBJECT
ORDER BY ZSTREAMNAME

  • /event/tombstone

  • /portrait/entity
  • /portrait/topic

There are a lot more testing to be done as this database contains a lot of information.

- Knowledge is power -

Credits: