Lesson 6 of 6·14 min read·Includes quiz

Building a Forensic Timeline

Super timelines, correlating artifacts across sources, Plaso/log2timeline concepts, presenting findings

What You'll Learn

  • Explain what a super timeline is and why combining all artifact timestamps into a single chronological view transforms an investigation
  • Describe how log2timeline/Plaso works: parsing hundreds of artifact types into a unified Plaso storage file
  • Construct a manual timeline by correlating events from disk artifacts, memory forensics, and log files
  • Apply timeline analysis techniques to identify the initial compromise, lateral movement, privilege escalation, and data exfiltration
  • Present forensic findings as a minute-by-minute reconstruction that tells a coherent narrative
  • Use timeline tools including Plaso, Timeline Explorer, and structured spreadsheets for different investigation scales
  • Correlate forensic timeline entries with SIEM data by merging Wazuh alerts with disk and memory artifacts
  • Identify and avoid common timeline pitfalls: time zone confusion, timestamp granularity differences, and evidence gaps
  • Structure a forensic investigation report that conveys findings to technical and non-technical audiences

From Artifacts to Answers

In Lessons 9.1 through 9.5, you learned to extract forensic artifacts from individual sources — disk images, Windows registries, Linux logs, and memory dumps. Each source tells part of the story. But an investigation that analyzes each source in isolation is like reading random pages from a novel — you get facts without narrative.

The forensic timeline is where every artifact comes together into a single chronological sequence. It transforms isolated data points into a story: the attacker gained access at 02:14 via SSH, escalated privileges at 02:18 with sudo, installed tools at 02:20, established a reverse shell at 02:22, began data exfiltration at 02:35, and cleared logs at 02:47.

Investigation ApproachWhat You GetWhat You Miss
Examine each artifact source individuallyFacts about specific evidence typesHow events relate to each other across sources
Build a timeline from a single sourceChronological view of one evidence typeEvents only visible in other sources
Build a super timeline from all sourcesComplete minute-by-minute narrativeMinimal gaps — every source fills in others' blind spots

The super timeline is the forensic equivalent of a SIEM correlation. In Modules 2–5, you learned how SIEMs correlate alerts from multiple sources to detect attacks. A forensic timeline does the same thing after the fact — correlating timestamps from filesystem metadata, event logs, registry changes, memory artifacts, and network logs to reconstruct what happened and when.

What Is a Super Timeline?

A super timeline is a single chronological listing of every timestamped event from every artifact source in an investigation. A typical super timeline from a single Windows workstation can contain millions of entries — every file creation, every registry modification, every event log entry, every browser history visit, every prefetch execution, all sorted by time.

The concept was pioneered by Kristinn Gudjonsson, who created the original log2timeline tool. The modern implementation is Plaso (Plaso Langar Að Safna Öllu — Icelandic for "Plaso Likes to Collect Everything"), which parses over 300 artifact types.

What Goes Into a Super Timeline

Source CategoryArtifact TypesTimestamp Types
FilesystemNTFS MFT, ext4 inodes, FAT directory entriesCreated, Modified, Accessed, Changed (MACB)
Windows RegistryAll hive keys and valuesLast written timestamp per key
Event LogsSecurity, System, Application, PowerShell, SysmonEvent generation time
Browser HistoryChrome, Firefox, Edge, IE history + downloads + cacheVisit time, download time, cache entry time
PrefetchApplication execution recordsLast 8 execution times (Win10+)
USB HistoryUSBSTOR registry keys, setupapi.logFirst/last connection times
NetworkWazuh alerts, Suricata logs, firewall logsAlert/connection timestamps
MemoryProcess creation times, network connection statesProcess start time from EPROCESS structure
Linux Logsauth.log, syslog, wtmp, bash_history (if timestamped)Log entry timestamps

log2timeline/Plaso: Automated Timeline Generation

For large-scale investigations, manually correlating thousands of artifacts is impractical. Plaso automates the process.

The super timeline concept — multiple evidence sources parsed into a single chronological view that reveals the complete attack narrative

Step 1: Parse All Artifacts

log2timeline.py --storage-file case001.plaso /evidence/disk_image.E01

This single command parses the entire disk image using all applicable parsers. On a 100 GB image, this can take hours but will extract millions of timestamped events from every supported artifact type.

Step 2: Filter the Timeline

A raw super timeline with 10 million entries is unusable without filtering. Plaso's psort.py tool filters and outputs the timeline:

psort.py -o l2tcsv -w timeline.csv case001.plaso "date > '2026-03-15 00:00:00' AND date < '2026-03-16 00:00:00'"

Common filters:

FilterPurpose
Time rangeNarrow to the incident window (e.g., 24 hours around the alert)
Source typeFocus on specific artifacts (e.g., only event logs or only filesystem)
KeywordSearch for specific filenames, usernames, or IP addresses
PathLimit to specific directories (e.g., user profile, /tmp)

Step 3: Analyze in a Viewer

The CSV output can be opened in:

  • Timeline Explorer (Eric Zimmerman) — purpose-built for super timeline analysis on Windows, handles millions of rows efficiently
  • Excel/LibreOffice — works for filtered timelines under ~1 million rows
  • Elastic/OpenSearch — import the timeline for SIEM-style searching and visualization
  • grep/awk — command-line filtering for specific patterns
💡

Start with a narrow time window and expand. Do not try to analyze 10 million entries at once. Start with the time window of your first alert (often from Wazuh or Suricata), find the initial compromise event, then expand backwards (how did they get in?) and forwards (what did they do after?) in incremental steps.

Manual Timeline Construction

Not every investigation requires Plaso. For smaller incidents or when you need to combine artifacts from different systems, a manual timeline built in a structured spreadsheet is often more effective and faster.

The Timeline Template

Timestamp (UTC)SourceSystemEvent DescriptionEvidence ReferenceATT&CK Technique
2026-03-15 02:14:33auth.logweb-server-01SSH login: user 'admin' from 10.0.1.50 (publickey)/var/log/auth.log line 4521T1078 — Valid Accounts
2026-03-15 02:17:45auth.logweb-server-01Failed SSH: 'test' from 203.0.113.42 (1 of 847 attempts)/var/log/btmpT1110 — Brute Force
2026-03-15 02:18:12auth.logweb-server-01sudo: admin → root, COMMAND=/usr/bin/apt install nmap/var/log/auth.log line 4589T1059 — Command Interpreter
2026-03-15 02:19:00syslogweb-server-01systemd started /tmp/.hidden (ELF binary)/var/log/syslogT1059.004 — Unix Shell
2026-03-15 02:22:15memoryWIN-DC-01powershell.exe (PID 7488) ESTABLISHED → 185.141.63.12:443Volatility netscanT1071 — Application Layer Protocol
2026-03-15 02:22:15memoryWIN-DC-01powershell.exe -enc SQBFAFgA... (Base64 download cradle)Volatility cmdlineT1059.001 — PowerShell
2026-03-15 02:35:00Wazuhweb-server-01Alert: Large outbound data transfer to 185.141.63.12Wazuh alert ID 15432T1041 — Exfiltration Over C2
2026-03-15 02:47:00filesystemweb-server-01/var/log/auth.log mtime changed (possible truncation)stat output, inode analysisT1070.002 — Clear Linux Logs

Building the Manual Timeline

  1. Start with your first indicator — usually a Wazuh alert, Suricata alert, or user report
  2. Work backwards — find the initial access event (SSH login, web exploit, phishing)
  3. Work forwards — trace every action from initial access to the present
  4. Cross-reference sources — if auth.log shows a login at 02:14, check wtmp for the same session, check bash_history for commands during that session, check filesystem timestamps for files created during that window
  5. Fill gaps — if there is a 15-minute gap in your timeline, identify which artifact source might fill it (memory? network logs? SIEM alerts?)

Timeline Analysis Techniques

Identifying the Initial Compromise

The initial compromise is the single most important event to find. Everything before it is reconnaissance (interesting but less urgent). Everything after it is the attack itself.

Common indicators of initial compromise in a timeline:

IndicatorWhat to Look For
First external authenticationSSH/RDP login from an unusual IP, especially outside business hours
Web exploit artifactsSuspicious web server log entries followed immediately by new process execution
Phishing payload executionEmail attachment opened followed by child process spawning
Vulnerability exploitationCrash/error log entry followed by unexpected privileged access

Tracking Lateral Movement

Once you find the initial compromise, look for the attacker moving to other systems:

02:22:15  WIN-DC-01   powershell.exe connects to 10.0.2.30:445 (SMB)
02:22:18  FILE-SRV-01 New login: admin from 10.0.1.50 (NTLM)
02:22:25  FILE-SRV-01 PsExec service installed: PSEXESVC
02:22:30  FILE-SRV-01 cmd.exe spawned by PSEXESVC

The timeline makes lateral movement visible: the attacker pivoted from WIN-DC-01 to FILE-SRV-01 using PsExec over SMB within 15 seconds.

Identifying Data Exfiltration

Timeline analysis workflow — from initial alert through evidence correlation to narrative reconstruction and report generation

Look for patterns that indicate data collection and transfer:

  • Staging: File compression utilities (7z, zip, rar) creating archives in unusual locations
  • Collection: Access timestamps on sensitive file shares or database exports
  • Transfer: Large outbound connections to external IPs, especially using encrypted protocols (HTTPS, DNS tunneling)

Correlating with SIEM Data

Your forensic timeline becomes exponentially more powerful when merged with SIEM alerts from Wazuh, Suricata, and other monitoring tools.

Why SIEM Correlation Matters

Forensic Artifact AloneForensic + SIEM Combined
"A file was created at 02:19""A file was created at 02:19, and Wazuh alert #15221 fired at 02:19 for 'New executable in /tmp'"
"PowerShell connected to 185.141.63.12""PowerShell connected to 185.141.63.12, and Suricata alert ET TROJAN Cobalt Strike C2 fired at 02:22"
"SSH login from 10.0.1.50 at 02:14""SSH login from 10.0.1.50 at 02:14, matching Wazuh alert for 'Login outside business hours'"

How to Merge SIEM Data

curl -sk "https://wazuh-indexer:9200/wazuh-alerts-*/_search" \
  -H "Content-Type: application/json" \
  -d '{
    "query": {
      "range": {
        "timestamp": {
          "gte": "2026-03-15T02:00:00Z",
          "lte": "2026-03-15T03:00:00Z"
        }
      }
    },
    "size": 1000,
    "sort": [{"timestamp": "asc"}]
  }'

Export the results to CSV and merge with your forensic timeline. The combined view shows both what the attacker did (forensic artifacts) and what your monitoring detected (SIEM alerts) — revealing both the attack and your detection gaps.

Common Timeline Pitfalls

🚨

Time zone errors are the most common mistake in forensic timelines. A one-hour time zone offset can completely break your event correlation. An attacker appears to exfiltrate data before they even logged in — not because of time travel, but because you mixed UTC and local time entries. Always normalize to UTC.

Pitfall 1: Time Zone Confusion

SourceDefault Time Zone
NTFS timestampsUTC
Windows Event LogsUTC (displayed in local time by Event Viewer)
Linux auth.log / syslogLocal system time (check /etc/timezone)
Wazuh alertsUTC
Suricata eve.jsonUTC
Browser historyVaries by browser and storage format
Memory process timesUTC (kernel structures)

Rule: Normalize everything to UTC before adding to your timeline.

Pitfall 2: Timestamp Granularity

Not all timestamps have the same precision:

SourceGranularity
NTFS (MFT)100-nanosecond intervals
ext41-nanosecond intervals (kernel 4.x+)
Windows Event Logs~100ms precision
auth.log1-second precision
Wazuh alerts1-second precision
Volatility process timesVaries by OS version

When two events appear to have the same timestamp, their actual order may be ambiguous. Document this uncertainty rather than guessing.

Pitfall 3: Evidence Gaps

Gaps in your timeline do not mean nothing happened. They mean you lack evidence for that period. Common causes:

  • Log rotation — older entries were compressed or deleted before acquisition
  • Anti-forensics — attacker cleared specific logs (history -c, log truncation)
  • Insufficient logging — process auditing was not enabled, command-line logging was not configured
  • Volatile evidence lost — the system was rebooted before memory was captured
  • Scope limitation — you imaged one system but the attacker used three

Always document gaps explicitly in your report: "No evidence available for WIN-DC-01 between 02:30 and 02:45 due to Event Log clearing."

Pitfall 4: Timestamp Manipulation

Sophisticated attackers may manipulate timestamps using tools like timestomp (changing NTFS MACB times) or touch (changing Linux file times). Indicators of timestamp manipulation:

  • File Modified time is older than Created time (impossible without manipulation on NTFS)
  • $STANDARD_INFORMATION timestamps differ from $FILE_NAME timestamps in the MFT
  • Files with timestamps that predate the operating system installation
  • Clusters of files with identical timestamps down to the sub-second

Report Writing for Forensic Investigations

The timeline is your evidence. The report is how you communicate that evidence to stakeholders who were not in the investigation.

Report Structure

SectionContentAudience
Executive Summary1-paragraph overview: what happened, impact, response statusC-suite, legal, management
Incident OverviewScope, affected systems, detection source, investigation timelineIT leadership, security management
FindingsMinute-by-minute reconstruction from your forensic timelineTechnical responders, other analysts
IOCsIP addresses, domains, file hashes, mutexes, YARA rulesSOC team, threat intel, peer organizations
Root CauseHow the attacker gained initial access and what controls failedSecurity architecture, risk management
RecommendationsSpecific remediation steps and preventive measuresIT operations, security engineering
Evidence AppendixTool outputs, raw timeline excerpts, screenshotsLegal, compliance, peer review

Write for your audience, not for yourself. The executive summary should be understandable by someone with no technical background. The findings section should be detailed enough that another forensic analyst could independently verify your conclusions. The recommendations should be actionable by the IT team. Each section serves a different reader.

The Findings Narrative

Transform your timeline into a narrative. Instead of:

02:14:33 - SSH login admin from 10.0.1.50
02:18:12 - sudo apt install nmap
02:19:00 - /tmp/.hidden executed

Write:

At 02:14 UTC on March 15, the attacker authenticated to web-server-01 via SSH using compromised credentials for the 'admin' account, originating from internal IP 10.0.1.50 (the previously compromised workstation). Within four minutes, the attacker escalated to root privileges via sudo and installed reconnaissance tools (nmap, netcat, Impacket). At 02:19, a previously staged ELF binary at /tmp/.hidden was executed, establishing a reverse shell to the attacker's C2 infrastructure.

The narrative adds context, causation, and interpretation — turning data into understanding.

Key Takeaways

  • A super timeline combines all timestamped artifacts from all sources into a single chronological view — transforming isolated facts into a coherent narrative
  • Plaso/log2timeline automates super timeline generation from disk images, parsing 300+ artifact types into a unified storage file
  • Manual timelines (spreadsheet-based) are effective for focused investigations and when combining artifacts from multiple systems
  • The analysis workflow: start at the first alert → work backwards to initial compromise → work forwards through the full attack chain → fill gaps
  • SIEM correlation dramatically enhances forensic timelines by adding monitoring context (what was detected vs. what was missed)
  • Normalize all timestamps to UTC before building your timeline — time zone errors are the most common and most damaging forensic mistake
  • Acknowledge evidence gaps explicitly rather than guessing — gaps caused by log rotation, anti-forensics, or insufficient logging must be documented
  • Watch for timestamp manipulation indicators: impossible chronological orders, mismatched MFT timestamps, or suspiciously identical timestamps
  • Forensic reports serve multiple audiences — executive summary for leadership, detailed findings for responders, IOCs for the SOC, and recommendations for remediation teams
  • Transform timeline data into narrative form that explains the how and why, not just the what and when

What's Next

You have completed the Digital Forensics module — from evidence acquisition and chain of custody through Windows artifacts, Linux investigation, memory forensics, and timeline construction. You now have the skills to conduct a full forensic investigation from detection to report. Module 10 takes you into YARA — Malware Detection, where you will write pattern-matching rules to identify malware by its binary characteristics, connect YARA to Velociraptor for enterprise-wide scanning, and build a detection library that bridges forensic findings with proactive threat hunting.

Knowledge Check: Building a Forensic Timeline

10 questions · 70% to pass

1

What is a 'super timeline' in digital forensics?

2

What does Plaso parse from a disk image to generate a super timeline?

3

When starting timeline analysis, what is the recommended approach for handling a raw super timeline with millions of entries?

4

What is the single most common and damaging mistake in forensic timeline construction?

5

In Lab 9.6, you are building a timeline and notice a 15-minute gap with no events. What should you do?

6

How does correlating forensic timeline entries with Wazuh SIEM alerts improve an investigation?

7

During Lab 9.6 timeline analysis, you find a file where the Modified timestamp is OLDER than the Created timestamp on an NTFS filesystem. What does this indicate?

8

Which section of a forensic investigation report is intended for C-suite executives and non-technical stakeholders?

9

When building a manual timeline, what is the correct sequence for analysis after finding your first indicator?

10

Why is it important to transform raw timeline data into narrative form in the Findings section of a forensic report?

0/10 answered