What You'll Learn
- Explain what a super timeline is and why combining all artifact timestamps into a single chronological view transforms an investigation
- Describe how log2timeline/Plaso works: parsing hundreds of artifact types into a unified Plaso storage file
- Construct a manual timeline by correlating events from disk artifacts, memory forensics, and log files
- Apply timeline analysis techniques to identify the initial compromise, lateral movement, privilege escalation, and data exfiltration
- Present forensic findings as a minute-by-minute reconstruction that tells a coherent narrative
- Use timeline tools including Plaso, Timeline Explorer, and structured spreadsheets for different investigation scales
- Correlate forensic timeline entries with SIEM data by merging Wazuh alerts with disk and memory artifacts
- Identify and avoid common timeline pitfalls: time zone confusion, timestamp granularity differences, and evidence gaps
- Structure a forensic investigation report that conveys findings to technical and non-technical audiences
From Artifacts to Answers
In Lessons 9.1 through 9.5, you learned to extract forensic artifacts from individual sources — disk images, Windows registries, Linux logs, and memory dumps. Each source tells part of the story. But an investigation that analyzes each source in isolation is like reading random pages from a novel — you get facts without narrative.
The forensic timeline is where every artifact comes together into a single chronological sequence. It transforms isolated data points into a story: the attacker gained access at 02:14 via SSH, escalated privileges at 02:18 with sudo, installed tools at 02:20, established a reverse shell at 02:22, began data exfiltration at 02:35, and cleared logs at 02:47.
| Investigation Approach | What You Get | What You Miss |
|---|---|---|
| Examine each artifact source individually | Facts about specific evidence types | How events relate to each other across sources |
| Build a timeline from a single source | Chronological view of one evidence type | Events only visible in other sources |
| Build a super timeline from all sources | Complete minute-by-minute narrative | Minimal gaps — every source fills in others' blind spots |
The super timeline is the forensic equivalent of a SIEM correlation. In Modules 2–5, you learned how SIEMs correlate alerts from multiple sources to detect attacks. A forensic timeline does the same thing after the fact — correlating timestamps from filesystem metadata, event logs, registry changes, memory artifacts, and network logs to reconstruct what happened and when.
What Is a Super Timeline?
A super timeline is a single chronological listing of every timestamped event from every artifact source in an investigation. A typical super timeline from a single Windows workstation can contain millions of entries — every file creation, every registry modification, every event log entry, every browser history visit, every prefetch execution, all sorted by time.
The concept was pioneered by Kristinn Gudjonsson, who created the original log2timeline tool. The modern implementation is Plaso (Plaso Langar Að Safna Öllu — Icelandic for "Plaso Likes to Collect Everything"), which parses over 300 artifact types.
What Goes Into a Super Timeline
| Source Category | Artifact Types | Timestamp Types |
|---|---|---|
| Filesystem | NTFS MFT, ext4 inodes, FAT directory entries | Created, Modified, Accessed, Changed (MACB) |
| Windows Registry | All hive keys and values | Last written timestamp per key |
| Event Logs | Security, System, Application, PowerShell, Sysmon | Event generation time |
| Browser History | Chrome, Firefox, Edge, IE history + downloads + cache | Visit time, download time, cache entry time |
| Prefetch | Application execution records | Last 8 execution times (Win10+) |
| USB History | USBSTOR registry keys, setupapi.log | First/last connection times |
| Network | Wazuh alerts, Suricata logs, firewall logs | Alert/connection timestamps |
| Memory | Process creation times, network connection states | Process start time from EPROCESS structure |
| Linux Logs | auth.log, syslog, wtmp, bash_history (if timestamped) | Log entry timestamps |
log2timeline/Plaso: Automated Timeline Generation
For large-scale investigations, manually correlating thousands of artifacts is impractical. Plaso automates the process.
Step 1: Parse All Artifacts
log2timeline.py --storage-file case001.plaso /evidence/disk_image.E01
This single command parses the entire disk image using all applicable parsers. On a 100 GB image, this can take hours but will extract millions of timestamped events from every supported artifact type.
Step 2: Filter the Timeline
A raw super timeline with 10 million entries is unusable without filtering. Plaso's psort.py tool filters and outputs the timeline:
psort.py -o l2tcsv -w timeline.csv case001.plaso "date > '2026-03-15 00:00:00' AND date < '2026-03-16 00:00:00'"
Common filters:
| Filter | Purpose |
|---|---|
| Time range | Narrow to the incident window (e.g., 24 hours around the alert) |
| Source type | Focus on specific artifacts (e.g., only event logs or only filesystem) |
| Keyword | Search for specific filenames, usernames, or IP addresses |
| Path | Limit to specific directories (e.g., user profile, /tmp) |
Step 3: Analyze in a Viewer
The CSV output can be opened in:
- Timeline Explorer (Eric Zimmerman) — purpose-built for super timeline analysis on Windows, handles millions of rows efficiently
- Excel/LibreOffice — works for filtered timelines under ~1 million rows
- Elastic/OpenSearch — import the timeline for SIEM-style searching and visualization
- grep/awk — command-line filtering for specific patterns
Start with a narrow time window and expand. Do not try to analyze 10 million entries at once. Start with the time window of your first alert (often from Wazuh or Suricata), find the initial compromise event, then expand backwards (how did they get in?) and forwards (what did they do after?) in incremental steps.
Manual Timeline Construction
Not every investigation requires Plaso. For smaller incidents or when you need to combine artifacts from different systems, a manual timeline built in a structured spreadsheet is often more effective and faster.
The Timeline Template
| Timestamp (UTC) | Source | System | Event Description | Evidence Reference | ATT&CK Technique |
|---|---|---|---|---|---|
| 2026-03-15 02:14:33 | auth.log | web-server-01 | SSH login: user 'admin' from 10.0.1.50 (publickey) | /var/log/auth.log line 4521 | T1078 — Valid Accounts |
| 2026-03-15 02:17:45 | auth.log | web-server-01 | Failed SSH: 'test' from 203.0.113.42 (1 of 847 attempts) | /var/log/btmp | T1110 — Brute Force |
| 2026-03-15 02:18:12 | auth.log | web-server-01 | sudo: admin → root, COMMAND=/usr/bin/apt install nmap | /var/log/auth.log line 4589 | T1059 — Command Interpreter |
| 2026-03-15 02:19:00 | syslog | web-server-01 | systemd started /tmp/.hidden (ELF binary) | /var/log/syslog | T1059.004 — Unix Shell |
| 2026-03-15 02:22:15 | memory | WIN-DC-01 | powershell.exe (PID 7488) ESTABLISHED → 185.141.63.12:443 | Volatility netscan | T1071 — Application Layer Protocol |
| 2026-03-15 02:22:15 | memory | WIN-DC-01 | powershell.exe -enc SQBFAFgA... (Base64 download cradle) | Volatility cmdline | T1059.001 — PowerShell |
| 2026-03-15 02:35:00 | Wazuh | web-server-01 | Alert: Large outbound data transfer to 185.141.63.12 | Wazuh alert ID 15432 | T1041 — Exfiltration Over C2 |
| 2026-03-15 02:47:00 | filesystem | web-server-01 | /var/log/auth.log mtime changed (possible truncation) | stat output, inode analysis | T1070.002 — Clear Linux Logs |
Building the Manual Timeline
- Start with your first indicator — usually a Wazuh alert, Suricata alert, or user report
- Work backwards — find the initial access event (SSH login, web exploit, phishing)
- Work forwards — trace every action from initial access to the present
- Cross-reference sources — if auth.log shows a login at 02:14, check wtmp for the same session, check bash_history for commands during that session, check filesystem timestamps for files created during that window
- Fill gaps — if there is a 15-minute gap in your timeline, identify which artifact source might fill it (memory? network logs? SIEM alerts?)
Timeline Analysis Techniques
Identifying the Initial Compromise
The initial compromise is the single most important event to find. Everything before it is reconnaissance (interesting but less urgent). Everything after it is the attack itself.
Common indicators of initial compromise in a timeline:
| Indicator | What to Look For |
|---|---|
| First external authentication | SSH/RDP login from an unusual IP, especially outside business hours |
| Web exploit artifacts | Suspicious web server log entries followed immediately by new process execution |
| Phishing payload execution | Email attachment opened followed by child process spawning |
| Vulnerability exploitation | Crash/error log entry followed by unexpected privileged access |
Tracking Lateral Movement
Once you find the initial compromise, look for the attacker moving to other systems:
02:22:15 WIN-DC-01 powershell.exe connects to 10.0.2.30:445 (SMB)
02:22:18 FILE-SRV-01 New login: admin from 10.0.1.50 (NTLM)
02:22:25 FILE-SRV-01 PsExec service installed: PSEXESVC
02:22:30 FILE-SRV-01 cmd.exe spawned by PSEXESVC
The timeline makes lateral movement visible: the attacker pivoted from WIN-DC-01 to FILE-SRV-01 using PsExec over SMB within 15 seconds.
Identifying Data Exfiltration
Look for patterns that indicate data collection and transfer:
- Staging: File compression utilities (7z, zip, rar) creating archives in unusual locations
- Collection: Access timestamps on sensitive file shares or database exports
- Transfer: Large outbound connections to external IPs, especially using encrypted protocols (HTTPS, DNS tunneling)
Correlating with SIEM Data
Your forensic timeline becomes exponentially more powerful when merged with SIEM alerts from Wazuh, Suricata, and other monitoring tools.
Why SIEM Correlation Matters
| Forensic Artifact Alone | Forensic + SIEM Combined |
|---|---|
| "A file was created at 02:19" | "A file was created at 02:19, and Wazuh alert #15221 fired at 02:19 for 'New executable in /tmp'" |
| "PowerShell connected to 185.141.63.12" | "PowerShell connected to 185.141.63.12, and Suricata alert ET TROJAN Cobalt Strike C2 fired at 02:22" |
| "SSH login from 10.0.1.50 at 02:14" | "SSH login from 10.0.1.50 at 02:14, matching Wazuh alert for 'Login outside business hours'" |
How to Merge SIEM Data
curl -sk "https://wazuh-indexer:9200/wazuh-alerts-*/_search" \
-H "Content-Type: application/json" \
-d '{
"query": {
"range": {
"timestamp": {
"gte": "2026-03-15T02:00:00Z",
"lte": "2026-03-15T03:00:00Z"
}
}
},
"size": 1000,
"sort": [{"timestamp": "asc"}]
}'
Export the results to CSV and merge with your forensic timeline. The combined view shows both what the attacker did (forensic artifacts) and what your monitoring detected (SIEM alerts) — revealing both the attack and your detection gaps.
Common Timeline Pitfalls
Time zone errors are the most common mistake in forensic timelines. A one-hour time zone offset can completely break your event correlation. An attacker appears to exfiltrate data before they even logged in — not because of time travel, but because you mixed UTC and local time entries. Always normalize to UTC.
Pitfall 1: Time Zone Confusion
| Source | Default Time Zone |
|---|---|
| NTFS timestamps | UTC |
| Windows Event Logs | UTC (displayed in local time by Event Viewer) |
| Linux auth.log / syslog | Local system time (check /etc/timezone) |
| Wazuh alerts | UTC |
| Suricata eve.json | UTC |
| Browser history | Varies by browser and storage format |
| Memory process times | UTC (kernel structures) |
Rule: Normalize everything to UTC before adding to your timeline.
Pitfall 2: Timestamp Granularity
Not all timestamps have the same precision:
| Source | Granularity |
|---|---|
| NTFS (MFT) | 100-nanosecond intervals |
| ext4 | 1-nanosecond intervals (kernel 4.x+) |
| Windows Event Logs | ~100ms precision |
| auth.log | 1-second precision |
| Wazuh alerts | 1-second precision |
| Volatility process times | Varies by OS version |
When two events appear to have the same timestamp, their actual order may be ambiguous. Document this uncertainty rather than guessing.
Pitfall 3: Evidence Gaps
Gaps in your timeline do not mean nothing happened. They mean you lack evidence for that period. Common causes:
- Log rotation — older entries were compressed or deleted before acquisition
- Anti-forensics — attacker cleared specific logs (
history -c, log truncation) - Insufficient logging — process auditing was not enabled, command-line logging was not configured
- Volatile evidence lost — the system was rebooted before memory was captured
- Scope limitation — you imaged one system but the attacker used three
Always document gaps explicitly in your report: "No evidence available for WIN-DC-01 between 02:30 and 02:45 due to Event Log clearing."
Pitfall 4: Timestamp Manipulation
Sophisticated attackers may manipulate timestamps using tools like timestomp (changing NTFS MACB times) or touch (changing Linux file times). Indicators of timestamp manipulation:
- File Modified time is older than Created time (impossible without manipulation on NTFS)
- $STANDARD_INFORMATION timestamps differ from $FILE_NAME timestamps in the MFT
- Files with timestamps that predate the operating system installation
- Clusters of files with identical timestamps down to the sub-second
Report Writing for Forensic Investigations
The timeline is your evidence. The report is how you communicate that evidence to stakeholders who were not in the investigation.
Report Structure
| Section | Content | Audience |
|---|---|---|
| Executive Summary | 1-paragraph overview: what happened, impact, response status | C-suite, legal, management |
| Incident Overview | Scope, affected systems, detection source, investigation timeline | IT leadership, security management |
| Findings | Minute-by-minute reconstruction from your forensic timeline | Technical responders, other analysts |
| IOCs | IP addresses, domains, file hashes, mutexes, YARA rules | SOC team, threat intel, peer organizations |
| Root Cause | How the attacker gained initial access and what controls failed | Security architecture, risk management |
| Recommendations | Specific remediation steps and preventive measures | IT operations, security engineering |
| Evidence Appendix | Tool outputs, raw timeline excerpts, screenshots | Legal, compliance, peer review |
Write for your audience, not for yourself. The executive summary should be understandable by someone with no technical background. The findings section should be detailed enough that another forensic analyst could independently verify your conclusions. The recommendations should be actionable by the IT team. Each section serves a different reader.
The Findings Narrative
Transform your timeline into a narrative. Instead of:
02:14:33 - SSH login admin from 10.0.1.50
02:18:12 - sudo apt install nmap
02:19:00 - /tmp/.hidden executed
Write:
At 02:14 UTC on March 15, the attacker authenticated to web-server-01 via SSH using compromised credentials for the 'admin' account, originating from internal IP 10.0.1.50 (the previously compromised workstation). Within four minutes, the attacker escalated to root privileges via sudo and installed reconnaissance tools (nmap, netcat, Impacket). At 02:19, a previously staged ELF binary at /tmp/.hidden was executed, establishing a reverse shell to the attacker's C2 infrastructure.
The narrative adds context, causation, and interpretation — turning data into understanding.
Key Takeaways
- A super timeline combines all timestamped artifacts from all sources into a single chronological view — transforming isolated facts into a coherent narrative
- Plaso/log2timeline automates super timeline generation from disk images, parsing 300+ artifact types into a unified storage file
- Manual timelines (spreadsheet-based) are effective for focused investigations and when combining artifacts from multiple systems
- The analysis workflow: start at the first alert → work backwards to initial compromise → work forwards through the full attack chain → fill gaps
- SIEM correlation dramatically enhances forensic timelines by adding monitoring context (what was detected vs. what was missed)
- Normalize all timestamps to UTC before building your timeline — time zone errors are the most common and most damaging forensic mistake
- Acknowledge evidence gaps explicitly rather than guessing — gaps caused by log rotation, anti-forensics, or insufficient logging must be documented
- Watch for timestamp manipulation indicators: impossible chronological orders, mismatched MFT timestamps, or suspiciously identical timestamps
- Forensic reports serve multiple audiences — executive summary for leadership, detailed findings for responders, IOCs for the SOC, and recommendations for remediation teams
- Transform timeline data into narrative form that explains the how and why, not just the what and when
What's Next
You have completed the Digital Forensics module — from evidence acquisition and chain of custody through Windows artifacts, Linux investigation, memory forensics, and timeline construction. You now have the skills to conduct a full forensic investigation from detection to report. Module 10 takes you into YARA — Malware Detection, where you will write pattern-matching rules to identify malware by its binary characteristics, connect YARA to Velociraptor for enterprise-wide scanning, and build a detection library that bridges forensic findings with proactive threat hunting.
Knowledge Check: Building a Forensic Timeline
10 questions · 70% to pass
What is a 'super timeline' in digital forensics?
What does Plaso parse from a disk image to generate a super timeline?
When starting timeline analysis, what is the recommended approach for handling a raw super timeline with millions of entries?
What is the single most common and damaging mistake in forensic timeline construction?
In Lab 9.6, you are building a timeline and notice a 15-minute gap with no events. What should you do?
How does correlating forensic timeline entries with Wazuh SIEM alerts improve an investigation?
During Lab 9.6 timeline analysis, you find a file where the Modified timestamp is OLDER than the Created timestamp on an NTFS filesystem. What does this indicate?
Which section of a forensic investigation report is intended for C-suite executives and non-technical stakeholders?
When building a manual timeline, what is the correct sequence for analysis after finding your first indicator?
Why is it important to transform raw timeline data into narrative form in the Findings section of a forensic report?
0/10 answered