Timeline Analysis Fundamentals
Core concepts and techniques for building forensic timelines during incident investigations.
Last updated: February 2026Purpose and Scope
Timeline analysis is the process of reconstructing a sequence of events from multiple data sources to understand what happened during a security incident. A well constructed timeline reveals attacker movements, identifies impacted systems, and establishes the scope of compromise. This playbook covers foundational concepts for building and analyzing timelines.
Prerequisites
- Log access: Centralized logs from endpoints, network devices, authentication systems, and applications
- Time synchronization: Understanding of time zones and NTP configuration across data sources
- Investigation context: Initial indicators or alerts that triggered the investigation
- Documentation system: Tool for recording findings and maintaining the timeline
Why Timelines Matter
Timelines answer critical questions:
- When did the initial compromise occur?
- What systems were accessed and in what order?
- What actions did the attacker take on each system?
- When did data exfiltration or damage occur?
- Is the attacker still present?
Without a timeline, investigations become fragmented and incomplete. Key events may be missed, leading to incomplete containment and remediation.
Timeline Data Sources
Endpoint Sources
- File system metadata: Creation, modification, and access times for files
- Windows Event Logs: Security, System, Application, PowerShell, and Sysmon logs
- Registry timestamps: Last write times on registry keys
- Prefetch files: Application execution history on Windows
- Browser history: Web activity with timestamps
- Shellbags: Folder access history
- Jump lists: Recent file and application usage
Network Sources
- Firewall logs: Connection allow and deny events
- Proxy logs: Web requests with timestamps and user info
- DNS logs: Query history showing domain lookups
- Zeek logs: Detailed network connection and protocol metadata
- NetFlow: Connection summaries with byte and packet counts
Authentication Sources
- Active Directory logs: Logon events, account changes, group modifications
- VPN logs: Remote access connections
- Cloud identity logs: Azure AD, Okta, or other IdP sign in records
- Application authentication: Service specific login events
Application Sources
- Email logs: Message delivery, mailbox access, rule changes
- Database logs: Query history and access patterns
- Application audit logs: User actions within business applications
- Cloud service logs: AWS CloudTrail, GCP audit logs, Azure activity logs
Time Normalization
The Time Zone Challenge
Different systems log in different time formats and zones:
- UTC vs local time
- Various timestamp formats (epoch, ISO 8601, custom)
- Systems with incorrect clock settings
- Daylight saving time transitions
Normalization Process
- Identify the time zone and format for each data source
- Convert all timestamps to a single reference (typically UTC)
- Validate time accuracy by correlating known events across sources
- Document any time drift or anomalies discovered
Timeline Construction Process
1. Define Scope
- Establish the investigation time window
- Identify systems and users of interest
- Determine which log sources are available and relevant
2. Collect Data
- Export logs from SIEM or directly from sources
- Preserve original evidence with chain of custody
- Document collection methods and any gaps
3. Parse and Normalize
- Extract timestamps and key fields from each source
- Convert to consistent time zone and format
- Tag events with source type for later filtering
4. Merge and Sort
- Combine events from all sources into a single timeline
- Sort chronologically
- Deduplicate where same event appears in multiple sources
5. Analyze and Annotate
- Walk through events sequentially
- Identify patterns and correlate across sources
- Annotate significant events with findings
- Mark gaps where expected events are missing
Key Analysis Techniques
Pivot Points
Use known indicators to anchor your timeline:
- Alert timestamps from detection systems
- Known malicious file creation times
- Authentication events for compromised accounts
- Network connections to C2 infrastructure
Working Forward and Backward
From each pivot point:
- Work backward: How did this happen? What preceded it?
- Work forward: What happened next? What was the impact?
Correlation Across Sources
The same event may appear differently in multiple logs:
- A logon appears in both endpoint and domain controller logs
- A web request appears in proxy logs and endpoint browser history
- A file download appears in network traffic and file system metadata
Correlating these strengthens your timeline and fills gaps.
Common Pitfalls
- Timestamp confusion: Mixing time zones or formats leads to incorrect sequencing
- Missing data: Log retention gaps or collection failures leave blind spots
- Tunnel vision: Focusing only on one system misses lateral movement
- Timestamp manipulation: Sophisticated attackers may alter timestamps
- Over reliance on single source: Each log type has limitations
Documentation
A good timeline document includes:
- Timestamp in consistent format
- Source system or log type
- Event description
- Relevant details (user, process, IP, file path)
- Analyst notes and interpretation
- Confidence level for each event
Escalation Guidance
Timeline analysis may reveal need for escalation when:
- Initial access occurred weeks or months earlier than detected
- Multiple systems show compromise indicators
- Sensitive data access is confirmed
- Persistence mechanisms suggest ongoing access
- Scope exceeds initial assumptions
References
- SANS Digital Forensics and Incident Response resources
- NIST SP 800-86: Guide to Integrating Forensic Techniques
- MITRE ATT&CK: attack.mitre.org
- CISA incident response guidance
Was this helpful?