5.6 Describe concepts as documented in NIST.SP800-86
📘Cisco Certified CyberOps Associate (200-201 CBROPS)
🔹 What is Data Integrity?
Data integrity means that data is:
- Accurate
- Complete
- Not altered or corrupted
In digital forensics and incident response, maintaining data integrity is critical because investigators must ensure that the evidence they collect is exactly the same as the original.
If data is changed—even slightly—it may:
- Become unreliable
- Be rejected as evidence
- Lead to incorrect conclusions
🔹 Why Data Integrity is Important (For the Exam)
In the context of NIST SP 800-86 (Guide to Integrating Forensic Techniques into Incident Response):
Data integrity ensures that:
- Evidence is trustworthy
- Investigations are accurate
- Findings can be defended and verified
- Evidence can be used in legal or organizational decisions
👉 Key exam idea:
If integrity is lost, the evidence loses value.
🔹 Key Concepts of Data Integrity
1. 🔐 Original Data Must Not Be Modified
- Investigators should never work directly on original data
- Instead, they create a forensic copy (image)
✔ Example (IT environment):
- A disk from a compromised server is copied
- Analysis is done on the copy, not the original disk
2. 🧾 Use of Hash Values
A hash is a unique digital fingerprint of data.
Common hash algorithms:
- MD5
- SHA-1
- SHA-256 (most secure and recommended)
How it works:
- A file is processed through a hash algorithm
- It generates a unique value (hash)
- If the file changes → the hash changes
✔ Example:
- Before collecting a hard drive: calculate hash
- After copying: calculate hash again
- If both hashes match → integrity is preserved
👉 Exam tip:
Matching hashes = proof that data has not changed
3. 📦 Chain of Custody
A chain of custody is a documented record showing:
- Who collected the data
- When it was collected
- Who accessed it
- What actions were taken
This ensures:
- No unauthorized access
- No tampering
✔ Example:
- Security analyst collects a log file
- Records date, time, and storage location
- Every transfer is logged
4. 🧪 Forensic Imaging
Forensic imaging is the process of:
- Creating an exact bit-by-bit copy of data
Important points:
- Includes deleted files, hidden data, and slack space
- Uses specialized tools
- Verified with hashes
✔ Example:
- Imaging a compromised workstation for investigation
5. 🚫 Write Protection
To maintain integrity:
- Use write blockers (hardware/software)
Purpose:
- Prevent any changes to original data during collection
✔ Example:
- Connecting a disk via a write blocker ensures no accidental writes
6. 🗂️ Logging and Documentation
Every action must be recorded:
- What was collected
- How it was collected
- Tools used
- Hash values
Why it matters:
- Supports transparency
- Allows repeatability
- Builds trust in findings
🔹 Data Integrity in the Forensic Process
According to NIST SP 800-86, data integrity must be maintained during all phases:
1. Collection
- Use write blockers
- Calculate hash values
- Document everything
2. Examination
- Work only on copies
- Verify hash values before analysis
3. Analysis
- Ensure tools do not modify data
- Maintain logs of all actions
4. Reporting
- Include hash values and procedures
- Prove integrity was maintained
🔹 Common Threats to Data Integrity
Be aware of what can damage integrity:
- Accidental modification
- Malware altering data
- Improper handling of evidence
- Using unverified tools
- Lack of documentation
👉 Exam tip:
Even unintentional changes can break integrity
🔹 Best Practices (Very Important for Exam)
- Always create forensic images
- Never analyze original data
- Use strong hashing algorithms (SHA-256)
- Maintain a chain of custody
- Use write blockers
- Document every step
- Verify integrity at multiple stages
🔹 Quick Exam Summary
- Data Integrity = Data is unchanged and trustworthy
- Hashing is used to verify integrity
- Forensic copies are used instead of originals
- Chain of custody ensures accountability
- Write blockers prevent accidental changes
- Documentation proves integrity was maintained
🔹 Simple Way to Remember
👉 Think of data integrity as:
“Proving that digital evidence is exactly the same from collection to analysis without any changes.”
