This commit is contained in:
Iliyan Angelov
2025-09-19 11:58:53 +03:00
parent 306b20e24a
commit 6b247e5b9f
11423 changed files with 1500615 additions and 778 deletions

View File

@@ -0,0 +1,559 @@
# Compliance & Governance API Documentation
## Overview
The Compliance & Governance module provides comprehensive functionality for managing regulatory compliance, evidence collection, retention policies, and export capabilities for auditors and regulators. This module supports major compliance frameworks including GDPR, HIPAA, SOX, and ISO 27001.
## Features
- **Regulatory Frameworks**: Manage compliance frameworks (GDPR, HIPAA, SOX, ISO 27001)
- **Compliance Requirements**: Track individual compliance requirements and their status
- **Regulatory Workflows**: Define and execute compliance workflows
- **Evidence Collection**: Collect and manage evidence linked to incidents
- **Retention Policies**: Implement data retention policies with auto-archive/delete
- **Export APIs**: Generate exports for regulators and auditors
- **Compliance Reports**: Create and manage compliance assessment reports
- **Legal Holds**: Manage legal holds to prevent data deletion
## API Endpoints
### Base URL
```
/api/compliance/
```
### Regulatory Frameworks
#### List Frameworks
```http
GET /api/compliance/frameworks/
```
**Query Parameters:**
- `framework_type`: Filter by framework type (GDPR, HIPAA, SOX, ISO27001, etc.)
- `is_active`: Filter by active status (true/false)
- `effective_date_after`: Filter frameworks effective after date
- `effective_date_before`: Filter frameworks effective before date
- `search`: Search in name, description, regions, or sectors
**Response:**
```json
{
"count": 4,
"next": null,
"previous": null,
"results": [
{
"id": "uuid",
"name": "GDPR Compliance Framework",
"framework_type": "GDPR",
"framework_type_display": "General Data Protection Regulation",
"version": "1.0",
"description": "GDPR compliance framework...",
"applicable_regions": ["EU", "EEA", "UK"],
"industry_sectors": ["Technology", "Healthcare"],
"is_active": true,
"effective_date": "2018-05-25",
"review_date": "2024-05-25",
"requirements_count": 12,
"created_at": "2024-01-01T00:00:00Z",
"updated_at": "2024-01-01T00:00:00Z"
}
]
}
```
#### Get Framework Details
```http
GET /api/compliance/frameworks/{id}/
```
#### Get Framework Requirements
```http
GET /api/compliance/frameworks/{id}/requirements/
```
**Query Parameters:**
- `compliance_status`: Filter by compliance status
- `priority`: Filter by priority level
#### Get Framework Compliance Summary
```http
GET /api/compliance/frameworks/{id}/compliance_summary/
```
**Response:**
```json
{
"total_requirements": 12,
"compliant": 8,
"partially_compliant": 2,
"non_compliant": 1,
"not_assessed": 1,
"implemented": 10,
"overdue_assessments": 2,
"compliance_percentage": 80.0
}
```
### Compliance Requirements
#### List Requirements
```http
GET /api/compliance/requirements/
```
**Query Parameters:**
- `framework`: Filter by framework ID
- `compliance_status`: Filter by compliance status
- `requirement_type`: Filter by requirement type
- `priority`: Filter by priority level
- `is_implemented`: Filter by implementation status
- `assigned_to`: Filter by assigned user
- `overdue`: Filter overdue assessments (true/false)
- `search`: Search in title, description, or requirement ID
#### Update Compliance Status
```http
POST /api/compliance/requirements/{id}/update_status/
```
**Request Body:**
```json
{
"compliance_status": "COMPLIANT"
}
```
#### Schedule Assessment
```http
POST /api/compliance/requirements/{id}/schedule_assessment/
```
**Request Body:**
```json
{
"assessment_date": "2024-06-01"
}
```
### Regulatory Workflows
#### List Workflows
```http
GET /api/compliance/workflows/
```
**Query Parameters:**
- `workflow_type`: Filter by workflow type
- `status`: Filter by workflow status
- `is_template`: Filter templates (true/false)
- `applicable_frameworks`: Filter by framework ID
- `search`: Search in name or description
#### Create Workflow Instance
```http
POST /api/compliance/workflows/{id}/create_instance/
```
**Request Body:**
```json
{
"title": "GDPR Breach Response - Incident #12345",
"description": "Response to data breach incident",
"related_incident": "incident-uuid",
"assigned_to": "user-uuid",
"due_date": "2024-02-01T00:00:00Z"
}
```
### Workflow Instances
#### List Workflow Instances
```http
GET /api/compliance/workflow-instances/
```
**Query Parameters:**
- `workflow`: Filter by workflow ID
- `status`: Filter by instance status
- `assigned_to`: Filter by assigned user
- `related_incident`: Filter by related incident
- `overdue`: Filter overdue instances (true/false)
- `search`: Search in title, description, or current step
#### Advance Workflow Step
```http
POST /api/compliance/workflow-instances/{id}/advance_step/
```
**Request Body:**
```json
{
"next_step": "assess",
"step_data": {
"risk_level": "HIGH",
"affected_records": 1000
}
}
```
#### Assign Stakeholder
```http
POST /api/compliance/workflow-instances/{id}/assign_stakeholder/
```
**Request Body:**
```json
{
"stakeholder_id": "user-uuid"
}
```
### Evidence Collection
#### List Evidence
```http
GET /api/compliance/evidence/
```
**Query Parameters:**
- `incident`: Filter by incident ID
- `evidence_type`: Filter by evidence type
- `status`: Filter by evidence status
- `collected_by`: Filter by collector
- `compliance_requirement`: Filter by requirement ID
- `search`: Search in title, description, or notes
#### Add Custody Record
```http
POST /api/compliance/evidence/{id}/add_custody_record/
```
**Request Body:**
```json
{
"action": "TRANSFERRED",
"notes": "Transferred to legal team for review"
}
```
#### Verify Evidence
```http
POST /api/compliance/evidence/{id}/verify_evidence/
```
**Request Body:**
```json
{
"verification_notes": "Evidence verified and authenticated"
}
```
### Retention Policies
#### List Retention Policies
```http
GET /api/compliance/retention-policies/
```
**Query Parameters:**
- `policy_type`: Filter by policy type
- `is_active`: Filter by active status
- `applicable_frameworks`: Filter by framework ID
- `search`: Search in name or description
### Export Requests
#### List Export Requests
```http
GET /api/compliance/export-requests/
```
**Query Parameters:**
- `request_type`: Filter by request type
- `status`: Filter by request status
- `requester_email`: Filter by requester email
- `overdue`: Filter overdue requests (true/false)
- `search`: Search in title, description, or requester info
#### Approve Export Request
```http
POST /api/compliance/export-requests/{id}/approve/
```
**Request Body:**
```json
{
"approval_notes": "Approved for regulatory submission"
}
```
#### Execute Export Request
```http
POST /api/compliance/export-requests/{id}/execute/
```
### Compliance Reports
#### List Compliance Reports
```http
GET /api/compliance/reports/
```
**Query Parameters:**
- `framework`: Filter by framework ID
- `report_type`: Filter by report type
- `status`: Filter by report status
- `prepared_by`: Filter by preparer
- `search`: Search in title, description, or summary
### Legal Holds
#### List Legal Holds
```http
GET /api/compliance/legal-holds/
```
**Query Parameters:**
- `status`: Filter by hold status
- `active`: Filter active holds (true/false)
- `search`: Search in case name, number, or description
#### Add Incident to Legal Hold
```http
POST /api/compliance/legal-holds/{id}/add_incident/
```
**Request Body:**
```json
{
"incident_id": "incident-uuid"
}
```
#### Add Evidence to Legal Hold
```http
POST /api/compliance/legal-holds/{id}/add_evidence/
```
**Request Body:**
```json
{
"evidence_id": "evidence-uuid"
}
```
## Data Models
### Regulatory Framework
- **id**: UUID primary key
- **name**: Framework name
- **framework_type**: Type (GDPR, HIPAA, SOX, ISO27001, etc.)
- **version**: Framework version
- **description**: Framework description
- **applicable_regions**: List of applicable regions
- **industry_sectors**: List of applicable sectors
- **compliance_requirements**: List of requirements
- **is_active**: Active status
- **effective_date**: Effective date
- **review_date**: Review date
### Compliance Requirement
- **id**: UUID primary key
- **framework**: Related framework
- **requirement_id**: Unique identifier within framework
- **title**: Requirement title
- **description**: Requirement description
- **requirement_type**: Type (TECHNICAL, ADMINISTRATIVE, etc.)
- **priority**: Priority level (CRITICAL, HIGH, MEDIUM, LOW)
- **compliance_status**: Status (COMPLIANT, PARTIALLY_COMPLIANT, etc.)
- **is_implemented**: Implementation status
- **assigned_to**: Assigned user
- **next_assessment_date**: Next assessment date
### Evidence Collection
- **id**: UUID primary key
- **incident**: Related incident
- **title**: Evidence title
- **description**: Evidence description
- **evidence_type**: Type (LOG_FILE, SCREENSHOT, DOCUMENT, etc.)
- **status**: Status (COLLECTED, VERIFIED, ANALYZED, etc.)
- **file_path**: File path
- **file_hash**: SHA-256 hash for integrity
- **collected_by**: Collector user
- **custody_chain**: Chain of custody records
- **retention_period**: Retention period
### Retention Policy
- **id**: UUID primary key
- **name**: Policy name
- **policy_type**: Type (INCIDENT_DATA, AUDIT_LOGS, etc.)
- **retention_period**: Retention period value
- **retention_unit**: Unit (DAYS, WEEKS, MONTHS, YEARS)
- **auto_archive**: Auto-archive flag
- **auto_delete**: Auto-delete flag
- **data_classification_levels**: Applicable classification levels
- **incident_categories**: Applicable incident categories
- **legal_hold_override**: Legal hold override flag
### Export Request
- **id**: UUID primary key
- **title**: Request title
- **request_type**: Type (REGULATORY, AUDIT, LEGAL, INTERNAL)
- **status**: Status (PENDING, APPROVED, IN_PROGRESS, etc.)
- **requester_name**: Requester name
- **requester_organization**: Requester organization
- **requester_email**: Requester email
- **data_scope**: Scope of data to export
- **date_range_start**: Start date for data
- **date_range_end**: End date for data
- **export_format**: Export format (JSON, CSV, XML, PDF)
- **include_evidence**: Include evidence flag
- **include_audit_trails**: Include audit trails flag
- **redaction_required**: Redaction required flag
## Authentication
All API endpoints require authentication. Use one of the following methods:
1. **Token Authentication**: Include `Authorization: Token <your-token>` header
2. **Session Authentication**: Use Django session authentication
3. **SSO Authentication**: Use configured SSO providers
## Permissions
The API uses role-based access control (RBAC) with the following permission levels:
- **View**: Read-only access to compliance data
- **Edit**: Modify compliance data
- **Admin**: Full administrative access
- **Auditor**: Special access for audit and export functions
## Error Handling
The API returns standard HTTP status codes:
- **200**: Success
- **201**: Created
- **400**: Bad Request
- **401**: Unauthorized
- **403**: Forbidden
- **404**: Not Found
- **500**: Internal Server Error
Error responses include detailed error messages:
```json
{
"error": "Invalid compliance status",
"details": "Status must be one of: COMPLIANT, PARTIALLY_COMPLIANT, NON_COMPLIANT, NOT_ASSESSED"
}
```
## Rate Limiting
API requests are rate-limited to prevent abuse:
- **Authenticated users**: 1000 requests per hour
- **Unauthenticated users**: 100 requests per hour
## Pagination
List endpoints support pagination:
- **page**: Page number (default: 1)
- **page_size**: Items per page (default: 20, max: 100)
## Filtering and Search
Most list endpoints support filtering and search:
- **Exact filters**: Use field names with exact values
- **Date ranges**: Use `_after` and `_before` suffixes
- **Search**: Use `search` parameter for text search across multiple fields
## Examples
### Creating a GDPR Compliance Framework
```bash
curl -X POST "https://api.example.com/api/compliance/frameworks/" \
-H "Authorization: Token your-token" \
-H "Content-Type: application/json" \
-d '{
"name": "GDPR Compliance Framework",
"framework_type": "GDPR",
"version": "1.0",
"description": "GDPR compliance framework for EU data protection",
"applicable_regions": ["EU", "EEA", "UK"],
"industry_sectors": ["Technology", "Healthcare"],
"is_active": true,
"effective_date": "2018-05-25"
}'
```
### Creating an Export Request
```bash
curl -X POST "https://api.example.com/api/compliance/export-requests/" \
-H "Authorization: Token your-token" \
-H "Content-Type: application/json" \
-d '{
"title": "GDPR Data Export Request",
"description": "Export of personal data for regulatory review",
"request_type": "REGULATORY",
"requester_name": "John Doe",
"requester_organization": "EU Data Protection Authority",
"requester_email": "john.doe@dpa.eu",
"data_scope": {
"incidents": true,
"evidence": true,
"audit_trails": true
},
"date_range_start": "2023-01-01T00:00:00Z",
"date_range_end": "2023-12-31T23:59:59Z",
"export_format": "JSON",
"include_evidence": true,
"include_audit_trails": true,
"redaction_required": false
}'
```
### Updating Compliance Status
```bash
curl -X POST "https://api.example.com/api/compliance/requirements/uuid/update_status/" \
-H "Authorization: Token your-token" \
-H "Content-Type: application/json" \
-d '{
"compliance_status": "COMPLIANT"
}'
```
## Integration with Other Modules
The Compliance & Governance module integrates with:
- **Incident Intelligence**: Links evidence and workflows to incidents
- **Security**: Uses data classification and user permissions
- **Automation Orchestration**: Triggers compliance workflows
- **SLA & On-Call**: Manages compliance deadlines and escalations
## Setup and Configuration
1. **Install the module**: Add `compliance_governance` to `INSTALLED_APPS`
2. **Run migrations**: `python manage.py migrate`
3. **Create sample data**: `python manage.py setup_compliance_governance`
4. **Configure frameworks**: Set up your specific compliance frameworks
5. **Define workflows**: Create regulatory workflows for your processes
6. **Set retention policies**: Configure data retention policies
## Support
For technical support or questions about the Compliance & Governance API:
- **Documentation**: Check this documentation and inline API docs
- **Admin Interface**: Use Django admin for data management
- **Logs**: Check application logs for detailed error information
- **Support Team**: Contact your system administrator

View File

@@ -0,0 +1,266 @@
"""
Django admin configuration for compliance_governance app
"""
from django.contrib import admin
from django.utils.html import format_html
from django.urls import reverse
from django.utils.safestring import mark_safe
from django.db.models import Count, Q
from django.utils import timezone
from datetime import timedelta
from .models import (
RegulatoryFramework,
ComplianceRequirement,
RegulatoryWorkflow,
WorkflowInstance,
EvidenceCollection,
RetentionPolicy,
ExportRequest,
ComplianceReport,
LegalHold,
)
@admin.register(RegulatoryFramework)
class RegulatoryFrameworkAdmin(admin.ModelAdmin):
"""Admin interface for Regulatory Framework"""
list_display = [
'name', 'framework_type', 'version', 'is_active', 'effective_date',
'requirements_count', 'created_at'
]
list_filter = [
'framework_type', 'is_active', 'effective_date', 'created_at'
]
search_fields = ['name', 'description', 'applicable_regions', 'industry_sectors']
readonly_fields = ['id', 'created_at', 'updated_at']
fieldsets = (
('Basic Information', {
'fields': ('id', 'name', 'framework_type', 'version', 'description')
}),
('Framework Details', {
'fields': ('applicable_regions', 'industry_sectors', 'compliance_requirements')
}),
('Status and Dates', {
'fields': ('is_active', 'effective_date', 'review_date')
}),
('Metadata', {
'fields': ('created_by', 'created_at', 'updated_at'),
'classes': ('collapse',)
}),
)
def requirements_count(self, obj):
"""Display count of requirements"""
count = obj.requirements.count()
if count > 0:
url = reverse('admin:compliance_governance_compliancerequirement_changelist')
return format_html('<a href="{}?framework__id__exact={}">{} requirements</a>', url, obj.id, count)
return '0 requirements'
requirements_count.short_description = 'Requirements'
@admin.register(ComplianceRequirement)
class ComplianceRequirementAdmin(admin.ModelAdmin):
"""Admin interface for Compliance Requirement"""
list_display = [
'requirement_id', 'title', 'framework', 'requirement_type', 'priority',
'compliance_status', 'is_implemented', 'assigned_to', 'next_assessment_date'
]
list_filter = [
'framework', 'requirement_type', 'priority', 'compliance_status',
'is_implemented', 'assigned_to', 'next_assessment_date'
]
search_fields = ['requirement_id', 'title', 'description', 'responsible_team']
readonly_fields = ['id', 'created_at', 'updated_at']
fieldsets = (
('Basic Information', {
'fields': ('id', 'framework', 'requirement_id', 'title', 'description')
}),
('Requirement Details', {
'fields': ('requirement_type', 'priority', 'implementation_guidance', 'evidence_requirements', 'testing_procedures')
}),
('Compliance Tracking', {
'fields': ('is_implemented', 'implementation_date', 'last_assessment_date', 'next_assessment_date', 'compliance_status')
}),
('Assignment', {
'fields': ('responsible_team', 'assigned_to')
}),
('Metadata', {
'fields': ('created_at', 'updated_at'),
'classes': ('collapse',)
}),
)
def get_queryset(self, request):
return super().get_queryset(request).select_related('framework', 'assigned_to')
@admin.register(RegulatoryWorkflow)
class RegulatoryWorkflowAdmin(admin.ModelAdmin):
"""Admin interface for Regulatory Workflow"""
list_display = [
'name', 'workflow_type', 'status', 'version', 'is_template',
'instances_count', 'created_by', 'created_at'
]
list_filter = [
'workflow_type', 'status', 'is_template', 'created_at'
]
search_fields = ['name', 'description']
readonly_fields = ['id', 'created_at', 'updated_at']
filter_horizontal = ['applicable_frameworks']
def instances_count(self, obj):
"""Display count of workflow instances"""
count = obj.instances.count()
if count > 0:
url = reverse('admin:compliance_governance_workflowinstance_changelist')
return format_html('<a href="{}?workflow__id__exact={}">{} instances</a>', url, obj.id, count)
return '0 instances'
instances_count.short_description = 'Instances'
@admin.register(WorkflowInstance)
class WorkflowInstanceAdmin(admin.ModelAdmin):
"""Admin interface for Workflow Instance"""
list_display = [
'title', 'workflow', 'status', 'assigned_to', 'related_incident',
'started_at', 'due_date', 'is_overdue'
]
list_filter = [
'workflow', 'status', 'assigned_to', 'started_at', 'due_date'
]
search_fields = ['title', 'description', 'current_step']
readonly_fields = ['id', 'started_at', 'updated_at']
filter_horizontal = ['stakeholders']
def is_overdue(self, obj):
"""Check if workflow instance is overdue"""
if obj.due_date and obj.status in ['PENDING', 'IN_PROGRESS']:
if timezone.now() > obj.due_date:
return format_html('<span style="color: red;">Yes</span>')
return 'No'
is_overdue.short_description = 'Overdue'
@admin.register(EvidenceCollection)
class EvidenceCollectionAdmin(admin.ModelAdmin):
"""Admin interface for Evidence Collection"""
list_display = [
'title', 'evidence_type', 'status', 'incident', 'collected_by',
'collection_timestamp', 'file_size_display'
]
list_filter = [
'evidence_type', 'status', 'collected_by', 'collection_timestamp'
]
search_fields = ['title', 'description', 'collection_notes']
readonly_fields = ['id', 'collection_timestamp', 'created_at', 'updated_at']
def file_size_display(self, obj):
"""Display file size in human readable format"""
if not obj.file_size:
return 'N/A'
size = obj.file_size
for unit in ['B', 'KB', 'MB', 'GB', 'TB']:
if size < 1024.0:
return f"{size:.1f} {unit}"
size /= 1024.0
return f"{size:.1f} PB"
file_size_display.short_description = 'File Size'
@admin.register(RetentionPolicy)
class RetentionPolicyAdmin(admin.ModelAdmin):
"""Admin interface for Retention Policy"""
list_display = [
'name', 'policy_type', 'retention_period', 'retention_unit',
'is_active', 'effective_date', 'created_by'
]
list_filter = [
'policy_type', 'is_active', 'effective_date', 'created_at'
]
search_fields = ['name', 'description']
readonly_fields = ['id', 'created_at', 'updated_at']
filter_horizontal = ['applicable_frameworks']
@admin.register(ExportRequest)
class ExportRequestAdmin(admin.ModelAdmin):
"""Admin interface for Export Request"""
list_display = [
'title', 'request_type', 'status', 'requester_name', 'requester_organization',
'requested_at', 'due_date', 'is_overdue'
]
list_filter = [
'request_type', 'status', 'requester_organization', 'requested_at', 'due_date'
]
search_fields = ['title', 'description', 'requester_name', 'requester_email']
readonly_fields = ['id', 'requested_at', 'updated_at']
filter_horizontal = ['applicable_frameworks']
def is_overdue(self, obj):
"""Check if export request is overdue"""
if obj.due_date and obj.status not in ['COMPLETED', 'CANCELLED']:
if timezone.now() > obj.due_date:
return format_html('<span style="color: red;">Yes</span>')
return 'No'
is_overdue.short_description = 'Overdue'
@admin.register(ComplianceReport)
class ComplianceReportAdmin(admin.ModelAdmin):
"""Admin interface for Compliance Report"""
list_display = [
'title', 'report_type', 'framework', 'status', 'overall_compliance_score',
'report_period_start', 'report_period_end', 'prepared_by'
]
list_filter = [
'report_type', 'status', 'framework', 'report_period_start', 'report_period_end'
]
search_fields = ['title', 'description', 'executive_summary']
readonly_fields = ['id', 'created_at', 'updated_at']
filter_horizontal = ['applicable_requirements']
@admin.register(LegalHold)
class LegalHoldAdmin(admin.ModelAdmin):
"""Admin interface for Legal Hold"""
list_display = [
'case_name', 'case_number', 'status', 'legal_counsel', 'hold_date',
'expiration_date', 'affected_data_count', 'is_active'
]
list_filter = [
'status', 'hold_date', 'expiration_date', 'created_at'
]
search_fields = ['case_name', 'case_number', 'description', 'legal_counsel', 'law_firm']
readonly_fields = ['id', 'created_at', 'updated_at']
filter_horizontal = ['related_incidents', 'related_evidence']
def affected_data_count(self, obj):
"""Display count of affected data items"""
count = obj.get_affected_data_count()
return f"{count} items"
affected_data_count.short_description = 'Affected Data'
def is_active(self, obj):
"""Check if legal hold is active"""
if obj.is_active():
return format_html('<span style="color: green;">Active</span>')
return format_html('<span style="color: red;">Inactive</span>')
is_active.short_description = 'Status'
# Custom admin site configuration
admin.site.site_header = "ETB Compliance & Governance Administration"
admin.site.site_title = "ETB Compliance Admin"
admin.site.index_title = "Compliance & Governance Management"

View File

@@ -0,0 +1,19 @@
"""
Compliance & Governance Django app configuration
"""
from django.apps import AppConfig
class ComplianceGovernanceConfig(AppConfig):
"""Configuration for Compliance & Governance app"""
default_auto_field = 'django.db.models.BigAutoField'
name = 'compliance_governance'
verbose_name = 'Compliance & Governance'
def ready(self):
"""Import signal handlers when the app is ready"""
try:
import compliance_governance.signals
except ImportError:
pass

View File

@@ -0,0 +1,459 @@
"""
Management command to set up Compliance & Governance module with sample data
"""
from django.core.management.base import BaseCommand
from django.contrib.auth import get_user_model
from django.utils import timezone
from datetime import date, timedelta
import json
from compliance_governance.models import (
RegulatoryFramework,
ComplianceRequirement,
RegulatoryWorkflow,
RetentionPolicy,
)
User = get_user_model()
class Command(BaseCommand):
help = 'Set up Compliance & Governance module with sample data'
def add_arguments(self, parser):
parser.add_argument(
'--reset',
action='store_true',
help='Reset existing data before creating new data',
)
def handle(self, *args, **options):
if options['reset']:
self.stdout.write('Resetting existing compliance data...')
self.reset_data()
self.stdout.write('Setting up Compliance & Governance module...')
# Create regulatory frameworks
self.create_regulatory_frameworks()
# Create compliance requirements
self.create_compliance_requirements()
# Create regulatory workflows
self.create_regulatory_workflows()
# Create retention policies
self.create_retention_policies()
self.stdout.write(
self.style.SUCCESS('Successfully set up Compliance & Governance module!')
)
def reset_data(self):
"""Reset existing compliance data"""
RetentionPolicy.objects.all().delete()
RegulatoryWorkflow.objects.all().delete()
ComplianceRequirement.objects.all().delete()
RegulatoryFramework.objects.all().delete()
def create_regulatory_frameworks(self):
"""Create sample regulatory frameworks"""
self.stdout.write('Creating regulatory frameworks...')
frameworks_data = [
{
'name': 'GDPR Compliance Framework',
'framework_type': 'GDPR',
'version': '1.0',
'description': 'General Data Protection Regulation compliance framework for EU data protection requirements.',
'applicable_regions': ['EU', 'EEA', 'UK'],
'industry_sectors': ['Technology', 'Healthcare', 'Finance', 'Retail'],
'compliance_requirements': [
'Data Protection Impact Assessment',
'Privacy by Design',
'Data Subject Rights',
'Data Breach Notification',
'Consent Management',
'Data Processing Records'
],
'is_active': True,
'effective_date': date(2018, 5, 25),
'review_date': date(2024, 5, 25),
},
{
'name': 'HIPAA Compliance Framework',
'framework_type': 'HIPAA',
'version': '1.0',
'description': 'Health Insurance Portability and Accountability Act compliance framework for healthcare data protection.',
'applicable_regions': ['US'],
'industry_sectors': ['Healthcare', 'Health Insurance', 'Healthcare Technology'],
'compliance_requirements': [
'Administrative Safeguards',
'Physical Safeguards',
'Technical Safeguards',
'Business Associate Agreements',
'Risk Assessment',
'Incident Response'
],
'is_active': True,
'effective_date': date(1996, 8, 21),
'review_date': date(2024, 8, 21),
},
{
'name': 'SOX Compliance Framework',
'framework_type': 'SOX',
'version': '1.0',
'description': 'Sarbanes-Oxley Act compliance framework for financial reporting and internal controls.',
'applicable_regions': ['US'],
'industry_sectors': ['Finance', 'Public Companies', 'Accounting'],
'compliance_requirements': [
'Internal Controls Assessment',
'Financial Reporting Controls',
'IT General Controls',
'Management Assessment',
'External Audit',
'Documentation Requirements'
],
'is_active': True,
'effective_date': date(2002, 7, 30),
'review_date': date(2024, 7, 30),
},
{
'name': 'ISO 27001 Information Security Management',
'framework_type': 'ISO27001',
'version': '1.0',
'description': 'ISO/IEC 27001 Information Security Management System standard.',
'applicable_regions': ['Global'],
'industry_sectors': ['Technology', 'Finance', 'Healthcare', 'Government', 'Manufacturing'],
'compliance_requirements': [
'Information Security Policy',
'Risk Assessment and Treatment',
'Access Control',
'Cryptography',
'Physical Security',
'Operations Security',
'Communications Security',
'System Acquisition and Development',
'Supplier Relationships',
'Information Security Incident Management',
'Business Continuity',
'Compliance'
],
'is_active': True,
'effective_date': date(2013, 10, 1),
'review_date': date(2024, 10, 1),
},
]
for framework_data in frameworks_data:
framework, created = RegulatoryFramework.objects.get_or_create(
name=framework_data['name'],
defaults=framework_data
)
if created:
self.stdout.write(f' Created framework: {framework.name}')
else:
self.stdout.write(f' Framework already exists: {framework.name}')
def create_compliance_requirements(self):
"""Create sample compliance requirements"""
self.stdout.write('Creating compliance requirements...')
# Get frameworks
gdpr_framework = RegulatoryFramework.objects.get(name='GDPR Compliance Framework')
hipaa_framework = RegulatoryFramework.objects.get(name='HIPAA Compliance Framework')
sox_framework = RegulatoryFramework.objects.get(name='SOX Compliance Framework')
iso_framework = RegulatoryFramework.objects.get(name='ISO 27001 Information Security Management')
requirements_data = [
# GDPR Requirements
{
'framework': gdpr_framework,
'requirement_id': 'GDPR-001',
'title': 'Data Protection Impact Assessment (DPIA)',
'description': 'Conduct Data Protection Impact Assessments for high-risk processing activities.',
'requirement_type': 'PROCEDURAL',
'priority': 'HIGH',
'implementation_guidance': 'Implement DPIA process for all new data processing activities that may result in high risk to individuals.',
'evidence_requirements': ['DPIA Documentation', 'Risk Assessment Records', 'Mitigation Plans'],
'testing_procedures': 'Review DPIA documentation and verify implementation of identified controls.',
'responsible_team': 'Privacy Team',
'next_assessment_date': date.today() + timedelta(days=90),
},
{
'framework': gdpr_framework,
'requirement_id': 'GDPR-002',
'title': 'Data Subject Rights Management',
'description': 'Implement processes to handle data subject rights requests (access, rectification, erasure, etc.).',
'requirement_type': 'PROCEDURAL',
'priority': 'CRITICAL',
'implementation_guidance': 'Establish clear procedures for handling data subject requests within 30 days.',
'evidence_requirements': ['Request Handling Procedures', 'Response Templates', 'Processing Records'],
'testing_procedures': 'Test data subject request handling process and verify response times.',
'responsible_team': 'Legal and Privacy Team',
'next_assessment_date': date.today() + timedelta(days=60),
},
# HIPAA Requirements
{
'framework': hipaa_framework,
'requirement_id': 'HIPAA-001',
'title': 'Administrative Safeguards',
'description': 'Implement administrative safeguards including security officer designation and workforce training.',
'requirement_type': 'ADMINISTRATIVE',
'priority': 'CRITICAL',
'implementation_guidance': 'Designate security officer, implement workforce training, and establish access management procedures.',
'evidence_requirements': ['Security Officer Documentation', 'Training Records', 'Access Management Procedures'],
'testing_procedures': 'Review training records and verify access management implementation.',
'responsible_team': 'Security Team',
'next_assessment_date': date.today() + timedelta(days=120),
},
{
'framework': hipaa_framework,
'requirement_id': 'HIPAA-002',
'title': 'Technical Safeguards',
'description': 'Implement technical safeguards including access control, audit controls, and encryption.',
'requirement_type': 'TECHNICAL',
'priority': 'CRITICAL',
'implementation_guidance': 'Implement access controls, audit logging, and encryption for PHI.',
'evidence_requirements': ['Access Control Documentation', 'Audit Logs', 'Encryption Implementation'],
'testing_procedures': 'Test access controls and verify audit logging functionality.',
'responsible_team': 'IT Security Team',
'next_assessment_date': date.today() + timedelta(days=90),
},
# SOX Requirements
{
'framework': sox_framework,
'requirement_id': 'SOX-001',
'title': 'Internal Controls Assessment',
'description': 'Assess and document internal controls over financial reporting.',
'requirement_type': 'ADMINISTRATIVE',
'priority': 'CRITICAL',
'implementation_guidance': 'Document and test internal controls related to financial reporting processes.',
'evidence_requirements': ['Control Documentation', 'Testing Results', 'Remediation Plans'],
'testing_procedures': 'Perform walkthroughs and test controls for effectiveness.',
'responsible_team': 'Internal Audit',
'next_assessment_date': date.today() + timedelta(days=180),
},
# ISO 27001 Requirements
{
'framework': iso_framework,
'requirement_id': 'ISO-001',
'title': 'Information Security Policy',
'description': 'Establish and maintain information security policies and procedures.',
'requirement_type': 'DOCUMENTATION',
'priority': 'HIGH',
'implementation_guidance': 'Develop comprehensive information security policies covering all aspects of the ISMS.',
'evidence_requirements': ['Security Policy Document', 'Policy Review Records', 'Approval Documentation'],
'testing_procedures': 'Review policy documentation and verify implementation across organization.',
'responsible_team': 'Information Security Team',
'next_assessment_date': date.today() + timedelta(days=365),
},
{
'framework': iso_framework,
'requirement_id': 'ISO-002',
'title': 'Risk Assessment and Treatment',
'description': 'Conduct regular risk assessments and implement appropriate risk treatment measures.',
'requirement_type': 'PROCEDURAL',
'priority': 'CRITICAL',
'implementation_guidance': 'Implement systematic risk assessment process and risk treatment plans.',
'evidence_requirements': ['Risk Assessment Reports', 'Risk Treatment Plans', 'Risk Register'],
'testing_procedures': 'Review risk assessment methodology and verify implementation of treatment measures.',
'responsible_team': 'Risk Management Team',
'next_assessment_date': date.today() + timedelta(days=180),
},
]
for req_data in requirements_data:
requirement, created = ComplianceRequirement.objects.get_or_create(
framework=req_data['framework'],
requirement_id=req_data['requirement_id'],
defaults=req_data
)
if created:
self.stdout.write(f' Created requirement: {requirement.requirement_id} - {requirement.title}')
else:
self.stdout.write(f' Requirement already exists: {requirement.requirement_id}')
def create_regulatory_workflows(self):
"""Create sample regulatory workflows"""
self.stdout.write('Creating regulatory workflows...')
# Get frameworks
gdpr_framework = RegulatoryFramework.objects.get(name='GDPR Compliance Framework')
hipaa_framework = RegulatoryFramework.objects.get(name='HIPAA Compliance Framework')
workflows_data = [
{
'name': 'GDPR Data Breach Response Workflow',
'workflow_type': 'DATA_BREACH',
'description': 'Workflow for handling GDPR data breach notifications and response.',
'applicable_frameworks': [gdpr_framework],
'workflow_definition': {
'steps': [
{'id': 'detect', 'name': 'Breach Detection', 'assignee': 'Security Team'},
{'id': 'assess', 'name': 'Risk Assessment', 'assignee': 'Privacy Team'},
{'id': 'notify_dpa', 'name': 'DPA Notification', 'assignee': 'Legal Team'},
{'id': 'notify_subjects', 'name': 'Data Subject Notification', 'assignee': 'Privacy Team'},
{'id': 'remediate', 'name': 'Remediation', 'assignee': 'IT Team'},
{'id': 'document', 'name': 'Documentation', 'assignee': 'Compliance Team'},
],
'transitions': [
{'from': 'detect', 'to': 'assess', 'condition': 'breach_confirmed'},
{'from': 'assess', 'to': 'notify_dpa', 'condition': 'high_risk'},
{'from': 'assess', 'to': 'remediate', 'condition': 'low_risk'},
{'from': 'notify_dpa', 'to': 'notify_subjects', 'condition': 'dpa_notified'},
{'from': 'notify_subjects', 'to': 'remediate', 'condition': 'subjects_notified'},
{'from': 'remediate', 'to': 'document', 'condition': 'remediation_complete'},
],
'end_steps': ['document']
},
'triggers': ['data_breach_detected', 'gdpr_incident_created'],
'conditions': {'framework': 'GDPR', 'severity': ['HIGH', 'CRITICAL']},
'status': 'ACTIVE',
'version': '1.0',
'notification_rules': [
{'event': 'workflow_started', 'recipients': ['privacy_team', 'legal_team']},
{'event': 'step_completed', 'recipients': ['assigned_user']},
{'event': 'workflow_completed', 'recipients': ['compliance_team']},
],
'escalation_rules': [
{'step': 'notify_dpa', 'timeout': 24, 'escalate_to': 'legal_director'},
{'step': 'notify_subjects', 'timeout': 48, 'escalate_to': 'privacy_officer'},
],
},
{
'name': 'HIPAA Incident Response Workflow',
'workflow_type': 'INCIDENT_RESPONSE',
'description': 'Workflow for handling HIPAA security incidents and breaches.',
'applicable_frameworks': [hipaa_framework],
'workflow_definition': {
'steps': [
{'id': 'detect', 'name': 'Incident Detection', 'assignee': 'Security Team'},
{'id': 'contain', 'name': 'Containment', 'assignee': 'IT Team'},
{'id': 'investigate', 'name': 'Investigation', 'assignee': 'Security Team'},
{'id': 'assess', 'name': 'Risk Assessment', 'assignee': 'Privacy Team'},
{'id': 'notify', 'name': 'Notification', 'assignee': 'Legal Team'},
{'id': 'recover', 'name': 'Recovery', 'assignee': 'IT Team'},
{'id': 'lessons', 'name': 'Lessons Learned', 'assignee': 'Security Team'},
],
'transitions': [
{'from': 'detect', 'to': 'contain', 'condition': 'incident_confirmed'},
{'from': 'contain', 'to': 'investigate', 'condition': 'contained'},
{'from': 'investigate', 'to': 'assess', 'condition': 'investigation_complete'},
{'from': 'assess', 'to': 'notify', 'condition': 'breach_confirmed'},
{'from': 'assess', 'to': 'recover', 'condition': 'no_breach'},
{'from': 'notify', 'to': 'recover', 'condition': 'notifications_sent'},
{'from': 'recover', 'to': 'lessons', 'condition': 'recovery_complete'},
],
'end_steps': ['lessons']
},
'triggers': ['hipaa_incident_created', 'phi_breach_detected'],
'conditions': {'framework': 'HIPAA'},
'status': 'ACTIVE',
'version': '1.0',
'notification_rules': [
{'event': 'workflow_started', 'recipients': ['security_team', 'privacy_officer']},
{'event': 'breach_confirmed', 'recipients': ['legal_team', 'executive_team']},
],
'escalation_rules': [
{'step': 'notify', 'timeout': 12, 'escalate_to': 'legal_director'},
],
},
]
for workflow_data in workflows_data:
applicable_frameworks = workflow_data.pop('applicable_frameworks')
workflow, created = RegulatoryWorkflow.objects.get_or_create(
name=workflow_data['name'],
defaults=workflow_data
)
if created:
workflow.applicable_frameworks.set(applicable_frameworks)
self.stdout.write(f' Created workflow: {workflow.name}')
else:
self.stdout.write(f' Workflow already exists: {workflow.name}')
def create_retention_policies(self):
"""Create sample retention policies"""
self.stdout.write('Creating retention policies...')
# Get frameworks
gdpr_framework = RegulatoryFramework.objects.get(name='GDPR Compliance Framework')
hipaa_framework = RegulatoryFramework.objects.get(name='HIPAA Compliance Framework')
sox_framework = RegulatoryFramework.objects.get(name='SOX Compliance Framework')
policies_data = [
{
'name': 'GDPR Personal Data Retention Policy',
'description': 'Retention policy for personal data under GDPR requirements.',
'policy_type': 'INCIDENT_DATA',
'applicable_frameworks': [gdpr_framework],
'retention_period': 7,
'retention_unit': 'YEARS',
'auto_archive': True,
'auto_delete': False,
'data_classification_levels': ['CONFIDENTIAL', 'RESTRICTED'],
'incident_categories': ['Data Breach', 'Privacy Incident', 'GDPR Violation'],
'legal_hold_override': True,
'is_active': True,
'effective_date': date.today(),
},
{
'name': 'HIPAA PHI Retention Policy',
'description': 'Retention policy for Protected Health Information under HIPAA.',
'policy_type': 'INCIDENT_DATA',
'applicable_frameworks': [hipaa_framework],
'retention_period': 6,
'retention_unit': 'YEARS',
'auto_archive': True,
'auto_delete': False,
'data_classification_levels': ['RESTRICTED', 'TOP_SECRET'],
'incident_categories': ['HIPAA Breach', 'PHI Incident', 'Security Incident'],
'legal_hold_override': True,
'is_active': True,
'effective_date': date.today(),
},
{
'name': 'SOX Financial Records Retention Policy',
'description': 'Retention policy for financial records and audit trails under SOX.',
'policy_type': 'AUDIT_LOGS',
'applicable_frameworks': [sox_framework],
'retention_period': 7,
'retention_unit': 'YEARS',
'auto_archive': True,
'auto_delete': False,
'data_classification_levels': ['CONFIDENTIAL', 'RESTRICTED'],
'incident_categories': ['Financial Incident', 'Audit Finding', 'Control Failure'],
'legal_hold_override': True,
'is_active': True,
'effective_date': date.today(),
},
{
'name': 'General System Logs Retention Policy',
'description': 'Retention policy for general system logs and audit trails.',
'policy_type': 'SYSTEM_LOGS',
'applicable_frameworks': [],
'retention_period': 1,
'retention_unit': 'YEARS',
'auto_archive': True,
'auto_delete': True,
'data_classification_levels': ['PUBLIC', 'INTERNAL'],
'incident_categories': ['System Incident', 'Performance Issue', 'General Security'],
'legal_hold_override': True,
'is_active': True,
'effective_date': date.today(),
},
]
for policy_data in policies_data:
applicable_frameworks = policy_data.pop('applicable_frameworks')
policy, created = RetentionPolicy.objects.get_or_create(
name=policy_data['name'],
defaults=policy_data
)
if created:
policy.applicable_frameworks.set(applicable_frameworks)
self.stdout.write(f' Created retention policy: {policy.name}')
else:
self.stdout.write(f' Retention policy already exists: {policy.name}')

View File

@@ -0,0 +1,366 @@
# Generated by Django 5.2.6 on 2025-09-18 16:41
import django.core.validators
import django.db.models.deletion
import uuid
from django.conf import settings
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
('incident_intelligence', '0004_incident_oncall_assignment_incident_sla_override_and_more'),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='RegulatoryFramework',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('name', models.CharField(max_length=100, unique=True)),
('framework_type', models.CharField(choices=[('GDPR', 'General Data Protection Regulation'), ('HIPAA', 'Health Insurance Portability and Accountability Act'), ('SOX', 'Sarbanes-Oxley Act'), ('ISO27001', 'ISO/IEC 27001'), ('PCI_DSS', 'Payment Card Industry Data Security Standard'), ('NIST', 'NIST Cybersecurity Framework'), ('CUSTOM', 'Custom Framework')], max_length=20)),
('version', models.CharField(default='1.0', max_length=20)),
('description', models.TextField()),
('applicable_regions', models.JSONField(default=list, help_text='List of applicable regions/countries')),
('industry_sectors', models.JSONField(default=list, help_text='List of applicable industry sectors')),
('compliance_requirements', models.JSONField(default=list, help_text='List of compliance requirements')),
('is_active', models.BooleanField(default=True)),
('effective_date', models.DateField()),
('review_date', models.DateField(blank=True, null=True)),
('created_at', models.DateTimeField(auto_now_add=True)),
('updated_at', models.DateTimeField(auto_now=True)),
('created_by', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to=settings.AUTH_USER_MODEL)),
],
options={
'ordering': ['name', 'version'],
},
),
migrations.CreateModel(
name='ExportRequest',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('title', models.CharField(max_length=200)),
('description', models.TextField()),
('request_type', models.CharField(choices=[('REGULATORY', 'Regulatory Request'), ('AUDIT', 'Audit Request'), ('LEGAL', 'Legal Request'), ('INTERNAL', 'Internal Request')], max_length=20)),
('status', models.CharField(choices=[('PENDING', 'Pending'), ('APPROVED', 'Approved'), ('IN_PROGRESS', 'In Progress'), ('COMPLETED', 'Completed'), ('REJECTED', 'Rejected'), ('CANCELLED', 'Cancelled')], default='PENDING', max_length=20)),
('requester_name', models.CharField(max_length=200)),
('requester_organization', models.CharField(blank=True, max_length=200, null=True)),
('requester_email', models.EmailField(max_length=254)),
('requester_phone', models.CharField(blank=True, max_length=20, null=True)),
('legal_authority', models.CharField(blank=True, max_length=200, null=True)),
('data_scope', models.JSONField(help_text='Scope of data to be exported')),
('date_range_start', models.DateTimeField(blank=True, null=True)),
('date_range_end', models.DateTimeField(blank=True, null=True)),
('data_classification_levels', models.JSONField(default=list)),
('incident_categories', models.JSONField(default=list)),
('export_format', models.CharField(default='JSON', help_text='Export format (JSON, CSV, XML, PDF)', max_length=20)),
('include_evidence', models.BooleanField(default=True)),
('include_audit_trails', models.BooleanField(default=True)),
('redaction_required', models.BooleanField(default=False)),
('approval_required', models.BooleanField(default=True)),
('approved_at', models.DateTimeField(blank=True, null=True)),
('approval_notes', models.TextField(blank=True, null=True)),
('export_file_path', models.CharField(blank=True, max_length=500, null=True)),
('export_file_hash', models.CharField(blank=True, max_length=64, null=True)),
('export_file_size', models.BigIntegerField(blank=True, null=True)),
('requested_at', models.DateTimeField(auto_now_add=True)),
('due_date', models.DateTimeField(blank=True, null=True)),
('completed_at', models.DateTimeField(blank=True, null=True)),
('updated_at', models.DateTimeField(auto_now=True)),
('approved_by', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='approved_exports', to=settings.AUTH_USER_MODEL)),
('created_by', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='created_exports', to=settings.AUTH_USER_MODEL)),
('executed_by', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='executed_exports', to=settings.AUTH_USER_MODEL)),
('applicable_frameworks', models.ManyToManyField(blank=True, to='compliance_governance.regulatoryframework')),
],
options={
'ordering': ['-requested_at'],
},
),
migrations.CreateModel(
name='ComplianceRequirement',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('requirement_id', models.CharField(help_text='Unique identifier within framework', max_length=50)),
('title', models.CharField(max_length=200)),
('description', models.TextField()),
('requirement_type', models.CharField(choices=[('TECHNICAL', 'Technical Control'), ('ADMINISTRATIVE', 'Administrative Control'), ('PHYSICAL', 'Physical Control'), ('PROCEDURAL', 'Procedural Control'), ('DOCUMENTATION', 'Documentation Requirement')], max_length=20)),
('priority', models.CharField(choices=[('CRITICAL', 'Critical'), ('HIGH', 'High'), ('MEDIUM', 'Medium'), ('LOW', 'Low')], default='MEDIUM', max_length=10)),
('implementation_guidance', models.TextField(blank=True, null=True)),
('evidence_requirements', models.JSONField(default=list, help_text='Types of evidence required')),
('testing_procedures', models.TextField(blank=True, null=True)),
('is_implemented', models.BooleanField(default=False)),
('implementation_date', models.DateField(blank=True, null=True)),
('last_assessment_date', models.DateField(blank=True, null=True)),
('next_assessment_date', models.DateField(blank=True, null=True)),
('compliance_status', models.CharField(choices=[('COMPLIANT', 'Compliant'), ('PARTIALLY_COMPLIANT', 'Partially Compliant'), ('NON_COMPLIANT', 'Non-Compliant'), ('NOT_ASSESSED', 'Not Assessed')], default='NOT_ASSESSED', max_length=20)),
('responsible_team', models.CharField(blank=True, max_length=100, null=True)),
('created_at', models.DateTimeField(auto_now_add=True)),
('updated_at', models.DateTimeField(auto_now=True)),
('assigned_to', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to=settings.AUTH_USER_MODEL)),
('framework', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='requirements', to='compliance_governance.regulatoryframework')),
],
options={
'ordering': ['framework', 'requirement_id'],
},
),
migrations.CreateModel(
name='ComplianceReport',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('title', models.CharField(max_length=200)),
('report_type', models.CharField(choices=[('ASSESSMENT', 'Compliance Assessment'), ('AUDIT', 'Audit Report'), ('REMEDIATION', 'Remediation Report'), ('STATUS', 'Status Report'), ('EXCEPTION', 'Exception Report')], max_length=20)),
('description', models.TextField()),
('status', models.CharField(choices=[('DRAFT', 'Draft'), ('REVIEW', 'Under Review'), ('APPROVED', 'Approved'), ('PUBLISHED', 'Published'), ('ARCHIVED', 'Archived')], default='DRAFT', max_length=20)),
('executive_summary', models.TextField(blank=True, null=True)),
('findings', models.JSONField(default=list, help_text='List of findings and observations')),
('recommendations', models.JSONField(default=list, help_text='List of recommendations')),
('action_items', models.JSONField(default=list, help_text='List of action items')),
('overall_compliance_score', models.FloatField(blank=True, help_text='Overall compliance score (0-100)', null=True, validators=[django.core.validators.MinValueValidator(0.0), django.core.validators.MaxValueValidator(100.0)])),
('compliance_gaps', models.JSONField(default=list, help_text='List of compliance gaps')),
('risk_assessment', models.JSONField(default=dict, help_text='Risk assessment details')),
('report_period_start', models.DateField()),
('report_period_end', models.DateField()),
('report_file_path', models.CharField(blank=True, max_length=500, null=True)),
('created_at', models.DateTimeField(auto_now_add=True)),
('updated_at', models.DateTimeField(auto_now=True)),
('published_at', models.DateTimeField(blank=True, null=True)),
('approved_by', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='approved_reports', to=settings.AUTH_USER_MODEL)),
('prepared_by', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='prepared_reports', to=settings.AUTH_USER_MODEL)),
('reviewed_by', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='reviewed_reports', to=settings.AUTH_USER_MODEL)),
('applicable_requirements', models.ManyToManyField(blank=True, to='compliance_governance.compliancerequirement')),
('framework', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='reports', to='compliance_governance.regulatoryframework')),
],
options={
'ordering': ['-created_at'],
},
),
migrations.CreateModel(
name='RegulatoryWorkflow',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('name', models.CharField(max_length=200)),
('workflow_type', models.CharField(choices=[('INCIDENT_RESPONSE', 'Incident Response'), ('DATA_BREACH', 'Data Breach Response'), ('AUDIT', 'Audit Process'), ('ASSESSMENT', 'Compliance Assessment'), ('REMEDIATION', 'Remediation Process'), ('DOCUMENTATION', 'Documentation Process')], max_length=30)),
('description', models.TextField()),
('workflow_definition', models.JSONField(help_text='JSON definition of workflow steps and transitions')),
('triggers', models.JSONField(default=list, help_text='Events that trigger this workflow')),
('conditions', models.JSONField(default=dict, help_text='Conditions for workflow execution')),
('status', models.CharField(choices=[('DRAFT', 'Draft'), ('ACTIVE', 'Active'), ('SUSPENDED', 'Suspended'), ('ARCHIVED', 'Archived')], default='DRAFT', max_length=20)),
('version', models.CharField(default='1.0', max_length=20)),
('is_template', models.BooleanField(default=False)),
('notification_rules', models.JSONField(default=list, help_text='Notification rules for workflow events')),
('escalation_rules', models.JSONField(default=list, help_text='Escalation rules for workflow delays')),
('created_at', models.DateTimeField(auto_now_add=True)),
('updated_at', models.DateTimeField(auto_now=True)),
('applicable_frameworks', models.ManyToManyField(blank=True, to='compliance_governance.regulatoryframework')),
('created_by', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to=settings.AUTH_USER_MODEL)),
],
options={
'ordering': ['name', 'version'],
},
),
migrations.CreateModel(
name='RetentionPolicy',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('name', models.CharField(max_length=200, unique=True)),
('description', models.TextField()),
('policy_type', models.CharField(choices=[('INCIDENT_DATA', 'Incident Data'), ('AUDIT_LOGS', 'Audit Logs'), ('EVIDENCE', 'Evidence'), ('USER_DATA', 'User Data'), ('SYSTEM_LOGS', 'System Logs'), ('BACKUP_DATA', 'Backup Data'), ('DOCUMENTATION', 'Documentation')], max_length=30)),
('retention_period', models.PositiveIntegerField(help_text='Retention period value')),
('retention_unit', models.CharField(choices=[('DAYS', 'Days'), ('WEEKS', 'Weeks'), ('MONTHS', 'Months'), ('YEARS', 'Years')], default='YEARS', max_length=10)),
('auto_archive', models.BooleanField(default=True, help_text='Automatically archive after retention period')),
('auto_delete', models.BooleanField(default=False, help_text='Automatically delete after retention period')),
('data_classification_levels', models.JSONField(default=list, help_text='Data classification levels this policy applies to')),
('incident_categories', models.JSONField(default=list, help_text='Incident categories this policy applies to')),
('custom_filters', models.JSONField(default=dict, help_text='Custom filters for policy application')),
('legal_hold_override', models.BooleanField(default=True, help_text='Whether legal holds can override this policy')),
('exception_conditions', models.JSONField(default=list, help_text='Conditions that create exceptions to this policy')),
('is_active', models.BooleanField(default=True)),
('effective_date', models.DateField()),
('review_date', models.DateField(blank=True, null=True)),
('created_at', models.DateTimeField(auto_now_add=True)),
('updated_at', models.DateTimeField(auto_now=True)),
('applicable_frameworks', models.ManyToManyField(blank=True, to='compliance_governance.regulatoryframework')),
('created_by', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to=settings.AUTH_USER_MODEL)),
],
options={
'ordering': ['name'],
},
),
migrations.CreateModel(
name='WorkflowInstance',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('title', models.CharField(max_length=200)),
('description', models.TextField(blank=True, null=True)),
('status', models.CharField(choices=[('PENDING', 'Pending'), ('IN_PROGRESS', 'In Progress'), ('COMPLETED', 'Completed'), ('SUSPENDED', 'Suspended'), ('CANCELLED', 'Cancelled'), ('FAILED', 'Failed')], default='PENDING', max_length=20)),
('current_step', models.CharField(blank=True, max_length=100, null=True)),
('execution_data', models.JSONField(default=dict, help_text='Runtime data for workflow execution')),
('completed_steps', models.JSONField(default=list, help_text='List of completed workflow steps')),
('started_at', models.DateTimeField(auto_now_add=True)),
('completed_at', models.DateTimeField(blank=True, null=True)),
('due_date', models.DateTimeField(blank=True, null=True)),
('updated_at', models.DateTimeField(auto_now=True)),
('assigned_to', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to=settings.AUTH_USER_MODEL)),
('created_by', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='created_workflows', to=settings.AUTH_USER_MODEL)),
('related_incident', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='regulatory_workflows', to='incident_intelligence.incident')),
('related_requirement', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='workflow_instances', to='compliance_governance.compliancerequirement')),
('stakeholders', models.ManyToManyField(blank=True, related_name='workflow_stakeholders', to=settings.AUTH_USER_MODEL)),
('workflow', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='instances', to='compliance_governance.regulatoryworkflow')),
],
options={
'ordering': ['-started_at'],
},
),
migrations.CreateModel(
name='EvidenceCollection',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('title', models.CharField(max_length=200)),
('description', models.TextField()),
('evidence_type', models.CharField(choices=[('LOG_FILE', 'Log File'), ('SCREENSHOT', 'Screenshot'), ('DOCUMENT', 'Document'), ('EMAIL', 'Email'), ('DATABASE_RECORD', 'Database Record'), ('CONFIGURATION', 'Configuration File'), ('AUDIT_TRAIL', 'Audit Trail'), ('TESTIMONY', 'Testimony/Statement'), ('PHYSICAL_EVIDENCE', 'Physical Evidence'), ('DIGITAL_FORENSICS', 'Digital Forensics')], max_length=30)),
('status', models.CharField(choices=[('COLLECTED', 'Collected'), ('VERIFIED', 'Verified'), ('ANALYZED', 'Analyzed'), ('ARCHIVED', 'Archived'), ('DESTROYED', 'Destroyed')], default='COLLECTED', max_length=20)),
('file_path', models.CharField(blank=True, max_length=500, null=True)),
('file_hash', models.CharField(blank=True, help_text='SHA-256 hash for integrity', max_length=64, null=True)),
('file_size', models.BigIntegerField(blank=True, null=True)),
('mime_type', models.CharField(blank=True, max_length=100, null=True)),
('collection_method', models.CharField(blank=True, max_length=100, null=True)),
('collection_timestamp', models.DateTimeField(auto_now_add=True)),
('collection_location', models.CharField(blank=True, max_length=200, null=True)),
('collection_notes', models.TextField(blank=True, null=True)),
('custody_chain', models.JSONField(default=list, help_text='Chain of custody records')),
('retention_period', models.DurationField(blank=True, null=True)),
('disposal_date', models.DateTimeField(blank=True, null=True)),
('disposal_method', models.CharField(blank=True, max_length=100, null=True)),
('created_at', models.DateTimeField(auto_now_add=True)),
('updated_at', models.DateTimeField(auto_now=True)),
('collected_by', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='collected_evidence', to=settings.AUTH_USER_MODEL)),
('compliance_requirement', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='evidence_collection', to='compliance_governance.compliancerequirement')),
('incident', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='evidence_collection', to='incident_intelligence.incident')),
('verified_by', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='verified_evidence', to=settings.AUTH_USER_MODEL)),
('workflow_instance', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='evidence_collection', to='compliance_governance.workflowinstance')),
],
options={
'ordering': ['-collection_timestamp'],
},
),
migrations.CreateModel(
name='LegalHold',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('case_name', models.CharField(max_length=200)),
('case_number', models.CharField(blank=True, max_length=100, null=True)),
('description', models.TextField()),
('status', models.CharField(choices=[('ACTIVE', 'Active'), ('SUSPENDED', 'Suspended'), ('RELEASED', 'Released'), ('EXPIRED', 'Expired')], default='ACTIVE', max_length=20)),
('legal_counsel', models.CharField(blank=True, max_length=200, null=True)),
('law_firm', models.CharField(blank=True, max_length=200, null=True)),
('court_jurisdiction', models.CharField(blank=True, max_length=200, null=True)),
('data_scope', models.JSONField(help_text='Scope of data covered by this hold')),
('custodian_list', models.JSONField(default=list, help_text='List of data custodians')),
('search_criteria', models.JSONField(default=dict, help_text='Search criteria for data collection')),
('hold_date', models.DateField(help_text='Date the legal hold was issued')),
('release_date', models.DateField(blank=True, null=True)),
('expiration_date', models.DateField(blank=True, null=True)),
('notification_sent', models.BooleanField(default=False)),
('notification_date', models.DateTimeField(blank=True, null=True)),
('reminder_sent', models.BooleanField(default=False)),
('reminder_date', models.DateTimeField(blank=True, null=True)),
('created_at', models.DateTimeField(auto_now_add=True)),
('updated_at', models.DateTimeField(auto_now=True)),
('created_by', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to=settings.AUTH_USER_MODEL)),
('related_evidence', models.ManyToManyField(blank=True, related_name='legal_holds', to='compliance_governance.evidencecollection')),
('related_incidents', models.ManyToManyField(blank=True, related_name='legal_holds', to='incident_intelligence.incident')),
],
options={
'ordering': ['-hold_date'],
'indexes': [models.Index(fields=['status', 'hold_date'], name='compliance__status_33a9b6_idx'), models.Index(fields=['expiration_date'], name='compliance__expirat_3d41f9_idx')],
},
),
migrations.AddIndex(
model_name='regulatoryframework',
index=models.Index(fields=['framework_type', 'is_active'], name='compliance__framewo_a67fc4_idx'),
),
migrations.AddIndex(
model_name='regulatoryframework',
index=models.Index(fields=['effective_date'], name='compliance__effecti_42b42f_idx'),
),
migrations.AddIndex(
model_name='exportrequest',
index=models.Index(fields=['request_type', 'status'], name='compliance__request_8dd8b7_idx'),
),
migrations.AddIndex(
model_name='exportrequest',
index=models.Index(fields=['status', 'due_date'], name='compliance__status_dadc69_idx'),
),
migrations.AddIndex(
model_name='exportrequest',
index=models.Index(fields=['requester_email'], name='compliance__request_a06e63_idx'),
),
migrations.AddIndex(
model_name='compliancerequirement',
index=models.Index(fields=['framework', 'compliance_status'], name='compliance__framewo_72cd4e_idx'),
),
migrations.AddIndex(
model_name='compliancerequirement',
index=models.Index(fields=['requirement_type', 'priority'], name='compliance__require_6eb886_idx'),
),
migrations.AddIndex(
model_name='compliancerequirement',
index=models.Index(fields=['next_assessment_date'], name='compliance__next_as_0fca2f_idx'),
),
migrations.AlterUniqueTogether(
name='compliancerequirement',
unique_together={('framework', 'requirement_id')},
),
migrations.AddIndex(
model_name='compliancereport',
index=models.Index(fields=['framework', 'report_type'], name='compliance__framewo_04b02e_idx'),
),
migrations.AddIndex(
model_name='compliancereport',
index=models.Index(fields=['status', 'report_period_end'], name='compliance__status_adc275_idx'),
),
migrations.AddIndex(
model_name='regulatoryworkflow',
index=models.Index(fields=['workflow_type', 'status'], name='compliance__workflo_ceb72e_idx'),
),
migrations.AddIndex(
model_name='regulatoryworkflow',
index=models.Index(fields=['is_template'], name='compliance__is_temp_0fed32_idx'),
),
migrations.AddIndex(
model_name='retentionpolicy',
index=models.Index(fields=['policy_type', 'is_active'], name='compliance__policy__c9edd3_idx'),
),
migrations.AddIndex(
model_name='retentionpolicy',
index=models.Index(fields=['effective_date'], name='compliance__effecti_6aa9ee_idx'),
),
migrations.AddIndex(
model_name='workflowinstance',
index=models.Index(fields=['workflow', 'status'], name='compliance__workflo_96e550_idx'),
),
migrations.AddIndex(
model_name='workflowinstance',
index=models.Index(fields=['status', 'due_date'], name='compliance__status_96c3cc_idx'),
),
migrations.AddIndex(
model_name='workflowinstance',
index=models.Index(fields=['assigned_to', 'status'], name='compliance__assigne_96daf0_idx'),
),
migrations.AddIndex(
model_name='evidencecollection',
index=models.Index(fields=['incident', 'evidence_type'], name='compliance__inciden_b30335_idx'),
),
migrations.AddIndex(
model_name='evidencecollection',
index=models.Index(fields=['status', 'collection_timestamp'], name='compliance__status_515c91_idx'),
),
migrations.AddIndex(
model_name='evidencecollection',
index=models.Index(fields=['compliance_requirement'], name='compliance__complia_5d81da_idx'),
),
]

View File

@@ -0,0 +1,708 @@
"""
Compliance & Governance models for Enterprise Incident Management API
Implements regulatory workflows, evidence collection, retention policies, and export APIs
"""
import uuid
import json
from datetime import datetime, timedelta
from typing import Dict, Any, Optional, List
from django.db import models
from django.contrib.auth import get_user_model
from django.core.validators import MinValueValidator, MaxValueValidator
from django.utils import timezone
from django.core.exceptions import ValidationError
User = get_user_model()
class RegulatoryFramework(models.Model):
"""Regulatory frameworks and standards (GDPR, HIPAA, SOX, ISO27001)"""
FRAMEWORK_TYPES = [
('GDPR', 'General Data Protection Regulation'),
('HIPAA', 'Health Insurance Portability and Accountability Act'),
('SOX', 'Sarbanes-Oxley Act'),
('ISO27001', 'ISO/IEC 27001'),
('PCI_DSS', 'Payment Card Industry Data Security Standard'),
('NIST', 'NIST Cybersecurity Framework'),
('CUSTOM', 'Custom Framework'),
]
id = models.UUIDField(primary_key=True, default=uuid.uuid4, editable=False)
name = models.CharField(max_length=100, unique=True)
framework_type = models.CharField(max_length=20, choices=FRAMEWORK_TYPES)
version = models.CharField(max_length=20, default='1.0')
description = models.TextField()
# Framework details
applicable_regions = models.JSONField(default=list, help_text="List of applicable regions/countries")
industry_sectors = models.JSONField(default=list, help_text="List of applicable industry sectors")
compliance_requirements = models.JSONField(default=list, help_text="List of compliance requirements")
# Status and dates
is_active = models.BooleanField(default=True)
effective_date = models.DateField()
review_date = models.DateField(null=True, blank=True)
# Metadata
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
created_by = models.ForeignKey(User, on_delete=models.SET_NULL, null=True, blank=True)
class Meta:
ordering = ['name', 'version']
indexes = [
models.Index(fields=['framework_type', 'is_active']),
models.Index(fields=['effective_date']),
]
def __str__(self):
return f"{self.name} v{self.version}"
class ComplianceRequirement(models.Model):
"""Individual compliance requirements within frameworks"""
REQUIREMENT_TYPES = [
('TECHNICAL', 'Technical Control'),
('ADMINISTRATIVE', 'Administrative Control'),
('PHYSICAL', 'Physical Control'),
('PROCEDURAL', 'Procedural Control'),
('DOCUMENTATION', 'Documentation Requirement'),
]
PRIORITY_LEVELS = [
('CRITICAL', 'Critical'),
('HIGH', 'High'),
('MEDIUM', 'Medium'),
('LOW', 'Low'),
]
id = models.UUIDField(primary_key=True, default=uuid.uuid4, editable=False)
framework = models.ForeignKey(RegulatoryFramework, on_delete=models.CASCADE, related_name='requirements')
# Requirement details
requirement_id = models.CharField(max_length=50, help_text="Unique identifier within framework")
title = models.CharField(max_length=200)
description = models.TextField()
requirement_type = models.CharField(max_length=20, choices=REQUIREMENT_TYPES)
priority = models.CharField(max_length=10, choices=PRIORITY_LEVELS, default='MEDIUM')
# Implementation details
implementation_guidance = models.TextField(blank=True, null=True)
evidence_requirements = models.JSONField(default=list, help_text="Types of evidence required")
testing_procedures = models.TextField(blank=True, null=True)
# Compliance tracking
is_implemented = models.BooleanField(default=False)
implementation_date = models.DateField(null=True, blank=True)
last_assessment_date = models.DateField(null=True, blank=True)
next_assessment_date = models.DateField(null=True, blank=True)
compliance_status = models.CharField(
max_length=20,
choices=[
('COMPLIANT', 'Compliant'),
('PARTIALLY_COMPLIANT', 'Partially Compliant'),
('NON_COMPLIANT', 'Non-Compliant'),
('NOT_ASSESSED', 'Not Assessed'),
],
default='NOT_ASSESSED'
)
# Relationships
responsible_team = models.CharField(max_length=100, blank=True, null=True)
assigned_to = models.ForeignKey(User, on_delete=models.SET_NULL, null=True, blank=True)
# Metadata
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
class Meta:
ordering = ['framework', 'requirement_id']
unique_together = ['framework', 'requirement_id']
indexes = [
models.Index(fields=['framework', 'compliance_status']),
models.Index(fields=['requirement_type', 'priority']),
models.Index(fields=['next_assessment_date']),
]
def __str__(self):
return f"{self.framework.name} - {self.requirement_id}: {self.title}"
class RegulatoryWorkflow(models.Model):
"""Workflow definitions for regulatory compliance processes"""
WORKFLOW_TYPES = [
('INCIDENT_RESPONSE', 'Incident Response'),
('DATA_BREACH', 'Data Breach Response'),
('AUDIT', 'Audit Process'),
('ASSESSMENT', 'Compliance Assessment'),
('REMEDIATION', 'Remediation Process'),
('DOCUMENTATION', 'Documentation Process'),
]
STATUS_CHOICES = [
('DRAFT', 'Draft'),
('ACTIVE', 'Active'),
('SUSPENDED', 'Suspended'),
('ARCHIVED', 'Archived'),
]
id = models.UUIDField(primary_key=True, default=uuid.uuid4, editable=False)
name = models.CharField(max_length=200)
workflow_type = models.CharField(max_length=30, choices=WORKFLOW_TYPES)
description = models.TextField()
# Framework association
applicable_frameworks = models.ManyToManyField(RegulatoryFramework, blank=True)
# Workflow definition
workflow_definition = models.JSONField(help_text="JSON definition of workflow steps and transitions")
triggers = models.JSONField(default=list, help_text="Events that trigger this workflow")
conditions = models.JSONField(default=dict, help_text="Conditions for workflow execution")
# Status and lifecycle
status = models.CharField(max_length=20, choices=STATUS_CHOICES, default='DRAFT')
version = models.CharField(max_length=20, default='1.0')
is_template = models.BooleanField(default=False)
# Notifications and escalations
notification_rules = models.JSONField(default=list, help_text="Notification rules for workflow events")
escalation_rules = models.JSONField(default=list, help_text="Escalation rules for workflow delays")
# Metadata
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
created_by = models.ForeignKey(User, on_delete=models.SET_NULL, null=True, blank=True)
class Meta:
ordering = ['name', 'version']
indexes = [
models.Index(fields=['workflow_type', 'status']),
models.Index(fields=['is_template']),
]
def __str__(self):
return f"{self.name} v{self.version}"
class WorkflowInstance(models.Model):
"""Instance of a regulatory workflow execution"""
STATUS_CHOICES = [
('PENDING', 'Pending'),
('IN_PROGRESS', 'In Progress'),
('COMPLETED', 'Completed'),
('SUSPENDED', 'Suspended'),
('CANCELLED', 'Cancelled'),
('FAILED', 'Failed'),
]
id = models.UUIDField(primary_key=True, default=uuid.uuid4, editable=False)
workflow = models.ForeignKey(RegulatoryWorkflow, on_delete=models.CASCADE, related_name='instances')
# Instance details
title = models.CharField(max_length=200)
description = models.TextField(blank=True, null=True)
status = models.CharField(max_length=20, choices=STATUS_CHOICES, default='PENDING')
# Related objects
related_incident = models.ForeignKey(
'incident_intelligence.Incident',
on_delete=models.SET_NULL,
null=True,
blank=True,
related_name='regulatory_workflows'
)
related_requirement = models.ForeignKey(
ComplianceRequirement,
on_delete=models.SET_NULL,
null=True,
blank=True,
related_name='workflow_instances'
)
# Execution tracking
current_step = models.CharField(max_length=100, blank=True, null=True)
execution_data = models.JSONField(default=dict, help_text="Runtime data for workflow execution")
completed_steps = models.JSONField(default=list, help_text="List of completed workflow steps")
# Assignments and responsibilities
assigned_to = models.ForeignKey(User, on_delete=models.SET_NULL, null=True, blank=True)
stakeholders = models.ManyToManyField(User, blank=True, related_name='workflow_stakeholders')
# Timestamps
started_at = models.DateTimeField(auto_now_add=True)
completed_at = models.DateTimeField(null=True, blank=True)
due_date = models.DateTimeField(null=True, blank=True)
# Metadata
created_by = models.ForeignKey(User, on_delete=models.SET_NULL, null=True, blank=True, related_name='created_workflows')
updated_at = models.DateTimeField(auto_now=True)
class Meta:
ordering = ['-started_at']
indexes = [
models.Index(fields=['workflow', 'status']),
models.Index(fields=['status', 'due_date']),
models.Index(fields=['assigned_to', 'status']),
]
def __str__(self):
return f"{self.workflow.name} - {self.title} ({self.status})"
class EvidenceCollection(models.Model):
"""Evidence collection and linking to incidents for compliance"""
EVIDENCE_TYPES = [
('LOG_FILE', 'Log File'),
('SCREENSHOT', 'Screenshot'),
('DOCUMENT', 'Document'),
('EMAIL', 'Email'),
('DATABASE_RECORD', 'Database Record'),
('CONFIGURATION', 'Configuration File'),
('AUDIT_TRAIL', 'Audit Trail'),
('TESTIMONY', 'Testimony/Statement'),
('PHYSICAL_EVIDENCE', 'Physical Evidence'),
('DIGITAL_FORENSICS', 'Digital Forensics'),
]
STATUS_CHOICES = [
('COLLECTED', 'Collected'),
('VERIFIED', 'Verified'),
('ANALYZED', 'Analyzed'),
('ARCHIVED', 'Archived'),
('DESTROYED', 'Destroyed'),
]
id = models.UUIDField(primary_key=True, default=uuid.uuid4, editable=False)
# Evidence details
title = models.CharField(max_length=200)
description = models.TextField()
evidence_type = models.CharField(max_length=30, choices=EVIDENCE_TYPES)
status = models.CharField(max_length=20, choices=STATUS_CHOICES, default='COLLECTED')
# Related objects
incident = models.ForeignKey(
'incident_intelligence.Incident',
on_delete=models.CASCADE,
related_name='evidence_collection'
)
workflow_instance = models.ForeignKey(
WorkflowInstance,
on_delete=models.SET_NULL,
null=True,
blank=True,
related_name='evidence_collection'
)
compliance_requirement = models.ForeignKey(
ComplianceRequirement,
on_delete=models.SET_NULL,
null=True,
blank=True,
related_name='evidence_collection'
)
# Evidence metadata
file_path = models.CharField(max_length=500, blank=True, null=True)
file_hash = models.CharField(max_length=64, blank=True, null=True, help_text="SHA-256 hash for integrity")
file_size = models.BigIntegerField(null=True, blank=True)
mime_type = models.CharField(max_length=100, blank=True, null=True)
# Collection details
collection_method = models.CharField(max_length=100, blank=True, null=True)
collection_timestamp = models.DateTimeField(auto_now_add=True)
collection_location = models.CharField(max_length=200, blank=True, null=True)
collection_notes = models.TextField(blank=True, null=True)
# Chain of custody
collected_by = models.ForeignKey(User, on_delete=models.SET_NULL, null=True, blank=True, related_name='collected_evidence')
verified_by = models.ForeignKey(User, on_delete=models.SET_NULL, null=True, blank=True, related_name='verified_evidence')
custody_chain = models.JSONField(default=list, help_text="Chain of custody records")
# Retention and disposal
retention_period = models.DurationField(null=True, blank=True)
disposal_date = models.DateTimeField(null=True, blank=True)
disposal_method = models.CharField(max_length=100, blank=True, null=True)
# Metadata
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
class Meta:
ordering = ['-collection_timestamp']
indexes = [
models.Index(fields=['incident', 'evidence_type']),
models.Index(fields=['status', 'collection_timestamp']),
models.Index(fields=['compliance_requirement']),
]
def __str__(self):
return f"{self.title} ({self.evidence_type})"
def add_custody_record(self, user: User, action: str, notes: str = None):
"""Add a custody record to the chain of custody"""
custody_record = {
'timestamp': timezone.now().isoformat(),
'user_id': str(user.id),
'user_name': user.get_full_name() or user.username,
'action': action,
'notes': notes or ''
}
self.custody_chain.append(custody_record)
self.save()
class RetentionPolicy(models.Model):
"""Data retention policies for compliance and governance"""
POLICY_TYPES = [
('INCIDENT_DATA', 'Incident Data'),
('AUDIT_LOGS', 'Audit Logs'),
('EVIDENCE', 'Evidence'),
('USER_DATA', 'User Data'),
('SYSTEM_LOGS', 'System Logs'),
('BACKUP_DATA', 'Backup Data'),
('DOCUMENTATION', 'Documentation'),
]
RETENTION_UNITS = [
('DAYS', 'Days'),
('WEEKS', 'Weeks'),
('MONTHS', 'Months'),
('YEARS', 'Years'),
]
id = models.UUIDField(primary_key=True, default=uuid.uuid4, editable=False)
name = models.CharField(max_length=200, unique=True)
description = models.TextField()
policy_type = models.CharField(max_length=30, choices=POLICY_TYPES)
# Framework association
applicable_frameworks = models.ManyToManyField(RegulatoryFramework, blank=True)
# Retention rules
retention_period = models.PositiveIntegerField(help_text="Retention period value")
retention_unit = models.CharField(max_length=10, choices=RETENTION_UNITS, default='YEARS')
auto_archive = models.BooleanField(default=True, help_text="Automatically archive after retention period")
auto_delete = models.BooleanField(default=False, help_text="Automatically delete after retention period")
# Conditions and filters
data_classification_levels = models.JSONField(
default=list,
help_text="Data classification levels this policy applies to"
)
incident_categories = models.JSONField(
default=list,
help_text="Incident categories this policy applies to"
)
custom_filters = models.JSONField(
default=dict,
help_text="Custom filters for policy application"
)
# Legal holds and exceptions
legal_hold_override = models.BooleanField(
default=True,
help_text="Whether legal holds can override this policy"
)
exception_conditions = models.JSONField(
default=list,
help_text="Conditions that create exceptions to this policy"
)
# Status and lifecycle
is_active = models.BooleanField(default=True)
effective_date = models.DateField()
review_date = models.DateField(null=True, blank=True)
# Metadata
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
created_by = models.ForeignKey(User, on_delete=models.SET_NULL, null=True, blank=True)
class Meta:
ordering = ['name']
indexes = [
models.Index(fields=['policy_type', 'is_active']),
models.Index(fields=['effective_date']),
]
def __str__(self):
return f"{self.name} ({self.retention_period} {self.retention_unit})"
def get_retention_duration(self):
"""Get retention duration as timedelta"""
unit_multipliers = {
'DAYS': 1,
'WEEKS': 7,
'MONTHS': 30,
'YEARS': 365
}
days = self.retention_period * unit_multipliers.get(self.retention_unit, 365)
return timedelta(days=days)
def applies_to_data(self, data_classification: str = None, incident_category: str = None) -> bool:
"""Check if this policy applies to given data"""
if not self.is_active:
return False
# Check data classification
if data_classification and self.data_classification_levels:
if data_classification not in self.data_classification_levels:
return False
# Check incident category
if incident_category and self.incident_categories:
if incident_category not in self.incident_categories:
return False
return True
class ExportRequest(models.Model):
"""Export requests for regulators and auditors"""
REQUEST_TYPES = [
('REGULATORY', 'Regulatory Request'),
('AUDIT', 'Audit Request'),
('LEGAL', 'Legal Request'),
('INTERNAL', 'Internal Request'),
]
STATUS_CHOICES = [
('PENDING', 'Pending'),
('APPROVED', 'Approved'),
('IN_PROGRESS', 'In Progress'),
('COMPLETED', 'Completed'),
('REJECTED', 'Rejected'),
('CANCELLED', 'Cancelled'),
]
id = models.UUIDField(primary_key=True, default=uuid.uuid4, editable=False)
# Request details
title = models.CharField(max_length=200)
description = models.TextField()
request_type = models.CharField(max_length=20, choices=REQUEST_TYPES)
status = models.CharField(max_length=20, choices=STATUS_CHOICES, default='PENDING')
# Requester information
requester_name = models.CharField(max_length=200)
requester_organization = models.CharField(max_length=200, blank=True, null=True)
requester_email = models.EmailField()
requester_phone = models.CharField(max_length=20, blank=True, null=True)
legal_authority = models.CharField(max_length=200, blank=True, null=True)
# Request parameters
data_scope = models.JSONField(help_text="Scope of data to be exported")
date_range_start = models.DateTimeField(null=True, blank=True)
date_range_end = models.DateTimeField(null=True, blank=True)
data_classification_levels = models.JSONField(default=list)
incident_categories = models.JSONField(default=list)
# Framework association
applicable_frameworks = models.ManyToManyField(RegulatoryFramework, blank=True)
# Export details
export_format = models.CharField(max_length=20, default='JSON', help_text="Export format (JSON, CSV, XML, PDF)")
include_evidence = models.BooleanField(default=True)
include_audit_trails = models.BooleanField(default=True)
redaction_required = models.BooleanField(default=False)
# Approval workflow
approval_required = models.BooleanField(default=True)
approved_by = models.ForeignKey(User, on_delete=models.SET_NULL, null=True, blank=True, related_name='approved_exports')
approved_at = models.DateTimeField(null=True, blank=True)
approval_notes = models.TextField(blank=True, null=True)
# Execution details
executed_by = models.ForeignKey(User, on_delete=models.SET_NULL, null=True, blank=True, related_name='executed_exports')
export_file_path = models.CharField(max_length=500, blank=True, null=True)
export_file_hash = models.CharField(max_length=64, blank=True, null=True)
export_file_size = models.BigIntegerField(null=True, blank=True)
# Timestamps
requested_at = models.DateTimeField(auto_now_add=True)
due_date = models.DateTimeField(null=True, blank=True)
completed_at = models.DateTimeField(null=True, blank=True)
# Metadata
created_by = models.ForeignKey(User, on_delete=models.SET_NULL, null=True, blank=True, related_name='created_exports')
updated_at = models.DateTimeField(auto_now=True)
class Meta:
ordering = ['-requested_at']
indexes = [
models.Index(fields=['request_type', 'status']),
models.Index(fields=['status', 'due_date']),
models.Index(fields=['requester_email']),
]
def __str__(self):
return f"{self.title} - {self.requester_organization or 'Internal'}"
def is_overdue(self) -> bool:
"""Check if the export request is overdue"""
if not self.due_date:
return False
return timezone.now() > self.due_date and self.status not in ['COMPLETED', 'CANCELLED']
class ComplianceReport(models.Model):
"""Compliance reports and assessments"""
REPORT_TYPES = [
('ASSESSMENT', 'Compliance Assessment'),
('AUDIT', 'Audit Report'),
('REMEDIATION', 'Remediation Report'),
('STATUS', 'Status Report'),
('EXCEPTION', 'Exception Report'),
]
STATUS_CHOICES = [
('DRAFT', 'Draft'),
('REVIEW', 'Under Review'),
('APPROVED', 'Approved'),
('PUBLISHED', 'Published'),
('ARCHIVED', 'Archived'),
]
id = models.UUIDField(primary_key=True, default=uuid.uuid4, editable=False)
# Report details
title = models.CharField(max_length=200)
report_type = models.CharField(max_length=20, choices=REPORT_TYPES)
description = models.TextField()
status = models.CharField(max_length=20, choices=STATUS_CHOICES, default='DRAFT')
# Framework association
framework = models.ForeignKey(RegulatoryFramework, on_delete=models.CASCADE, related_name='reports')
applicable_requirements = models.ManyToManyField(ComplianceRequirement, blank=True)
# Report content
executive_summary = models.TextField(blank=True, null=True)
findings = models.JSONField(default=list, help_text="List of findings and observations")
recommendations = models.JSONField(default=list, help_text="List of recommendations")
action_items = models.JSONField(default=list, help_text="List of action items")
# Assessment details
overall_compliance_score = models.FloatField(
null=True, blank=True,
validators=[MinValueValidator(0.0), MaxValueValidator(100.0)],
help_text="Overall compliance score (0-100)"
)
compliance_gaps = models.JSONField(default=list, help_text="List of compliance gaps")
risk_assessment = models.JSONField(default=dict, help_text="Risk assessment details")
# Report metadata
report_period_start = models.DateField()
report_period_end = models.DateField()
report_file_path = models.CharField(max_length=500, blank=True, null=True)
# Approval and review
prepared_by = models.ForeignKey(User, on_delete=models.SET_NULL, null=True, blank=True, related_name='prepared_reports')
reviewed_by = models.ForeignKey(User, on_delete=models.SET_NULL, null=True, blank=True, related_name='reviewed_reports')
approved_by = models.ForeignKey(User, on_delete=models.SET_NULL, null=True, blank=True, related_name='approved_reports')
# Timestamps
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
published_at = models.DateTimeField(null=True, blank=True)
class Meta:
ordering = ['-created_at']
indexes = [
models.Index(fields=['framework', 'report_type']),
models.Index(fields=['status', 'report_period_end']),
]
def __str__(self):
return f"{self.title} - {self.framework.name}"
class LegalHold(models.Model):
"""Legal holds to prevent data deletion during litigation or investigations"""
STATUS_CHOICES = [
('ACTIVE', 'Active'),
('SUSPENDED', 'Suspended'),
('RELEASED', 'Released'),
('EXPIRED', 'Expired'),
]
id = models.UUIDField(primary_key=True, default=uuid.uuid4, editable=False)
# Hold details
case_name = models.CharField(max_length=200)
case_number = models.CharField(max_length=100, blank=True, null=True)
description = models.TextField()
status = models.CharField(max_length=20, choices=STATUS_CHOICES, default='ACTIVE')
# Legal information
legal_counsel = models.CharField(max_length=200, blank=True, null=True)
law_firm = models.CharField(max_length=200, blank=True, null=True)
court_jurisdiction = models.CharField(max_length=200, blank=True, null=True)
# Scope and criteria
data_scope = models.JSONField(help_text="Scope of data covered by this hold")
custodian_list = models.JSONField(default=list, help_text="List of data custodians")
search_criteria = models.JSONField(default=dict, help_text="Search criteria for data collection")
# Dates
hold_date = models.DateField(help_text="Date the legal hold was issued")
release_date = models.DateField(null=True, blank=True)
expiration_date = models.DateField(null=True, blank=True)
# Related objects
related_incidents = models.ManyToManyField(
'incident_intelligence.Incident',
blank=True,
related_name='legal_holds'
)
related_evidence = models.ManyToManyField(
EvidenceCollection,
blank=True,
related_name='legal_holds'
)
# Notifications
notification_sent = models.BooleanField(default=False)
notification_date = models.DateTimeField(null=True, blank=True)
reminder_sent = models.BooleanField(default=False)
reminder_date = models.DateTimeField(null=True, blank=True)
# Metadata
created_by = models.ForeignKey(User, on_delete=models.SET_NULL, null=True, blank=True)
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
class Meta:
ordering = ['-hold_date']
indexes = [
models.Index(fields=['status', 'hold_date']),
models.Index(fields=['expiration_date']),
]
def __str__(self):
return f"{self.case_name} - {self.case_number or 'No Case Number'}"
def is_active(self) -> bool:
"""Check if the legal hold is currently active"""
if self.status != 'ACTIVE':
return False
if self.expiration_date and timezone.now().date() > self.expiration_date:
return False
return True
def get_affected_data_count(self) -> int:
"""Get count of data items affected by this legal hold"""
incident_count = self.related_incidents.count()
evidence_count = self.related_evidence.count()
return incident_count + evidence_count

View File

@@ -0,0 +1,343 @@
"""
Serializers for Compliance & Governance API endpoints
"""
from rest_framework import serializers
from django.contrib.auth import get_user_model
from ..models import (
RegulatoryFramework,
ComplianceRequirement,
RegulatoryWorkflow,
WorkflowInstance,
EvidenceCollection,
RetentionPolicy,
ExportRequest,
ComplianceReport,
LegalHold,
)
User = get_user_model()
class RegulatoryFrameworkSerializer(serializers.ModelSerializer):
"""Serializer for Regulatory Framework"""
requirements_count = serializers.SerializerMethodField()
is_active_display = serializers.CharField(source='get_is_active_display', read_only=True)
framework_type_display = serializers.CharField(source='get_framework_type_display', read_only=True)
class Meta:
model = RegulatoryFramework
fields = [
'id', 'name', 'framework_type', 'framework_type_display', 'version', 'description',
'applicable_regions', 'industry_sectors', 'compliance_requirements',
'is_active', 'is_active_display', 'effective_date', 'review_date',
'created_at', 'updated_at', 'created_by', 'requirements_count'
]
read_only_fields = ['id', 'created_at', 'updated_at']
def get_requirements_count(self, obj):
return obj.requirements.count()
class ComplianceRequirementSerializer(serializers.ModelSerializer):
"""Serializer for Compliance Requirement"""
framework_name = serializers.CharField(source='framework.name', read_only=True)
requirement_type_display = serializers.CharField(source='get_requirement_type_display', read_only=True)
priority_display = serializers.CharField(source='get_priority_display', read_only=True)
compliance_status_display = serializers.CharField(source='get_compliance_status_display', read_only=True)
assigned_to_name = serializers.CharField(source='assigned_to.get_full_name', read_only=True)
class Meta:
model = ComplianceRequirement
fields = [
'id', 'framework', 'framework_name', 'requirement_id', 'title', 'description',
'requirement_type', 'requirement_type_display', 'priority', 'priority_display',
'implementation_guidance', 'evidence_requirements', 'testing_procedures',
'is_implemented', 'implementation_date', 'last_assessment_date', 'next_assessment_date',
'compliance_status', 'compliance_status_display', 'responsible_team', 'assigned_to',
'assigned_to_name', 'created_at', 'updated_at'
]
read_only_fields = ['id', 'created_at', 'updated_at']
class RegulatoryWorkflowSerializer(serializers.ModelSerializer):
"""Serializer for Regulatory Workflow"""
workflow_type_display = serializers.CharField(source='get_workflow_type_display', read_only=True)
status_display = serializers.CharField(source='get_status_display', read_only=True)
applicable_frameworks_names = serializers.StringRelatedField(
source='applicable_frameworks', many=True, read_only=True
)
instances_count = serializers.SerializerMethodField()
created_by_name = serializers.CharField(source='created_by.get_full_name', read_only=True)
class Meta:
model = RegulatoryWorkflow
fields = [
'id', 'name', 'workflow_type', 'workflow_type_display', 'description',
'applicable_frameworks', 'applicable_frameworks_names', 'workflow_definition',
'triggers', 'conditions', 'status', 'status_display', 'version', 'is_template',
'notification_rules', 'escalation_rules', 'created_at', 'updated_at',
'created_by', 'created_by_name', 'instances_count'
]
read_only_fields = ['id', 'created_at', 'updated_at']
def get_instances_count(self, obj):
return obj.instances.count()
class WorkflowInstanceSerializer(serializers.ModelSerializer):
"""Serializer for Workflow Instance"""
workflow_name = serializers.CharField(source='workflow.name', read_only=True)
status_display = serializers.CharField(source='get_status_display', read_only=True)
related_incident_title = serializers.CharField(source='related_incident.title', read_only=True)
related_requirement_title = serializers.CharField(source='related_requirement.title', read_only=True)
assigned_to_name = serializers.CharField(source='assigned_to.get_full_name', read_only=True)
created_by_name = serializers.CharField(source='created_by.get_full_name', read_only=True)
stakeholders_names = serializers.StringRelatedField(
source='stakeholders', many=True, read_only=True
)
is_overdue = serializers.SerializerMethodField()
class Meta:
model = WorkflowInstance
fields = [
'id', 'workflow', 'workflow_name', 'title', 'description', 'status', 'status_display',
'related_incident', 'related_incident_title', 'related_requirement', 'related_requirement_title',
'current_step', 'execution_data', 'completed_steps', 'assigned_to', 'assigned_to_name',
'stakeholders', 'stakeholders_names', 'started_at', 'completed_at', 'due_date',
'is_overdue', 'created_by', 'created_by_name', 'updated_at'
]
read_only_fields = ['id', 'started_at', 'updated_at']
def get_is_overdue(self, obj):
if not obj.due_date:
return False
from django.utils import timezone
return timezone.now() > obj.due_date and obj.status not in ['COMPLETED', 'CANCELLED']
class EvidenceCollectionSerializer(serializers.ModelSerializer):
"""Serializer for Evidence Collection"""
evidence_type_display = serializers.CharField(source='get_evidence_type_display', read_only=True)
status_display = serializers.CharField(source='get_status_display', read_only=True)
incident_title = serializers.CharField(source='incident.title', read_only=True)
workflow_instance_title = serializers.CharField(source='workflow_instance.title', read_only=True)
compliance_requirement_title = serializers.CharField(source='compliance_requirement.title', read_only=True)
collected_by_name = serializers.CharField(source='collected_by.get_full_name', read_only=True)
verified_by_name = serializers.CharField(source='verified_by.get_full_name', read_only=True)
file_size_display = serializers.SerializerMethodField()
class Meta:
model = EvidenceCollection
fields = [
'id', 'title', 'description', 'evidence_type', 'evidence_type_display', 'status', 'status_display',
'incident', 'incident_title', 'workflow_instance', 'workflow_instance_title',
'compliance_requirement', 'compliance_requirement_title', 'file_path', 'file_hash',
'file_size', 'file_size_display', 'mime_type', 'collection_method', 'collection_timestamp',
'collection_location', 'collection_notes', 'collected_by', 'collected_by_name',
'verified_by', 'verified_by_name', 'custody_chain', 'retention_period', 'disposal_date',
'disposal_method', 'created_at', 'updated_at'
]
read_only_fields = ['id', 'collection_timestamp', 'created_at', 'updated_at']
def get_file_size_display(self, obj):
if not obj.file_size:
return None
# Convert bytes to human readable format
for unit in ['B', 'KB', 'MB', 'GB', 'TB']:
if obj.file_size < 1024.0:
return f"{obj.file_size:.1f} {unit}"
obj.file_size /= 1024.0
return f"{obj.file_size:.1f} PB"
class RetentionPolicySerializer(serializers.ModelSerializer):
"""Serializer for Retention Policy"""
policy_type_display = serializers.CharField(source='get_policy_type_display', read_only=True)
retention_unit_display = serializers.CharField(source='get_retention_unit_display', read_only=True)
applicable_frameworks_names = serializers.StringRelatedField(
source='applicable_frameworks', many=True, read_only=True
)
created_by_name = serializers.CharField(source='created_by.get_full_name', read_only=True)
retention_duration_display = serializers.SerializerMethodField()
class Meta:
model = RetentionPolicy
fields = [
'id', 'name', 'description', 'policy_type', 'policy_type_display',
'applicable_frameworks', 'applicable_frameworks_names', 'retention_period',
'retention_unit', 'retention_unit_display', 'retention_duration_display',
'auto_archive', 'auto_delete', 'data_classification_levels', 'incident_categories',
'custom_filters', 'legal_hold_override', 'exception_conditions', 'is_active',
'effective_date', 'review_date', 'created_at', 'updated_at', 'created_by', 'created_by_name'
]
read_only_fields = ['id', 'created_at', 'updated_at']
def get_retention_duration_display(self, obj):
duration = obj.get_retention_duration()
days = duration.days
if days >= 365:
years = days // 365
return f"{years} year{'s' if years > 1 else ''}"
elif days >= 30:
months = days // 30
return f"{months} month{'s' if months > 1 else ''}"
elif days >= 7:
weeks = days // 7
return f"{weeks} week{'s' if weeks > 1 else ''}"
else:
return f"{days} day{'s' if days > 1 else ''}"
class ExportRequestSerializer(serializers.ModelSerializer):
"""Serializer for Export Request"""
request_type_display = serializers.CharField(source='get_request_type_display', read_only=True)
status_display = serializers.CharField(source='get_status_display', read_only=True)
applicable_frameworks_names = serializers.StringRelatedField(
source='applicable_frameworks', many=True, read_only=True
)
approved_by_name = serializers.CharField(source='approved_by.get_full_name', read_only=True)
executed_by_name = serializers.CharField(source='executed_by.get_full_name', read_only=True)
created_by_name = serializers.CharField(source='created_by.get_full_name', read_only=True)
is_overdue = serializers.SerializerMethodField()
export_file_size_display = serializers.SerializerMethodField()
class Meta:
model = ExportRequest
fields = [
'id', 'title', 'description', 'request_type', 'request_type_display', 'status', 'status_display',
'requester_name', 'requester_organization', 'requester_email', 'requester_phone', 'legal_authority',
'data_scope', 'date_range_start', 'date_range_end', 'data_classification_levels', 'incident_categories',
'applicable_frameworks', 'applicable_frameworks_names', 'export_format', 'include_evidence',
'include_audit_trails', 'redaction_required', 'approval_required', 'approved_by', 'approved_by_name',
'approved_at', 'approval_notes', 'executed_by', 'executed_by_name', 'export_file_path',
'export_file_hash', 'export_file_size', 'export_file_size_display', 'requested_at', 'due_date',
'is_overdue', 'completed_at', 'created_by', 'created_by_name', 'updated_at'
]
read_only_fields = ['id', 'requested_at', 'updated_at']
def get_is_overdue(self, obj):
return obj.is_overdue()
def get_export_file_size_display(self, obj):
if not obj.export_file_size:
return None
# Convert bytes to human readable format
size = obj.export_file_size
for unit in ['B', 'KB', 'MB', 'GB', 'TB']:
if size < 1024.0:
return f"{size:.1f} {unit}"
size /= 1024.0
return f"{size:.1f} PB"
class ComplianceReportSerializer(serializers.ModelSerializer):
"""Serializer for Compliance Report"""
report_type_display = serializers.CharField(source='get_report_type_display', read_only=True)
status_display = serializers.CharField(source='get_status_display', read_only=True)
framework_name = serializers.CharField(source='framework.name', read_only=True)
applicable_requirements_titles = serializers.StringRelatedField(
source='applicable_requirements', many=True, read_only=True
)
prepared_by_name = serializers.CharField(source='prepared_by.get_full_name', read_only=True)
reviewed_by_name = serializers.CharField(source='reviewed_by.get_full_name', read_only=True)
approved_by_name = serializers.CharField(source='approved_by.get_full_name', read_only=True)
class Meta:
model = ComplianceReport
fields = [
'id', 'title', 'report_type', 'report_type_display', 'description', 'status', 'status_display',
'framework', 'framework_name', 'applicable_requirements', 'applicable_requirements_titles',
'executive_summary', 'findings', 'recommendations', 'action_items', 'overall_compliance_score',
'compliance_gaps', 'risk_assessment', 'report_period_start', 'report_period_end',
'report_file_path', 'prepared_by', 'prepared_by_name', 'reviewed_by', 'reviewed_by_name',
'approved_by', 'approved_by_name', 'created_at', 'updated_at', 'published_at'
]
read_only_fields = ['id', 'created_at', 'updated_at']
class LegalHoldSerializer(serializers.ModelSerializer):
"""Serializer for Legal Hold"""
status_display = serializers.CharField(source='get_status_display', read_only=True)
related_incidents_titles = serializers.StringRelatedField(
source='related_incidents', many=True, read_only=True
)
related_evidence_titles = serializers.StringRelatedField(
source='related_evidence', many=True, read_only=True
)
created_by_name = serializers.CharField(source='created_by.get_full_name', read_only=True)
is_active = serializers.SerializerMethodField()
affected_data_count = serializers.SerializerMethodField()
class Meta:
model = LegalHold
fields = [
'id', 'case_name', 'case_number', 'description', 'status', 'status_display',
'legal_counsel', 'law_firm', 'court_jurisdiction', 'data_scope', 'custodian_list',
'search_criteria', 'hold_date', 'release_date', 'expiration_date',
'related_incidents', 'related_incidents_titles', 'related_evidence', 'related_evidence_titles',
'notification_sent', 'notification_date', 'reminder_sent', 'reminder_date',
'is_active', 'affected_data_count', 'created_by', 'created_by_name', 'created_at', 'updated_at'
]
read_only_fields = ['id', 'created_at', 'updated_at']
def get_is_active(self, obj):
return obj.is_active()
def get_affected_data_count(self, obj):
return obj.get_affected_data_count()
# Nested serializers for detailed views
class ComplianceRequirementDetailSerializer(ComplianceRequirementSerializer):
"""Detailed serializer for Compliance Requirement with related data"""
framework = RegulatoryFrameworkSerializer(read_only=True)
evidence_collection = EvidenceCollectionSerializer(many=True, read_only=True)
workflow_instances = WorkflowInstanceSerializer(many=True, read_only=True)
class WorkflowInstanceDetailSerializer(WorkflowInstanceSerializer):
"""Detailed serializer for Workflow Instance with related data"""
workflow = RegulatoryWorkflowSerializer(read_only=True)
evidence_collection = EvidenceCollectionSerializer(many=True, read_only=True)
class EvidenceCollectionDetailSerializer(EvidenceCollectionSerializer):
"""Detailed serializer for Evidence Collection with related data"""
incident = serializers.StringRelatedField(read_only=True)
workflow_instance = WorkflowInstanceSerializer(read_only=True)
compliance_requirement = ComplianceRequirementSerializer(read_only=True)
class ExportRequestDetailSerializer(ExportRequestSerializer):
"""Detailed serializer for Export Request with related data"""
applicable_frameworks = RegulatoryFrameworkSerializer(many=True, read_only=True)
class ComplianceReportDetailSerializer(ComplianceReportSerializer):
"""Detailed serializer for Compliance Report with related data"""
framework = RegulatoryFrameworkSerializer(read_only=True)
applicable_requirements = ComplianceRequirementSerializer(many=True, read_only=True)
class LegalHoldDetailSerializer(LegalHoldSerializer):
"""Detailed serializer for Legal Hold with related data"""
related_incidents = serializers.StringRelatedField(many=True, read_only=True)
related_evidence = EvidenceCollectionSerializer(many=True, read_only=True)

View File

@@ -0,0 +1,222 @@
"""
Signals for Compliance & Governance module
Handles integration with other modules
"""
from django.db.models.signals import post_save, pre_delete
from django.dispatch import receiver
from django.utils import timezone
from datetime import timedelta
from .models import (
RegulatoryWorkflow,
WorkflowInstance,
EvidenceCollection,
RetentionPolicy,
LegalHold,
)
@receiver(post_save, sender='incident_intelligence.Incident')
def create_compliance_workflows(sender, instance, created, **kwargs):
"""Create compliance workflows when incidents are created"""
if not created:
return
# Check if incident triggers any compliance workflows
applicable_workflows = RegulatoryWorkflow.objects.filter(
status='ACTIVE',
triggers__contains=['incident_created']
)
for workflow in applicable_workflows:
# Check if workflow conditions are met
conditions = workflow.conditions
# Check framework conditions
if 'framework' in conditions:
# This would need to be implemented based on incident classification
# For now, we'll create workflows for all applicable frameworks
pass
# Check severity conditions
if 'severity' in conditions:
if instance.severity not in conditions['severity']:
continue
# Create workflow instance
WorkflowInstance.objects.create(
workflow=workflow,
title=f"{workflow.name} - {instance.title}",
description=f"Compliance workflow triggered by incident: {instance.title}",
related_incident=instance,
status='PENDING',
created_by=instance.reporter
)
@receiver(post_save, sender='incident_intelligence.Incident')
def check_retention_policies(sender, instance, created, **kwargs):
"""Check if incident data should be subject to retention policies"""
if not created:
return
# Get applicable retention policies
policies = RetentionPolicy.objects.filter(
is_active=True,
policy_type='INCIDENT_DATA'
)
for policy in policies:
# Check if policy applies to this incident
if policy.applies_to_data(
data_classification=instance.data_classification.name if instance.data_classification else None,
incident_category=instance.category
):
# This would trigger retention policy application
# Implementation would depend on specific retention logic
pass
@receiver(post_save, sender='security.AuditLog')
def create_evidence_from_audit_log(sender, instance, created, **kwargs):
"""Create evidence collection entries for security-relevant audit logs"""
if not created:
return
# Only create evidence for certain types of audit logs
evidence_types = [
'DATA_ACCESS',
'DATA_MODIFIED',
'DATA_DELETED',
'LOGIN_FAILED',
'ACCOUNT_LOCKED',
]
if instance.action_type in evidence_types:
# Check if there's a related incident
# This would need to be implemented based on your incident correlation logic
related_incident = None
# Only create evidence collection entry if there's a related incident
if related_incident:
EvidenceCollection.objects.create(
title=f"Audit Log Evidence - {instance.action_type}",
description=f"Audit log entry for {instance.action_type} action",
evidence_type='AUDIT_TRAIL',
incident=related_incident,
status='COLLECTED',
collection_method='AUTOMATED',
collection_notes=f"Automatically collected from audit log: {instance.id}",
collected_by=None, # System collected
)
@receiver(pre_delete, sender='incident_intelligence.Incident')
def check_legal_holds_before_deletion(sender, instance, **kwargs):
"""Check for active legal holds before deleting incident data"""
# Check if there are any active legal holds that might affect this incident
active_legal_holds = LegalHold.objects.filter(
status='ACTIVE',
related_incidents=instance
)
if active_legal_holds.exists():
# Prevent deletion if legal hold is active
raise Exception(
f"Cannot delete incident {instance.id} - it is subject to active legal holds: "
f"{', '.join([hold.case_name for hold in active_legal_holds])}"
)
@receiver(pre_delete, sender=EvidenceCollection)
def check_legal_holds_before_evidence_deletion(sender, instance, **kwargs):
"""Check for active legal holds before deleting evidence"""
# Check if there are any active legal holds that might affect this evidence
active_legal_holds = LegalHold.objects.filter(
status='ACTIVE',
related_evidence=instance
)
if active_legal_holds.exists():
# Prevent deletion if legal hold is active
raise Exception(
f"Cannot delete evidence {instance.id} - it is subject to active legal holds: "
f"{', '.join([hold.case_name for hold in active_legal_holds])}"
)
@receiver(post_save, sender=WorkflowInstance)
def send_workflow_notifications(sender, instance, created, **kwargs):
"""Send notifications for workflow events"""
if not created:
return
# Get notification rules from workflow
notification_rules = instance.workflow.notification_rules
for rule in notification_rules:
if rule.get('event') == 'workflow_started':
# Send notification to specified recipients
recipients = rule.get('recipients', [])
# Implementation would depend on your notification system
# This could integrate with email, Slack, or other notification channels
pass
@receiver(post_save, sender=RetentionPolicy)
def apply_retention_policy_to_existing_data(sender, instance, created, **kwargs):
"""Apply new retention policy to existing data"""
if not created:
return
# This would trigger a background task to apply the retention policy
# to existing data that matches the policy criteria
# Implementation would depend on your task queue system (Celery, etc.)
pass
@receiver(post_save, sender=LegalHold)
def notify_legal_hold_stakeholders(sender, instance, created, **kwargs):
"""Notify stakeholders when legal hold is created"""
if not created:
return
# Send notifications to custodians and stakeholders
custodians = instance.custodian_list
# Implementation would depend on your notification system
pass
# Integration with SLA & On-Call module
@receiver(post_save, sender=WorkflowInstance)
def create_sla_tracking(sender, instance, created, **kwargs):
"""Create SLA tracking for compliance workflows"""
if not created:
return
# Check if workflow has SLA requirements
if instance.due_date:
# This would integrate with the SLA & On-Call module
# to create SLA tracking entries for compliance deadlines
pass
# Integration with Automation Orchestration module
@receiver(post_save, sender='automation_orchestration.RunbookExecution')
def create_evidence_from_automation(sender, instance, created, **kwargs):
"""Create evidence collection entries for automation executions"""
if not created:
return
# Create evidence for automation executions that are compliance-related
if hasattr(instance, 'runbook') and instance.runbook:
EvidenceCollection.objects.create(
title=f"Automation Evidence - {instance.runbook.name}",
description=f"Evidence from runbook execution: {instance.runbook.name}",
evidence_type='AUDIT_TRAIL',
incident=instance.related_incident,
status='COLLECTED',
collection_method='AUTOMATED',
collection_notes=f"Automatically collected from runbook execution: {instance.id}",
collected_by=None, # System collected
)

View File

@@ -0,0 +1,147 @@
"""
Tests for Compliance & Governance module
"""
from django.test import TestCase
from django.contrib.auth import get_user_model
from django.utils import timezone
from datetime import date, timedelta
from .models import (
RegulatoryFramework,
ComplianceRequirement,
EvidenceCollection,
RetentionPolicy,
ExportRequest,
LegalHold,
)
User = get_user_model()
class RegulatoryFrameworkModelTest(TestCase):
"""Test cases for RegulatoryFramework model"""
def setUp(self):
self.user = User.objects.create_user(
username='testuser',
email='test@example.com',
password='testpass123'
)
def test_create_framework(self):
"""Test creating a regulatory framework"""
framework = RegulatoryFramework.objects.create(
name='Test GDPR Framework',
framework_type='GDPR',
version='1.0',
description='Test framework',
applicable_regions=['EU', 'UK'],
industry_sectors=['Technology'],
is_active=True,
effective_date=date.today(),
created_by=self.user
)
self.assertEqual(framework.name, 'Test GDPR Framework')
self.assertEqual(framework.framework_type, 'GDPR')
self.assertTrue(framework.is_active)
self.assertEqual(framework.created_by, self.user)
def test_framework_str_representation(self):
"""Test string representation of framework"""
framework = RegulatoryFramework.objects.create(
name='Test Framework',
framework_type='GDPR',
version='1.0',
description='Test',
is_active=True,
effective_date=date.today()
)
self.assertEqual(str(framework), 'Test Framework v1.0')
class ComplianceRequirementModelTest(TestCase):
"""Test cases for ComplianceRequirement model"""
def setUp(self):
self.user = User.objects.create_user(
username='testuser',
email='test@example.com',
password='testpass123'
)
self.framework = RegulatoryFramework.objects.create(
name='Test Framework',
framework_type='GDPR',
version='1.0',
description='Test',
is_active=True,
effective_date=date.today()
)
def test_create_requirement(self):
"""Test creating a compliance requirement"""
requirement = ComplianceRequirement.objects.create(
framework=self.framework,
requirement_id='GDPR-001',
title='Test Requirement',
description='Test requirement description',
requirement_type='TECHNICAL',
priority='HIGH',
responsible_team='Security Team',
assigned_to=self.user
)
self.assertEqual(requirement.requirement_id, 'GDPR-001')
self.assertEqual(requirement.framework, self.framework)
self.assertEqual(requirement.assigned_to, self.user)
self.assertEqual(requirement.compliance_status, 'NOT_ASSESSED')
class RetentionPolicyModelTest(TestCase):
"""Test cases for RetentionPolicy model"""
def setUp(self):
self.user = User.objects.create_user(
username='testuser',
email='test@example.com',
password='testpass123'
)
def test_create_retention_policy(self):
"""Test creating a retention policy"""
policy = RetentionPolicy.objects.create(
name='Test Retention Policy',
description='Test policy description',
policy_type='INCIDENT_DATA',
retention_period=7,
retention_unit='YEARS',
auto_archive=True,
auto_delete=False,
is_active=True,
effective_date=date.today(),
created_by=self.user
)
self.assertEqual(policy.name, 'Test Retention Policy')
self.assertEqual(policy.retention_period, 7)
self.assertEqual(policy.retention_unit, 'YEARS')
self.assertTrue(policy.auto_archive)
self.assertFalse(policy.auto_delete)
def test_get_retention_duration(self):
"""Test getting retention duration as timedelta"""
policy = RetentionPolicy.objects.create(
name='Test Policy',
description='Test',
policy_type='INCIDENT_DATA',
retention_period=2,
retention_unit='YEARS',
is_active=True,
effective_date=date.today()
)
duration = policy.get_retention_duration()
expected_days = 2 * 365 # 2 years
self.assertEqual(duration.days, expected_days)

View File

@@ -0,0 +1,38 @@
"""
URL configuration for compliance_governance app
"""
from django.urls import path, include
from rest_framework.routers import DefaultRouter
from .views.compliance import (
RegulatoryFrameworkViewSet,
ComplianceRequirementViewSet,
RegulatoryWorkflowViewSet,
WorkflowInstanceViewSet,
EvidenceCollectionViewSet,
RetentionPolicyViewSet,
ExportRequestViewSet,
ComplianceReportViewSet,
LegalHoldViewSet,
)
# Create router and register viewsets
router = DefaultRouter()
router.register(r'frameworks', RegulatoryFrameworkViewSet, basename='regulatory-framework')
router.register(r'requirements', ComplianceRequirementViewSet, basename='compliance-requirement')
router.register(r'workflows', RegulatoryWorkflowViewSet, basename='regulatory-workflow')
router.register(r'workflow-instances', WorkflowInstanceViewSet, basename='workflow-instance')
router.register(r'evidence', EvidenceCollectionViewSet, basename='evidence-collection')
router.register(r'retention-policies', RetentionPolicyViewSet, basename='retention-policy')
router.register(r'export-requests', ExportRequestViewSet, basename='export-request')
router.register(r'reports', ComplianceReportViewSet, basename='compliance-report')
router.register(r'legal-holds', LegalHoldViewSet, basename='legal-hold')
app_name = 'compliance_governance'
urlpatterns = [
# API endpoints
path('', include(router.urls)),
# Additional custom endpoints can be added here
# path('custom-endpoint/', CustomView.as_view(), name='custom-endpoint'),
]

View File

@@ -0,0 +1,3 @@
from django.shortcuts import render
# Create your views here.

View File

@@ -0,0 +1,732 @@
"""
Views for Compliance & Governance API endpoints
"""
from rest_framework import viewsets, status, permissions
from rest_framework.decorators import action
from rest_framework.response import Response
from rest_framework.pagination import PageNumberPagination
from django_filters.rest_framework import DjangoFilterBackend
from django_filters import rest_framework as filters
from django.db.models import Q, Count, Avg
from django.utils import timezone
from datetime import timedelta
from ..models import (
RegulatoryFramework,
ComplianceRequirement,
RegulatoryWorkflow,
WorkflowInstance,
EvidenceCollection,
RetentionPolicy,
ExportRequest,
ComplianceReport,
LegalHold,
)
from ..serializers.compliance import (
RegulatoryFrameworkSerializer,
ComplianceRequirementSerializer,
ComplianceRequirementDetailSerializer,
RegulatoryWorkflowSerializer,
WorkflowInstanceSerializer,
WorkflowInstanceDetailSerializer,
EvidenceCollectionSerializer,
EvidenceCollectionDetailSerializer,
RetentionPolicySerializer,
ExportRequestSerializer,
ExportRequestDetailSerializer,
ComplianceReportSerializer,
ComplianceReportDetailSerializer,
LegalHoldSerializer,
LegalHoldDetailSerializer,
)
class CompliancePagination(PageNumberPagination):
"""Custom pagination for compliance endpoints"""
page_size = 20
page_size_query_param = 'page_size'
max_page_size = 100
class RegulatoryFrameworkFilter(filters.FilterSet):
"""Filter for Regulatory Framework"""
framework_type = filters.ChoiceFilter(choices=RegulatoryFramework.FRAMEWORK_TYPES)
is_active = filters.BooleanFilter()
effective_date_after = filters.DateFilter(field_name='effective_date', lookup_expr='gte')
effective_date_before = filters.DateFilter(field_name='effective_date', lookup_expr='lte')
search = filters.CharFilter(method='filter_search')
class Meta:
model = RegulatoryFramework
fields = ['framework_type', 'is_active', 'effective_date_after', 'effective_date_before', 'search']
def filter_search(self, queryset, name, value):
return queryset.filter(
Q(name__icontains=value) |
Q(description__icontains=value) |
Q(applicable_regions__icontains=value) |
Q(industry_sectors__icontains=value)
)
class RegulatoryFrameworkViewSet(viewsets.ModelViewSet):
"""ViewSet for Regulatory Framework"""
queryset = RegulatoryFramework.objects.all()
serializer_class = RegulatoryFrameworkSerializer
pagination_class = CompliancePagination
filter_backends = [DjangoFilterBackend]
filterset_class = RegulatoryFrameworkFilter
permission_classes = [permissions.IsAuthenticated]
def get_queryset(self):
return RegulatoryFramework.objects.select_related('created_by').prefetch_related('requirements')
@action(detail=True, methods=['get'])
def requirements(self, request, pk=None):
"""Get all requirements for a framework"""
framework = self.get_object()
requirements = framework.requirements.all()
# Apply filters if provided
status_filter = request.query_params.get('compliance_status')
if status_filter:
requirements = requirements.filter(compliance_status=status_filter)
priority_filter = request.query_params.get('priority')
if priority_filter:
requirements = requirements.filter(priority=priority_filter)
serializer = ComplianceRequirementSerializer(requirements, many=True)
return Response(serializer.data)
@action(detail=True, methods=['get'])
def compliance_summary(self, request, pk=None):
"""Get compliance summary for a framework"""
framework = self.get_object()
requirements = framework.requirements.all()
summary = {
'total_requirements': requirements.count(),
'compliant': requirements.filter(compliance_status='COMPLIANT').count(),
'partially_compliant': requirements.filter(compliance_status='PARTIALLY_COMPLIANT').count(),
'non_compliant': requirements.filter(compliance_status='NON_COMPLIANT').count(),
'not_assessed': requirements.filter(compliance_status='NOT_ASSESSED').count(),
'implemented': requirements.filter(is_implemented=True).count(),
'overdue_assessments': requirements.filter(
next_assessment_date__lt=timezone.now().date()
).count(),
}
# Calculate compliance percentage
assessed_count = summary['compliant'] + summary['partially_compliant'] + summary['non_compliant']
if assessed_count > 0:
summary['compliance_percentage'] = (summary['compliant'] / assessed_count) * 100
else:
summary['compliance_percentage'] = 0
return Response(summary)
class ComplianceRequirementFilter(filters.FilterSet):
"""Filter for Compliance Requirement"""
framework = filters.UUIDFilter()
compliance_status = filters.ChoiceFilter(choices=ComplianceRequirement._meta.get_field('compliance_status').choices)
requirement_type = filters.ChoiceFilter(choices=ComplianceRequirement.REQUIREMENT_TYPES)
priority = filters.ChoiceFilter(choices=ComplianceRequirement.PRIORITY_LEVELS)
is_implemented = filters.BooleanFilter()
assigned_to = filters.UUIDFilter()
overdue = filters.BooleanFilter(method='filter_overdue')
search = filters.CharFilter(method='filter_search')
class Meta:
model = ComplianceRequirement
fields = ['framework', 'compliance_status', 'requirement_type', 'priority', 'is_implemented', 'assigned_to', 'overdue', 'search']
def filter_overdue(self, queryset, name, value):
if value:
return queryset.filter(next_assessment_date__lt=timezone.now().date())
return queryset
def filter_search(self, queryset, name, value):
return queryset.filter(
Q(title__icontains=value) |
Q(description__icontains=value) |
Q(requirement_id__icontains=value) |
Q(responsible_team__icontains=value)
)
class ComplianceRequirementViewSet(viewsets.ModelViewSet):
"""ViewSet for Compliance Requirement"""
queryset = ComplianceRequirement.objects.all()
serializer_class = ComplianceRequirementSerializer
pagination_class = CompliancePagination
filter_backends = [DjangoFilterBackend]
filterset_class = ComplianceRequirementFilter
permission_classes = [permissions.IsAuthenticated]
def get_queryset(self):
return ComplianceRequirement.objects.select_related(
'framework', 'assigned_to'
).prefetch_related('evidence_collection', 'workflow_instances')
def get_serializer_class(self):
if self.action == 'retrieve':
return ComplianceRequirementDetailSerializer
return ComplianceRequirementSerializer
@action(detail=True, methods=['post'])
def update_status(self, request, pk=None):
"""Update compliance status"""
requirement = self.get_object()
new_status = request.data.get('compliance_status')
if new_status not in dict(ComplianceRequirement._meta.get_field('compliance_status').choices):
return Response(
{'error': 'Invalid compliance status'},
status=status.HTTP_400_BAD_REQUEST
)
requirement.compliance_status = new_status
requirement.last_assessment_date = timezone.now().date()
requirement.save()
serializer = self.get_serializer(requirement)
return Response(serializer.data)
@action(detail=True, methods=['post'])
def schedule_assessment(self, request, pk=None):
"""Schedule next assessment"""
requirement = self.get_object()
assessment_date = request.data.get('assessment_date')
if not assessment_date:
return Response(
{'error': 'Assessment date is required'},
status=status.HTTP_400_BAD_REQUEST
)
requirement.next_assessment_date = assessment_date
requirement.save()
serializer = self.get_serializer(requirement)
return Response(serializer.data)
class RegulatoryWorkflowFilter(filters.FilterSet):
"""Filter for Regulatory Workflow"""
workflow_type = filters.ChoiceFilter(choices=RegulatoryWorkflow.WORKFLOW_TYPES)
status = filters.ChoiceFilter(choices=RegulatoryWorkflow.STATUS_CHOICES)
is_template = filters.BooleanFilter()
applicable_frameworks = filters.UUIDFilter(field_name='applicable_frameworks')
search = filters.CharFilter(method='filter_search')
class Meta:
model = RegulatoryWorkflow
fields = ['workflow_type', 'status', 'is_template', 'applicable_frameworks', 'search']
def filter_search(self, queryset, name, value):
return queryset.filter(
Q(name__icontains=value) |
Q(description__icontains=value)
)
class RegulatoryWorkflowViewSet(viewsets.ModelViewSet):
"""ViewSet for Regulatory Workflow"""
queryset = RegulatoryWorkflow.objects.all()
serializer_class = RegulatoryWorkflowSerializer
pagination_class = CompliancePagination
filter_backends = [DjangoFilterBackend]
filterset_class = RegulatoryWorkflowFilter
permission_classes = [permissions.IsAuthenticated]
def get_queryset(self):
return RegulatoryWorkflow.objects.select_related('created_by').prefetch_related(
'applicable_frameworks', 'instances'
)
@action(detail=True, methods=['post'])
def create_instance(self, request, pk=None):
"""Create a new workflow instance"""
workflow = self.get_object()
if workflow.status != 'ACTIVE':
return Response(
{'error': 'Cannot create instance from inactive workflow'},
status=status.HTTP_400_BAD_REQUEST
)
instance_data = request.data.copy()
instance_data['workflow'] = workflow.id
instance_data['created_by'] = request.user.id
serializer = WorkflowInstanceSerializer(data=instance_data)
if serializer.is_valid():
instance = serializer.save()
return Response(
WorkflowInstanceSerializer(instance).data,
status=status.HTTP_201_CREATED
)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
class WorkflowInstanceFilter(filters.FilterSet):
"""Filter for Workflow Instance"""
workflow = filters.UUIDFilter()
status = filters.ChoiceFilter(choices=WorkflowInstance.STATUS_CHOICES)
assigned_to = filters.UUIDFilter()
related_incident = filters.UUIDFilter()
overdue = filters.BooleanFilter(method='filter_overdue')
search = filters.CharFilter(method='filter_search')
class Meta:
model = WorkflowInstance
fields = ['workflow', 'status', 'assigned_to', 'related_incident', 'overdue', 'search']
def filter_overdue(self, queryset, name, value):
if value:
return queryset.filter(
due_date__lt=timezone.now(),
status__in=['PENDING', 'IN_PROGRESS']
)
return queryset
def filter_search(self, queryset, name, value):
return queryset.filter(
Q(title__icontains=value) |
Q(description__icontains=value) |
Q(current_step__icontains=value)
)
class WorkflowInstanceViewSet(viewsets.ModelViewSet):
"""ViewSet for Workflow Instance"""
queryset = WorkflowInstance.objects.all()
serializer_class = WorkflowInstanceSerializer
pagination_class = CompliancePagination
filter_backends = [DjangoFilterBackend]
filterset_class = WorkflowInstanceFilter
permission_classes = [permissions.IsAuthenticated]
def get_queryset(self):
return WorkflowInstance.objects.select_related(
'workflow', 'assigned_to', 'created_by', 'related_incident', 'related_requirement'
).prefetch_related('stakeholders', 'evidence_collection')
def get_serializer_class(self):
if self.action == 'retrieve':
return WorkflowInstanceDetailSerializer
return WorkflowInstanceSerializer
@action(detail=True, methods=['post'])
def advance_step(self, request, pk=None):
"""Advance workflow to next step"""
instance = self.get_object()
next_step = request.data.get('next_step')
step_data = request.data.get('step_data', {})
if not next_step:
return Response(
{'error': 'Next step is required'},
status=status.HTTP_400_BAD_REQUEST
)
# Add current step to completed steps
if instance.current_step:
instance.completed_steps.append({
'step': instance.current_step,
'completed_at': timezone.now().isoformat(),
'completed_by': request.user.id
})
# Update current step and execution data
instance.current_step = next_step
instance.execution_data.update(step_data)
instance.updated_at = timezone.now()
# Check if workflow is complete
workflow_definition = instance.workflow.workflow_definition
if next_step in workflow_definition.get('end_steps', []):
instance.status = 'COMPLETED'
instance.completed_at = timezone.now()
instance.save()
serializer = self.get_serializer(instance)
return Response(serializer.data)
@action(detail=True, methods=['post'])
def assign_stakeholder(self, request, pk=None):
"""Assign stakeholder to workflow instance"""
instance = self.get_object()
stakeholder_id = request.data.get('stakeholder_id')
if not stakeholder_id:
return Response(
{'error': 'Stakeholder ID is required'},
status=status.HTTP_400_BAD_REQUEST
)
from django.contrib.auth import get_user_model
User = get_user_model()
try:
stakeholder = User.objects.get(id=stakeholder_id)
instance.stakeholders.add(stakeholder)
return Response({'message': 'Stakeholder assigned successfully'})
except User.DoesNotExist:
return Response(
{'error': 'User not found'},
status=status.HTTP_404_NOT_FOUND
)
class EvidenceCollectionFilter(filters.FilterSet):
"""Filter for Evidence Collection"""
incident = filters.UUIDFilter()
evidence_type = filters.ChoiceFilter(choices=EvidenceCollection.EVIDENCE_TYPES)
status = filters.ChoiceFilter(choices=EvidenceCollection.STATUS_CHOICES)
collected_by = filters.UUIDFilter()
compliance_requirement = filters.UUIDFilter()
search = filters.CharFilter(method='filter_search')
class Meta:
model = EvidenceCollection
fields = ['incident', 'evidence_type', 'status', 'collected_by', 'compliance_requirement', 'search']
def filter_search(self, queryset, name, value):
return queryset.filter(
Q(title__icontains=value) |
Q(description__icontains=value) |
Q(collection_notes__icontains=value)
)
class EvidenceCollectionViewSet(viewsets.ModelViewSet):
"""ViewSet for Evidence Collection"""
queryset = EvidenceCollection.objects.all()
serializer_class = EvidenceCollectionSerializer
pagination_class = CompliancePagination
filter_backends = [DjangoFilterBackend]
filterset_class = EvidenceCollectionFilter
permission_classes = [permissions.IsAuthenticated]
def get_queryset(self):
return EvidenceCollection.objects.select_related(
'incident', 'collected_by', 'verified_by', 'workflow_instance', 'compliance_requirement'
)
def get_serializer_class(self):
if self.action == 'retrieve':
return EvidenceCollectionDetailSerializer
return EvidenceCollectionSerializer
@action(detail=True, methods=['post'])
def add_custody_record(self, request, pk=None):
"""Add custody record to evidence"""
evidence = self.get_object()
action = request.data.get('action')
notes = request.data.get('notes', '')
if not action:
return Response(
{'error': 'Action is required'},
status=status.HTTP_400_BAD_REQUEST
)
evidence.add_custody_record(request.user, action, notes)
serializer = self.get_serializer(evidence)
return Response(serializer.data)
@action(detail=True, methods=['post'])
def verify_evidence(self, request, pk=None):
"""Verify evidence"""
evidence = self.get_object()
verification_notes = request.data.get('verification_notes', '')
evidence.status = 'VERIFIED'
evidence.verified_by = request.user
evidence.updated_at = timezone.now()
evidence.save()
# Add custody record
evidence.add_custody_record(request.user, 'VERIFIED', verification_notes)
serializer = self.get_serializer(evidence)
return Response(serializer.data)
class RetentionPolicyFilter(filters.FilterSet):
"""Filter for Retention Policy"""
policy_type = filters.ChoiceFilter(choices=RetentionPolicy.POLICY_TYPES)
is_active = filters.BooleanFilter()
applicable_frameworks = filters.UUIDFilter(field_name='applicable_frameworks')
search = filters.CharFilter(method='filter_search')
class Meta:
model = RetentionPolicy
fields = ['policy_type', 'is_active', 'applicable_frameworks', 'search']
def filter_search(self, queryset, name, value):
return queryset.filter(
Q(name__icontains=value) |
Q(description__icontains=value)
)
class RetentionPolicyViewSet(viewsets.ModelViewSet):
"""ViewSet for Retention Policy"""
queryset = RetentionPolicy.objects.all()
serializer_class = RetentionPolicySerializer
pagination_class = CompliancePagination
filter_backends = [DjangoFilterBackend]
filterset_class = RetentionPolicyFilter
permission_classes = [permissions.IsAuthenticated]
def get_queryset(self):
return RetentionPolicy.objects.select_related('created_by').prefetch_related('applicable_frameworks')
class ExportRequestFilter(filters.FilterSet):
"""Filter for Export Request"""
request_type = filters.ChoiceFilter(choices=ExportRequest.REQUEST_TYPES)
status = filters.ChoiceFilter(choices=ExportRequest.STATUS_CHOICES)
requester_email = filters.CharFilter(lookup_expr='icontains')
overdue = filters.BooleanFilter(method='filter_overdue')
search = filters.CharFilter(method='filter_search')
class Meta:
model = ExportRequest
fields = ['request_type', 'status', 'requester_email', 'overdue', 'search']
def filter_overdue(self, queryset, name, value):
if value:
return queryset.filter(
due_date__lt=timezone.now(),
status__in=['PENDING', 'APPROVED', 'IN_PROGRESS']
)
return queryset
def filter_search(self, queryset, name, value):
return queryset.filter(
Q(title__icontains=value) |
Q(description__icontains=value) |
Q(requester_name__icontains=value) |
Q(requester_organization__icontains=value)
)
class ExportRequestViewSet(viewsets.ModelViewSet):
"""ViewSet for Export Request"""
queryset = ExportRequest.objects.all()
serializer_class = ExportRequestSerializer
pagination_class = CompliancePagination
filter_backends = [DjangoFilterBackend]
filterset_class = ExportRequestFilter
permission_classes = [permissions.IsAuthenticated]
def get_queryset(self):
return ExportRequest.objects.select_related(
'approved_by', 'executed_by', 'created_by'
).prefetch_related('applicable_frameworks')
def get_serializer_class(self):
if self.action == 'retrieve':
return ExportRequestDetailSerializer
return ExportRequestSerializer
@action(detail=True, methods=['post'])
def approve(self, request, pk=None):
"""Approve export request"""
export_request = self.get_object()
approval_notes = request.data.get('approval_notes', '')
if export_request.status != 'PENDING':
return Response(
{'error': 'Only pending requests can be approved'},
status=status.HTTP_400_BAD_REQUEST
)
export_request.status = 'APPROVED'
export_request.approved_by = request.user
export_request.approved_at = timezone.now()
export_request.approval_notes = approval_notes
export_request.save()
serializer = self.get_serializer(export_request)
return Response(serializer.data)
@action(detail=True, methods=['post'])
def execute(self, request, pk=None):
"""Execute export request"""
export_request = self.get_object()
if export_request.status != 'APPROVED':
return Response(
{'error': 'Only approved requests can be executed'},
status=status.HTTP_400_BAD_REQUEST
)
# This would typically trigger an async task to generate the export
export_request.status = 'IN_PROGRESS'
export_request.executed_by = request.user
export_request.save()
# TODO: Implement actual export generation logic
serializer = self.get_serializer(export_request)
return Response(serializer.data)
class ComplianceReportFilter(filters.FilterSet):
"""Filter for Compliance Report"""
framework = filters.UUIDFilter()
report_type = filters.ChoiceFilter(choices=ComplianceReport.REPORT_TYPES)
status = filters.ChoiceFilter(choices=ComplianceReport.STATUS_CHOICES)
prepared_by = filters.UUIDFilter()
search = filters.CharFilter(method='filter_search')
class Meta:
model = ComplianceReport
fields = ['framework', 'report_type', 'status', 'prepared_by', 'search']
def filter_search(self, queryset, name, value):
return queryset.filter(
Q(title__icontains=value) |
Q(description__icontains=value) |
Q(executive_summary__icontains=value)
)
class ComplianceReportViewSet(viewsets.ModelViewSet):
"""ViewSet for Compliance Report"""
queryset = ComplianceReport.objects.all()
serializer_class = ComplianceReportSerializer
pagination_class = CompliancePagination
filter_backends = [DjangoFilterBackend]
filterset_class = ComplianceReportFilter
permission_classes = [permissions.IsAuthenticated]
def get_queryset(self):
return ComplianceReport.objects.select_related(
'framework', 'prepared_by', 'reviewed_by', 'approved_by'
).prefetch_related('applicable_requirements')
def get_serializer_class(self):
if self.action == 'retrieve':
return ComplianceReportDetailSerializer
return ComplianceReportSerializer
class LegalHoldFilter(filters.FilterSet):
"""Filter for Legal Hold"""
status = filters.ChoiceFilter(choices=LegalHold.STATUS_CHOICES)
active = filters.BooleanFilter(method='filter_active')
search = filters.CharFilter(method='filter_search')
class Meta:
model = LegalHold
fields = ['status', 'active', 'search']
def filter_active(self, queryset, name, value):
if value:
return queryset.filter(
status='ACTIVE',
expiration_date__gt=timezone.now().date()
)
return queryset
def filter_search(self, queryset, name, value):
return queryset.filter(
Q(case_name__icontains=value) |
Q(case_number__icontains=value) |
Q(description__icontains=value) |
Q(legal_counsel__icontains=value) |
Q(law_firm__icontains=value)
)
class LegalHoldViewSet(viewsets.ModelViewSet):
"""ViewSet for Legal Hold"""
queryset = LegalHold.objects.all()
serializer_class = LegalHoldSerializer
pagination_class = CompliancePagination
filter_backends = [DjangoFilterBackend]
filterset_class = LegalHoldFilter
permission_classes = [permissions.IsAuthenticated]
def get_queryset(self):
return LegalHold.objects.select_related('created_by').prefetch_related(
'related_incidents', 'related_evidence'
)
def get_serializer_class(self):
if self.action == 'retrieve':
return LegalHoldDetailSerializer
return LegalHoldSerializer
@action(detail=True, methods=['post'])
def add_incident(self, request, pk=None):
"""Add incident to legal hold"""
legal_hold = self.get_object()
incident_id = request.data.get('incident_id')
if not incident_id:
return Response(
{'error': 'Incident ID is required'},
status=status.HTTP_400_BAD_REQUEST
)
from incident_intelligence.models import Incident
try:
incident = Incident.objects.get(id=incident_id)
legal_hold.related_incidents.add(incident)
return Response({'message': 'Incident added to legal hold successfully'})
except Incident.DoesNotExist:
return Response(
{'error': 'Incident not found'},
status=status.HTTP_404_NOT_FOUND
)
@action(detail=True, methods=['post'])
def add_evidence(self, request, pk=None):
"""Add evidence to legal hold"""
legal_hold = self.get_object()
evidence_id = request.data.get('evidence_id')
if not evidence_id:
return Response(
{'error': 'Evidence ID is required'},
status=status.HTTP_400_BAD_REQUEST
)
try:
evidence = EvidenceCollection.objects.get(id=evidence_id)
legal_hold.related_evidence.add(evidence)
return Response({'message': 'Evidence added to legal hold successfully'})
except EvidenceCollection.DoesNotExist:
return Response(
{'error': 'Evidence not found'},
status=status.HTTP_404_NOT_FOUND
)