Skip to main content

Overview

Formal can forward all activity logs to your SIEM, data lake, or observability platform. This enables centralized log management, long-term retention, compliance reporting, and integration with your existing security tools.

Supported Platforms

Supported log destinations
  • AWS S3
  • Datadog
  • Splunk
  • Elastic
  • Sumo Logic
By default, Formal forwards all logs to your configured destination.

Setup

AWS S3

AWS S3 log integration requires an AWS Cloud Integration with S3 access enabled.
1

Set Up AWS Integration

First, configure an AWS Cloud Integration with:
  • AllowS3Access: true
  • S3BucketARN: Your S3 bucket ARN
2

Navigate to Log Integrations

3

Create Integration

Click Create Log Integration
4

Select AWS S3

Choose AWS S3 as your provider
5

Configure

  • S3 Bucket Name: Your bucket name
  • Cloud Integration: Select your AWS Cloud Integration

Terraform

# First, create S3 bucket
resource "aws_s3_bucket" "formal_logs" {
  bucket = "formal-connector-logs"
}

# Create AWS Cloud Integration
resource "formal_integration_cloud" "aws" {
  name         = "aws-integration"
  cloud_region = "us-east-1"

  aws {
    template_version    = "1.2.0"
    allow_s3_access     = true
    s3_bucket_arn       = "${aws_s3_bucket.formal_logs.arn}/*"
  }
}

# Deploy CloudFormation stack (see AWS Integration guide)
resource "aws_cloudformation_stack" "formal" {
  # ... (see AWS Integration docs)
}

# Create log integration
resource "formal_integration_log" "s3" {
  name = "s3-log-drain"

  s3 {
    s3_bucket_name       = aws_s3_bucket.formal_logs.bucket
    cloud_integration_id = formal_integration_cloud.aws.id
  }
}

Datadog

1

Get Datadog Credentials

From your Datadog account, retrieve:
  • Account ID
  • API Key
  • Site (e.g., datadoghq.com, datadoghq.eu)
2

Navigate to Log Integrations

3

Create Integration

Click Create Log Integration
4

Select Datadog

Choose Datadog as your provider
5

Enter Credentials

  • Account ID: Your Datadog account ID
  • API Key: Your Datadog API key
  • Site: Your Datadog site

Terraform

resource "formal_integration_log" "datadog" {
  name = "datadog-log-drain"

  datadog {
    account_id = var.datadog_account_id
    api_key    = var.datadog_api_key
    site       = "datadoghq.com"  # or datadoghq.eu, etc.
  }
}

Splunk

1

Create Splunk HEC Token

In Splunk, create a new HTTP Event Collector (HEC) token
2

Navigate to Log Integrations

3

Create Integration

Click Create Log Integration
4

Select Splunk

Choose Splunk as your provider
5

Enter Configuration

  • Access Token: Your HEC token
  • Host: Your Splunk instance hostname
  • Port: HEC port (usually 8088)

Terraform

resource "formal_integration_log" "splunk" {
  name = "splunk-log-drain"

  splunk {
    access_token = var.splunk_hec_token
    host         = "splunk.example.com"
    port         = 8088
  }
}

Use Cases

Compliance and Auditing

Forward logs to long-term storage for compliance requirements:
# Archive to S3 for 7 years (SOC 2, HIPAA, etc.)
resource "formal_integration_log" "compliance_archive" {
  name = "compliance-s3-archive"

  s3 {
    s3_bucket_name       = aws_s3_bucket.compliance_logs.bucket
    cloud_integration_id = formal_integration_cloud.aws.id
  }
}

# Configure S3 lifecycle policy
resource "aws_s3_bucket_lifecycle_configuration" "compliance" {
  bucket = aws_s3_bucket.compliance_logs.id

  rule {
    id     = "archive-old-logs"
    status = "Enabled"

    transition {
      days          = 90
      storage_class = "GLACIER"
    }

    expiration {
      days = 2555  # 7 years
    }
  }
}

Real-Time Security Monitoring

Send logs to your SIEM for real-time threat detection:
# Send to Datadog for real-time monitoring
resource "formal_integration_log" "security_monitoring" {
  name = "datadog-security"

  datadog {
    account_id = var.datadog_account_id
    api_key    = var.datadog_api_key
    site       = "datadoghq.com"
  }
}
Create alerts in Datadog for:
  • Failed authentication attempts
  • Policy violations
  • Unusual query patterns
  • Off-hours access

Data Lake Integration

Forward logs to your data lake for analytics:
# Send to S3 data lake
resource "formal_integration_log" "data_lake" {
  name = "data-lake-integration"

  s3 {
    s3_bucket_name       = "my-data-lake-formal-logs"
    cloud_integration_id = formal_integration_cloud.aws.id
  }
}
Then use Athena, Redshift Spectrum, or Databricks to analyze:
  • User access patterns
  • Query performance
  • Policy effectiveness
  • Resource utilization

Multi-Destination Forwarding

Send logs to multiple destinations:
# Real-time monitoring
resource "formal_integration_log" "splunk_realtime" {
  name = "splunk-realtime"
  splunk {
    access_token = var.splunk_hec_token
    host         = "splunk.example.com"
    port         = 8088
  }
}

# Long-term archive
resource "formal_integration_log" "s3_archive" {
  name = "s3-archive"
  s3 {
    s3_bucket_name       = "formal-logs-archive"
    cloud_integration_id = formal_integration_cloud.aws.id
  }
}

# Security analytics
resource "formal_integration_log" "datadog_security" {
  name = "datadog-security"
  datadog {
    account_id = var.datadog_account_id
    api_key    = var.datadog_api_key
    site       = "datadoghq.com"
  }
}

Next Steps