Event-Based Real-Time Cloud Functions Examples

Make sure that APIENDPOINT, APIKEY, and STORAGECLIENTID are configured as environment variables accessible to the cloud function.

Refer to the Event-based handling documentation for details on these variables.

AWS Lambda Function with S3 Trigger Setup Guide

Before proceeding with the setup, ensure the following requirements are met:

1. IAM User Permissions

  • Your IAM user must have the necessary permissions to:
    • Create Lambda functions
    • Assign IAM roles
    • Configure S3 triggers
    • Access the target S3 buckets

2. Account-Level Verification

  • Confirm there are no account-level naming restrictions that would prevent your chosen Lambda function name
  • Verify your AWS account has sufficient permissions and service quotas to create new Lambda functions in the desired region
  • Ensure you have access to the target AWS region where the Lambda function will be deployed

3. S3 Bucket Configuration

  • Verify that S3 bucket policies are configured to allow access from the Lambda function's execution role
  • Required S3 permissions may include:
    • s3:GetObject - to read objects from the bucket
    • s3:PutObject - to write objects to the bucket (if applicable)
    • s3:DeleteObject - to delete objects from the bucket (if applicable)

4. IAM Execution Role

  • Ensure you have a pre-configured IAM role with:
    • Lambda execution permissions
    • S3 access permissions for your target bucket(s)
    • CloudWatch Logs permissions for function monitoring

Step 1: Creating the Lambda Function

1.1 Navigate to Lambda Service

  1. Open the AWS Management Console
  2. Navigate to the Lambda service

1.2 Initialize Function Creation

  1. Click "Create function"
  2. Select "Author from scratch"

1.3 Configure Basic Settings

  1. Function name: Enter a descriptive name for your Lambda function

    • Example: MetaDefenderStorageSecurityProcessor
    • Ensure the name complies with AWS naming conventions and any account-specific policies
  2. Runtime: Select Python (choose the latest compatible version)

1.4 Configure Permissions

  1. Under "Change default execution role", select "Use an existing role"
  2. From the "Existing role" dropdown, choose your pre-configured IAM role
  3. Verify role permissions:
    • Navigate to the IAM console and review the selected role
    • Confirm Permission policies include necessary access to S3, CloudWatch Logs, and other required services
    • Verify Trust relationships allow the Lambda service (lambda.amazonaws.com) to assume the role
  1. Expand "Advanced configuration"
  2. Enable "Tags"
  3. Add relevant tags for resource organization:
    • Key: Purpose | Value: MetaDefenderScan
    • Key: Environment | Value: Production/Development
    • Key: Owner | Value: [Your Team/Department]

1.6 Create the Function

  1. Click "Create function"
  2. Wait for the function to be successfully created

Step 2: Adding S3 Trigger

2.1 Add Trigger Configuration

  1. In your newly created Lambda function, click "Add trigger"
  2. Select "S3" as the trigger source

2.2 Configure S3 Trigger Settings

  1. Bucket: Select your target S3 bucket from the dropdown menu

  2. Event types: Choose the appropriate event type

    • Default: "All object create events"
    • Alternative options: Object create, delete, or restore events based on your requirements
  3. Prefix (Optional): Specify a prefix to filter objects by path

    • Example: uploads/ to only trigger on objects in the uploads folder
  4. Suffix (Optional): Specify a suffix to filter objects by file extension

    • Example: .pdf to only trigger on PDF files
  5. Recursive invocation: Check this option to acknowledge potential recursive invocations

2.3 Finalize Trigger Setup

  1. Review your trigger configuration
  2. Click "Add" to create the S3 trigger

3.1 Navigate to Code Section

  1. In your Lambda function console, navigate to the "Code" tab
  2. Replace the default code with the following implementation:
Python
Copy

3.2 Deploy the Code

  1. Click "Deploy" to save and deploy your function code

Step 4: Configuring Environment Variables

4.1 Navigate to Configuration

  1. In your Lambda function console, click on the "Configuration" tab
  2. Select "Environment variables" from the left sidebar

4.2 Add Required Environment Variables

Click "Edit" and add the following environment variables:

  1. APIENDPOINT

    • Key: APIENDPOINT
    • Value: Your MDSS URL + /api/webhook/realtime
    • Example: https://mdss-example.com/api/webhook/realtime
  2. APIKEY

    • Key: APIKEY
    • Value: Your MDSS user API key
    • Note: Ensure this key has appropriate permissions for webhook operations
  3. STORAGECLIENTID

    • Key: STORAGECLIENTID
    • Value: Your storage client ID from MDSS
    • To obtain: Navigate to your desired storage configuration and copy the storageClientId

4.3 Save Configuration

  1. Click "Save" to apply the environment variable changes

Step 5: Testing and Validation

5.1 Test the Function

  1. Upload a test file to your configured S3 bucket
  2. Monitor the Lambda function's execution in the "Monitor" tab
  3. Check CloudWatch Logs for any execution errors or successful processing

5.2 Verify Integration

  1. Confirm that file events are being sent to your MDSS instance
  2. Verify that scans are initiated as expected
  3. Review MDSS logs for successful webhook reception

Azure Blob function app setup

  1. Deploy the Azure function app using the Terraform script: https://github.com/OPSWAT/metadefender-k8s/tree/main/terraform/azure-function-docker
  2. Configure STORAGECLIENTID, APIKEY, and APIENDPOINT variables in the .tvars file:
Python
Copy

Azure Blob Event Grid RTP configuration

Refer to the example for detailed configuration: https://github.com/OPSWAT/metadefender-k8s/tree/main/terraform/CloudFunctions/Azure/webhook-notification

Event Notifications for Page and Append blob are NOT supported.

Events for these blob types are triggered upon the first block commit, potentially before the upload is complete.

Google Cloud Function Setup

  1. Configure the Cloud Run with the google.cloud.storage.object.v1.finalizedtrigger to process newly added objects.
  2. Python Function Example:
Python
Copy
  1. Requirements.txt example:
requirements.txt
Copy

Alibaba Cloud Function Setup

  1. Follow the official Alibaba Cloud documentation to create a compute function with an OSS trigger: https://www.alibabacloud.com/help/en/function-compute/latest/configure-an-oss-trigger
  2. Specify the bucket to monitor and subscribe to the oss:ObjectCreated:* event.
  3. Python Function Example:
Python
Copy

Wasabi Function Setup

  1. Follow the official Wasabi documentation to create an event notification: https://docs.wasabi.com/v1/docs/event-notifications-bucket
  2. Establish a connection with a service capable of sending requests to MetaDefender Storage Security (MDSS).
  3. The Wasabi documentation example uses AWS SNS, which can be integrated with AWS Lambda (see Amazon S3 Lambda Function Setup: How do I configure Event Notifications on my Wasabi bucket using AWS SNS?

S3 Compatible function setup

  • Event-based real-time processing configuration varies for different S3-compatible services.
  • Most S3-compatible services offer event notification similar to Wasabi.
  • The service must send a request to the MDSS endpoint: http(s)://{baseurl}/api/webhook/realtime with the appropriate request body:
JSON
Copy
Type to search, ESC to discard
Type to search, ESC to discard
Type to search, ESC to discard