Views:
  1. Log in to Amazon Web Services (AWS) and go to SNS.
  2. Create a new topic, and take note of the Topic ARN.
  3. Go to S3.
  4. Create a bucket, and take note of the Bucket Name. This is where your events will be stored.
  5. Go to Identity and Access Management.
  6. Create a policy that will allow the permission to publish to SNS. Select Create your own policy and enter the following code. Make sure to replace "SNS-TOPIC-ARN" with your SNS Topic ARN from Step 2.
    {
        "Version""2012-10-17",
        "Statement": [
            {
                "Action": [
                    "sns:Publish"
                ],
                "Effect""Allow",
                "Resource""SNS-TOPIC-ARN"
            }
        ]
    }
  7. Create another policy that will allow your Lambda function to save objects to your S3 bucket. Select Create your own policy and enter the following code. Make sure to replace "S3-BUCKET-NAME" with your bucket name.
    {
        "Version""2012-10-17",
        "Statement": [
            {
                "Effect""Allow",
                "Action": [
                    "s3:PutObject",
                    "s3:PutObjectAcl"
                ],
                "Resource": [
                    "arn:aws:s3:::S3-BUCKET-NAME",
                    "arn:aws:s3:::S3-BUCKET-NAME/*"
                ]
            }
        ]
    }
  8. Create a Role. For the role type, under AWS Service Role, select AWS Lambda. Attach the policy you just created in Step 7.
  9. Create a User, and take note of the Access Key ID and Secret Access Key. It is recommended to download the credentials as .csv file.
    If you already have a user without keys, you can create a new set by clicking on your User > Security Credentials > Create Access Key.
  10. Edit your User's permissions to attach the policy created on Step 6, to allow it to publish to SNS.
  11. Go to Lambda and create a Lambda function.
    1. Select blueprint "sns-message" as script template
    2. Configure Trigger to the SNS topic you created on step 2.
    3. For the lambda function code, just select the language you want. This tutorial provides code samples for Node.js 14.x.
    4. For the role, select to choose an existing one, and pick the one you created on Step 8. It will give permission to your Lambda function to save objects in S3.
    5. Copy the sample script and modify it to suit your needs. Replace "S3-BUCKET-NAME" by your bucket name.
      const aws = require('aws-sdk');
      const s3 = new aws.S3({apiVersion: '2006-03-01'});
      const bucket = 'S3-BUCKET-NAME';
      const s3prefix = 'CloudOneWorkloadSecurityEvents/';
      const ext = '.json';
        
      exports.handler = (sns, context) => {
          //retrieve the events from the sns json
          var events = sns.Records[0].Sns.Message;
            
          //extract the date to use in s3 directory
          var timestamp = sns.Records[0].Sns.Timestamp;
          var date = getFormattedDate(new Date(timestamp));
          //add 5 random digits to prevent overwriting an existing file if two messages are received at the same ms.
          var random5digits = Math.floor(Math.random()*90000) + 10000;
        
          //ie. DeepSecurityEvents/2016/07/20/2016-07-20T15:30:00.000Z12345.json
          var filename = s3prefix + date + '/' + timestamp + random5digits + ext;
        
          sendToS3(events, filename);
      };
        
      function getFormattedDate(d) {
          //returns yyyy/MM/dd
          return d.getFullYear() + '/' + twoDigits(d.getMonth() + 1) + '/' + twoDigits(d.getDate());
      }
        
      function twoDigits(n) {
          return n < 10 ? '0' + n : n;
      }
        
      function sendToS3(content, filename) {
          var params = {
              Bucket: bucket,
              Key: filename,
              Body: content
          };
          s3.putObject(params, function(err) {
              if (err) console.log(err, err.stack); // an error occurred
          });
      }
  12. Enable the option to Publish Events to Amazon Simple Notification Service (in Administration tab > System Settings > Event Forwarding > Amazon SNS).
    • Enter your Access Key and your Secret Key (from step 10) and your SNS Topic ARN (from Step 2).
    • Select which event types you want to forward. You can refer to the screenshot below.

      Event Types

Verify the Lambda function using a test event, which can be enabled in Actions > Configure test event. Either test it using the sample event for SNS from the drop-down list or using the test event below:

{
  "Records": [
    {
      "Sns": {
        "Message""[{\"EventID\":1,\"EventType\":\"TestEvent\"}]",
        "Timestamp""2016-07-29T17:40:00.000Z"
      }
    }
  ]
}

More about SNS

Deep Security sends events to SNS in bulk, as a JSON array that has a limit of 110 KB in size1. If there are a lot of events at once, multiple SNS messages may be sent.

The format of the SNS messages will therefore be the following:

[{"Property1":"Value1","Property2":"Value2",...,"PropertyN":"ValueN"},{"Property1":"Value1","Property2":"Value2",...,"PropertyN":"ValueN"},...,{"Property1":"Value1","Property2":"Value2",...,"PropertyN":"ValueN"}]

1 SNS has a payload limit of 256 KB per message. However, since Lambda only has a limit of 128 KB for the entire message, we have reduced our batching to a payload limit of 110 KB.

More about Lambda

By default, Lambda has a maximum of 100 concurrent executions, but it can be increased. The formula to apply to work out the number of Concurrent Executions you need is:

Concurrency = lambda triggers per second * function duration (in seconds).

The sample code publishes every event batch as one, with the following path (key parameter in the code) in S3:

DeepSecurityEvents/yyyy/MM/dd/timestamp.json

If you want to save all events individually, you can refer to the following sample code.

const aws = require('aws-sdk');
const s3 = new aws.S3({apiVersion: '2006-03-01'});  
const bucket = 'S3-BUCKET-NAME';
const s3prefix = 'DeepSecurityEvents/';
const ext = '.json';
 
exports.handler = (sns, context) => {
    //retrieve the events from the sns json
    var events = JSON.parse(sns.Records[0].Sns.Message);
 
    date = getFormattedDate(new Date(sns.Records[0].Sns.Timestamp));
 
    for(var i = 0; i < events.length; i++) {
        sendToS3(events[i]);
    }
};
 
function getFormattedDate(d) {
    //returns yyyy/MM/dd
    return d.getFullYear() + '/' + twoDigits(d.getMonth() + 1) + '/' + twoDigits(d.getDate());
}
 
function twoDigits(n) {
    return n < 10 ? '0' + n : n;
}
 
function sendToS3(event) {
    var params = {
        Bucket: bucket,
        Key: s3prefix + event.EventType + '/' + date + '/' + event.EventID + ext,
        Body: JSON.stringify(event)
    };
    s3.putObject(params, function(err) {
        if (err) console.log(err, err.stack); // an error occurred
    });
}
  1. Parse the events.
  2. Loop through them.
  3. Publish them individually to S3. Make sure to send strings to S3, not objects (use JSON.stringify()). At this point, you will have access to the event's properties. It may be interesting to change the S3 path to something like DeepSecurityEvents/EventType/yyyy/MM/dd/EventID.json or DeepSecurityEvents/yyyy/MM/dd/EventType/EventID.json.
 
This will slightly increase the execution time of the Lambda function. Make sure not to exceed the limit, or to increase it.
 

More about S3

It is possible to create a rule in your S3 bucket that will automatically delete objects after a certain number of days. See S3 Lifecycle Configuration.

Other Resources