Sign In with your
Trend Micro Account
Need Help?
Need More Help?

Create a technical support case if you need further support.

Publishing Deep Security events to Amazon S3 Bucket using SNS and Lambda

    • Updated:
    • 18 Jul 2019
    • Product/Version:
    • Deep Security 11.0
    • Deep Security 12.0
    • Platform:
    • Linux All
    • Windows -
Summary

Learn how to store Deep Security events to an Amazon S3 Bucket as a long-term storage option.

 
The user will be responsible for the costs of SNS, Lambda, and S3.
Details
Public
  1. Log in to Amazon Web Services (AWS) and go to SNS.
  2. Create a new topic, and take note of the Topic ARN.
  3. Go to S3.
  4. Create a bucket, and take note of the Bucket Name. This is where your events will be stored.
  5. Go to Identity and Access Management.
  6. Create a policy that will allow the permission to publish to SNS. Select Create your own policy and enter the following code. Make sure to replace "SNS-TOPIC-ARN" with your SNS Topic ARN from Step 2.
    {      "Version": "2012-10-17",      "Statement": [          {              "Action": [                  "sns:Publish"              ],              "Effect": "Allow",              "Resource": "SNS-TOPIC-ARN"          }      ]  }
  7. Create another policy that will allow your Lambda function to save objects to your S3 bucket. Select Create your own policy and enter the following code. Make sure to replace "S3-BUCKET-NAME" with your bucket name.
    {      "Version": "2012-10-17",      "Statement": [          {              "Effect": "Allow",              "Action": [                  "s3:PutObject",                  "s3:PutObjectAcl"              ],              "Resource": [                  "arn:aws:s3:::S3-BUCKET-NAME",                  "arn:aws:s3:::S3-BUCKET-NAME/*"              ]          }      ]  }
  8. Create a Role. For the role type, under AWS Service Role, select AWS Lambda. Attach the policy you just created in Step 7.
  9. Create a User, and take note of the Access Key ID and Secret Access Key. It is recommended to download the credentials as .csv file.
    If you already have a user without keys, you can create a new set by clicking on your User > Security Credentials > Create Access Key.
  10. Edit your User's permissions to attach the policy created on Step 6, to allow it to publish to SNS.
  11. Go to Lambda and create a Lambda function.
    1. Select blueprint "sns-message" as script template.
    2. Configure Trigger to the SNS topic you created on Step 2.
    3. Choose whether to enable the trigger now or later. If you're following this guide, it is okay to enable the trigger now since Deep Security Manager is not publishing to SNS yet.
    4. For the Lambda function code, select your preferred language (i.e. Java 8, NodeJS, Node.js 4.3 and Python 2.7). A code sample for Node.js 4.3 is provided below.
    5. For the role, click Choose an existing one and select the one you created on Step 8. It will give permission to your Lambda function to save objects in S3.

    Below is a sample script that you may copy and modify based on your needs. Replace "S3-BUCKET-NAME" with your bucket name.

    var aws = require('aws-sdk');  var s3 = new aws.S3({apiVersion: '2006-03-01'});  var bucket = 'S3-BUCKET-NAME';  var acl = 'public-read';  var s3prefix = 'DeepSecurityEvents/';  var ext = '.json';     exports.handler = (sns, context) => {      //retrieve the events from the sns json      var events = sns.Records[0].Sns.Message;            //extract the date to use in s3 directory      var timestamp = sns.Records[0].Sns.Timestamp;      var date = getFormattedDate(new Date(timestamp));      //add 5 random digits to prevent overwriting an existing file if two messages are received at the same ms.      var random5digits = Math.floor(Math.random()*90000) + 10000;         //ie. DeepSecurityEvents/2016/07/20/2016-07-20T15:30:00.000Z12345.json      var filename = s3prefix + date + '/' + timestamp + random5digits + ext;         sendToS3(events, filename);  };     function getFormattedDate(d) {      //returns yyyy/MM/dd      return d.getFullYear() + '/' + twoDigits(d.getMonth() + 1) + '/' + twoDigits(d.getDate());  }     function twoDigits(n) {      return n < 10 ? '0' + n : n;  }     function sendToS3(content, filename) {      var params = {          Bucket: bucket,          Key: filename,          ACL: acl,          Body: content      };      s3.putObject(params, function(err) {          if (err) console.log(err, err.stack); // an error occurred      });  }
  12. On the Deep Security Manager console, go to Administration > System Settings > Event Forwarding.
  13. Under Amazon SNS, enter your Access Key and Secret Key generated from Step 9, and your SNS Topic ARN from Step 2.
  14. Enable Publish Events to Amazon Simple Notification Service.
  15. Select which event types you want to forward.

    Event Types

Verify the Lambda function using a test event, which can be enabled in Actions > Configure test event. Either test it using the sample event for SNS from the drop-down list or using the test event below:

{         "Records": [           {           "Sns": {           "Message": "[{\"EventID\":1,\"EventType\":\"TestEvent\"}]",           "Timestamp": "2016-07-29T17:40:00.000Z"           }          }         ]        }

More about SNS

Deep Security sends events to SNS in bulk, as a JSON array that has a limit of 110 KB in size1. If there are a lot of events at once, multiple SNS messages may be sent.

The format of the SNS messages will therefore be the following:

[{"Property1":"Value1","Property2":"Value2",...,"PropertyN":"ValueN"},{"Property1":"Value1","Property2":"Value2",...,"PropertyN":"ValueN"},...,{"Property1":"Value1","Property2":"Value2",...,"PropertyN":"ValueN"}]

1 SNS has a payload limit of 256 KB per message. However, since Lambda only has a limit of 128 KB for the entire message, we have reduced our batching to a payload limit of 110 KB.

More about Lambda

By default, Lambda has a maximum of 100 concurrent executions, but it can be increased. The formula to apply to work out the number of Concurrent Executions you need is:

Concurrency = lambda triggers per second * function duration (in seconds).

The sample code publishes every event batch as one, with the following path (key parameter in the code) in S3:

DeepSecurityEvents/yyyy/MM/dd/timestamp.json

If you want to save all events individually, you can refer to the following sample code.

var aws = require('aws-sdk');  var s3 = new aws.S3({apiVersion: '2006-03-01'});     var bucket = 'S3-BUCKET-NAME';  var acl = 'public-read';  var s3prefix = 'DeepSecurityEvents/';  var ext = '.json';  var date;     exports.handler = (sns, context) => {      //retrieve the events from the sns json      var events = JSON.parse(sns.Records[0].Sns.Message);         date = getFormattedDate(new Date(sns.Records[0].Sns.Timestamp));             for(var i = 0; i < events.length; i++) {                 sendToS3(events[i]);          }  };     function getFormattedDate(d) {      //returns yyyy/MM/dd      return d.getFullYear() + '/' + twoDigits(d.getMonth() + 1) + '/' + twoDigits(d.getDate());  }     function twoDigits(n) {      return n < 10 ? '0' + n : n;  }     function sendToS3(event) {      var params = {          Bucket: bucket,          Key: s3prefix + event.EventType + '/' + date + '/' + event.EventID + ext,          ACL: acl,          Body: JSON.stringify(event)      };      s3.putObject(params, function(err) {          if (err) console.log(err, err.stack); // an error occurred      });  }
  1. Parse the events.
  2. Loop through them.
  3. Publish them individually to S3. Make sure to send strings to S3, not objects (use JSON.stringify()). At this point, you will have access to the event's properties. It may be interesting to change the S3 path to something like DeepSecurityEvents/EventType/yyyy/MM/dd/EventID.json or DeepSecurityEvents/yyyy/MM/dd/EventType/EventID.json.
 
This will slightly increase the execution time of the Lambda function. Make sure not to exceed the limit, or to increase it.

More about S3

It is possible to create a rule in your S3 bucket that will automatically delete objects after a certain number of days. See S3 Lifecycle Configuration.

Other Resources

Premium
Internal
Rating:
Category:
Configure
Solution Id:
1123035
Feedback
Did this article help you?

Thank you for your feedback!

To help us improve the quality of this article, please leave your email here so we can clarify further your feedback, if neccessary:
We will not send you spam or share your email address.

*This form is automated system. General questions, technical, sales, and product-related issues submitted through this form will not be answered.

If you need additional help, you may try to contact the support team. Contact Support

To help us improve the quality of this article, please leave your email here so we can clarify further your feedback, if neccessary:
We will not send you spam or share your email address.

*This form is automated system. General questions, technical, sales, and product-related issues submitted through this form will not be answered.