Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Reverted from v. 9

...

  • CloudWatch log group is created from where the data would be exported.

  • S3 bucket is created where you want to import the data.

Use Case 1

Below configuration is for the use case where Cloudwatch and S3 bucket are in the same account:

Steps for Configuration

First step is to create a Lambda instance that houses the source code for receiving CloudWatch events and storing them to our S3 instance.

...

  1. Once the lambda function is created. Navigate to Code, copy and paste the following code.

...

  1. Configuration > Environment Variables. Click Edit.

...

Please mention below keys as it is, these keys are used in lambda function and change in key name will directly impact on lambda functioning.

Key

Value

LOGGROUP_NAME

<Cloudwatch log group name>

MY_AWS_REGION

<AWS Region>

PREFIX

<S3 Prefix>

S3_BUCKET_NAME

<S3 bucket name>

...

  1. Please download the following zip file. Select upload from dropdown under code. Upload the zip and save, please note that post uploading you might not be able to see the source code.

View file
nameawslambdasdk.zip

...

When using node.js 20, there are some aws-sdk dependencies, which are handled in the zip along with the function code.

  1. If you are creating new function and Eventbridge rule is not available in trigger: Click Add Trigger and choose Eventbridge.

image-20240704-073138.pngImage Added

  1. Create new rule

  2. Provide the rule name

  3. State the description

  4. Schedule expression will act as CRON which will automatically trigger the event on matching expression. We are going to set the 2-minutes rate which invokes the lambda function every 2 minutes. You can specify as per organization’s policy or the interval you want lambda to execute.

 Valid values: minute | minutes | hour | hours | day | days

 Syntax:

rate(value unit)

image-20240704-073220.pngImage Added

Make sure your S3 bucket has following policy:

Code Block
{
    "Version": "2012-10-17",
    "Statement": [
        {
        if (err) {   "Effect": "Allow",
         if (err.code === 'NoSuchKey') "Principal": {
                callback(null, 0); // Start from the beginning if the file doesn't exist"AWS": "<LAMBDA_EXECUTION_IAM_ROLE_ARN>"
            },
            "Action": [
 } else {               "s3:GetObject",
         callback(err);       "s3:PutObject",
     }         } else {"s3:ListBucket"
            callback(null], parseInt(data.Body.toString('utf-8').trim()));
            }"Resource": [
   }); } function processLogs(lastTimestamp, callback) {     let latestTimestamp = lastTimestamp; "arn:aws:s3:::<S3_BUCKET_NAME>",
    function retrieveLogs(token) {         getLogEvents(logGroupName, lastTimestamp, token, (err, data) => { "arn:aws:s3:::<S3_BUCKET_NAME>/*"
            ]
    if (err) {  }
              console.error(`Error getting log events for stream:`, err);
                return callback(err);
            }
            const logEvents = data.events;
            const nextToken = data.nextToken;
            if (logEvents.length > 0) {
                exportToS3(logEvents, (exportErr) => {
                    if (exportErr) {
            ]
}

Replace S3_BUCKET_NAME and LAMBDA_EXECUTION_IAM_ROLE_ARN with your actual values.

You can get LAMBDA_EXECUTION_IAM_ROLE_ARN from: Lambda > Configuration > Permission > click Role name > Copy the ARN

...

After completing these configurations, data would be exported from Cloudwatch to S3.

Use Case 2

Below configuration is for the use case where destination S3 bucket is in different account:

Configuration

First step is to create a Lambda instance that houses the source code for receiving CloudWatch events and storing them to our S3 instance.

  1. Search for the Lambda service in your AWS account, navigate to functions, select Create Function or select the existing function.

  2. Under Basic Information, we need to provide:

    1. Function name

    2. Runtime (Node.js 20x)

    3. Instruction set Architecture (x86_64 default)

image-20240808-073732.pngImage Added

Set Up the Role in the Target Account

Create a role in the target account that your Lambda function will assume.

  1. Navigate to AWS Services in the account of destination S3 bucket.

  2. In Services > IAM > Roles

  3. Click Create role (For future reference in this document, we will call it as CrossAcS3Write)

...

  1. Select Trusted entity as Custom trust policy.

  2. Provide below policy under custom trust policy.

Code Block
{
  "Version": "2024-08-08",
  "Statement": [
    {
      "Effect": "Allow",
   console.error(`Error exporting log events to S3 for stream:`, exportErr); "Principal": {
        "AWS": "<Lambda_Execution_Role_ARN>"
      },
       return callback(exportErr);"Action": "sts:AssumeRole"
    }
  ]
             }
                    const streamLatestTimestamp = logEvents[logEvents.length - 1].timestamp;
                    latestTimestamp = Math.max(latestTimestamp, streamLatestTimestamp);
                    if (nextToken) {
 }

You can check the <Lambda_Execution_Role_ARN> in the source account where you have created lambda.

Lambda > Configuration > Permissions > click on the Role name

...

It will open in a new tab. Copy the role ARN and use it as showed in the above policy in target account trust policy creation.

...

  1. Click Next

  2. In Add permission, do not add any permission and click next for now.

  3. Provide Role details as name as CrossAcS3Write and description and click create role. (CrossAcS3Write is just for our reference in the document, you can provide any name)

  4. Open a newly created role to grant Permissions to the Role

  5. In Permission > Add permission and provide below permission

Code Block
{
    "Version": "2012-10-17",
    "Statement": [
        {
             retrieveLogs(nextToken);"Effect": "Allow",
            "Action": [
      } else {        "s3:GetObject",
                updateLastExportedTimestamp(latestTimestamp"s3:PutObject",
callback);                 "s3:ListBucket"
   }         ],
       });     "Resource": "arn:aws:s3:::<TARGET_BUCKET_NAME>/*"
      } else {}
                if (nextToken) ]
}

Replace <TARGET_BUCKET_NAME> with the name of the S3 bucket in the target account. 

...

Set up target S3 bucket policy

  1. Navigate to the target S3 bucket

  2. Open the bucket

  3. Navigate to Permissions > Bucket policy

  4. Provide the below policy

Code Block
{
       "Version": "2012-10-17",
    "Statement": [
      retrieveLogs(nextToken);  {
            "Effect": "Allow",
} else {          "Principal": {
         updateLastExportedTimestamp(latestTimestamp, callback);      "AWS": "<ROLE_ARN_CrossAcS3Write>"
         }   },
         }    "Action": [
      });     }     // Start the recursive retrieval"s3:GetObject",
         retrieveLogs(null); } function getLogEvents(logGroupName, startTime, token, callback) { "s3:PutObject",
    const params = {         logGroupName"s3:ListBucket"
logGroupName,         startTime: startTime,  ],
      nextToken: token     };
 "Resource": [
  cloudwatchlogs.filterLogEvents(params, (err, data) => {         if (err) { "arn:aws:s3:::<TARGET_BUCKET_NAME>",
             callback(err);   "arn:aws:s3:::<TARGET_BUCKET_NAME>/*"
     } else {             callback(null, data);]
        }
    });
}
function exportToS3(logEvents, callback) {
    const logData = logEvents.map(event => event.message).join('\n');
    const s3Key = `${s3KeyPrefix}${Date.now()}.txt`;
    s3.putObject({ Bucket: s3BucketName, Key: s3Key, Body: logData }, (err, data) => {]
}

<ROLE_ARN_CrossAcS3Write> is the role we created in previous steps. You can copy the ARN from target Account > IAM > Roles, open newly created CrossAcS3Write role and copy ARN

Replace <TARGET_BUCKET_NAME> with the name of the S3 bucket in the target account.

 Set up lambda execution role policy

In the source account where lambda is created, make sure your newly created execution role or existing role should have following policy:

You can check the policy in Lambda > Configuration > Permissions, click on the Role name

Code Block
{
    "Version": "2012-10-17",
    "Statement": [
        if (err) {
            callback(err);
"Effect": "Allow",
       } else {   "Action": [
        callback(null);        "logs:CreateLogGroup",
}     }); } function updateLastExportedTimestamp(timestamp, callback) {     s3.putObject({ Bucket: s3BucketName, Key: 'lastTimestamp.txt', Body: timestamp.toString() }, (err, data) => { "logs:CreateLogStream",
                if (err) {"logs:PutLogEvents",
                "logs:GetLogEvents",
   callback(err);         } else {  "logs:DescribeLogStreams",
          callback(null);      "logs:CreateExportTask",
  }     }); }

Replace s3BucketName, logGroupName, s3KeyPrefix with actual values.

image-20240209-114741 (1).pngImage Removed

  1. Click Deploy

  2. Click to + Add Trigger and choose EventBridge

...

  1. Select Create new rule

  2. Provide the Rule name

  3. Enter the Rule description

  4. Schedule expression will act as CRON which will automatically trigger the event on matching expression.

Set the 2-minutes rate which invokes the lambda function every 2 minutes. You can specify as per organization’s policy or the interval you want lambda to execute.

Valid values: minute | minutes | hour | hours | day | days

 Syntax:

rate(value unit)

...

Make sure your S3 bucket has following policy:

Code Block
{
    "Version": "2012-10-17",        "logs:FilterLogEvents"
            ],
            "Resource": "*"
        },
        {
            "Effect": "Allow",
            "StatementAction": [
        {        "s3:GetObject",
    "Effect": "Allow",             "Principal": {s3:PutObject",
                "AWS": "<LAMBDA_EXECUTION_IAM_ROLE_ARN>s3:ListBucket"
            }],
            "ActionResource": [
                "arn:aws:s3:GetObject:::<TARGET_BUCKET_NAME>",
                "arn:aws:s3:PutObject",::<TARGET_BUCKET_NAME>/*"
            ]
   "s3:ListBucket"     },
        ],{
            "ResourceEffect": ["Allow",
                "arn:aws:s3:::<S3_BUCKET_NAME>"Action": "sts:AssumeRole",
                "arn:aws:s3:::<S3_BUCKET_NAME>/*""Resource": "<ROLE_ARN_CrossAcS3Write>"
        }
    ]
        }
    ]
}

Replace S3_BUCKET_NAME and <LAMBDA_EXECUTION_IAM_ROLE_ARN> with your actual values.

You can get <LAMBDA_EXECUTION_IAM_ROLE_ARN> from: lambda > configuration > permission > click on the role name → Copy the ARN

...

}

<ROLE_ARN_CrossAcS3Write> is the role we created in previous steps in the target S3 bucket account. You can copy the ARN from target Account > IAM > Roles, open newly created CrossAcS3Write role and copy ARN.

Replace <TARGET_BUCKET_NAME> with the name of the S3 bucket in the target account.

 Set up code and configurations in Lambda

  1. In Lambda function, navigate to configuration > Environment Variables and click Edit.

...

  1. Please mention below keys as it is, these keys are used in lambda function and any change in key name will directly impact on lambda functioning.

Key

Value

LOGGROUP_NAME

<Cloudwatch log group name>

MY_AWS_REGION

<AWS Region>

PREFIX

<Target S3 Bucket Prefix>

S3_BUCKET_NAME

<Target S3 Bucket Name>

TARGET_ACCOUNT_ROLE_ARN

<ROLE_ARN_CrossAcS3Write>

...

  1. Please download the following zip file. Go to lambda → code → upload from dropdown under code. Upload the zip and save, please note that post uploading you might not be able to see the source code.

View file
namelambda.zip

...

When using node.js 20, there are some aws-sdk dependencies, which are handled in the zip along with the function code.

  1. If you are creating new function and Eventbridge rule is not available in trigger: Click Add Trigger and choose Eventbridge

...

  1. Create New rule

  2. Provide the Rule name

  3. State the description.

  4. Schedule expression will act as CRON which will automatically trigger the event on matching expression. We are going to set the 2-minutes rate which invokes the lambda function every 2 minutes. You can specify as per organization’s policy or the interval you want lambda to execute.

 Valid values: minute | minutes | hour | hours | day | days

 Syntax: rate(value unit)

image-20240808-094126.pngImage Added

After completing these configurations, data would be exported from Cloudwatch to target S3 bucket in another account.