CloudWatch Log Group to S3 Bucket Export Automation using Lambda
Prerequisite
CloudWatch log group is created from where the data would be exported.
S3 bucket is created where you want to import the data.
Use Case 1
Below configuration is for the use case where Cloudwatch and S3 bucket are in the same account:
Steps for Configuration
First step is to create a Lambda instance that houses the source code for receiving CloudWatch events and storing them to our S3 instance.
Search for the Lambda service in your AWS account, navigate to functions, and select Create Function.
Under Basic Information, provide:
Function name
Runtime (Node.js 20x)
Instruction set Architecture (x86_64 default)
Make sure your newly created execution role should have following policy:
You can check the policy in Lambda > Configuration > Permission and click role name
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"logs:CreateLogGroup",
"logs:CreateLogStream",
"logs:PutLogEvents",
"logs:GetLogEvents",
"logs:DescribeLogStreams",
"logs:CreateExportTask",
"logs:FilterLogEvents"
],
"Resource": "*"
},
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:PutObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::<S3_BUCKET_NAME>",
"arn:aws:s3:::<S3_BUCKET_NAME>/*"
]
}
]
}
Replace S3_BUCKET_NAME with actual bucket name.
Once the lambda function is created. Navigate to Configuration > Environment Variables. Click Edit.
Please mention below keys as it is, these keys are used in lambda function and change in key name will directly impact on lambda functioning.
Key | Value |
---|---|
LOGGROUP_NAME | <Cloudwatch log group name> |
MY_AWS_REGION | <AWS Region> |
PREFIX | <S3 Prefix> |
S3_BUCKET_NAME | <S3 bucket name> |
Please download the following zip file. Select upload from dropdown under code. Upload the zip and save, please note that post uploading you might not be able to see the source code.
When using node.js 20, there are some aws-sdk dependencies, which are handled in the zip along with the function code.
If you are creating new function and Eventbridge rule is not available in trigger: Click Add Trigger and choose Eventbridge.
Create new rule
Provide the rule name
State the description
Schedule expression will act as CRON which will automatically trigger the event on matching expression. We are going to set the 2-minutes rate which invokes the lambda function every 2 minutes. You can specify as per organization’s policy or the interval you want lambda to execute.
Valid values: minute | minutes | hour | hours | day | days
Syntax:
rate(value unit)
Make sure your S3 bucket has following policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "<LAMBDA_EXECUTION_IAM_ROLE_ARN>"
},
"Action": [
"s3:GetObject",
"s3:PutObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::<S3_BUCKET_NAME>",
"arn:aws:s3:::<S3_BUCKET_NAME>/*"
]
}
]
}
Replace S3_BUCKET_NAME
and LAMBDA_EXECUTION_IAM_ROLE_ARN
with your actual values.
You can get LAMBDA_EXECUTION_IAM_ROLE_ARN
from: Lambda > Configuration > Permission > click Role name > Copy the ARN
After completing these configurations, data would be exported from Cloudwatch to S3.
Use Case 2
Below configuration is for the use case where destination S3 bucket is in different account:
Configuration:
First step is to create a Lambda instance that houses the source code for receiving CloudWatch events and storing them to our S3 instance.
Search for the Lambda service in your AWS account, navigate to functions, select Create Function or select the existing function.
Under Basic Information, we need to provide:
Function name
Runtime (Node.js 20x)
Instruction set Architecture (x86_64 default)
Set Up the Role in the Target Account
Create a role in the target account that your Lambda function will assume.
Navigate to AWS Services in the account of destination S3 bucket.
In Services > IAM > Roles
Click Create role (For future reference in this document, we will call it as CrossAcS3Write)
Select Trusted entity as Custom trust policy.
Provide below policy under custom trust policy.
{
"Version": "2024-08-08",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "<Lambda_Execution_Role_ARN>"
},
"Action": "sts:AssumeRole"
}
]
}
You can check the <Lambda_Execution_Role_ARN> in the source account where you have created lambda.
Lambda > Configuration > Permissions > click on the Role name
It will open in a new tab. Copy the role ARN and use it as showed in the above policy in target account trust policy creation.
Click Next
In Add permission, do not add any permission and click next for now.
Provide Role details as name as
CrossAcS3Write
and description and click create role. (CrossAcS3Write
is just for our reference in the document, you can provide any name)Open a newly created role to grant Permissions to the Role
In Permission > Add permission and provide below permission
Replace <TARGET_BUCKET_NAME>
with the name of the S3 bucket in the target account.
Set up target S3 bucket policy
Navigate to the target S3 bucket
Open the bucket
Navigate to Permissions > Bucket policy
Provide the below policy
<ROLE_ARN_CrossAcS3Write>
is the role we created in previous steps. You can copy the ARN from target Account > IAM > Roles, open newly created CrossAcS3Write
role and copy ARN
Replace <TARGET_BUCKET_NAME>
with the name of the S3 bucket in the target account.
Set up lambda execution role policy
In the source account where lambda is created, make sure your newly created execution role or existing role should have following policy:
You can check the policy in Lambda > Configuration > Permissions, click on the Role name
<ROLE_ARN_CrossAcS3Write>
is the role we created in previous steps in the target S3 bucket account. You can copy the ARN from target Account > IAM > Roles, open newly created CrossAcS3Write
role and copy ARN.
Replace <TARGET_BUCKET_NAME>
with the name of the S3 bucket in the target account.
Set up code and configurations in Lambda
In Lambda function, navigate to configuration > Environment Variables and click Edit.
Please mention below keys as it is, these keys are used in lambda function and any change in key name will directly impact on lambda functioning.
Key | Value |
---|---|
LOGGROUP_NAME | <Cloudwatch log group name> |
MY_AWS_REGION | <AWS Region> |
PREFIX | <Target S3 Bucket Prefix> |
S3_BUCKET_NAME | <Target S3 Bucket Name> |
TARGET_ACCOUNT_ROLE_ARN | <ROLE_ARN_CrossAcS3Write> |
Please download the following zip file. Go to lambda → code → upload from dropdown under code. Upload the zip and save, please note that post uploading you might not be able to see the source code.
When using node.js 20, there are some aws-sdk dependencies, which are handled in the zip along with the function code.
If you are creating new function and Eventbridge rule is not available in trigger: Click Add Trigger and choose Eventbridge
Create New rule
Provide the Rule name
State the description.
Schedule expression will act as CRON which will automatically trigger the event on matching expression. We are going to set the 2-minutes rate which invokes the lambda function every 2 minutes. You can specify as per organization’s policy or the interval you want lambda to execute.
Valid values: minute | minutes | hour | hours | day | days
Syntax: rate(value unit)
After completing these configurations, data would be exported from Cloudwatch to target S3 bucket in another account.
About Accenture:
Accenture is a leading global professional services company that helps the world’s leading businesses, governments and other organizations build their digital core, optimize their operations, accelerate revenue growth and enhance citizen services—creating tangible value at speed and scale. We are a talent and innovation led company with 738,000 people serving clients in more than 120 countries. Technology is at the core of change today, and we are one of the world’s leaders in helping drive that change, with strong ecosystem relationships. We combine our strength in technology with unmatched industry experience, functional expertise and global delivery capability. We are uniquely able to deliver tangible outcomes because of our broad range of services, solutions and assets across Strategy & Consulting, Technology, Operations, Industry X and Accenture Song. These capabilities, together with our culture of shared success and commitment to creating 360° value, enable us to help our clients succeed and build trusted, lasting relationships. We measure our success by the 360° value we create for our clients, each other, our shareholders, partners and communities. Visit us at www.accenture.com.
About Accenture Security
Accenture Security is a leading provider of end-to-end cybersecurity services, including advanced cyber defense, applied cybersecurity solutions and managed security operations. We bring security innovation, coupled with global scale and a worldwide delivery capability through our network of Advanced Technology and Intelligent Operations centers. Helped by our team of highly skilled professionals, we enable clients to innovate safely, build cyber resilience and grow with confidence. Follow us @AccentureSecure on Twitter or visit us at www.accenture.com/security.
Legal notice: Accenture, the Accenture logo, and other trademarks, service marks, and designs are registered or unregistered trademarks of Accenture and its subsidiaries in the United States and in foreign countries. All trademarks are properties of their respective owners. This document is intended for general informational purposes only and does not take into account the reader’s specific circumstances, and may not reflect the most current developments. Accenture disclaims, to the fullest extent permitted by applicable law, any and all liability for the accuracy and completeness of the information in this presentation and for any acts or omissions made based on such information. Accenture does not provide legal, regulatory, audit, or tax advice. Readers are responsible for obtaining such advice from their own legal counsel or other licensed professionals.