Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

 Log Type

Ingestion label

Preferred Logging Protocol - Format

Log Collection Method

Data Source

AppOmni

 APPOMNI JSON

 Webhook - JSON

 C2C - Push

N/A

Device Configuration

1 - AWS S3 Configuration

Creation of S3 bucket

  1. Navigate to https://s3.console.aws.amazon.com/

  2. Click Create Bucket

  3. Provide a name for the S3 bucket

  4. Select a Region

  5. Click Create Bucket

...

Setup API Gateway Role

  1. Navigate to https://console.aws.amazon.com/iam

  2. Click Roles

  3. Click Create role

  4. Select S3 as your use case

  5. Click Next: Permissions

...

  1. Select AmazonS3FullAccess

  2. Click Next: Tags

  3. Click Next: Review

  4. Provide a Role Name

  5. Click Create Role

...

  1. Click or Open the role you created in the previous step

  2. Click Trust relationships tab

  3. Click Edit trust relationship

  4. Paste the following into the policy document

Code Block
{
 
"Version": "2012-10-17",
 
"Statement": [
 
{
 
"Effect": "Allow",
 
"Principal": {
 
"Service": "apigateway.amazonaws.com"
 
},
 
"Action": "sts:AssumeRole"
 
}
 
]
 
}

...

  1. Click Save or Update Trust Policy

  2. Copy the Role ARN for this role, you’ll need it later

Create API GW

  1. Navigate to API Management - Amazon API Gateway - AWS

  2. Click Create API

  3. Click the Build button in the Rest API section

  4. Select New API

  5. Provide an API name and Description

  6. Click Create API

...

  1. Click Actions dropdown

  2. Select Create Method

  3. Select PUT, and click the check box to save

  4. Select AWS Service

  5. Set AWS Region to the region where the S3 bucket was created

  6. Set AWS Service to Simple Storage Service (S3)

  7. Leave AWS Subdomain empty

  8. HTTP method PUT

  9. Set Action Type to Use path override

  10. Set Path override (optional) to the following:

Code Block
<name_of_your_s3_bucket>/{key}
  1. Set Execution Role to the ARN of the role created in Step #2

  2. Click Save

...

  1. Click Integration Request

  2. Expand Mapping Templates

  3. Select When there are no templates defined (recommended)

  4. Click Add Mapping template

  5. Set it to application/json, and click the save

  6. Insert the below into the Template

Code Block
#set($context.requestOverride.path.key = $context.requestId)
$input.body
  1. Click Save

...

  1. Scroll to the top

  2. Click the <- Method Execution

  3. Select your PUT method

  4. Click on Actions drop down

  5. Click Deploy API

  6. Set Deployment stage to [New Stage]

  7. Set Stage Name to the name of your S3 bucket

  8. Set Stage Description to the name of your S3 bucket

  9. Click Deploy

...

  1. Click Usage Plans on the left side

  2. Click Create

  3. Provide Name

  4. Disable Enable throttling

  5. Disable Enable quota

  6. Click Next

...

  1. Click Add API Stage

  2. In the API dropdown list select the API you created above

  3. In the Stage dropdown list select the Stage you created above

  4. Click Save

  5. Click Next

  6. Click Done

...

  1. Click API Keys

  2. Click Create API Key and add to Usage Plan

  3. Provide an API Key Name

  4. Select Auto Generate

  5. Click Save

...

  1. Click Key to open the key

  2. Click the Show option and copy the key (you’ll need it later)

  3. Navigate back to the API PUT method

  4. Click the Method Request

  5. Set API Key Required to True

...

  1. Expand the HTTP Request Headers.

  2. Insert x-api-key, select Required and click Save

...

  1. Select your PUT method

  2. Select Actions - Deploy API dropdown

  3. Select the Deployment Stage you created earlier

  4. Click Deploy

...

  1. Click Stages on the left and select the Stage you configured above.

  2. Copy the Invoke URL, which you’ll need later

...

  1. Navigate back to the API PUT method

  2. Click Test

  3. Insert x-api-key:<insert API key from previous steps>

  4. Click Test

  5. Verify that you get Status 200 response (test succeeded)

...

2. Configure S3 Bucket in AppOmni - Alerts Dashboard

  1. Log into your AppOmni console

  2. Navigate to Monitor - Alert Dashboard

...

  1. Click the + sign to add a new AppOmni Data Sink.

...

  1. Provide a name for the Data Sink

  2. Select Event Format as Condensed ECS

  3. Set Max Event Size in Bytes to 0 (zero)

  4. Data Sink = HTTP

  5. Insert the Invoke URL from Create API GW step above in the Endpoint section

  6. Set the Token Header to x-api-key

  7. Insert the API key from Create API GW step above in the Token Section

  8. Method = PUT

  9. Delivery Format = JSON Array

...

  1. Click Test Data Sink Configuration to ensure all parameters are correct.

  2. Click Create New Data Sink to save the Data Sink settings

...

  1. Review the JSON files in the S3 bucket, they contain the normalized AppOmni SaaS audit logs and detection alerts.

...

3. Configure S3 Bucket in AppOmni - Workflows

  1. Log into your AppOmni console

  2. Navigate to Monitor - Workflows

  3. Click the Add Workflow +

  4. Provide a Workflow Name

  5. Select the Environment Tags to include in the Workflow

  6. Click Webhook Notifications for New/Reopened Events on Policy Scans (PUT)

  7. Insert the Invoke URL from Create API GW step above in the Endpoint section

...

  1. Click Next

  2. Set HeaderName to x-api-key

  3. Insert the API key from Create API GW step above in the HeaderValue Section

  4. Click Finish

...

  1. Click Add Workflow

...

4. Configure AppOmni policies to leverage the S3 bucket workflow

  1. Open your AppOmni policy

  2. Click Manage

  3. Select the configured S3 Bucket in the workflow section

  4. Click Add to Policy

...

  1. AppOmni findings for this policy will now be send to the configured S3 bucket

  2. The JSON files in the S3 bucket contain the AppOmni findings from the policy/policies configured to send findings to the S3 bucket.

...

5. Create SQS

For Internal Reference: Configuring Amazon Simple Queue Service (SQS) with S3 Storage

6. Attaching Created SQS with S3

  1. Navigate to S3 Dashboard and select the S3 Bucket created in Section 1 above which needs to be monitored for Notifications.

...

  1. Navigate to Properties Tab

...

  1. Under event notification. Click Create event notification

...

  1. Provide a desired name to the notification and select the Event which has to be notified.

...

Select All Objects create events as we need to be notified for every new entry in the Bucket.

  1. Add Prefix 

Prefix is necessary in such cases if any single object has to be monitored. The prefix is the name of the directory created in the bucket. In case every directory in the Bucket has to be monitored for notifications, leave the prefix blank.

  1. Select SQS queue as destination and Select Choose from your SQS queue.

...

  1. Select the SQS queue from the dropdown which you have created earlier

  2. Click Save

AppOmni Integration with Webhook (Recommended)

  1. Log into your AppOmni console

  2. Navigate to Threat detection > Destinations

  3. Click Add Destination

...

  1. search for Webhook and click on that.

...

  1. Click Continue.

...

  1. Fill below details and rest everything should have default value.

a. Name - Provide the name

b. Description(Optional) - Provide description

c. URL Endpoint - Contact with On-boarding engineer for this URL and API Key. You need to append URL with API key. Eg: - https://us-chronicle.googleapis.com/v1alpha/projects/12345/locations/us/instances/123-abc/feeds/abc123-123:importPushLogs?key={API Key}

d. HTTP Method - Select Post

e. Delivery Format - Select JSON Lines

f. Token Header - Mention “X-Webhook-Access-Key

g. Token - Contact with On-boarding engineer for Token(Secret Key)

...

  1. Click Save.

  2. Once the destination is created, check if it is enabled. If it is not enabled, enable it.

AppOmni Integration with AWS S3 (Alternative)

 Create S3 Bucket

  1. Go to https://s3.console.aws.amazon.com/

  2. Click Create Bucket

  3. Provide a name for the S3 bucket

  4. Select a Region

  5. Click Create Bucket

...

Create an IAM User

Create an IAM user with upload access to the bucket. Please reference the AWS documentation, Controlling access to a bucket with user policies, for instructions on performing this step.

Follow below document for IAM user permission: 
Accenture MDR Quick Start Guide in Configuring IAM User and KMS Key Policies

Retrieve access key and access key ID

  1. Login to AWS console with the user created in above step.

  2. In the AWS console, select your account name and, in the drop-down menu, select Security credentials

...

  1. In the Access keys panel, select Create access key. After creation, copy the keys from the Access key and Secret access key fields.

...

Configure destination as AWS S3 in AppOmni console

  1. Log into your AppOmni console

  2. Navigate to Threat detection > Destinations

  3. Click on Add Destination

...

  1. Search for Amazon Web Services and click on that

...

5.Click Continue

...

  1. Fill below details and rest everything should be default.

a. Name - Mention the name

b. Description (Optional)- Mention the description

c. AWS region name - Provide the AWS S3 bucket region

d. AWS access key - Provide the Secret Access key which we have created in previous steps

e. AWS access key ID - Provide the Access Key which we have created in previous steps

f. AWS bucket name - Provide the S3 bucket name where you want to store the logs 

g. Delivery Format - Select JSON lines

  1. Click Save

...

  1. Once the destination has created, check if it is enabled or not. If it is not enabled than enable it.

  2. Review the JSON files in the S3 bucket.

Create SQS and attach with S3:

Refer this document: Configuring AWS Simple Queue Service (SQS) with S3 Storage

Follow below page for IAM user policies which are required for SQS & S3:

IAM User and KMS Key Policies Required for AWS

If SQS is encrypted, please add below policy in KMS. This policy is required to make s3 events notify SQS:

Code Block
KMS Policy:
***********
{
  "Sid": "example-statement-ID",
  "Effect": "Allow",
  "Principal": {
    "Service": "s3.amazonaws.com"
  },
  "Action": [
    "kms:GenerateDataKey",
     
  ],
  "Resource": "*"
}

Integration Parameters

Parameters required from customer for Integration.

Integration via Webhook

Configure Webhook on Google Chronicle instance and copy a Endpoint URL & Secret key.

Integration via AWS SQS

S3 URI

Property

Default Value

Description

REGION

Custom Value

The region where the S3 bucket resides. For a list of regions, see Amazon S3 regions.

Select the region of your S3 bucket

QUEUE NAME

Custom Value

The SQS queue name.

ACCOUNT NUMBER

Custom Value

The

S3 URI to ingest

account number for the SQS queue and S3 bucket.

QUEUE ACCESS KEY ID

Custom Value

The value obtained from above configuration.

SECRET ACCESS KEY

Custom Value

The value obtained from above configuration

This is the 20 character ID associated with your Amazon IAM account.

QUEUE SECRET ACCESS KEY

Custom Value

This is the 40 character access key associated with your Amazon IAM account.

SOURCE DELETION OPTION

Custom Value

Whether to delete source files after they have been transferred to Chronicle. This reduces storage costs. Valid values are:

  • SOURCE_DELETION_NEVER: Never delete files from the source.

  • SOURCE_DELETION_ON_SUCCESS:Delete files and empty directories from the source after successful ingestion.

  • SOURCE_DELETION_ON_SUCCESS_FILES_ONLY:Delete files from the source after successful ingestion.

S3 BUCKET ACCESS KEY ID

N/A

This is the 20 character ID associated with your Amazon IAM account. Only specify if using a different access key for the S3 bucket.

S3 BUCKET SECRET ACCESS KEY

N/A

This is the 40 character access key associated with your Amazon IAM account. Only specify if using a different access key for the S3 bucket.

ASSET NAMESPACE

N/A

To assign an asset namespace to all events that are ingested from a particular feed, set the "namespace" field within details. The namespace field is a string.