...
Log Type | Ingestion label | Preferred Logging Protocol - Format | Log Collection Method | Data Source |
---|---|---|---|---|
Symantec Event export | SYMANTEC_EVENT_EXPORT | JSON | C2C | https://cloud.google.com/chronicle/docs/reference/feed-management-api#symantec-event-export |
...
C2C - Storage |
Device Configuration
Events can be streamed to cloud storage data buckets. You can add or edit a Data Bucket stream type to stream and export events into the cloud storage buckets.
Log in the cloud console
Navigate to Integration > Event Stream
Click Add if you want to add a new event stream. Else, select an existing event stream in the grid and edit the fields in the event stream details flyout.
In the Add Event Stream, select Data Bucket as the Stream Type.
Type a Name for the event stream that you are configuring for the cloud storage.
In Data Bucket, configure the following options:
...
Streaming into data buckets is not supported for tenants of the India data center.
If you have used AWS S3 as storage provided, please follow below steps to attach SQS to S3:
Follow all the steps provided above to store logs in S3 bucket
Create SQS and attach it with S3. Please refer Configuring AWS Simple Queue Service (SQS) with S3 Storage
Please refer below page to check required IAM user policies.
IAM User and KMS Key Policies Required for AWS
For more details on how to get required credentials for integration parameters please refer:
Get Credentials for AWS Storage
Integration Parameters
Amazon SQS:
Parameter Display Name | Default Value | Description |
---|
REGION | N/A | Select the region of your S3 bucket |
QUEUE NAME | N/A | The SQS queue name. |
ACCOUNT NUMBER | N/A | The |
account number for the SQS queue and S3 bucket. | ||
QUEUE ACCESS KEY ID | N/A | This is the 20 character ID associated with your Amazon IAM account. |
QUEUE SECRET ACCESS KEY | N/A | This is the 40 character access key associated with your Amazon IAM account. |
SOURCE DELETION OPTION | N/A | Whether to delete source files after they have been transferred to Chronicle. This reduces storage costs. Valid values are:
|
S3 BUCKET ACCESS KEY ID | N/A |
The OAuth client ID.
This is the 20 character ID associated with your Amazon IAM account. Only specify if using a different access key for the S3 bucket. | |
S3 BUCKET SECRET ACCESS KEY | N/A |
The OAuth client secret.
This is the 40 character access key associated with your Amazon IAM account. Only specify if using a different access key for the S3 bucket. | ||
ASSET NAMESPACE | N/A | To assign an asset namespace to all events that are ingested from a particular feed, set the |
Microsoft Azure Blob Storage:
ASSET NAMESPACE
Parameter Display Name | Default Value | Description |
---|---|---|
AZURE URI | N/A |
An OAuth 2.0 token used to refresh access tokens when they expire. Provide “OAUTH CREDENTIALS“ which will be your “OAUTH REFRESH TOKEN“
The URI pointing to a Azure Blob Storage blob or container. Container names are | ||
URI IS A | Directory which includes subdirectories | The type of object indicated by the URI. Valid values are:
|
SOURCE DELETION OPTION | Never delete files | Source file deletion is not supported in Azure. This field's value must be set to |
Shared Key OR SAS Token |
| A shared key, a 512-bit random string in base64 encoding, authorized to access Azure Blob Storage. Required if not specifying an SAS Token. |
ASSET NAMESPACE |
| To assign an asset namespace to all events that are ingested from a particular feed, set the |
Google Cloud Storage:
Parameter Display Name | Default Value | Description |
---|---|---|
STORAGE BUKCET URI | N/A | The URI which corresponds to the Google Cloud Storage bucket. The format is the same format used by |
URI IS A | N/A | The type of object indicated by
|
SOURCE DELETION OPTION | N/A | Whether to delete source files after they have been transferred to Google Security Operations. This reduces storage costs. Valid values are:
|