...
Log Type | Ingestion label | Preferred Logging Protocol - Format | Log Collection Method | Data Source |
---|---|---|---|---|
Symantec Event export | SYMANTEC_EVENT_EXPORT | JSON | C2C | https://cloud.google.com/chronicle/docs/reference/feed-management-api#symantec-event-export C2C - Storage |
Device Configuration
Sign in to the SEP 15/14.2 console.
Select Integration.
Click Client Application and copy the Customer ID and Domain ID, which are used when you create a Chronicle feed.
Click + Add and provide an application name.
Click Add and provide name.
Navigate to the Details page and perform the following actions:
In the Devices Group Management section, select View.
In the Alerts & Events Rule Management section, select View.
In the Investigation Incident section, select View.
Click Save.
Click the menu (vertical ellipses) located at the end of the application name and click Client Secret.
...
Copy the CLIENT ID, CLIENT SECRET & OAUTH CREDENTIALS (OAUTH REFRESH TOKEN), which are required when you configure the Chronicle feed.
...
Events can be streamed to cloud storage data buckets. You can add or edit a Data Bucket stream type to stream and export events into the cloud storage buckets.
Log in the cloud console
Navigate to Integration > Event Stream
Click Add if you want to add a new event stream. Else, select an existing event stream in the grid and edit the fields in the event stream details flyout.
In the Add Event Stream, select Data Bucket as the Stream Type.
Type a Name for the event stream that you are configuring for the cloud storage.
In Data Bucket, configure the following options:
Field | Description |
---|---|
Bucket | Type the bucket name that you have already created for your cloud storage. |
Provider | Select a cloud platform from the options provided:
|
Region | Type the region of the data center where your cloud storage account is created. |
Bucket State | Enable the event stream to initiate streaming of events into your selected cloud storage bucket. |
Directory | Type the directory location of your cloud storage bucket where the events are stored in the files. |
Log Rotation | Set the Time and Size as per your organizations retention policy |
Type the KEY and SECRET, the unique identifiers of your storage accounts on the cloud platforms.
These identifiers are used to authorize Symantec Endpoint Security to stream events into the cloud storages.
The unique identifier is known by different names for the cloud platforms that are listed in the PROVIDER drop-down menu.
Refer to this KB for details on configuring the cloud platforms.
Uncheck the COMPRESSION checkbox.
In the Query Filter section, search and filter the event type_ids from the EVENT TYPE ID list to include the corresponding events in the event stream.
If the event type_ids that you selected are already selected and queried by other streams, you must enable the event stream that you are creating to continue streaming of events with no data loss.
Keep OCSF schema unchecked.
Click Test Connection to verify that the connection with the cloud storage account is established.
Click Create.
Streaming into data buckets is not supported for tenants of the India data center.
If you have used AWS S3 as storage provided, please follow below steps to attach SQS to S3:
Follow all the steps provided above to store logs in S3 bucket
Create SQS and attach it with S3. Please refer Configuring AWS Simple Queue Service (SQS) with S3 Storage
Please refer below page to check required IAM user policies.
IAM User and KMS Key Policies Required for AWS
For more details on how to get required credentials for integration parameters please refer:
Get Credentials for AWS Storage
Integration Parameters
Amazon SQS:
Parameter Display Name | Default Value | Description |
---|
REGION | N/A | Select the region of your S3 bucket |
QUEUE NAME | N/A | The SQS queue name. |
ACCOUNT NUMBER | N/A | The |
account number for the SQS queue and S3 bucket. | ||
QUEUE ACCESS KEY ID | N/A | This is the 20 character ID associated with your Amazon IAM account. |
QUEUE SECRET ACCESS KEY | N/A | This is the 40 character access key associated with your Amazon IAM account. |
SOURCE DELETION OPTION | N/A | Whether to delete source files after they have been transferred to Chronicle. This reduces storage costs. Valid values are:
|
S3 BUCKET ACCESS KEY ID | N/A |
The OAuth client ID.
This is the 20 character ID associated with your Amazon IAM account. Only specify if using a different access key for the S3 bucket. | |
S3 BUCKET SECRET ACCESS KEY | N/A |
The OAuth client secret.
OAUTH REFRESH TOKEN
N/A
This is the 40 character access key associated with your Amazon IAM account. Only specify if using a different access key for the S3 bucket. | ||
ASSET NAMESPACE | N/A | To assign an asset namespace to all events that are ingested from a particular feed, set the |
Microsoft Azure Blob Storage:
Parameter Display Name | Default Value | Description |
---|---|---|
AZURE URI | N/A | The URI pointing to a Azure Blob Storage blob or container. Container names are |
URI IS A | Directory which includes subdirectories | The type of object indicated by the URI. Valid values are:
|
SOURCE DELETION OPTION | Never delete files | Source file deletion is not supported in Azure. This field's value must be set to |
Shared Key OR SAS Token |
| A shared key, a 512-bit random string in base64 encoding, authorized to access Azure Blob Storage. Required if not specifying an SAS Token. |
ASSET NAMESPACE |
| To assign an asset namespace to all events that are ingested from a particular feed, set the |
Google Cloud Storage:
Parameter Display Name | Default Value | Description |
---|---|---|
STORAGE BUKCET URI | N/A | The URI which corresponds to the Google Cloud Storage bucket. The format is the same format used by |
URI IS A | N/A | The type of object indicated by
|
SOURCE DELETION OPTION | N/A | Whether to delete source files after they have been transferred to Google Security Operations. This reduces storage costs. Valid values are:
|