Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Current »

About The Device

VPC Flow Logs records a sample of network flows sent from and received by VM instances. These logs can be used for network monitoring, forensics, real-time security analysis, and expense optimization.

You can view flow logs in Cloud Logging, and you can export logs to any destination that Cloud Logging export supports. 

Device Information

 Entity

Particulars

Vendor Name

Google

Product Name

Cloud VPC Flow

Type of Device

Cloud

Collection Method

Log Type

 Ingestion label

Preferred Logging Protocol - Format

Log Collection Method

Data Source

VPC Flow Logs

 GCP_VPC_FLOW

(Raw log telemetry)

Prop Vendor API - JSON

 C2C - Storage

https://cloud.google.com/chronicle/docs/reference/feed-management-api#gc-storage

Device Configuration

Please follow the steps below to enable raw log telemetry.

Pre-Requisite:

  1. Please create Cloud Storage Bucket.

  2. Chronicle Service Account which will be provided by Adaptive MXDR Team.

Configure VPC flow logs:

  1. Login to Google Cloud account using credentials.

  2. On Welcome page, click VPC Network

3.In VPC networks, click Default.

  1. In Subnets, select all logs and click Flow Logs > Configure.

5.Select Aggregation Interval and enter Sample Rate. (For example, enter 50%)

6.Click Save.
After saving, VPC logs start flowing into Chronicle

  1. Next, search Logging in the search bar at the top and click Enter. By default, it navigates you to Log Explorer.

  2. In Log Explorer, you can see all logs that come from multiple sources. Filter the logs by choosing VPC_flows in Log Name and click Apply.
    All VPC logs are sorted out in the page

9.Click More Actions and select Create Sink.

  1. In Logs Router screen, in Create logs Routing Sink window, fill the following details:

  • In Sink Details, enter Name & Description and click Next. (For example, test_gcp_vcp_flows & GCP Flows)

  • In Sink Destination, in Select sink service, select Cloud Storage Bucket and in Cloud Storage Bucket, select bucket which you have created as mentioned in pre-requisite.

  • In Choose Logs to include in Sink, a default log is populated once you select an option in Cloud Storage Bucket and click Next.

  • (Optional) In Choose Logs to filter out of Sink, choose the logs that you would like not to sink

  • Click Create Sink. All logs will be sinked and stored in Cloud Storage Bucket

Get the gsutil URL of Storage Bucket:

In order to ingest VPC logs into Chronicle, you must copy gsutil URL from the configuration of a storage bucket and paste it in the Input parameters of Chronicle Feeds.

Configuring a feed in Chronicle Instance:

  1. From your Chronicle instance page, select Settings.

2.Click Feeds where you can find the data feeds that you have configured as well as the default feeds that the Google provided.

  1. Click ADD NEW.

  2. In ADD FEED > Set Properties, select SOURCE TYPE as Google Cloud Storage.

  3. Select the Log Type as GCP VPC Flow.

  4. In Chronicle Service Account, click on GET A SERVICE ACCOUNT. It will create a service account for you.

  5. Click Next.

8.In Input Parameters, paste the gsutil URL that you copied from configuration of a storage bucket.

  1. Click Next.

  2. In Finalize, click Submit.

Viewing VPC logs in Cloud Storage Bucket:

To view the VPC logs that are synchronized in cloud storage bucket, first you must grant the Chronicle access. You must also perform the following actions from the Cloud Storage section in the Google Cloud Console.

  • To grant read permission to a specific file, you can Edit access on that file and grant the above email Reader access. This can only be done if you have not enabled uniform bucket-level access.

  • If you configure the feed to delete source files (see below for how to do this), you must add the above email as a principle on your bucket and grant it the IAM role of Storage Object Admin.

  • To grant read permission to multiple files you must grant access at the bucket level. Specifically, you must add the above email as a principle to your storage bucket and grant it the IAM role of Storage Object Viewer.

To enable permission to multiple files in a single bucket at a time, following steps help you achieve it.

  1. Click on the bucket that you would like enable permissions.

  1. In Permissions, click ADD

  1. Mention CHRONICLE SERVICE ACCOUNT provided by Adaptive MXDR team in New Principals.

  2. In Role, select Storage Object Viewer.

  3. Click Save. A gsutil URL is generated for a storage bucket that you have enabled permissions

Integration Parameters

Property

Default Value

Description

 STORAGE BUKCET URI

 N/A

 The URI which corresponds to the Google Cloud Storage bucket. The format is the same format used by gsutil to specify a resource.

URI IS A

N/A

The type of object indicated by bucketUri. Valid values are:

  • FILES: The URI points to a single file which will be ingested with each execution of the feed.

  • FOLDERS: The URI points to a directory. All files contained within the directory will be ingested with each execution of the feed.

  • FOLDERS_RECURSIVE: The URI points to a directory. All files and directories contains within the indicated directory will be ingested, including all files and directories within those directories, and so on.

SOURCE DELETION OPTION

N/A

Whether to delete source files after they have been transferred to Google Security Operations. This reduces storage costs. Valid values are:

  • SOURCE_DELETION_NEVER: Never delete files from the source.

  • SOURCE_DELETION_ON_SUCCESS:Delete files and empty directories from the source after successful ingestion.

  • SOURCE_DELETION_ON_SUCCESS_FILES_ONLY:Delete files from the source after successful ingestion.

  • No labels