Accenture MDR Quick Start Guide for Microsoft® Azure Linux - MSAzureStorage

This quick start guide will help Accenture MDR customers configure Microsoft® Azure Linux to allow log collection from the Log Collection Platform (LCP).

The document includes the following topics:

Supported Versions

A list of supported versions is available in the Accenture MDR Supported Products List document (Accenture_MDR_Supported_Products_List.xlsx) which can be found in Accenture MDR Portal.

Port Requirements

Table 1-1: Port requirements for LCP communication.

Source

Destination

Port

Description

LCP

Azure Blob storage/Event Hub

443(https)

Default port

Configuring Microsoft Azure Linux

To configure the Linux using any one of the below setup

Log Collection using Linux Diagnostic Extension

Prerequisites

  • An Azure subscription that you can sign in to.

  • A user who's a Global Administrator

  • Azure Storage Account to store the logs or an Event Hub to stream the logs. 

  • Azure Linux Diagnostic Extension 3.0 Installed on Virtual Machines

Reference URLs

How to create storage account?
Create a storage account - Azure Storage

How to configure Event Hub?
Azure Quickstart - Create an event hub using the Azure portal - Azure Event Hubs

How to create SAS URL?

Create an Azure Event Hubs namespace called contosohub.
Create an event hub in the namespace called syslogmsgs.
Create a shared access policy on the event hub that enables the send claim. Name the policy writer.
If your SAS is good until midnight UTC on January 1, 2018, the sasURL value might be like this example:

https://contosohub.servicebus.windows.net/syslogmsgs?sr=contosohub.servicebus.windows.net%2fsyslogmsgs&sig=xxxxxxxxxxxxxxxxxxxxxxxxx&se=1514764800&skn=writer

For more information about generating and retrieving information on SAS tokens for Event Hubs, see Generate SAS token

Configuration Steps:

  1. Log In to Azure Portal i.e https://portal.azure.com/

  2. Open Cloud Shell

3. A new window will open at the bottom. Select Azure CLI as scripting language and if you doing it first time then it will ask to create storage space so create a storage space for this user.

In Case Of Storing Logs In Blob Container

  • Open attached protectedfile.json and publicfile.json files in the notepad and provide appropriate details and save the file.

Note : In case of Ubuntu and Debian, we are monitoring /var/log/auth.log file. So just mention this file path in publicfile.json

The full path name of the log file to be watched and captured. The path name must name a single file. It can't name a directory or contain wildcard characters. The omsagent user account must have read access to the file path.

  • Upload the saved file to storage.

  • Run the following command

    az vm extension set --publisher Microsoft.Azure.Diagnostics --name LinuxDiagnostic --version 3.0 --resource-group ________________ --vm-name ________________ --protected-settings protectedfile.json --settings publicfile.json
  • This command will automatically push logs to store on blob storage.

In Case of Streaming Logs To Event HUB

  • Open attached protectedevent.json and publicevent.json files in the notepad and provide appropriate details and save the file.

Note : In case of Ubuntu and Debian, we are monitoring /var/log/auth.log file. So just mention this file path in publicevent.json

The full path name of the log file to be watched and captured. The path name must name a single file. It can't name a directory or contain wildcard characters. The omsagent user account must have read access to the file path.

  • Upload the saved file to storage.

  • Run the following command

    az vm extension set --publisher Microsoft.Azure.Diagnostics --name LinuxDiagnostic --version 3.0 --resource-group ________________ --vm-name ________________ --protected-settings protectedevent.json --settings publicevent.json
  • This command will automatically push logs to store on event hub.

Log Collection using Azure Monitor Agent

Prerequisites

  • An Azure subscription that you can sign in to.

  • A user who's a Global Administrator

  • Azure Storage Account to store the logs or an Event Hub to stream the logs. 

  • Azure Log Analytics Workspace

  • A data collection rule created for log collection should be created in the same region as your Log Analytics workspace.

Reference URLs

How to create storage account?
Create a storage account - Azure Storage

How to configure Event Hub?
Azure Quickstart - Create an event hub using the Azure portal - Azure Event Hubs

Create Azure Log Analytics Workspace -

This is an optional step. If you are already having an existing Log Analytics Workspace then you can use it.
When you collect logs and data, the information is stored in a workspace. Once you create a workspace, configure data sources and solutions to store their data there.
Find the steps to create Log Analytics Workspace:

  1. Log In to Azure Portal i.e https://portal.azure.com/

  2. In the Azure portal, enter Log Analytics in the search box. As you begin typing, the list filters based on your input. Select Log Analytics workspaces. It will redirect you to Log Analytics Workspace Home Page on which it has been reflected with your existing workspaces and option to create new one.

3. Click Create.

4. Select a Subscription from the dropdown.

5. Use an existing Resource Group or create a new one.

6. Provide a name for the new Log Analytics workspace. This name must be unique per resource group.

7. You may optionally Select Pricing tier and create Tags as per your requirements.

8. Click Review + Create to review the settings. Then Click Create to create the workspace.

Create Data collection rule -

A Data collection rule is an Azure resource that allows you to define the way data should be handled as it's ingested into the workspace. Create a data collection rule in the same region as your Log Analytics workspace.
Find the steps to create a Data collection rule:

  • Under Settings, select Data Collection Rules and Select Create to create a new data collection rule and associations.

  • On the Basics tab, Enter a Rule name and specify a Subscription, Resource Group, Region, and Platform Type:

    • Region specifies where the Data Collection Rule will be created. The virtual machines and their associations can be in any subscription or resource group in the tenant.

    • Platform Type specifies the type of resources this rule can apply to. Select Platform Type as Linux.

  • On the Resources tab, Click + Add resources. Use the filters to find the Linux Virtual Machines for which you want to collect logs and select them under the scope of the data collection rule; then click on the Apply button. The Azure portal installs Azure Monitor Agent on resources those don't already have it installed.

  • On the Collect and deliver tab, click Add data source to add a Data source and set a Destination.

a) Select a Data source type as Linux Syslog.

b) For the LOG_AUTH and LOG_AUTHPRIV syslog facilities, select Minimum log level value as LOG_INFO. For the rest of the log facilities select Minimum log level value as none.

c) Click Next: Destination.

d) Enter the following values:

i) Destination type - Azure Monitor Logs

ii) Subscription - Select the appropriate subscription

iii) Account or namespace - Select the appropriate Log Analytics workspace

iv) Click Add data source.

e) Click Next and then Review + create.

Azure Monitor supports collection of messages sent by rsyslog or syslog-ng, where rsyslog is the default daemon. The default Syslog daemon on version 5 of Red Hat Enterprise Linux, CentOS, and Oracle Linux version (sysklog) isn't supported for Syslog event collection. To collect Syslog data from this version of these distributions, the rsyslog daemon should be installed and configured to replace syslog. aIf your VM doesn't have the Azure Monitor agent installed, the data collection rule deployment triggers the installation of the agent on the VM.

Create Data Export Rule in Log Analytics workspace

Data export in a Log Analytics workspace lets you continuously export data per selected tables in your workspace. You can export to an Azure Storage account or Azure Event Hubs as the data arrives to an Azure Monitor pipeline.
Find the steps to create it:

  1. From the Azure Portal, navigate to the Log Analytics Workspace.

  2. On the Log Analytics workspace home page, select the Log Analytics workspace which you have configured in the Data collection rule.

  3. Click Data Export under the Settings section. Click New export rule at the top of the pane.

4. On the Basics tab, Enter Rule Name and Keep Enable Upon Creation setting checked; then Click on Next.

5. On the Source tab, Select Table name as Syslog You can also use search bar to find Syslog table from the list; then click on Next.

6. On the Destination tab, you can either store logs in Storage Account or stream the logs to Event Hub as per your preference. You may select either of these methods by enabling them from Destination type on the diagnostic settings page. MxDR supports log collection from both the options.

i) Storage account

To store logs in Storage Account, select Storage account from Destination type as shown in below screenshot. You need to choose an existing Subscription and a Storage account. You must either create a new or use existing storage account to store log information.

 Note: MxDR recommends a minimum of 1 day of log retention, the number can be defined based on the organization's policies.

ii) Stream logs to an event hub
To stream logs to Event Hub, select Event hub from Destination type as shown in below screenshot. Select Subscription, event hub namespace , event hub name and event hub policy name created during Event Hub as part of pre-requisite.

iii) Click Next: Review + create.

LCP Configuration Parameters

Table 1-2: The Microsoft Azure Linux event collector (API – 3982) properties to be configured by MDR are shown in the table.

Property

Value if Eventhub

Value if Storage

Logging Source

Select EventHub

Select Storage

eventHubConnectionString

Event hub connection string

N/A (keep blank)

consumerGroupName

Optional and used if consumerGroup is other than default

N/A (keep blank)

Account Key

Access Key to access storage account

Access Key to access storage account

Blob Container

Storage blob Container name

Storage blob Container name

Example:

filelogjsonblobver2v0
am-syslog

Storage Account Name

Azure storage account name

Azure storage account name

Subscription

Set Event hub name

Subscription ID that customer wants to be monitored

Note: In case of EventHub logging source, storage Account Key/SAS Token, Blob Container, and Storage Account Name are required because the marker for the event hub gets stored in the storage account.

Note: In case of Storage logging source, for each blob container created under storage account there should be separate sensor configuration created in the Log collection platform.

 

Legal Notice

Copyright © 2021 Accenture. All rights reserved.

Accenture, the Accenture Logo, and DeepSight Intelligence are trademarks or registered trademarks of Accenture in the U.S. and other countries. Other names may be trademarks of their respective owners.

The product described in this document is distributed under licenses restricting its use, copying, distribution, and decompilation/reverse engineering. No part of this document may be reproduced in any form by any means without prior written authorization of Accenture and its licensors, if any.

THE DOCUMENTATION IS PROVIDED "AS IS" AND ALL EXPRESS OR IMPLIED CONDITIONS, REPRESENTATIONS AND WARRANTIES, INCLUDING ANY IMPLIED WARRANTY OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE OR NON-INFRINGEMENT, ARE DISCLAIMED, EXCEPT TO THE EXTENT THAT SUCH DISCLAIMERS ARE HELD TO BE LEGALLY INVALID. ACCENTURE SHALL NOT BE LIABLE FOR INCIDENTAL OR CONSEQUENTIAL DAMAGES IN CONNECTION WITH THE FURNISHING, PERFORMANCE, OR USE OF THIS DOCUMENTATION. THE INFORMATION CONTAINED IN THIS DOCUMENTATION IS SUBJECT TO CHANGE WITHOUT NOTICE.

The Licensed Software and Documentation are deemed to be commercial computer software as defined in FAR 12.212 and subject to restricted rights as defined in FAR Section 52.227-19 "Commercial Computer Software - Restricted Rights" and DFARS 227.7202, et seq. "Commercial Computer Software and Commercial Computer Software Documentation," as applicable, and any successor regulations, whether delivered by Accenture as on premises or hosted services. Any use, modification, reproduction release, performance, display or disclosure of the Licensed Software and Documentation by the U.S. Government shall be solely in accordance with the terms of this Agreement.