Skip to main content

Netskope Help

Microsoft Azure Monitor Plugin for Log Shipper

This document explains how to configure the Microsoft Azure Monitor integration with the Cloud Log Shipper module of the Netskope Cloud Exchange platform.

Event Support

Yes

Alert Support

Yes

WebTx Support

No

Prerequisites
  • A Netskope Tenant (or multiple, for example, production and development/test instances).

  • A Netskope Cloud Exchange tenant with the Log Shipper module already configured.

  • Microsoft Azure Application’s Tenant ID, Client ID and Client Secret

  • Microsoft Azure Log Analytic Workspace

  • Microsoft Azure Monitor Data Collection Endpoint

  • Microsoft Azure Monitor Data Collection Rule

  • Connectivity to the following host: https://portal.azure.com/

Workflow
  1. Configure a Log Analytics Workspace.

  2. Configure an Application and get your Tenant ID, Application ID and Client Secret.

  3. Configure a Data Collection Endpoint and get your DCE URI.

  4. Configure a Basic Table in Log Analytics Workspace and get your Data Collection Rule Immutable ID.

  5. Assign a Permission to DCR and DCE.

  6. Configure the Microsoft Azure Monitor plugin.

  7. Configure a Log Shipper Business Rule for Microsoft Azure Monitor.

  8. Configure Log Shipper SIEM mappings for Microsoft Azure Monitor.

  9. Validate the Microsoft Azure Monitor plugin.

Click play to watch a video.

 
  1. Log in to Azure and select Log Analytics Workspace

    image1.png
  2. Click Create Tab on the top.

  3. Select Subscription, and then select an existing Resource Group (or create a new one).

  4. Enter a name for your Log Analytics Workspace, select a region, and then select Next > Next > Create.

    image2.png
  1. Log in to Azure with an account that has the Global Administrator role.

  2. Go to Azure App Registration > New Registration.

    image3.png
  3. In the registration form, enter a name for your application, and then click Register.

  4. Make a copy of the Tenant ID and Application (client) ID on the application page.

    image4.png
  5. Click Certificates & secrets, and then click New client secret to generate Client secret. Add a description and Expire time, and then click Add.

  6. Copy the value of Secret, as it will only be displayed once.

    image5.png
  1. Go to Azure Home and select Monitor from the Azure services.

  2. Select Data Collection Endpoints on the left panel, and then select Create.

    image6.png
  3. Enter a name for the Data collection Endpoint, select a Subscription and Resource Group, select a region (make sure that this region is the region of your Log Analytics Workspace) and click Review + create.

    image7.png
  4. From the Overview tab, copy the Logs Ingestion that will be your Data Collection Endpoint DCE URI.

  1. A Custom Log Analytics Table requires sample data to be uploaded for which to create a json file on your system with the following content:

    [
      {
            "RawData":  {
            },
            "Application":  "",
            "DataType": "",
    	"SubType": "",
    	"TimeGenerated": "2022-11-01 12:00:00.576165"
        }
    ]
  2. On the Azure home tab, go to Log Analytics Workspace, select the workspace created previously, select Tables. Click Create and select New Custom log (DCR based).

    image8.png
  3. Enter a name for the table.

  4. For Data Collection Rule, click Create a new data collection rule and select a Subscription and Resource Group from the dropdown lists. Enter the region for your Log Analytics Workspace, and click Done.

    image9.png
  5. The new Data Collection Rule will be selected in the Data collection rule field, and then click Next.

  6. On the Schema and Transformation tab, click Browse for Files and select the sample data json file you created previously.

    image10.png
    image11.png
  7. Click Next and then click Create.

  8. A Custom Log Table will be created with the suffix _CL. By default, a table plan will be Analytics. To convert it to Basic table, search for your table and click on the three dots at the right.

    image12.png
  9. Select Manage Table, and in the table plan field, select Basic.

  10. Select the retention period as per your requirement and click Save.

    image13.png

    Note

    • Here we are changing the table Plan from Analytics to basic as The Basic log data plan lets you save on the cost of ingesting and storing high-volume verbose logs in your Log Analytics workspace for debugging, troubleshooting, and auditing.

    • If table plan is not changed and kept as Analytics, the Logs will still be ingested in the Table without any issue.

    • The Analytics’table has a configurable retention period from 30 days to 730 days. The Basic table has Retention fixed at eight days.

    • Basic Logs tables retain data for eight days. When you change an existing table's plan to Basic Logs, Azure archives data that's more than eight days old, but still within the table's original retention period.

  11. To get the Data Collection Immutable ID, go to Home, select Monitor from the Azure Services > Data Collection Rules, and then select the DCR created by you while creating the Custom Table.

    image14.png
  12. In the Overview tab, click JSON View from the top right corner, and copy the immutableId.

    image15.png
  1. On the Azure Home page, go to Monitor > Data Collection Endpoint and select the Endpoint created previously.

  2. Select Access control (IAM) and click Add role assignment.

    image16.png
  3. From the list of roles, select Monitoring Metrics Publisher and click Next.

    image17.png
  4. Select User, group, or service principal for which to assign access.

  5. Click Select Members and search for the Application you created in the search box, and then select it.

    image18.png
  6. Click Review + assign.

  7. Repeat these same steps to assign permissions to the DCR (Data Collection Rule).

  1. In Cloud Exchange, go to Settings > Plugins.

  2. Search for and select the Microsoft Azure Monitor box to open the plugin configuration page.

    image19.png
  3. Enter a Configuration Name a select a valid Mapping. (Default Mappings for all plugins are available. If you want to create a new mapping, go to Settings > Log Shipper > Mapping).

  4. Transform the raw logs is enabled by default, which will transform the raw data on the basis of Mapping file. Turn it off if you want to send Raw data directly to Azure Monitor.

    image20.png
  5. Click Next.

  6. Enter the Tenant ID, App ID, and App Secret obtained while creating Azure Application in the previous steps.

  7. Enter the DCE URI (Log Ingestion) and DCR ImmutableID obtained previously.

  8. Enter the Custom Log Table Name.

    image21.png
  9. Click Save.

    image22.png
  1. Go to Log Shipper > Business Rules.

  2. Click on the Create New Rule.

    image23.png
  3. Enter a Rule Name and configure a filter. Enter a Folder Name, if any.

  4. Click Save.

    image24.png
  1. Go to Log Shipper > SIEM Mappings > Add SIEM Mapping.

    image25.png
  2. Select a Source Configuration, Destination Configuration, and Business Rule.

  3. Click Save.

    image26.png

In order to validate the plugin workflow, you can check from Netskope Cloud Exchange and from Log Analytics Workspace.

Note

Make sure you have added the SIEM Mapping before confirming your data ingestion.

To validate from Netskope Cloud Exchange, go to Logging and and search the logs with message that contains ingest

image27.png

To validate from the Log Analytics Workspace:

  1. In the Azure portal, go to Log Analytics Workspace service, select the Log Analytics Workspace that you created and select Logs under the General Category from the left panel.

  2. Write the Custom Log Table Name in the query editor and click Run. You can select the Time Range from the top to filter out Logs.

    image28.png

If received error code 403 and Log error message Ensure that you have the correct permissions for your application to the DCR., check if you have assigned permissions to the correct Data Collection endpoint. It may take up to 30 minutes to reflect the assigned permissions.