Tutorial / Cram Notes
Azure Log Analytics is a powerful monitoring and analytics tool within the Azure platform, offering the ability to collect, analyze, and act on telemetry data from cloud and on-premises environments. Custom logs in Azure Log Analytics provide a flexible approach to capturing custom data that is not automatically collected by Azure Monitor agents. This capability is valuable for security operations analysts who might need to log custom events, performance data, or telemetry from various applications and services for analysis and compliance purposes.
Understanding Custom Logs
Custom logs allow an organization to define a new data type in Azure Log Analytics, to which data from text files can be uploaded. For instance, you may have logs generated by a custom application or any non-standard logs you’d like to analyze alongside your standard Azure telemetry. These logs can then be used in queries, alert rules, and other Log Analytics features.
Creating Custom Logs
To create custom logs in Azure Log Analytics, follow these key steps:
- Define a Custom Log:
- Access your Log Analytics workspace in the Azure Portal.
- In the Workspace Data Sources section, select ‘Custom Logs’.
- Click on the ‘Add+’ button to start the custom log wizard, which will guide you through the process of defining your new log.
- Upload Samples:
- During the wizard, you will be prompted to upload a sample of the log file you wish to collect. This file is used to extract the structure of your log data.
- Azure will parse the sample file to recommend the best timestamp format and record delimiter.
- Define Record Structure:
- Specify a regular expression to split the data into individual records if the auto-detection isn’t suitable.
- Define the timestamp and its format to help Azure understand how to process the timing of the events.
- Finalize and Create:
- Name your custom log (postfix ‘_CL’ will be added automatically to denote a custom log).
- Review the summary and create your custom log.
Monitoring Agent Configuration
Once the custom log is defined, it should be collected by the Azure Monitor agent:
- Determine which machines will send the custom log data.
- Install and configure the Log Analytics agent on those machines if not already present.
- Define the path where the agent can find the logs to be ingested.
Writing Queries for Custom Logs
Custom logs can be queried just like any other data type in Log Analytics. Here’s an example of how you might write a Kusto Query Language (KQL) query for your custom log data:
CustomLog_CL
| where TimeGenerated > ago(1d)
| where LogLevel == “Error”
| summarize count() by bin(TimeGenerated, 1h), Component
| render timechart
The above query counts the number of error-level logs from the CustomLog_CL
custom log over the past day, grouped by hour and component, displaying the results in a timechart.
Security Considerations
As with any data collection, ensure compliance with data governance and security policies:
- Limit access to custom logs to authorized personnel.
- Consider the types of data being collected and avoid logging sensitive information without proper encryption and access controls.
Use Cases
Custom logs can be particularly beneficial in several scenarios, such as:
- Monitoring Custom Applications: Collect logs from custom-built applications that aren’t already supported by Azure Monitor.
- Compliance and Auditing: Track specific actions or events that are critical for compliance reporting.
- Advanced Security Scenarios: Capture detailed log data from security systems for incident investigation and analysis.
By leveraging custom logs in Azure Log Analytics, security operations analysts have a robust tool to extend their monitoring and analytics capabilities, ensuring they can effectively track and respond to potential security incidents in their organization’s IT environment.
Practice Test with Explanation
True or False: Azure Log Analytics can only store logs generated by Azure services.
- A) True
- B) False
Answer: B) False
Explanation: Azure Log Analytics can store logs from Azure services as well as custom logs that include data from other sources.
To create a custom log in Azure Log Analytics, which of the following methods can be used?
- A) REST API
- B) Azure Portal
- C) PowerShell
- D) All of the above
Answer: D) All of the above
Explanation: Custom logs can be created using the Azure Portal, PowerShell, the REST API, or Azure CLI.
Which file formats are supported for custom logs in Azure Log Analytics?
- A) JSON
- B) CSV
- C) XML
- D) A and B
Answer: D) A and B
Explanation: Azure Log Analytics supports creating custom logs using data in JSON and CSV formats.
True or False: Custom logs in Azure Log Analytics have instant ingestion and are available for querying immediately after upload.
- A) True
- B) False
Answer: B) False
Explanation: After a custom log is uploaded to Azure Log Analytics, it may take up to one hour for the data to be available for querying.
Which Azure service can be used to collect data from various sources and forward it to Log Analytics?
- A) Azure Functions
- B) Azure Event Hubs
- C) Azure Monitor
- D) Azure Log Analytics Agent
Answer: D) Azure Log Analytics Agent
Explanation: The Azure Log Analytics Agent collects telemetry from a variety of sources and forwards it to Azure Log Analytics.
True or False: PII data can be stored in Azure Log Analytics without any restrictions.
- A) True
- B) False
Answer: B) False
Explanation: Storing PII data in Azure Log Analytics must be done in compliance with data privacy laws and Azure policies. Users should take care to handle such data appropriately.
To create custom logs, what must first be defined in Azure Log Analytics Workspace?
- A) Log Storage Account
- B) Custom Fields
- C) Custom Log Name
- D) Data Source
Answer: C) Custom Log Name
Explanation: Before ingesting custom data, a Custom Log Name, which will act as the target table in Azure Log Analytics, must be defined.
When using the HTTP Data Collector API to send logs to Azure Log Analytics, what is required in the request header?
- A) API Key
- B) Workspace ID
- C) Log Type
- D) Time-Generated Field
Answer: A) API Key
Explanation: When using the HTTP Data Collector API to send logs to Azure Log Analytics, an API Key (along with the Workspace ID) is required for authentication in the request header.
Which of the following Azure resources can be used to automate the collection and submission of custom logs to Azure Log Analytics?
- A) Azure Automation Account
- B) Azure Logic Apps
- C) Azure Stream Analytics
- D) All of the above
Answer: D) All of the above
Explanation: Azure Automation Accounts, Azure Logic Apps, and Azure Stream Analytics can all be used to automate the process of collecting and submitting custom logs to Log Analytics.
True or False: Once custom data is ingested into Azure Log Analytics, the schema of the custom log cannot be altered.
- A) True
- B) False
Answer: A) True
Explanation: After custom data has been ingested into Azure Log Analytics, the schema (custom log name, fields, etc.) is fixed and cannot be modified.
Interview Questions
What is Azure Log Analytics?
Azure Log Analytics is a service that allows you to collect and analyze log data from a variety of sources in Azure and other platforms.
What is a custom log in Azure Log Analytics?
A custom log is a log that you can create in Log Analytics to store data that is specific to your environment or application.
Why might you want to create a custom log in Azure Log Analytics?
You might want to create a custom log to store data that is not already collected by one of the built-in logs in Log Analytics, or to store data that is unique to your environment or application.
What are the steps to create a custom log in Azure Log Analytics?
To create a custom log, you need to define a data source, create a log definition, and then start sending data to the log. The exact steps will depend on the data source and the type of log you are creating.
What is a data source in Azure Log Analytics?
A data source is a source of log data that you want to collect and analyze in Log Analytics. Data sources can include virtual machines, containers, applications, and more.
How do you define a data source in Azure Log Analytics?
You can define a data source in Azure Log Analytics by creating a data collection rule that specifies the type of data you want to collect, the machines or applications you want to collect it from, and any filters or transformations you want to apply.
What is a log definition in Azure Log Analytics?
A log definition is a schema that defines the structure of a custom log in Azure Log Analytics. The log definition specifies the name and data type of each field in the log.
How do you create a log definition in Azure Log Analytics?
You can create a log definition in Azure Log Analytics by defining a JSON schema that describes the structure of the log. You can then use this schema to create the log in Log Analytics.
What are the benefits of creating a custom log in Azure Log Analytics?
Some potential benefits of creating a custom log in Azure Log Analytics include having more control over the data you collect and store, being able to store data that is unique to your environment or application, and being able to query and analyze this data using Log Analytics.
How can you query and analyze data in a custom log in Azure Log Analytics?
You can query and analyze data in a custom log using Log Analytics queries, which allow you to filter, group, and aggregate data in a variety of ways. You can also use Log Analytics workspaces to create dashboards and reports based on the data in your custom log.
I found the custom logs in Azure Log Analytics really useful for tracking specific data for our applications.
Does anyone here know if you can create alerts based on custom logs?
I appreciate the blog post, very enlightening!
Has anyone tried integrating Azure Log Analytics with Power BI for advanced data visualization?
I’m having trouble with Kusto Query Language (KQL) for parsing my custom logs. Any tips?
Thanks for the helpful insights!
Custom logs are powerful but can get complex quickly. Anyone uses automation for custom log management?
The user interface for creating custom logs could be more intuitive, in my opinion.