Updates from: 07/20/2022 01:12:16
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory-b2c Javascript And Page Layout https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/javascript-and-page-layout.md
function addTermsOfUseLink() {
In the code, replace `termsOfUseUrl` with the link to your terms of use agreement. For your directory, create a new user attribute called **termsOfUse** and then include **termsOfUse** as a user attribute.
+Alternatively, you can add a link at the bottom of self-asserted pages, without using of JavaScript. Use the following localization:
+
+```xml
+<LocalizedResources Id="api.localaccountsignup.en">
+ <LocalizedStrings>
+ <!-- The following elements will display a link at the bottom of the page. -->
+ <LocalizedString ElementType="UxElement" StringId="disclaimer_link_1_text">Terms of use</LocalizedString>
+ <LocalizedString ElementType="UxElement" StringId="disclaimer_link_1_url">termsOfUseUrl</LocalizedString>
+ </LocalizedStrings>
+</LocalizedResources>
+```
+
+Replace `termsOfUseUrl` with the link to your organization's privacy policy and terms of use.
++ ## Next steps Find more information about how to [Customize the user interface of your application in Azure Active Directory B2C](customize-ui-with-html.md).
active-directory-b2c Localization String Ids https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/localization-string-ids.md
Previously updated : 04/12/2022 Last updated : 04/19/2022
The following are the IDs for a content definition with an ID of `api.localaccou
| **ver_intro_msg** | Verification is necessary. Please click Send button. | | **ver_input** | Verification code |
+### Sign-up and self-asserted pages disclaimer links
+
+The following `UxElement` string IDs will display disclaimer link(s) at the bottom of the self-asserted page. These links are not displayed by default unless specified in the localized strings.
+
+| ID | Example value |
+| | - |
+| **disclaimer_msg_intro** | By providing your phone number, you consent to receiving a one-time passcode sent by text message to help you sign into {insert your application name}. Standard messsage and data rates may apply. |
+| **disclaimer_link_1_text** | Privacy Statement |
+| **disclaimer_link_1_url** | {insert your privacy statement URL} |
+| **disclaimer_link_2_text** | Terms and Conditions |
+| **disclaimer_link_2_url** | {insert your terms and conditions URL} |
+ ### Sign-up and self-asserted pages error messages | ID | Default value |
The following example shows the use of some of the user interface elements in th
<LocalizedString ElementType="UxElement" StringId="ver_input">Verification code</LocalizedString> <LocalizedString ElementType="UxElement" StringId="ver_intro_msg">Verification is necessary. Please click Send button.</LocalizedString> <LocalizedString ElementType="UxElement" StringId="ver_success_msg">E-mail address verified. You can now continue.</LocalizedString>
+ <!-- The following elements will display a message and two links at the bottom of the page.
+ For policies that you intend to show to users in the United States, we suggest displaying the following text. Replace the content of the disclaimer_link_X_url elements with links to your organization's privacy statement and terms and conditions.
+ Uncomment any of these lines to display them. -->
+ <!-- <LocalizedString ElementType="UxElement" StringId="disclaimer_msg_intro">By providing your phone number, you consent to receiving a one-time passcode sent by text message to help you sign into {insert your application name}. Standard messsage and data rates may apply.</LocalizedString> -->
+ <!-- <LocalizedString ElementType="UxElement" StringId="disclaimer_link_1_text">Privacy Statement</LocalizedString>
+ <LocalizedString ElementType="UxElement" StringId="disclaimer_link_1_url">{insert your privacy statement URL}</LocalizedString> -->
+ <!-- <LocalizedString ElementType="UxElement" StringId="disclaimer_link_2_text">Terms and Conditions</LocalizedString>
+ <LocalizedString ElementType="UxElement" StringId="disclaimer_link_2_url">{insert your terms and conditions URL}</LocalizedString> -->
<LocalizedString ElementType="ErrorMessage" StringId="ServiceThrottled">There are too many requests at this moment. Please wait for some time and try again.</LocalizedString> <LocalizedString ElementType="ErrorMessage" StringId="UserMessageIfClaimNotVerified">Claim not verified: {0}</LocalizedString> <LocalizedString ElementType="ErrorMessage" StringId="UserMessageIfClaimsPrincipalAlreadyExists">A user with the specified ID already exists. Please choose a different one.</LocalizedString>
active-directory-b2c Phone Factor Technical Profile https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/phone-factor-technical-profile.md
The **CryptographicKeys** element is not used.
| ManualPhoneNumberEntryAllowed| No | Specify whether or not a user is allowed to manually enter a phone number. Possible values: `true`, or `false` (default).| | setting.authenticationMode | No | The method to validate the phone number. Possible values: `sms`, `phone`, or `mixed` (default).| | setting.autodial| No| Specify whether the technical profile should auto dial or auto send an SMS. Possible values: `true`, or `false` (default). Auto dial requires the `setting.authenticationMode` metadata be set to `sms`, or `phone`. The input claims collection must have a single phone number. |
+| setting.autosubmit | No | Specifies whether the technical profile should auto submit the one-time password entry form. Possible values are `true` (default), or `false`. When auto-submit is turned off, the user needs to select a button to progress the journey. |
### UI elements
active-directory Partner Driven Integrations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/partner-driven-integrations.md
If you have built a SCIM Gateway and would like to add it to this list, follow t
1. Review the Azure AD SCIM [documentation](use-scim-to-provision-users-and-groups.md) to understand the Azure AD SCIM implementation. 1. Test compatibility between the Azure AD SCIM client and your SCIM gateway. 1. Click the pencil at the top of this document to edit the article
-1. Once you're redirected to Github, click the pencil at the top of the article to start making changes
+1. Once you're redirected to GitHub, click the pencil at the top of the article to start making changes
1. Make changes in the article using the Markdown language and create a pull request. Make sure to provide a description for the pull request. 1. An admin of the repository will review and merge your changes so that others can view them.
active-directory Howto Authentication Passwordless Phone https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/authentication/howto-authentication-passwordless-phone.md
Previously updated : 07/15/2022 Last updated : 07/19/2022
Microsoft Authenticator can be used to sign in to any Azure AD account without u
This authentication technology can be used on any device platform, including mobile. This technology can also be used with any app or website that integrates with Microsoft Authentication Libraries. People who enabled phone sign-in from Microsoft Authenticator see a message that asks them to tap a number in their app. No username or password is asked for. To complete the sign-in process in the app, a user must next take the following actions:
People who enabled phone sign-in from Microsoft Authenticator see a message that
1. Choose **Approve**. 1. Provide their PIN or biometric.
+## Multiple accounts on iOS (preview)
+
+You can enable passwordless phone sign-in for multiple accounts in Microsoft Authenticator on any supported iOS device. Consultants, students, and others with multiple accounts in Azure AD can add each account to Microsoft Authenticator and use passwordless phone sign-in for all of them from the same iOS device.
+
+Previously, admins might not require passwordless sign-in for users with multiple accounts because it requires them to carry more devices for sign-in. By removing the limitation of one user sign-in from a device, admins can more confidently encourage users to register passwordless phone sign-in and use it as their default sign-in method.
+
+The Azure AD accounts can be in the same tenant or different tenants. Guest accounts aren't supported for multiple account sign-in from one device.
+
+>[!NOTE]
+>Multiple accounts on iOS is currently in public preview. Some features might not be supported or have limited capabilities. For more information about previews, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
+ ## Prerequisites To use passwordless phone sign-in with Microsoft Authenticator, the following prerequisites must be met: - Recommended: Azure AD Multi-Factor Authentication, with push notifications allowed as a verification method. Push notifications to your smartphone or tablet help the Authenticator app to prevent unauthorized access to accounts and stop fraudulent transactions. The Authenticator app automatically generates codes when set up to do push notifications so a user has a backup sign-in method even if their device doesn't have connectivity. - Latest version of Microsoft Authenticator installed on devices running iOS 12.0 or greater, or Android 6.0 or greater.-- The device that runs Microsoft Authenticator must be registered to an individual user. We're actively working to enable multiple accounts on Android.
+- For Android, the device that runs Microsoft Authenticator must be registered to an individual user. We're actively working to enable multiple accounts on Android.
+- For iOS, the device must be registered with each tenant where it's used to sign in. For example, the following device must be registered with Contoso and Wingtiptoys to allow all accounts to sign in:
+ - balas@contoso.com
+ - balas@wingtiptoys.com and bsandhu@wingtiptoys
+- For iOS, we recommend enabling the option in Microsoft Authenticator to allow Microsoft to gather usage data. It's not enabled by default. To enable it in Microsoft Authenticator, go to **Settings** > **Usage Data**.
+
+ :::image type="content" border="true" source="./media/howto-authentication-passwordless-phone/telemetry.png" alt-text="Screenshot of Usage Data in Microsoft Authenticator.":::
To use passwordless authentication in Azure AD, first enable the combined registration experience, then enable users for the passwordless method.
An end user can be enabled for multifactor authentication (MFA) through an on-pr
If the user attempts to upgrade multiple installations (5+) of Microsoft Authenticator with the passwordless phone sign-in credential, this change might result in an error.
-### Device registration
-
-Before you can create this new strong credential, there are prerequisites. One prerequisite is that the device on which Microsoft Authenticator is installed must be registered within the Azure AD tenant to an individual user.
-
-Currently, a device can only be enabled for passwordless sign-in in a single tenant. This limit means that only one work or school account in Microsoft Authenticator can be enabled for phone sign-in.
-
-> [!NOTE]
-> Device registration is not the same as device management or mobile device management (MDM). Device registration only associates a device ID and a user ID together, in the Azure AD directory.
## Next steps
To learn about Azure AD authentication and passwordless methods, see the followi
- [Learn how passwordless authentication works](concept-authentication-passwordless.md) - [Learn about device registration](../devices/overview.md)-- [Learn about Azure AD Multi-Factor Authentication](../authentication/howto-mfa-getstarted.md)
+- [Learn about Azure AD Multi-Factor Authentication](../authentication/howto-mfa-getstarted.md)
active-directory How To Create Group Based Permissions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/cloud-infrastructure-entitlement-management/how-to-create-group-based-permissions.md
This article describes how you can create and manage group-based permissions in Permissions Management with the User management dashboard.
-[!NOTE] The Permissions Management Administrator for all authorization systems will be able to create the new group based permissions.
+> [!NOTE]
+> The Permissions Management Administrator for all authorization systems will be able to create the new group based permissions.
## Select administrative permissions settings for a group
active-directory Concept Conditional Access Cloud Apps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/concept-conditional-access-cloud-apps.md
Previously updated : 04/19/2022 Last updated : 07/18/2022
The following key applications are included in the Office 365 client app:
- OneDrive - Power Apps - Power Automate-- Security & Compliance Center
+- Security & compliance portal
- SharePoint Online - Skype for Business Online - Skype and Teams Tenant Admin API
A complete list of all services included can be found in the article [Apps inclu
### Microsoft Azure Management
-The Microsoft Azure Management application includes multiple services.
-
- - Azure portal
- - Microsoft Entra admin center
- - Azure Resource Manager provider
- - Classic deployment model APIs
- - Azure PowerShell
- - Azure CLI
- - Azure DevOps
- - Azure Data Factory portal
- - Azure Event Hubs
- - Azure Service Bus
- - [Azure SQL Database](/azure/azure-sql/database/conditional-access-configure)
- - SQL Managed Instance
- - Azure Synapse
- - Visual Studio subscriptions administrator portal
+When Conditional Access policy is targeted to the Microsoft Azure Management application, within the Conditional Access policy app picker the policy will be enforced for tokens issued to application IDs of a set of services closely bound to the portal.
+
+- Azure Resource Manager
+- Azure portal, which also covers the Microsoft Entra admin center
+- Azure Data Lake
+- Application Insights API
+- Log Analytics API
+
+Because the policy is applied to the Azure management portal and API, services, or clients with an Azure API service dependency, can indirectly be impacted. For example:
+
+- Classic deployment model APIs
+- Azure PowerShell
+- Azure CLI
+- Azure DevOps
+- Azure Data Factory portal
+- Azure Event Hubs
+- Azure Service Bus
+- [Azure SQL Database](/azure/azure-sql/database/conditional-access-configure)
+- SQL Managed Instance
+- Azure Synapse
+- Visual Studio subscriptions administrator portal
> [!NOTE] > The Microsoft Azure Management application applies to [Azure PowerShell](/powershell/azure/what-is-azure-powershell), which calls the [Azure Resource Manager API](../../azure-resource-manager/management/overview.md). It does not apply to [Azure AD PowerShell](/powershell/azure/active-directory/overview), which calls the [Microsoft Graph API](/graph/overview). For more information on how to set up a sample policy for Microsoft Azure Management, see [Conditional Access: Require MFA for Azure management](howto-conditional-access-policy-azure-management.md).
->[!NOTE]
->For Azure Government, you should target the Azure Government Cloud Management API application.
+> [!TIP]
+> For Azure Government, you should target the Azure Government Cloud Management API application.
### Other applications
Administrators can add any Azure AD registered application to Conditional Access
> [!NOTE] > Since Conditional Access policy sets the requirements for accessing a service you are not able to apply it to a client (public/native) application. In other words, the policy is not set directly on a client (public/native) application, but is applied when a client calls a service. For example, a policy set on SharePoint service applies to the clients calling SharePoint. A policy set on Exchange applies to the attempt to access the email using Outlook client. That is why client (public/native) applications are not available for selection in the Cloud Apps picker and Conditional Access option is not available in the application settings for the client (public/native) application registered in your tenant.
-Some applications don't appear in the picker at all. The only way to include these applications in a Conditional Access policy is to include **All apps**.
+Some applications don't appear in the picker at all. The only way to include these applications in a Conditional Access policy is to includeΓÇ»**All cloud apps**.
+
+### All cloud apps
+
+Applying a Conditional Access policy to **All cloud apps** will result in the policy being enforced for all tokens issued to web sites and services. This option includes applications that aren't individually targetable in Conditional Access policy, such as Azure Active Directory.
+
+In some cases, an **All cloud apps** policy could inadvertently block user access. These cases are excluded from policy enforcement and include:
+
+- Services required to achieve the desired security posture. For example, device enrollment calls are excluded from compliant device policy targeted to All cloud apps.
+
+- Calls to Azure AD Graph and MS Graph, to access user profile, group membership and relationship information that is commonly used by applications excluded from policy. The excluded scopes are listed below. Consent is still required for apps to use these permissions.
+ - For native clients:
+ - Azure AD Graph: User.read
+ - MS Graph: User.read, People.read, and UserProfile.read
+ - For confidential / authenticated clients:
+ - Azure AD Graph: User.read, User.read.all, and User.readbasic.all
+ - MS Graph: User.read,User.read.all, User.read.All People.read, People.read.all, GroupMember.Read.All, Member.Read.Hidden, and UserProfile.read
## User actions
active-directory Concept Conditional Access Grant https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/concept-conditional-access-grant.md
Devices must be registered in Azure AD before they can be marked as compliant. M
> [!NOTE] > On Windows 7, iOS, Android, macOS, and some third-party web browsers Azure AD identifies the device using a client certificate that is provisioned when the device is registered with Azure AD. When a user first signs in through the browser the user is prompted to select the certificate. The end user must select this certificate before they can continue to use the browser.
+You can use the Microsoft Defender for Endpoint app along with the Approved Client app policy in Intune to set device compliance policy Conditional Access policies. There's no exclusion required for the Microsoft Defender for Endpoint app while setting up Conditional Access. Although Microsoft Defender for Endpoint on Android & iOS (App ID - dd47d17a-3194-4d86-bfd5-c6ae6f5651e3) isn't an approved app, it has permission to report device security posture. This permission enables the flow of compliance information to Conditional Access.
+ ### Require hybrid Azure AD joined device Organizations can choose to use the device identity as part of their Conditional Access policy. Organizations can require that devices are hybrid Azure AD joined using this checkbox. For more information about device identities, see the article [What is a device identity?](../devices/overview.md).
active-directory Howto Conditional Access Policy Admin Mfa https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/howto-conditional-access-policy-admin-mfa.md
Accounts that are assigned administrative rights are targeted by attackers. Requiring multi-factor authentication (MFA) on those accounts is an easy way to reduce the risk of those accounts being compromised.
-Microsoft recommends you require MFA on the following roles at a minimum:
+Microsoft recommends you require MFA on the following roles at a minimum, based on [identity score recommendations](../fundamentals/identity-secure-score.md):
- Global administrator - Application administrator
active-directory Reference Aadsts Error Codes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/reference-aadsts-error-codes.md
The `error` field has several possible values - review the protocol documentatio
| AADSTS50178 | SessionControlNotSupportedForPassthroughUsers - Session control isn't supported for passthrough users. | | AADSTS50180 | WindowsIntegratedAuthMissing - Integrated Windows authentication is needed. Enable the tenant for Seamless SSO. | | AADSTS50187 | DeviceInformationNotProvided - The service failed to perform device authentication. |
-| AADSTS50194 | Application '{appId}'({appName}) is n't configured as a multi-tenant application. Usage of the /common endpoint isn't supported for such applications created after '{time}'. Use a tenant-specific endpoint or configure the application to be multi-tenant. |
+| AADSTS50194 | Application '{appId}'({appName}) isn't configured as a multi-tenant application. Usage of the /common endpoint isn't supported for such applications created after '{time}'. Use a tenant-specific endpoint or configure the application to be multi-tenant. |
| AADSTS50196 | LoopDetected - A client loop has been detected. Check the appΓÇÖs logic to ensure that token caching is implemented, and that error conditions are handled correctly. The app has made too many of the same request in too short a period, indicating that it is in a faulty state or is abusively requesting tokens. | | AADSTS50197 | ConflictingIdentities - The user could not be found. Try signing in again. | | AADSTS50199 | CmsiInterrupt - For security reasons, user confirmation is required for this request. Because this is an "interaction_required" error, the client should do interactive auth. This occurs because a system webview has been used to request a token for a native application - the user must be prompted to ask if this was actually the app they meant to sign into. To avoid this prompt, the redirect URI should be part of the following safe list: <br />http://<br />https://<br />msauth://(iOS only)<br />msauthv2://(iOS only)<br />chrome-extension:// (desktop Chrome browser only) |
active-directory Scenario Spa Sign In https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/scenario-spa-sign-in.md
Previously updated : 02/11/2020 Last updated : 07/19/2022 #Customer intent: As an application developer, I want to know how to write a single-page application by using the Microsoft identity platform.
Learn how to add sign-in to the code for your single-page application.
Before you can get tokens to access APIs in your application, you need an authenticated user context. You can sign in users to your application in MSAL.js in two ways:
-* [Pop-up window](#sign-in-with-a-pop-up-window), by using the `loginPopup` method
-* [Redirect](#sign-in-with-redirect), by using the `loginRedirect` method
+- [Pop-up window](#sign-in-with-a-pop-up-window), by using the `loginPopup` method
+- [Redirect](#sign-in-with-redirect), by using the `loginRedirect` method
You can also optionally pass the scopes of the APIs for which you need the user to consent at the time of sign-in.
-> [!NOTE]
-> If your application already has access to an authenticated user context or ID token, you can skip the login step and directly acquire tokens. For details, see [SSO with user hint](msal-js-sso.md#with-user-hint).
+If your application already has access to an authenticated user context or ID token, you can skip the login step, and directly acquire tokens. For details, see [SSO with user hint](msal-js-sso.md#with-user-hint).
## Choosing between a pop-up or redirect experience The choice between a pop-up or redirect experience depends on your application flow:
-* If you don't want users to move away from your main application page during authentication, we recommend the pop-up method. Because the authentication redirect happens in a pop-up window, the state of the main application is preserved.
+- If you don't want users to move away from your main application page during authentication, we recommend the pop-up method. Because the authentication redirect happens in a pop-up window, the state of the main application is preserved.
-* If users have browser constraints or policies where pop-up windows are disabled, you can use the redirect method. Use the redirect method with the Internet Explorer browser, because there are [known issues with pop-up windows on Internet Explorer](https://github.com/AzureAD/microsoft-authentication-library-for-js/blob/dev/lib/msal-browser/docs/internet-explorer.md#popups).
+- If users have browser constraints or policies where pop-up windows are disabled, you can use the redirect method. Use the redirect method with the Internet Explorer browser, because there are [known issues with pop-up windows on Internet Explorer](https://github.com/AzureAD/microsoft-authentication-library-for-js/blob/dev/lib/msal-browser/docs/internet-explorer.md#popups).
## Sign-in with a pop-up window # [JavaScript (MSAL.js v2)](#tab/javascript2) ```javascript- const config = {
- auth: {
- clientId: 'your_app_id',
- redirectUri: "your_app_redirect_uri", //defaults to application start page
- postLogoutRedirectUri: "your_app_logout_redirect_uri"
- }
-}
+ auth: {
+ clientId: "your_app_id",
+ redirectUri: "your_app_redirect_uri", //defaults to application start page
+ postLogoutRedirectUri: "your_app_logout_redirect_uri",
+ },
+};
const loginRequest = {
- scopes: ["User.ReadWrite"]
-}
+ scopes: ["User.ReadWrite"],
+};
let accountId = ""; const myMsal = new PublicClientApplication(config);
-myMsal.loginPopup(loginRequest)
- .then(function (loginResponse) {
- accountId = loginResponse.account.homeAccountId;
- // Display signed-in user content, call API, etc.
- }).catch(function (error) {
- //login failure
- console.log(error);
- });
+myMsal
+ .loginPopup(loginRequest)
+ .then(function (loginResponse) {
+ accountId = loginResponse.account.homeAccountId;
+ // Display signed-in user content, call API, etc.
+ })
+ .catch(function (error) {
+ //login failure
+ console.log(error);
+ });
``` # [JavaScript (MSAL.js v1)](#tab/javascript1) ```javascript- const config = {
- auth: {
- clientId: 'your_app_id',
- redirectUri: "your_app_redirect_uri", //defaults to application start page
- postLogoutRedirectUri: "your_app_logout_redirect_uri"
- }
-}
+ auth: {
+ clientId: "your_app_id",
+ redirectUri: "your_app_redirect_uri", //defaults to application start page
+ postLogoutRedirectUri: "your_app_logout_redirect_uri",
+ },
+};
const loginRequest = {
- scopes: ["User.ReadWrite"]
-}
+ scopes: ["User.ReadWrite"],
+};
const myMsal = new UserAgentApplication(config);
-myMsal.loginPopup(loginRequest)
- .then(function (loginResponse) {
- //login success
- }).catch(function (error) {
- //login failure
- console.log(error);
- });
+myMsal
+ .loginPopup(loginRequest)
+ .then(function (loginResponse) {
+ //login success
+ })
+ .catch(function (error) {
+ //login failure
+ console.log(error);
+ });
``` # [Angular (MSAL.js v2)](#tab/angular2)
The MSAL Angular wrapper allows you to secure specific routes in your applicatio
```javascript // In app-routing.module.ts
-import { NgModule } from '@angular/core';
-import { Routes, RouterModule } from '@angular/router';
-import { ProfileComponent } from './profile/profile.component';
-import { MsalGuard } from '@azure/msal-angular';
-import { HomeComponent } from './home/home.component';
+import { NgModule } from "@angular/core";
+import { Routes, RouterModule } from "@angular/router";
+import { ProfileComponent } from "./profile/profile.component";
+import { MsalGuard } from "@azure/msal-angular";
+import { HomeComponent } from "./home/home.component";
const routes: Routes = [
- {
- path: 'profile',
- component: ProfileComponent,
- canActivate: [
- MsalGuard
- ]
- },
- {
- path: '',
- component: HomeComponent
- }
+ {
+ path: "profile",
+ component: ProfileComponent,
+ canActivate: [MsalGuard],
+ },
+ {
+ path: "",
+ component: HomeComponent,
+ },
]; @NgModule({
- imports: [RouterModule.forRoot(routes, { useHash: false })],
- exports: [RouterModule]
+ imports: [RouterModule.forRoot(routes, { useHash: false })],
+ exports: [RouterModule],
})
-export class AppRoutingModule { }
+export class AppRoutingModule {}
``` For a pop-up window experience, set the `interactionType` configuration to `InteractionType.Popup` in the Guard configuration. You can also pass the scopes that require consent as follows: ```javascript // In app.module.ts
-import { PublicClientApplication, InteractionType } from '@azure/msal-browser';
-import { MsalModule } from '@azure/msal-angular';
+import { PublicClientApplication, InteractionType } from "@azure/msal-browser";
+import { MsalModule } from "@azure/msal-angular";
@NgModule({
- imports: [
- MsalModule.forRoot( new PublicClientApplication({
- auth: {
- clientId: 'Enter_the_Application_Id_Here',
- },
- cache: {
- cacheLocation: 'localStorage',
- storeAuthStateInCookie: isIE,
- }
- }), {
- interactionType: InteractionType.Popup, // Msal Guard Configuration
- authRequest: {
- scopes: ['user.read']
- }
- }, null)
- ]
+ imports: [
+ MsalModule.forRoot(
+ new PublicClientApplication({
+ auth: {
+ clientId: "Enter_the_Application_Id_Here",
+ },
+ cache: {
+ cacheLocation: "localStorage",
+ storeAuthStateInCookie: isIE,
+ },
+ }),
+ {
+ interactionType: InteractionType.Popup, // Msal Guard Configuration
+ authRequest: {
+ scopes: ["user.read"],
+ },
+ },
+ null
+ ),
+ ],
})
-export class AppModule { }
+export class AppModule {}
``` # [Angular (MSAL.js v1)](#tab/angular1)+ The MSAL Angular wrapper allows you to secure specific routes in your application by adding `MsalGuard` to the route definition. This guard will invoke the method to sign in when that route is accessed.+ ```javascript // In app-routing.module.ts
-import { NgModule } from '@angular/core';
-import { Routes, RouterModule } from '@angular/router';
-import { ProfileComponent } from './profile/profile.component';
-import { MsalGuard } from '@azure/msal-angular';
-import { HomeComponent } from './home/home.component';
+import { NgModule } from "@angular/core";
+import { Routes, RouterModule } from "@angular/router";
+import { ProfileComponent } from "./profile/profile.component";
+import { MsalGuard } from "@azure/msal-angular";
+import { HomeComponent } from "./home/home.component";
const routes: Routes = [ {
- path: 'profile',
+ path: "profile",
component: ProfileComponent,
- canActivate: [
- MsalGuard
- ]
+ canActivate: [MsalGuard],
}, {
- path: '',
- component: HomeComponent
- }
+ path: "",
+ component: HomeComponent,
+ },
]; @NgModule({ imports: [RouterModule.forRoot(routes, { useHash: false })],
- exports: [RouterModule]
+ exports: [RouterModule],
})
-export class AppRoutingModule { }
+export class AppRoutingModule {}
``` For a pop-up window experience, enable the `popUp` configuration option. You can also pass the scopes that require consent as follows:
For a pop-up window experience, enable the `popUp` configuration option. You can
# [React](#tab/react)
-The MSAL React wrapper allows you to protect specific components by wrapping them in the `MsalAuthenticationTemplate` component. This component will invoke login if a user is not already signed in or render child components otherwise.
+The MSAL React wrapper allows you to protect specific components by wrapping them in the `MsalAuthenticationTemplate` component. This component will invoke login if a user isn't already signed in or render child components otherwise.
```javascript import { InteractionType } from "@azure/msal-browser"; import { MsalAuthenticationTemplate, useMsal } from "@azure/msal-react"; function WelcomeUser() {
- const { accounts } = useMsal();
- const username = accounts[0].username;
+ const { accounts } = useMsal();
+ const username = accounts[0].username;
- return <p>Welcome, {username}</p>
+ return <p>Welcome, {username}</p>;
} // Remember that MsalProvider must be rendered somewhere higher up in the component tree function App() {
- return (
- <MsalAuthenticationTemplate interactionType={InteractionType.Popup}>
- <p>This will only render if a user is signed-in.</p>
- <WelcomeUser />
- </MsalAuthenticationTemplate>
- )
-};
+ return (
+ <MsalAuthenticationTemplate interactionType={InteractionType.Popup}>
+ <p>This will only render if a user is signed-in.</p>
+ <WelcomeUser />
+ </MsalAuthenticationTemplate>
+ );
+}
``` You can also use the `@azure/msal-browser` APIs directly to invoke a login paired with the `AuthenticatedTemplate` and/or `UnauthenticatedTemplate` components to render specific contents to signed-in or signed-out users respectively. This is the recommended approach if you need to invoke login as a result of user interaction such as a button click. ```javascript
-import { useMsal, AuthenticatedTemplate, UnauthenticatedTemplate } from "@azure/msal-react";
+import {
+ useMsal,
+ AuthenticatedTemplate,
+ UnauthenticatedTemplate,
+} from "@azure/msal-react";
function signInClickHandler(instance) {
- instance.loginPopup();
+ instance.loginPopup();
} // SignInButton Component returns a button that invokes a popup login when clicked function SignInButton() {
- // useMsal hook will return the PublicClientApplication instance you provided to MsalProvider
- const { instance } = useMsal();
+ // useMsal hook will return the PublicClientApplication instance you provided to MsalProvider
+ const { instance } = useMsal();
- return <button onClick={() => signInClickHandler(instance)}>Sign In</button>
-};
+ return <button onClick={() => signInClickHandler(instance)}>Sign In</button>;
+}
function WelcomeUser() {
- const { accounts } = useMsal();
- const username = accounts[0].username;
+ const { accounts } = useMsal();
+ const username = accounts[0].username;
- return <p>Welcome, {username}</p>
+ return <p>Welcome, {username}</p>;
} // Remember that MsalProvider must be rendered somewhere higher up in the component tree function App() {
- return (
- <>
- <AuthenticatedTemplate>
- <p>This will only render if a user is signed-in.</p>
- <WelcomeUser />
- </AuthenticatedTemplate>
- <UnauthenticatedTemplate>
- <p>This will only render if a user is not signed-in.</p>
- <SignInButton />
- </UnauthenticatedTemplate>
- </>
- )
+ return (
+ <>
+ <AuthenticatedTemplate>
+ <p>This will only render if a user is signed-in.</p>
+ <WelcomeUser />
+ </AuthenticatedTemplate>
+ <UnauthenticatedTemplate>
+ <p>This will only render if a user is not signed-in.</p>
+ <SignInButton />
+ </UnauthenticatedTemplate>
+ </>
+ );
} ```
function App() {
# [JavaScript (MSAL.js v2)](#tab/javascript2) ```javascript- const config = {
- auth: {
- clientId: 'your_app_id',
- redirectUri: "your_app_redirect_uri", //defaults to application start page
- postLogoutRedirectUri: "your_app_logout_redirect_uri"
- }
-}
+ auth: {
+ clientId: "your_app_id",
+ redirectUri: "your_app_redirect_uri", //defaults to application start page
+ postLogoutRedirectUri: "your_app_logout_redirect_uri",
+ },
+};
const loginRequest = {
- scopes: ["User.ReadWrite"]
-}
+ scopes: ["User.ReadWrite"],
+};
let accountId = ""; const myMsal = new PublicClientApplication(config); function handleResponse(response) {
- if (response !== null) {
- accountId = response.account.homeAccountId;
- // Display signed-in user content, call API, etc.
- } else {
- // In case multiple accounts exist, you can select
- const currentAccounts = myMsal.getAllAccounts();
-
- if (currentAccounts.length === 0) {
- // no accounts signed-in, attempt to sign a user in
- myMsal.loginRedirect(loginRequest);
- } else if (currentAccounts.length > 1) {
- // Add choose account code here
- } else if (currentAccounts.length === 1) {
- accountId = currentAccounts[0].homeAccountId;
- }
+ if (response !== null) {
+ accountId = response.account.homeAccountId;
+ // Display signed-in user content, call API, etc.
+ } else {
+ // In case multiple accounts exist, you can select
+ const currentAccounts = myMsal.getAllAccounts();
+
+ if (currentAccounts.length === 0) {
+ // no accounts signed-in, attempt to sign a user in
+ myMsal.loginRedirect(loginRequest);
+ } else if (currentAccounts.length > 1) {
+ // Add choose account code here
+ } else if (currentAccounts.length === 1) {
+ accountId = currentAccounts[0].homeAccountId;
}
+ }
} myMsal.handleRedirectPromise().then(handleResponse);
myMsal.handleRedirectPromise().then(handleResponse);
The redirect methods don't return a promise because of the move away from the main app. To process and access the returned tokens, register success and error callbacks before you call the redirect methods. ```javascript- const config = {
- auth: {
- clientId: 'your_app_id',
- redirectUri: "your_app_redirect_uri", //defaults to application start page
- postLogoutRedirectUri: "your_app_logout_redirect_uri"
- }
-}
+ auth: {
+ clientId: "your_app_id",
+ redirectUri: "your_app_redirect_uri", //defaults to application start page
+ postLogoutRedirectUri: "your_app_logout_redirect_uri",
+ },
+};
const loginRequest = {
- scopes: ["User.ReadWrite"]
-}
+ scopes: ["User.ReadWrite"],
+};
const myMsal = new UserAgentApplication(config); function authCallback(error, response) {
- //handle redirect response
+ //handle redirect response
} myMsal.handleRedirectCallback(authCallback);
myMsal.loginRedirect(loginRequest);
# [Angular (MSAL.js v2)](#tab/angular2)
-The code here is the same as described earlier in the section about sign-in with a pop-up window, except that the `interactionType` is set to `InteractionType.Redirect` for the MsalGuard Configuration, and the `MsalRedirectComponent` is bootstrapped to handle redirects.
+The code here's the same as described earlier in the section about sign-in with a pop-up window, except that the `interactionType` is set to `InteractionType.Redirect` for the MsalGuard Configuration, and the `MsalRedirectComponent` is bootstrapped to handle redirects.
```javascript // In app.module.ts
-import { PublicClientApplication, InteractionType } from '@azure/msal-browser';
-import { MsalModule, MsalRedirectComponent } from '@azure/msal-angular';
+import { PublicClientApplication, InteractionType } from "@azure/msal-browser";
+import { MsalModule, MsalRedirectComponent } from "@azure/msal-angular";
@NgModule({
- imports: [
- MsalModule.forRoot( new PublicClientApplication({
- auth: {
- clientId: 'Enter_the_Application_Id_Here',
- },
- cache: {
- cacheLocation: 'localStorage',
- storeAuthStateInCookie: isIE,
- }
- }), {
- interactionType: InteractionType.Redirect, // Msal Guard Configuration
- authRequest: {
- scopes: ['user.read']
- }
- }, null)
- ],
- bootstrap: [AppComponent, MsalRedirectComponent]
+ imports: [
+ MsalModule.forRoot(
+ new PublicClientApplication({
+ auth: {
+ clientId: "Enter_the_Application_Id_Here",
+ },
+ cache: {
+ cacheLocation: "localStorage",
+ storeAuthStateInCookie: isIE,
+ },
+ }),
+ {
+ interactionType: InteractionType.Redirect, // Msal Guard Configuration
+ authRequest: {
+ scopes: ["user.read"],
+ },
+ },
+ null
+ ),
+ ],
+ bootstrap: [AppComponent, MsalRedirectComponent],
})
-export class AppModule { }
+export class AppModule {}
``` # [Angular (MSAL.js v1)](#tab/angular1)
-The code here is the same as described earlier in the section about sign-in with a pop-up window. The default flow is redirect.
+The code here's the same as described earlier in the section about sign-in with a pop-up window. The default flow is redirect.
# [React](#tab/react)
-The MSAL React wrapper allows you to protect specific components by wrapping them in the `MsalAuthenticationTemplate` component. This component will invoke login if a user is not already signed in or render child components otherwise.
+The MSAL React wrapper allows you to protect specific components by wrapping them in the `MsalAuthenticationTemplate` component. This component will invoke login if a user isn't already signed in or render child components otherwise.
```javascript import { InteractionType } from "@azure/msal-browser"; import { MsalAuthenticationTemplate, useMsal } from "@azure/msal-react"; function WelcomeUser() {
- const { accounts } = useMsal();
- const username = accounts[0].username;
+ const { accounts } = useMsal();
+ const username = accounts[0].username;
- return <p>Welcome, {username}</p>
+ return <p>Welcome, {username}</p>;
} // Remember that MsalProvider must be rendered somewhere higher up in the component tree function App() {
- return (
- <MsalAuthenticationTemplate interactionType={InteractionType.Redirect}>
- <p>This will only render if a user is signed-in.</p>
- <WelcomeUser />
- </MsalAuthenticationTemplate>
- )
-};
+ return (
+ <MsalAuthenticationTemplate interactionType={InteractionType.Redirect}>
+ <p>This will only render if a user is signed-in.</p>
+ <WelcomeUser />
+ </MsalAuthenticationTemplate>
+ );
+}
``` You can also use the `@azure/msal-browser` APIs directly to invoke a login paired with the `AuthenticatedTemplate` and/or `UnauthenticatedTemplate` components to render specific contents to signed-in or signed-out users respectively. This is the recommended approach if you need to invoke login as a result of user interaction such as a button click. ```javascript
-import { useMsal, AuthenticatedTemplate, UnauthenticatedTemplate } from "@azure/msal-react";
+import {
+ useMsal,
+ AuthenticatedTemplate,
+ UnauthenticatedTemplate,
+} from "@azure/msal-react";
function signInClickHandler(instance) {
- instance.loginRedirect();
+ instance.loginRedirect();
} // SignInButton Component returns a button that invokes a popup login when clicked function SignInButton() {
- // useMsal hook will return the PublicClientApplication instance you provided to MsalProvider
- const { instance } = useMsal();
+ // useMsal hook will return the PublicClientApplication instance you provided to MsalProvider
+ const { instance } = useMsal();
- return <button onClick={() => signInClickHandler(instance)}>Sign In</button>
-};
+ return <button onClick={() => signInClickHandler(instance)}>Sign In</button>;
+}
function WelcomeUser() {
- const { accounts } = useMsal();
- const username = accounts[0].username;
+ const { accounts } = useMsal();
+ const username = accounts[0].username;
- return <p>Welcome, {username}</p>
+ return <p>Welcome, {username}</p>;
} // Remember that MsalProvider must be rendered somewhere higher up in the component tree function App() {
- return (
- <>
- <AuthenticatedTemplate>
- <p>This will only render if a user is signed-in.</p>
- <WelcomeUser />
- </AuthenticatedTemplate>
- <UnauthenticatedTemplate>
- <p>This will only render if a user is not signed-in.</p>
- <SignInButton />
- </UnauthenticatedTemplate>
- </>
- )
+ return (
+ <>
+ <AuthenticatedTemplate>
+ <p>This will only render if a user is signed-in.</p>
+ <WelcomeUser />
+ </AuthenticatedTemplate>
+ <UnauthenticatedTemplate>
+ <p>This will only render if a user is not signed-in.</p>
+ <SignInButton />
+ </UnauthenticatedTemplate>
+ </>
+ );
} ```
You can also configure `logoutPopup` to redirect the main window to a different
```javascript const config = {
- auth: {
- clientId: 'your_app_id',
- redirectUri: "your_app_redirect_uri", // defaults to application start page
- postLogoutRedirectUri: "your_app_logout_redirect_uri"
- }
-}
+ auth: {
+ clientId: "your_app_id",
+ redirectUri: "your_app_redirect_uri", // defaults to application start page
+ postLogoutRedirectUri: "your_app_logout_redirect_uri",
+ },
+};
const myMsal = new PublicClientApplication(config); // you can select which account application should sign out const logoutRequest = {
- account: myMsal.getAccountByHomeId(homeAccountId),
- mainWindowRedirectUri: "your_app_main_window_redirect_uri"
-}
+ account: myMsal.getAccountByHomeId(homeAccountId),
+ mainWindowRedirectUri: "your_app_main_window_redirect_uri",
+};
await myMsal.logoutPopup(logoutRequest); ```+ # [JavaScript (MSAL.js v1)](#tab/javascript1)
-Signing out with a pop-up window is not supported in MSAL.js v1
+Signing out with a pop-up window isn't supported in MSAL.js v1
# [Angular (MSAL.js v2)](#tab/angular2)
logout() {
# [Angular (MSAL.js v1)](#tab/angular1)
-Signing out with a pop-up window is not supported in MSAL Angular v1
+Signing out with a pop-up window isn't supported in MSAL Angular v1
# [React](#tab/react) ```javascript
-import { useMsal, AuthenticatedTemplate, UnauthenticatedTemplate } from "@azure/msal-react";
+import {
+ useMsal,
+ AuthenticatedTemplate,
+ UnauthenticatedTemplate,
+} from "@azure/msal-react";
function signOutClickHandler(instance) {
- const logoutRequest = {
- account: instance.getAccountByHomeId(homeAccountId),
- mainWindowRedirectUri: "your_app_main_window_redirect_uri",
- postLogoutRedirectUri: "your_app_logout_redirect_uri"
- }
- instance.logoutPopup(logoutRequest);
+ const logoutRequest = {
+ account: instance.getAccountByHomeId(homeAccountId),
+ mainWindowRedirectUri: "your_app_main_window_redirect_uri",
+ postLogoutRedirectUri: "your_app_logout_redirect_uri",
+ };
+ instance.logoutPopup(logoutRequest);
} // SignOutButton Component returns a button that invokes a popup logout when clicked function SignOutButton() {
- // useMsal hook will return the PublicClientApplication instance you provided to MsalProvider
- const { instance } = useMsal();
+ // useMsal hook will return the PublicClientApplication instance you provided to MsalProvider
+ const { instance } = useMsal();
- return <button onClick={() => signOutClickHandler(instance)}>Sign Out</button>
-};
+ return (
+ <button onClick={() => signOutClickHandler(instance)}>Sign Out</button>
+ );
+}
// Remember that MsalProvider must be rendered somewhere higher up in the component tree function App() {
- return (
- <>
- <AuthenticatedTemplate>
- <p>This will only render if a user is signed-in.</p>
- <SignOutButton />
- </AuthenticatedTemplate>
- <UnauthenticatedTemplate>
- <p>This will only render if a user is not signed-in.</p>
- </UnauthenticatedTemplate>
- </>
- )
+ return (
+ <>
+ <AuthenticatedTemplate>
+ <p>This will only render if a user is signed-in.</p>
+ <SignOutButton />
+ </AuthenticatedTemplate>
+ <UnauthenticatedTemplate>
+ <p>This will only render if a user is not signed-in.</p>
+ </UnauthenticatedTemplate>
+ </>
+ );
} ```
function App() {
## Sign-out with a redirect
-MSAL.js provides a `logout` method in v1, and `logoutRedirect` method in v2, that clears the cache in browser storage and redirects the window to the Azure Active Directory (Azure AD) sign-out page. After sign-out, Azure AD redirects back to the page that invoked logout by default.
+MSAL.js provides a `logout` method in v1, and `logoutRedirect` method in v2 that clears the cache in browser storage and redirects the window to the Azure AD sign-out page. After sign-out, Azure AD redirects back to the page that invoked logout by default.
You can configure the URI to which it should redirect after sign-out by setting `postLogoutRedirectUri`. This URI should be registered as a redirect URI in your application registration.
You can configure the URI to which it should redirect after sign-out by setting
```javascript const config = {
- auth: {
- clientId: 'your_app_id',
- redirectUri: "your_app_redirect_uri", //defaults to application start page
- postLogoutRedirectUri: "your_app_logout_redirect_uri"
- }
-}
+ auth: {
+ clientId: "your_app_id",
+ redirectUri: "your_app_redirect_uri", //defaults to application start page
+ postLogoutRedirectUri: "your_app_logout_redirect_uri",
+ },
+};
const myMsal = new PublicClientApplication(config); // you can select which account application should sign out const logoutRequest = {
- account: myMsal.getAccountByHomeId(homeAccountId)
-}
+ account: myMsal.getAccountByHomeId(homeAccountId),
+};
myMsal.logoutRedirect(logoutRequest); ```
myMsal.logoutRedirect(logoutRequest);
```javascript const config = {
- auth: {
- clientId: 'your_app_id',
- redirectUri: "your_app_redirect_uri", //defaults to application start page
- postLogoutRedirectUri: "your_app_logout_redirect_uri"
- }
-}
+ auth: {
+ clientId: "your_app_id",
+ redirectUri: "your_app_redirect_uri", //defaults to application start page
+ postLogoutRedirectUri: "your_app_logout_redirect_uri",
+ },
+};
const myMsal = new UserAgentApplication(config);
this.authService.logout();
# [React](#tab/react) ```javascript
-import { useMsal, AuthenticatedTemplate, UnauthenticatedTemplate } from "@azure/msal-react";
+import {
+ useMsal,
+ AuthenticatedTemplate,
+ UnauthenticatedTemplate,
+} from "@azure/msal-react";
function signOutClickHandler(instance) {
- const logoutRequest = {
- account: instance.getAccountByHomeId(homeAccountId),
- postLogoutRedirectUri: "your_app_logout_redirect_uri"
- }
- instance.logoutRedirect(logoutRequest);
+ const logoutRequest = {
+ account: instance.getAccountByHomeId(homeAccountId),
+ postLogoutRedirectUri: "your_app_logout_redirect_uri",
+ };
+ instance.logoutRedirect(logoutRequest);
} // SignOutButton Component returns a button that invokes a redirect logout when clicked function SignOutButton() {
- // useMsal hook will return the PublicClientApplication instance you provided to MsalProvider
- const { instance } = useMsal();
+ // useMsal hook will return the PublicClientApplication instance you provided to MsalProvider
+ const { instance } = useMsal();
- return <button onClick={() => signOutClickHandler(instance)}>Sign Out</button>
-};
+ return (
+ <button onClick={() => signOutClickHandler(instance)}>Sign Out</button>
+ );
+}
// Remember that MsalProvider must be rendered somewhere higher up in the component tree function App() {
- return (
- <>
- <AuthenticatedTemplate>
- <p>This will only render if a user is signed-in.</p>
- <SignOutButton />
- </AuthenticatedTemplate>
- <UnauthenticatedTemplate>
- <p>This will only render if a user is not signed-in.</p>
- </UnauthenticatedTemplate>
- </>
- )
+ return (
+ <>
+ <AuthenticatedTemplate>
+ <p>This will only render if a user is signed-in.</p>
+ <SignOutButton />
+ </AuthenticatedTemplate>
+ <UnauthenticatedTemplate>
+ <p>This will only render if a user is not signed-in.</p>
+ </UnauthenticatedTemplate>
+ </>
+ );
} ```
active-directory Road To The Cloud Introduction https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/road-to-the-cloud-introduction.md
# Introduction
-This content provides guidance to move:
+Some organizations set goals to remove AD, and their on-premises IT footprint. Others take advantage of some cloud-based capabilities to reduce the AD footprint, but not to completely remove their on-premises environments. This content provides guidance to move:
* **From** - Active Directory (AD) and other non-cloud based services, either hosted on-premises or Infrastructure-as-a-Service (IaaS), that provide identity management (IDM), identity and access management (IAM) and device management.
This content provides guidance to move:
>[!NOTE] > In this content, when we refer to AD, we are referring to Windows Server Active Directory Domain Services.
-Some organizations set goals to remove AD, and their on-premises IT footprint. Others set goals to take advantage of some cloud-based capabilities, but not to completely remove their on-premises or IaaS environments. Transformation must be aligned with and achieve business objectives including increased productivity, reduced costs and complexity, and improved security posture. To better understand the costs vs. value of moving to the cloud, see [Forrester TEI for Microsoft Azure Active Directory](https://www.microsoft.com/security/business/forrester-tei-study) and other TEI reports and [Cloud economics](https://azure.microsoft.com/overview/cloud-economics/).
+Transformation must be aligned with and achieve business objectives including increased productivity, reduced costs and complexity, and improved security posture. To better understand the costs vs. value of moving to the cloud, see [Forrester TEI for Microsoft Azure Active Directory](https://www.microsoft.com/security/business/forrester-tei-study) and other TEI reports and [Cloud economics](https://azure.microsoft.com/overview/cloud-economics/).
## Next steps
active-directory Create Access Review https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/create-access-review.md
Title: Create an access review of groups and applications - Azure AD
description: Learn how to create an access review of group members or application access in Azure Active Directory. -+ editor: markwahl-msft na Previously updated : 03/22/2022 Last updated : 07/18/2022
If you are reviewing access to an application, then before creating the review,
1. Use the **At end of review, send notification to** option to send notifications to other users or groups with completion updates. This feature allows for stakeholders other than the review creator to be updated on the progress of the review. To use this feature, choose **Select User(s) or Group(s)** and add another user or group for which you want to receive the status of completion.
-1. In the **Enable review decision helpers** section, choose whether you want your reviewer to receive recommendations during the review process. When enabled, users who have signed in during the previous 30-day period are recommended for approval. Users who haven't signed in during the past 30 days are recommended for denial. This 30-day interval is irrespective of whether the sign-ins were interactive or not. The last sign-in date for the specified user will also display along with the recommendation.
+1. In the **Enable review decision helpers** section choose whether you want your reviewer to receive recommendations during the review process:
+ 1. If you select **No sign-in within 30 days**, users who have signed in during the previous 30-day period are recommended for approval. Users who haven't signed in during the past 30 days are recommended for denial. This 30-day interval is irrespective of whether the sign-ins were interactive or not. The last sign-in date for the specified user will also display along with the recommendation.
+ 1. If you select **Peer outlier**, approvers will be recommended to keep or deny access to users based on the access the users' peers have. If a user doesn't have the same access as their peers, the system will recommend that the reviewer deny them access.
> [!NOTE] > If you create an access review based on applications, your recommendations are based on the 30-day interval period depending on when the user last signed in to the application rather than the tenant.
- ![Screenshot that shows the Enable reviewer decision helpers option.](./media/create-access-review/helpers.png)
+ ![Screenshot that shows the Enable reviewer decision helpers options.](./media/create-access-review/helpers.png)
1. In the **Advanced settings** section, you can choose the following:
active-directory Perform Access Review https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/perform-access-review.md
Title: Review access to groups & applications in access reviews - Azure AD
description: Learn how to review access of group members or application access in Azure Active Directory access reviews. -+ editor: markwahl-msft na Previously updated : 2/18/2022 Last updated : 7/18/2022
There are two ways that you can approve or deny access:
### Review access based on recommendations
-To make access reviews easier and faster for you, we also provide recommendations that you can accept with a single click. The recommendations are generated based on the user's sign-in activity.
+To make access reviews easier and faster for you, we also provide recommendations that you can accept with a single click. There are two ways recommendations are generated for the reviewer. One method the system uses to create recommendations is by the user's sign-in activity. If a user has been inactive for 30 days or more, the reviewer will be recommended to deny access. The other method is based on the access the user's peers have. If the user doesn't have the same access as their peers, the reviewer will be recommended to deny that user access.
+
+If you have **No sign-in within 30 days** or **Peer outlier** enabled, follow the steps below to accept recommendations:
1. Select one or more users and then Click **Accept recommendations**.
active-directory Review Recommendations Group Access Reviews https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/review-recommendations-group-access-reviews.md
+
+ Title: Review recommendations for Access reviews - Azure AD
+description: Learn how to review access of group members with review recommendations in Azure Active Directory access reviews.
+++
+editor: markwahl-msft
++
+ na
++ Last updated : 7/18/2022+++++
+# Review recommendations for Access reviews
+
+Decision makers who review users' access and perform access reviews can use system based recommendations to help them decide whether to continue their access or deny their access to resources. For more information about how to use review recommendations, see [Enable decision helpers](create-access-review.md#next-settings).
+
+## Prerequisites
+
+- Azure AD Premium P2
+
+For more information, see [License requirements](access-reviews-overview.md#license-requirements).
+
+## Peer outlier recommendations
+If review decision helpers are enabled by the creator of the access review, reviewers can receive peer outlier recommendations for reviews of group access reviews.
+
+Peer analysis recommendation detects users with outlier access to a group, based on reporting-structure similarity with other group members. The outlier recommendation relies on a scoring mechanism which is calculated by computing the userΓÇÖs average distance to the remaining users in the group.
+
+A *peer* in an organizationΓÇÖs chart is defined as two or more users who share similar characteristics in the organization's reporting structure. Users who are very distant from all the other group members based on their organization's chart, are considered a ΓÇ£peer outlierΓÇ¥ in a group.
+
+> [!NOTE]
+> Currently, this feature is only available for uses in your directory. Use of the peer outlier recommendations is not supported for guest users.
++
+The following image has an example of an organization's reporting structure in a cosmetics company:
+
+![Example hierarchial organization chart for a cosmetics company](./media/review-recommendations-group-access-reviews/org-chart-example.png)
+
+Based on the reporting structure in the example image, members outside of a division that is under a group review, would be denied access by the system if the peer outlier recommendation was taken by the reviewer.
+
+For example, Phil who works within the Personal care division is in a group with Debby, Irwin, and Emily who all work within the Cosmetics division. The group is called *Fresh Skin*. If an Access Review for the group Fresh Skin is performed, based on the reporting structure and distance away from the other group members, Phil would be considered an outlier. The system will create a **Deny** recommendation in the group access review.
+
+## Next Steps
+- [Create an access review](create-access-review.md)
+- [Review access to groups or applications](perform-access-review.md)
+
active-directory Tutorial Govern Monitor https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/manage-apps/tutorial-govern-monitor.md
Previously updated : 02/24/2022 Last updated : 07/19/2022 # Customer intent: As an administrator of an Azure AD tenant, I want to govern and monitor my applications.
To create an access review:
### Start the access review
-After you've specified the settings for an access review, select **Start**. The access review appears in your list with an indicator of its status.
+The access review starts in a few minutes and it appears in your list with an indicator of its status.
By default, Azure AD sends an email to reviewers shortly after the review starts. If you choose not to have Azure AD send the email, be sure to inform the reviewers that an access review is waiting for them to complete. You can show them the instructions for how to review access to groups or applications. If your review is for guests to review their own access, show them the instructions for how to review access for themselves to groups or applications. If you've assigned guests as reviewers and they haven't accepted their invitation to the tenant, they won't receive an email from access reviews. They must first accept the invitation before they can begin reviewing.
+### View the status of an access review
+
+You can track the progress of access reviews as they are completed.
+
+1. Go to **Azure Active Directory**, and then select **Identity Governance**.
+1. In the left menu, select **Access reviews**.
+1. In the list, select the access review you created.
+1. On the **Overview** page, check the progress of the access review.
+
+The **Results** page provides information on each user under review in the instance, including the ability to Stop, Reset, and Download results. To learn more, check out the [Complete an access review of groups and applications in Azure AD access reviews](../governance/complete-access-review.md) article.
+ ## Access the audit logs report The audit logs report combines several reports around application activities into a single view for context-based reporting. For more information, see [Audit logs in Azure Active Directory](../reports-monitoring/concept-audit-logs.md).
After about 15 minutes, verify that events are streamed to your Log Analytics wo
Advance to the next article to learn how to... > [!div class="nextstepaction"]
-> [Manage consent to applications and evaluate consent requests](manage-consent-requests.md)
+> [Manage consent to applications and evaluate consent requests](manage-consent-requests.md)
active-directory Permissions Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/roles/permissions-reference.md
Previously updated : 06/27/2022 Last updated : 07/18/2022
Users with this role can't change the credentials or reset MFA for members and o
> | | | > | microsoft.directory/users/authenticationMethods/create | Create authentication methods for users | > | microsoft.directory/users/authenticationMethods/delete | Delete authentication methods for users |
-> | microsoft.directory/users/authenticationMethods/standard/read | Read standard properties of authentication methods for users |
+> | microsoft.directory/users/authenticationMethods/standard/restrictedRead | Read standard properties of authentication methods that do not include personally identifiable information for users |
> | microsoft.directory/users/authenticationMethods/basic/update | Update basic properties of authentication methods for users |
-> | microsoft.directory/deletedItems.users/restore | Restore soft deleted users to original state |
-> | microsoft.directory/users/delete | Delete users |
-> | microsoft.directory/users/disable | Disable users |
-> | microsoft.directory/users/enable | Enable users |
> | microsoft.directory/users/invalidateAllRefreshTokens | Force sign-out by invalidating user refresh tokens |
-> | microsoft.directory/users/restore | Restore deleted users |
-> | microsoft.directory/users/basic/update | Update basic properties on users |
-> | microsoft.directory/users/manager/update | Update manager for users |
> | microsoft.directory/users/password/update | Reset passwords for all users |
-> | microsoft.directory/users/userPrincipalName/update | Update User Principal Name of users |
> | microsoft.azure.serviceHealth/allEntities/allTasks | Read and configure Azure Service Health | > | microsoft.azure.supportTickets/allEntities/allTasks | Create and manage Azure support tickets | > | microsoft.office365.serviceHealth/allEntities/allTasks | Read and configure Service Health in the Microsoft 365 admin center |
Users with this role have read access to recipients and write access to the attr
> | Actions | Description | > | | | > | microsoft.office365.exchange/allRecipients/allProperties/allTasks | Create and delete all recipients, and read and update all properties of recipients in Exchange Online |
-> | microsoft.office365.exchange/messageTracking/allProperties/allTasks | Manage all tasks in message tracking in Exchange Online |
> | microsoft.office365.exchange/migration/allProperties/allTasks | Manage all tasks related to migration of recipients in Exchange Online | ## External ID User Flow Administrator
Do not use. This role has been deprecated and will be removed from Azure AD in t
> | microsoft.directory/contacts/delete | Delete contacts | > | microsoft.directory/contacts/basic/update | Update basic properties on contacts | > | microsoft.directory/deletedItems.groups/restore | Restore soft deleted groups to original state |
+> | microsoft.directory/deletedItems.users/delete | Permanently delete users, which can no longer be restored |
> | microsoft.directory/deletedItems.users/restore | Restore soft deleted users to original state | > | microsoft.directory/groups/create | Create Security groups and Microsoft 365 groups, excluding role-assignable groups | > | microsoft.directory/groups/delete | Delete Security groups and Microsoft 365 groups, excluding role-assignable groups |
Do not use. This role has been deprecated and will be removed from Azure AD in t
> | microsoft.directory/contacts/delete | Delete contacts | > | microsoft.directory/contacts/basic/update | Update basic properties on contacts | > | microsoft.directory/deletedItems.groups/restore | Restore soft deleted groups to original state |
+> | microsoft.directory/deletedItems.users/delete | Permanently delete users, which can no longer be restored |
> | microsoft.directory/deletedItems.users/restore | Restore soft deleted users to original state | > | microsoft.directory/domains/allProperties/allTasks | Create and delete domains, and read and update all properties | > | microsoft.directory/groups/create | Create Security groups and Microsoft 365 groups, excluding role-assignable groups |
The [Authentication Administrator](#authentication-administrator) role has permi
The [Authentication Policy Administrator](#authentication-policy-administrator) role has permissions to set the tenant's authentication method policy that determines which methods each user can register and use. | Role | Manage user's auth methods | Manage per-user MFA | Manage MFA settings | Manage auth method policy | Manage password protection policy | Update sensitive attributes |
-| - | - | - | - | - | - | - |
+| - | - | - | - | - | - | - |
| Authentication Administrator | Yes for some users (see above) | Yes for some users (see above) | No | No | No | Yes for some users (see above) | | Privileged Authentication Administrator| Yes for all users | Yes for all users | No | No | No | Yes for all users | | Authentication Policy Administrator | No | No | Yes | Yes | Yes | No |
The [Authentication Policy Administrator](#authentication-policy-administrator)
> | microsoft.directory/users/authenticationMethods/delete | Delete authentication methods for users | > | microsoft.directory/users/authenticationMethods/standard/read | Read standard properties of authentication methods for users | > | microsoft.directory/users/authenticationMethods/basic/update | Update basic properties of authentication methods for users |
-> | microsoft.directory/deletedItems.users/restore | Restore soft deleted users to original state |
-> | microsoft.directory/users/delete | Delete users |
-> | microsoft.directory/users/disable | Disable users |
-> | microsoft.directory/users/enable | Enable users |
> | microsoft.directory/users/invalidateAllRefreshTokens | Force sign-out by invalidating user refresh tokens |
-> | microsoft.directory/users/restore | Restore deleted users |
-> | microsoft.directory/users/basic/update | Update basic properties on users |
-> | microsoft.directory/users/manager/update | Update manager for users |
> | microsoft.directory/users/password/update | Reset passwords for all users |
-> | microsoft.directory/users/userPrincipalName/update | Update User Principal Name of users |
> | microsoft.azure.serviceHealth/allEntities/allTasks | Read and configure Azure Service Health | > | microsoft.azure.supportTickets/allEntities/allTasks | Create and manage Azure support tickets | > | microsoft.office365.serviceHealth/allEntities/allTasks | Read and configure Service Health in the Microsoft 365 admin center |
Users with this role can't change the credentials or reset MFA for members and o
> | microsoft.directory/accessReviews/definitions.groups/create | Create access reviews for membership in Security and Microsoft 365 groups. | > | microsoft.directory/accessReviews/definitions.groups/delete | Delete access reviews for membership in Security and Microsoft 365 groups. | > | microsoft.directory/accessReviews/definitions.groups/allProperties/read | Read all properties of access reviews for membership in Security and Microsoft 365 groups, including role-assignable groups. |
-> | microsoft.directory/users/authenticationMethods/create | Create authentication methods for users |
-> | microsoft.directory/users/authenticationMethods/delete | Delete authentication methods for users |
-> | microsoft.directory/users/authenticationMethods/standard/read | Read standard properties of authentication methods for users |
-> | microsoft.directory/users/authenticationMethods/basic/update | Update basic properties of authentication methods for users |
> | microsoft.directory/contacts/create | Create contacts | > | microsoft.directory/contacts/delete | Delete contacts | > | microsoft.directory/contacts/basic/update | Update basic properties on contacts | > | microsoft.directory/deletedItems.groups/restore | Restore soft deleted groups to original state |
-> | microsoft.directory/deletedItems.users/restore | Restore soft deleted users to original state |
> | microsoft.directory/entitlementManagement/allProperties/allTasks | Create and delete resources, and read and update all properties in Azure AD entitlement management | > | microsoft.directory/groups/assignLicense | Assign product licenses to groups for group-based licensing | > | microsoft.directory/groups/create | Create Security groups and Microsoft 365 groups, excluding role-assignable groups |
active-directory Amms Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/amms-tutorial.md
Title: 'Tutorial: Azure Active Directory integration with AMMS | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with AMMS'
description: Learn how to configure single sign-on between Azure Active Directory and AMMS.
Previously updated : 04/04/2019 Last updated : 07/09/2022
-# Tutorial: Azure Active Directory integration with AMMS
+# Tutorial: Azure AD SSO integration with AMMS
-In this tutorial, you learn how to integrate AMMS with Azure Active Directory (Azure AD).
-Integrating AMMS with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate AMMS with Azure Active Directory (Azure AD). When you integrate AMMS with Azure AD, you can:
-* You can control in Azure AD who has access to AMMS.
-* You can enable your users to be automatically signed-in to AMMS (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to AMMS.
+* Enable your users to be automatically signed-in to AMMS with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites To configure Azure AD integration with AMMS, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get a [free account](https://azure.microsoft.com/free/)
-* AMMS single sign-on enabled subscription
+* An Azure AD subscription. If you don't have an Azure AD environment, you can get a [free account](https://azure.microsoft.com/free/).
+* AMMS single sign-on enabled subscription.
+* Along with Cloud Application Administrator, Application Administrator can also add or manage applications in Azure AD.
+For more information, see [Azure built-in roles](../roles/permissions-reference.md).
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* AMMS supports **SP** initiated SSO
+* AMMS supports **SP** initiated SSO.
-## Adding AMMS from the gallery
+## Add AMMS from the gallery
To configure the integration of AMMS into Azure AD, you need to add AMMS from the gallery to your list of managed SaaS apps.
-**To add AMMS from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **AMMS**, select **AMMS** from result panel then click **Add** button to add the application.
-
- ![AMMS in the results list](common/search-new-app.png)
-
-## Configure and test Azure AD single sign-on
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **AMMS** in the search box.
+1. Select **AMMS** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-In this section, you configure and test Azure AD single sign-on with AMMS based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in AMMS needs to be established.
+## Configure and test Azure AD SSO for AMMS
-To configure and test Azure AD single sign-on with AMMS, you need to complete the following building blocks:
+Configure and test Azure AD SSO with AMMS using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in AMMS.
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure AMMS Single Sign-On](#configure-amms-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create AMMS test user](#create-amms-test-user)** - to have a counterpart of Britta Simon in AMMS that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+To configure and test Azure AD SSO with AMMS, perform the following steps:
-### Configure Azure AD single sign-on
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure AMMS SSO](#configure-amms-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create AMMS test user](#create-amms-test-user)** - to have a counterpart of B.Simon in AMMS that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-In this section, you enable Azure AD single sign-on in the Azure portal.
+## Configure Azure AD SSO
-To configure Azure AD single sign-on with AMMS, perform the following steps:
+Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **AMMS** application integration page, select **Single sign-on**.
+1. In the Azure portal, on the **AMMS** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Configure single sign-on link](common/select-sso.png)
+ ![Screenshot shows to edit Basic S A M L Configuration.](common/edit-urls.png "Basic Configuration")
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
+1. On the **Basic SAML Configuration** section, perform the following steps:
- ![Single sign-on select mode](common/select-saml-option.png)
-
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
-
- ![Edit Basic SAML Configuration](common/edit-urls.png)
-
-4. On the **Basic SAML Configuration** section, perform the following steps:
-
- ![AMMS Domain and URLs single sign-on information](common/sp-identifier.png)
+ a. In the **Identifier (Entity ID)** text box, type a value using the following pattern:
+ `<SUBDOMAIN>.microwestcloud.com/amms`
- a. In the **Sign on URL** text box, type a URL using the following pattern:
+ b. In the **Sign on URL** text box, type a URL using the following pattern:
`https://<SUBDOMAIN>.microwestcloud.com/amms/pages/login.aspx`
- b. In the **Identifier (Entity ID)** text box, type a URL using the following pattern:
- `<SUBDOMAIN>.microwestcloud.com/amms`
- > [!NOTE]
- > These values are not real. Update these values with the actual Sign on URL and Identifier. Contact [AMMS Client support team](mailto:techsupport@microwestsoftware.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+ > These values are not real. Update these values with the actual Identifier and Sign on URL. Contact [AMMS Client support team](mailto:techsupport@microwestsoftware.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
5. On the **Set up Single Sign-On with SAML** page, In the **SAML Signing Certificate** section, click copy button to copy **App Federation Metadata Url** and save it on your computer.
- ![The Certificate download link](common/copy-metadataurl.png)
-
-### Configure AMMS Single Sign-On
-
-To configure single sign-on on **AMMS** side, you need to send the **App Federation Metadata Url** to [AMMS support team](mailto:techsupport@microwestsoftware.com). They set this setting to have the SAML SSO connection set properly on both sides.
+ ![Screenshot shows the Certificate download link.](common/copy-metadataurl.png "Certificate")
### Create an Azure AD test user
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type `brittasimon@yourcompanydomain.extension`. For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
+In this section, you'll create a test user in the Azure portal called B.Simon.
- d. Click **Create**.
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
### Assign the Azure AD test user
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to AMMS.
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to AMMS.
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **AMMS**.
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **AMMS**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. In the **Add Assignment** dialog, click the **Assign** button.
- ![Enterprise applications blade](common/enterprise-applications.png)
+## Configure AMMS SSO
-2. In the applications list, select **AMMS**.
-
- ![The AMMS link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog click the **Assign** button.
+To configure single sign-on on **AMMS** side, you need to send the **App Federation Metadata Url** to [AMMS support team](mailto:techsupport@microwestsoftware.com). They set this setting to have the SAML SSO connection set properly on both sides.
### Create AMMS test user In this section, you create a user called Britta Simon in AMMS. Work with [AMMS support team](mailto:techsupport@microwestsoftware.com) to add the users in the AMMS platform. Users must be created and activated before you use single sign-on.
-### Test single sign-on
+## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
-When you click the AMMS tile in the Access Panel, you should be automatically signed in to the AMMS for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+* Click on **Test this application** in Azure portal. This will redirect to AMMS Sign-on URL where you can initiate the login flow.
-## Additional Resources
+* Go to AMMS Sign-on URL directly and initiate the login flow from there.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the AMMS tile in the My Apps, this will redirect to AMMS Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure AMMS you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Change Process Management Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/change-process-management-tutorial.md
Previously updated : 05/07/2020 Last updated : 07/09/2022
-# Tutorial: Azure Active Directory single sign-on (SSO) integration with Change Process Management
+# Tutorial: Azure AD SSO integration with Change Process Management
In this tutorial, you'll learn how to integrate Change Process Management with Azure Active Directory (Azure AD). When you integrate Change Process Management with Azure AD, you can:
In this tutorial, you'll learn how to integrate Change Process Management with A
* Enable your users to be automatically signed in to Change Process Management with their Azure AD accounts. * Manage your accounts in one central location: the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [Single sign-on to applications in Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items: * An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/). * A Change Process Management subscription with single sign-on (SSO) enabled.
+* Along with Cloud Application Administrator, Application Administrator can also add or manage applications in Azure AD.
+For more information, see [Azure built-in roles](../roles/permissions-reference.md).
-## Tutorial description
+## Scenario description
In this tutorial, you'll configure and test Azure AD SSO in a test environment. Change Process Management supports IDP-initiated SSO.
-After you configure Change Process Management, you can enforce session control, which protects exfiltration and infiltration of your organization's sensitive data in real time. Session controls extend from Conditional Access. [Learn how to enforce session control with Microsoft Defender for Cloud Apps](/cloud-app-security/proxy-deployment-any-app).
- ## Add Change Process Management from the gallery To configure the integration of Change Process Management into Azure AD, you need to add Change Process Management from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) with a work or school account or with a personal Microsoft account.
+1. Sign in to the Azure portal with a work or school account or with a personal Microsoft account.
1. In the left pane, select **Azure Active Directory**. 1. Go to **Enterprise applications** and then select **All Applications**. 1. To add an application, select **New application**.
To configure and test Azure AD SSO with Change Process Management, you'll take t
Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Change Process Management** application integration page, in the **Manage** section, select **single sign-on**.
+1. In the Azure portal, on the **Change Process Management** application integration page, in the **Manage** section, select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**. 1. On the **Set up Single Sign-On with SAML** page, select the pencil button for **Basic SAML Configuration** to edit the settings:
- ![Pencil button for Basic SAML Configuration](common/edit-urls.png)
+ ![Screenshot shows to edit Basic S A M L Configuration.](common/edit-urls.png "Basic Configuration")
-1. On the **Set up Single Sign-On with SAML** page, take these steps:
+1. On the **Basic SAML Configuration** section, perform the following steps:
- a. In the **Identifier** box, enter a URL in the following pattern:
+ a. In the **Identifier** box, type a URL using the following pattern:
`https://<hostname>:8443/`
- b. In the **Reply URL** box, enter a URL in the following pattern:
+ b. In the **Reply URL** box, type a URL using the following pattern:
`https://<hostname>:8443/changepilot/saml/sso` > [!NOTE]
Follow these steps to enable Azure AD SSO in the Azure portal.
1. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, select the **Download** link for **Certificate (Base64)** to download the certificate and save it on your computer:
- ![Certificate download link](common/certificatebase64.png)
+ ![Screenshot shows the Certificate download link.](common/certificatebase64.png "Certificate")
1. In the **Set up Change Process Management** section, copy the appropriate URL or URLs, based on your requirements:
- ![Copy configuration URLs](common/copy-configuration-urls.png)
+ ![Screenshot shows to copy configuration appropriate U R L.](common/copy-configuration-urls.png "Metadata")
### Create an Azure AD test user
In this section, you'll enable B.Simon to use Azure single sign-on by granting t
1. In the Azure portal, select **Enterprise applications**, and then select **All applications**. 1. In the applications list, select **Change Process Management**. 1. In the app's overview page, in the **Manage** section, select **Users and groups**:-
- ![Select Users and groups](common/users-groups-blade.png)
- 1. Select **Add user**, and then select **Users and groups** in the **Add Assignment** dialog box.-
- ![Select Add user](common/add-assign-user.png)
- 1. In the **Users and groups** dialog box, select **B.Simon** in the **Users** list, and then click the **Select** button at the bottom of the screen. 1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog box, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen. 1. In the **Add Assignment** dialog box, select **Assign**.
In this section, you'll enable B.Simon to use Azure single sign-on by granting t
To configure single sign-on on the Change Process Management side, you need to send the downloaded Base64 certificate and the appropriate URLs that you copied from the Azure portal to the [Change Process Management support team](mailto:support@realtech-us.com). They configure the SAML SSO connection to be correct on both sides. ### Create a Change Process Management test user
- Work with the [Change Process Management support team](mailto:support@realtech-us.com) to add a user named B.Simon in Change Process Management. Users must be created and activated before you use single sign-on.
-## Test SSO
-
-In this section, you'll test your Azure AD SSO configuration by using Access Panel.
-
-When you select the Change Process Management tile in Access Panel, you should be automatically signed in to the Change Process Management instance for which you set up SSO. For more information about Access Panel, see [Introduction to Access Panel](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+Work with the [Change Process Management support team](mailto:support@realtech-us.com) to add a user named B.Simon in Change Process Management. Users must be created and activated before you use single sign-on.
-## Additional resources
--- [Tutorials on how to integrate SaaS apps with Azure Active Directory](./tutorial-list.md)
+## Test SSO
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+In this section, you test your Azure AD single sign-on configuration with following options.
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+* Click on Test this application in Azure portal and you should be automatically signed in to the Change Process Management for which you set up the SSO.
-- [Try Change Process Management with Azure AD](https://aad.portal.azure.com/)
+* You can use Microsoft My Apps. When you click the Change Process Management tile in the My Apps, you should be automatically signed in to the Change Process Management for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is session control in Microsoft Defender for Cloud Apps?](/cloud-app-security/proxy-intro-aad)
+## Next steps
-- [How to protect Change Process Management with advanced visibility and controls](/cloud-app-security/proxy-intro-aad)
+Once you configure Change Process Management you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Halosys Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/halosys-tutorial.md
Title: 'Tutorial: Azure Active Directory integration with Halosys | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with Halosys'
description: Learn how to configure single sign-on between Azure Active Directory and Halosys.
Previously updated : 02/15/2019 Last updated : 07/09/2022
-# Tutorial: Azure Active Directory integration with Halosys
+# Tutorial: Azure AD SSO integration with Halosys
-In this tutorial, you learn how to integrate Halosys with Azure Active Directory (Azure AD).
-Integrating Halosys with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate Halosys with Azure Active Directory (Azure AD). When you integrate Halosys with Azure AD, you can:
-* You can control in Azure AD who has access to Halosys.
-* You can enable your users to be automatically signed-in to Halosys (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to Halosys.
+* Enable your users to be automatically signed-in to Halosys with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with Halosys, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* Halosys single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Halosys single sign-on (SSO) enabled subscription.
+* Along with Cloud Application Administrator, Application Administrator can also add or manage applications in Azure AD.
+For more information, see [Azure built-in roles](../roles/permissions-reference.md).
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* Halosys supports **IDP** initiated SSO
+* Halosys supports **IDP** initiated SSO.
-## Adding Halosys from the gallery
+## Add Halosys from the gallery
To configure the integration of Halosys into Azure AD, you need to add Halosys from the gallery to your list of managed SaaS apps.
-**To add Halosys from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **Halosys**, select **Halosys** from result panel then click **Add** button to add the application.
-
- ![Halosys in the results list](common/search-new-app.png)
-
-## Configure and test Azure AD single sign-on
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Halosys** in the search box.
+1. Select **Halosys** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-In this section, you configure and test Azure AD single sign-on with Halosys based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in Halosys needs to be established.
+## Configure and test Azure AD SSO for Halosys
-To configure and test Azure AD single sign-on with Halosys, you need to complete the following building blocks:
+Configure and test Azure AD SSO with Halosys using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Halosys.
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure Halosys Single Sign-On](#configure-halosys-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create Halosys test user](#create-halosys-test-user)** - to have a counterpart of Britta Simon in Halosys that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+To configure and test Azure AD SSO with Halosys, perform the following steps:
-### Configure Azure AD single sign-on
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Halosys SSO](#configure-halosys-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Halosys test user](#create-halosys-test-user)** - to have a counterpart of B.Simon in Halosys that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-In this section, you enable Azure AD single sign-on in the Azure portal.
+## Configure Azure AD SSO
-To configure Azure AD single sign-on with Halosys, perform the following steps:
+Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Halosys** application integration page, select **Single sign-on**.
+1. In the Azure portal, on the **Halosys** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Configure single sign-on link](common/select-sso.png)
+ ![Screenshot shows to edit Basic S A M L Configuration.](common/edit-urls.png "Basic Configuration")
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
-
- ![Single sign-on select mode](common/select-saml-option.png)
-
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
-
- ![Edit Basic SAML Configuration](common/edit-urls.png)
-
-4. On the **Set up Single Sign-On with SAML** page, perform the following steps:
-
- ![Halosys Domain and URLs single sign-on information](common/idp-intiated.png)
+1. On the **Basic SAML Configuration** section, perform the following steps:
a. In the **Identifier** text box, type a URL using the following pattern: `https://<company-name>.halosys.com`
To configure Azure AD single sign-on with Halosys, perform the following steps:
> [!NOTE] > These values are not real. Update these values with the actual Identifier and Reply URL. Contact [Halosys Client support team](https://www.sonata-software.com/form/contact) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
-5. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, click **Download** to download the **Federation Metadata XML** from the given options as per your requirement and save it on your computer.
+1. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, click **Download** to download the **Federation Metadata XML** from the given options as per your requirement and save it on your computer.
- ![The Certificate download link](common/metadataxml.png)
+ ![Screenshot shows the Certificate download link.](common/metadataxml.png "Certificate")
-6. On the **Set up Halosys** section, copy the appropriate URL(s) as per your requirement.
+1. On the **Set up Halosys** section, copy the appropriate URL(s) as per your requirement.
- ![Copy configuration URLs](common/copy-configuration-urls.png)
-
- a. Login URL
-
- b. Azure Ad Identifier
-
- c. Logout URL
-
-### Configure Halosys Single Sign-On
-
-To configure single sign-on on **Halosys** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [Halosys support team](https://www.sonata-software.com/form/contact). They set this setting to have the SAML SSO connection set properly on both sides.
+ ![Screenshot shows to copy configuration appropriate U R L.](common/copy-configuration-urls.png "Metadata")
### Create an Azure AD test user
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
+In this section, you'll create a test user in the Azure portal called B.Simon.
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type **brittasimon\@yourcompanydomain.extension**
- For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
-
- d. Click **Create**.
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
### Assign the Azure AD test user
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to Halosys.
-
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **Halosys**.
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Halosys.
- ![Enterprise applications blade](common/enterprise-applications.png)
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Halosys**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. In the **Add Assignment** dialog, click the **Assign** button.
-2. In the applications list, select **Halosys**.
+## Configure Halosys SSO
- ![The Halosys link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog click the **Assign** button.
+To configure single sign-on on **Halosys** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [Halosys support team](https://www.sonata-software.com/form/contact). They set this setting to have the SAML SSO connection set properly on both sides.
### Create Halosys test user In this section, you create a user called Britta Simon in Halosys. Work with [Halosys support team](https://www.sonata-software.com/form/contact) to add the users in the Halosys platform. Users must be created and activated before you use single sign-on.
-### Test single sign-on
-
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+## Test SSO
-When you click the Halosys tile in the Access Panel, you should be automatically signed in to the Halosys for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+In this section, you test your Azure AD single sign-on configuration with following options.
-## Additional Resources
+* Click on Test this application in Azure portal and you should be automatically signed in to the Halosys for which you set up the SSO.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the Halosys tile in the My Apps, you should be automatically signed in to the Halosys for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Halosys you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Sap Hana Cloud Platform Identity Authentication Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/sap-hana-cloud-platform-identity-authentication-tutorial.md
Follow these steps to enable Azure AD SSO in the Azure portal.
`https://<IAS-tenant-id>.accounts.ondemand.com/saml2/idp/acs/<IAS-tenant-id>.accounts.ondemand.com` > [!NOTE]
- > These values are not real. Update these values with the actual Identifier and Reply URL. Contact the [SAP Cloud Identity Services Client support team](https://cloudplatform.sap.com/capabilities/security/trustcenter.html) to get these values. If you don't understand Identifier value, read the SAP Cloud Identity Services documentation about [Tenant SAML 2.0 configuration](https://help.hana.ondemand.com/cloud_identity/frameset.htm?e81a19b0067f4646982d7200a8dab3ca.html).
+ > These values are not real. Update these values with the actual Identifier and Reply URL. Contact the [SAP Cloud Identity Services Client support team](https://cloudplatform.sap.com/capabilities/security/trustcenter.html) to get these values. If you don't understand Identifier value, read the SAP Cloud Identity Services documentation about [Tenant SAML 2.0 configuration](https://help.sap.com/docs/IDENTITY_AUTHENTICATION/6d6d63354d1242d185ab4830fc04feb1/e81a19b0067f4646982d7200a8dab3ca.html).
+ 5. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP**-initiated mode:
active-directory Tableau Online Provisioning Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/tableau-online-provisioning-tutorial.md
This operation starts the initial synchronization cycle of all users and groups
In June 2022, Tableau released a SCIM 2.0 connector. Completing the steps below will update applications configured to use the Tableau API endpoint to the use the SCIM 2.0 endpoint. These steps will remove any customizations previously made to the Tableau Cloud application, including:
-* Authentication details
+* Authentication details (credentials used for provisioning, NOT the credentials used for SSO)
* Scoping filters * Custom attribute mappings >[!Note]
active-directory Yuhu Property Management Platform Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/yuhu-property-management-platform-tutorial.md
Title: 'Tutorial: Azure Active Directory single sign-on (SSO) integration with Yuhu Property Management Platform | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with Yuhu Property Management Platform'
description: Learn how to configure single sign-on between Azure Active Directory and Yuhu Property Management Platform.
Previously updated : 12/18/2019 Last updated : 07/09/2022
-# Tutorial: Azure Active Directory single sign-on (SSO) integration with Yuhu Property Management Platform
+# Tutorial: Azure AD SSO integration with Yuhu Property Management Platform
In this tutorial, you'll learn how to integrate Yuhu Property Management Platform with Azure Active Directory (Azure AD). When you integrate Yuhu Property Management Platform with Azure AD, you can:
In this tutorial, you'll learn how to integrate Yuhu Property Management Platfor
* Enable your users to be automatically signed-in to Yuhu Property Management Platform with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items: * An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/). * Yuhu Property Management Platform single sign-on (SSO) enabled subscription.
+* Along with Cloud Application Administrator, Application Administrator can also add or manage applications in Azure AD.
+For more information, see [Azure built-in roles](../roles/permissions-reference.md).
## Scenario description In this tutorial, you configure and test Azure AD SSO in a test environment.
-* Yuhu Property Management Platform supports **SP** initiated SSO
+* Yuhu Property Management Platform supports **SP** initiated SSO.
-## Adding Yuhu Property Management Platform from the gallery
+## Add Yuhu Property Management Platform from the gallery
To configure the integration of Yuhu Property Management Platform into Azure AD, you need to add Yuhu Property Management Platform from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **Yuhu Property Management Platform** in the search box. 1. Select **Yuhu Property Management Platform** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-## Configure and test Azure AD single sign-on for Yuhu Property Management Platform
+## Configure and test Azure AD SSO for Yuhu Property Management Platform
Configure and test Azure AD SSO with Yuhu Property Management Platform using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Yuhu Property Management Platform.
-To configure and test Azure AD SSO with Yuhu Property Management Platform, complete the following building blocks:
+To configure and test Azure AD SSO with Yuhu Property Management Platform, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
- * **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
- * **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
1. **[Configure Yuhu Property Management Platform SSO](#configure-yuhu-property-management-platform-sso)** - to configure the single sign-on settings on application side.
- * **[Create Yuhu Property Management Platform test user](#create-yuhu-property-management-platform-test-user)** - to have a counterpart of B.Simon in Yuhu Property Management Platform that is linked to the Azure AD representation of user.
+ 1. **[Create Yuhu Property Management Platform test user](#create-yuhu-property-management-platform-test-user)** - to have a counterpart of B.Simon in Yuhu Property Management Platform that is linked to the Azure AD representation of user.
1. **[Test SSO](#test-sso)** - to verify whether the configuration works. ## Configure Azure AD SSO Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Yuhu Property Management Platform** application integration page, find the **Manage** section and select **single sign-on**.
+1. In the Azure portal, on the **Yuhu Property Management Platform** application integration page, find the **Manage** section and select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**.
-1. On the **Set up single sign-on with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
-
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
-1. On the **Basic SAML Configuration** section, enter the values for the following fields:
+ ![Screenshot shows to edit Basic S A M L Configuration.](common/edit-urls.png "Basic Configuration")
- a. In the **Sign on URL** text box, type a URL using the following pattern:
- `https://<SUBDOMAIN>.yuhu.io/companies`
+1. On the **Basic SAML Configuration** section, perform the following steps:
- b. In the **Identifier (Entity ID)** text box, type a URL using the following pattern:
+ a. In the **Identifier (Entity ID)** text box, type a value using the following pattern:
`yuhu-<ID>`
+ b. In the **Sign on URL** text box, type a URL using the following pattern:
+ `https://<SUBDOMAIN>.yuhu.io/companies`
+ > [!NOTE]
- > These values are not real. Update these values with the actual Sign on URL and Identifier. Contact [Yuhu Property Management Platform Client support team](mailto:hello@yuhu.io) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+ > These values are not real. Update these values with the actual Identifier and Sign on URL. Contact [Yuhu Property Management Platform Client support team](mailto:hello@yuhu.io) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
1. Yuhu Property Management Platform application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes.
- ![image](common/default-attributes.png)
+ ![Screenshot shows the image of attributes configuration.](common/default-attributes.png "Attributes")
1. In addition to above, Yuhu Property Management Platform application expects few more attributes to be passed back in SAML response which are shown below. These attributes are also pre populated but you can review them as per your requirements.
Follow these steps to enable Azure AD SSO in the Azure portal.
1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Certificate (Raw)** and select **Download** to download the certificate and save it on your computer.
- ![The Certificate download link](common/certificateraw.png)
+ ![Screenshot shows the Certificate download link.](common/certificateraw.png "Certificate")
1. On the **Set up Yuhu Property Management Platform** section, copy the appropriate URL(s) based on your requirement.
- ![Copy configuration URLs](common/copy-configuration-urls.png)
+ ![Screenshot shows to copy configuration appropriate U R L.](common/copy-configuration-urls.png "Metadata")
### Create an Azure AD test user
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **Yuhu Property Management Platform**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen. 1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen. 1. In the **Add Assignment** dialog, click the **Assign** button.
In this section, you create a user called B.Simon in Yuhu Property Management Pl
## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
-
-When you click the Yuhu Property Management Platform tile in the Access Panel, you should be automatically signed in to the Yuhu Property Management Platform for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+In this section, you test your Azure AD single sign-on configuration with following options.
-## Additional resources
+* Click on **Test this application** in Azure portal. This will redirect to Yuhu Property Management Platform Sign-on URL where you can initiate the login flow.
-- [ List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory ](./tutorial-list.md)
+* Go to Yuhu Property Management Platform Sign-on URL directly and initiate the login flow from there.
-- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)
+* You can use Microsoft My Apps. When you click the Yuhu Property Management Platform tile in the My Apps, this will redirect to Yuhu Property Management Platform Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+## Next steps
-- [Try Yuhu Property Management Platform with Azure AD](https://aad.portal.azure.com/)
+Once you configure Yuhu Property Management Platform you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
aks Cluster Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/cluster-configuration.md
AKS supports Ubuntu 18.04 as the default node operating system (OS) in general a
## Container runtime configuration
-A container runtime is software that executes containers and manages container images on a node. The runtime helps abstract away sys-calls or operating system (OS) specific functionality to run containers on Linux or Windows. For Linux node pools, `containerd` is used for node pools using Kubernetes version 1.19 and greater. For Windows Server 2019 node pools, `containerd` is generally available and can be used in node pools using Kubernetes 1.20 and greater, but Docker is still used by default.
+A container runtime is software that executes containers and manages container images on a node. The runtime helps abstract away sys-calls or operating system (OS) specific functionality to run containers on Linux or Windows. For Linux node pools, `containerd` is used for node pools using Kubernetes version 1.19 and greater. For Windows Server 2019 node pools, `containerd` is generally available and is used by default in Kubernetes 1.23 and greater. Docker is no longer supported as of September 2022. For more information about this deprecation, see the [AKS release notes][aks-release-notes].
[`Containerd`](https://containerd.io/) is an [OCI](https://opencontainers.org/) (Open Container Initiative) compliant core container runtime that provides the minimum set of required functionality to execute containers and manage images on a node. It was [donated](https://www.cncf.io/announcement/2017/03/29/containerd-joins-cloud-native-computing-foundation/) to the Cloud Native Compute Foundation (CNCF) in March of 2017. The current Moby (upstream Docker) version that AKS uses already leverages and is built on top of `containerd`, as shown above.
By using `containerd` for AKS nodes, pod startup latency improves and node resou
`Containerd` works on every GA version of Kubernetes in AKS, and in every upstream kubernetes version above v1.19, and supports all Kubernetes and AKS features. > [!IMPORTANT]
-> Clusters with Linux node pools created on Kubernetes v1.19 or greater default to `containerd` for its container runtime. Clusters with node pools on a earlier supported Kubernetes versions receive Docker for their container runtime. Linux node pools will be updated to `containerd` once the node pool Kubernetes version is updated to a version that supports `containerd`. You can still use Docker node pools and clusters on older supported versions until those fall off support.
+> Clusters with Linux node pools created on Kubernetes v1.19 or greater default to `containerd` for its container runtime. Clusters with node pools on a earlier supported Kubernetes versions receive Docker for their container runtime. Linux node pools will be updated to `containerd` once the node pool Kubernetes version is updated to a version that supports `containerd`. You can still use Docker node pools and clusters on versions below 1.23, but Docker is no longer supported as of September 2022.
>
-> Using `containerd` with Windows Server 2019 node pools is generally available, although the default for node pools created on Kubernetes v1.22 and earlier is still Docker. For more details, see [Add a Windows Server node pool with `containerd`][/learn/aks-add-np-containerd].
+> Using `containerd` with Windows Server 2019 node pools is generally available, and is used by default in Kubernetes 1.23 and greater. For more details, see [Add a Windows Server node pool with `containerd`][/learn/aks-add-np-containerd].
> > It is highly recommended to test your workloads on AKS node pools with `containerd` prior to using clusters with a Kubernetes version that supports `containerd` for your node pools.
az aks show -n aks -g myResourceGroup --query "oidcIssuerProfile.issuerUrl" -ots
- Read more about [Ephemeral OS disks](../virtual-machines/ephemeral-os-disks.md).
+<!-- LINKS - external -->
+[aks-release-notes]: https://github.com/Azure/AKS/releases
+ <!-- LINKS - internal --> [azure-cli-install]: /cli/azure/install-azure-cli [az-feature-register]: /cli/azure/feature#az_feature_register
aks Concepts Clusters Workloads https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/concepts-clusters-workloads.md
The Azure VM size for your nodes defines the storage CPUs, memory, size, and typ
In AKS, the VM image for your cluster's nodes is based on Ubuntu Linux or Windows Server 2019. When you create an AKS cluster or scale out the number of nodes, the Azure platform automatically creates and configures the requested number of VMs. Agent nodes are billed as standard VMs, so any VM size discounts (including [Azure reservations][reservation-discounts]) are automatically applied.
+For managed disks, the default disk size and performance will be assigned according to the selected VM SKU and vCPU count. For more information, see [Default OS disk sizing](cluster-configuration.md#default-os-disk-sizing).
+ If you need advanced configuration and control on your Kubernetes node container runtime and OS, you can deploy a self-managed cluster using [Cluster API Provider Azure][cluster-api-provider-azure]. ### Resource reservations
aks Quick Windows Container Deploy Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/learn/quick-windows-container-deploy-cli.md
Beginning in Kubernetes version 1.20 and greater, you can specify `containerd` a
Use the `az aks nodepool add` command to add a node pool that can run Windows Server containers with the `containerd` runtime. > [!NOTE]
-> If you do not specify the *WindowsContainerRuntime=containerd* custom header, the node pool will use Docker as the container runtime.
+> If you do not specify the *WindowsContainerRuntime=containerd* custom header, the node pool will still use `containerd` as the container runtime by default.
```azurecli-interactive az aks nodepool add \
az aks upgrade \
The above command upgrades all Windows Server node pools in the *myAKSCluster* to use the `containerd` runtime. > [!NOTE]
-> After upgrading all existing Windows Server node pools to use the `containerd` runtime, Docker will still be the default runtime when adding new Windows Server node pools.
+> When running the upgrade command, the `--kubernetes-version` specified must be a higher version than the node pool's current version.
## Connect to the cluster
aks Limit Egress Traffic https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/limit-egress-traffic.md
description: Learn what ports and addresses are required to control egress traff
Previously updated : 06/27/2022 Last updated : 07/05/2022 #Customer intent: As an cluster operator, I want to restrict egress traffic for nodes to only access defined ports and addresses and improve cluster security.
You'll define the outbound type to use the UDR that already exists on the subnet
> > The AKS feature for [**API server authorized IP ranges**](api-server-authorized-ip-ranges.md) can be added to limit API server access to only the firewall's public endpoint. The authorized IP ranges feature is denoted in the diagram as optional. When enabling the authorized IP range feature to limit API server access, your developer tools must use a jumpbox from the firewall's virtual network or you must add all developer endpoints to the authorized IP range.
+#### Create an AKS cluster with system-assigned identities
+
+> [!NOTE]
+> AKS will create a system-assigned kubelet identity in the Node resource group if you do not [specify your own kubelet managed identity][Use a pre-created kubelet managed identity].
+
+You can create an AKS cluster using a system-assigned managed identity by running the following CLI command.
+ ```azurecli az aks create -g $RG -n $AKSNAME -l $LOC \ --node-count 3 \
az aks create -g $RG -n $AKSNAME -l $LOC \
> [!NOTE] > For creating and using your own VNet and route table where the resources are outside of the worker node resource group, the CLI will add the role assignment automatically. If you are using an ARM template or other client, you need to use the Principal ID of the cluster managed identity to perform a [role assignment.][add role to identity] >
-> If you are not using the CLI but using your own VNet or route table which are outside of the worker node resource group, it's recommended to use [user-assigned control plane identity][Bring your own control plane managed identity]. For system-assigned control plane identity, we cannot get the identity ID before creating cluster, which causes delay for role assignment to take effect.
+> If you are not using the CLI but using your own VNet or route table which are outside of the worker node resource group, it's recommended to use [user-assigned control plane identity][Create an AKS cluster with user-assigned identities]. For system-assigned control plane identity, we cannot get the identity ID before creating cluster, which causes delay for role assignment to take effect.
+#### Create an AKS cluster with user-assigned identities
+
+##### Create user-assigned managed identities
+
+If you don't have a control plane managed identity, you can create by running the following [az identity create][az-identity-create] command:
+
+```azurecli-interactive
+az identity create --name myIdentity --resource-group myResourceGroup
+```
+
+The output should resemble the following:
+
+```output
+{
+ "clientId": "<client-id>",
+ "clientSecretUrl": "<clientSecretUrl>",
+ "id": "/subscriptions/<subscriptionid>/resourcegroups/myResourceGroup/providers/Microsoft.ManagedIdentity/userAssignedIdentities/myIdentity",
+ "location": "westus2",
+ "name": "myIdentity",
+ "principalId": "<principal-id>",
+ "resourceGroup": "myResourceGroup",
+ "tags": {},
+ "tenantId": "<tenant-id>",
+ "type": "Microsoft.ManagedIdentity/userAssignedIdentities"
+}
+```
+
+If you don't have a kubelet managed identity, you can create one by running the following [az identity create][az-identity-create] command:
+
+```azurecli-interactive
+az identity create --name myKubeletIdentity --resource-group myResourceGroup
+```
+
+The output should resemble the following:
+
+```output
+{
+ "clientId": "<client-id>",
+ "clientSecretUrl": "<clientSecretUrl>",
+ "id": "/subscriptions/<subscriptionid>/resourcegroups/myResourceGroup/providers/Microsoft.ManagedIdentity/userAssignedIdentities/myKubeletIdentity",
+ "location": "westus2",
+ "name": "myKubeletIdentity",
+ "principalId": "<principal-id>",
+ "resourceGroup": "myResourceGroup",
+ "tags": {},
+ "tenantId": "<tenant-id>",
+ "type": "Microsoft.ManagedIdentity/userAssignedIdentities"
+}
+```
+
+##### Create an AKS cluster with user-assigned identities
+
+Now you can use the following command to create your AKS cluster with your existing identities in the subnet. Provide the control plane identity resource ID via `assign-identity` and the kubelet managed identity via `assign-kubelet-identity`:
+
+```azurecli
+az aks create -g $RG -n $AKSNAME -l $LOC \
+ --node-count 3 \
+ --network-plugin $PLUGIN \
+ --outbound-type userDefinedRouting \
+ --vnet-subnet-id $SUBNETID \
+ --api-server-authorized-ip-ranges $FWPUBLIC_IP
+ --enable-managed-identity \
+ --assign-identity <identity-resource-id> \
+ --assign-kubelet-identity <kubelet-identity-resource-id>
+```
+
+> [!NOTE]
+> For creating and using your own VNet and route table where the resources are outside of the worker node resource group, the CLI will add the role assignment automatically. If you are using an ARM template or other client, you need to use the Principal ID of the cluster managed identity to perform a [role assignment.][add role to identity]
### Enable developer access to the API server
If you want to restrict how pods communicate between themselves and East-West tr
[aks-faq]: faq.md [aks-private-clusters]: private-clusters.md [add role to identity]: use-managed-identity.md#add-role-assignment-for-control-plane-identity
-[Bring your own control plane managed identity]: use-managed-identity.md#bring-your-own-control-plane-managed-identity
+[Create an AKS cluster with user-assigned identities]: limit-egress-traffic.md#create-an-aks-cluster-with-user-assigned-identities
+[Use a pre-created kubelet managed identity]: use-managed-identity.md#use-a-pre-created-kubelet-managed-identity
+[az-identity-create]: /cli/azure/identity#az_identity_create
+[az-aks-get-credentials]: /cli/azure/aks#az_aks_get_credentials
aks Operator Best Practices Storage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/operator-best-practices-storage.md
In this example, the *Standard_DS2_v2* offers twice as many attached disks, and
Work with your application development team to understand their storage capacity and performance needs. Choose the appropriate VM size for the AKS nodes to meet or exceed their performance needs. Regularly baseline applications to adjust VM size as needed.
+> [!NOTE]
+> By default, disk size and performance for managed disks is assigned according to the selected VM SKU and vCPU count. Default OS disk sizing is only used on new clusters or node pools when Ephemeral OS disks are not supported and a default OS disk size is not specified. For more information, see [Default OS disk sizing](cluster-configuration.md#default-os-disk-sizing).
+ For more information about available VM sizes, see [Sizes for Linux virtual machines in Azure][vm-sizes]. ++ ## Dynamically provision volumes > **Best practice guidance**
aks Web App Routing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/web-app-routing.md
The Web Application Routing solution makes it easy to access applications that a
## Web Application Routing solution overview
-The add-on deploys four components: an [nginx ingress controller][nginx], [Secrets Store CSI Driver][csi-driver], [Open Service Mesh (OSM)][osm], and [External-DNS][external-dns] controller.
+The add-on deploys two components: an [nginx ingress controller][nginx], and [External-DNS][external-dns] controller.
- **Nginx ingress Controller**: The ingress controller exposed to the internet. - **External-DNS controller**: Watches for Kubernetes Ingress resources and creates DNS A records in the cluster-specific DNS zone.-- **CSI driver**: Connector used to communicate with keyvault to retrieve SSL certificates for ingress controller.-- **OSM**: A lightweight, extensible, cloud native service mesh that allows users to uniformly manage, secure, and get out-of-the-box observability features for highly dynamic microservice environments. ## Prerequisites
az extension update --name aks-preview
### Install the `osm` CLI
-Since Web Application Routing uses OSM internally to secure intranet communication, we need to set up the `osm` CLI. This command-line tool contains everything needed to install and configure Open Service Mesh. The binary is available on the [OSM GitHub releases page][osm-release].
+Since Web Application Routing uses OSM internally to secure intranet communication, we need to set up the `osm` CLI. This command-line tool contains everything needed to configure and manage Open Service Mesh. The latest binaries are available on the [OSM GitHub releases page][osm-release].
-## Deploy Web Application Routing with the Azure CLI
+### Import certificate to Azure Keyvault
-The Web Application Routing routing add-on can be enabled with the Azure CLI when deploying an AKS cluster. To do so, use the [az aks create][az-aks-create] command with the `--enable-addons` argument.
+```bash
+openssl pkcs12 -export -in aks-ingress-tls.crt -inkey aks-ingress-tls.key -out aks-ingress-tls.pfx
+# skip Password prompt
+```
```azurecli
-az aks create --resource-group myResourceGroup --name myAKSCluster --enable-addons web_application_routing
+az keyvault certificate import --vault-name <MY_KEYVAULT> -n <KEYVAULT-CERTIFICATE-NAME> -f aks-ingress-tls.pfx
```
-> [!TIP]
-> If you want to enable multiple add-ons, provide them as a comma-separated list. For example, to enable Web Application Routing routing and monitoring, use the format `--enable-addons web_application_routing,monitoring`.
+## Deploy Web Application Routing with the Azure CLI
+
+The Web Application Routing routing add-on can be enabled with the Azure CLI when deploying an AKS cluster. To do so, use the [az aks create][az-aks-create] command with the `--enable-addons` argument. However, since Web Application routing depends on the OSM addon to secure intranet communication and the Azure Keyvault Secret Provider to retrieve certificates, we must enable them at the same time.
+
+```azurecli
+az aks create --resource-group myResourceGroup --name myAKSCluster --enable-addons azure-keyvault-secrets-provider,open-service-mesh,web_application_routing --generate-ssh-keys
+```
You can also enable Web Application Routing on an existing AKS cluster using the [az aks enable-addons][az-aks-enable-addons] command. To enable Web Application Routing on an existing cluster, add the `--addons` parameter and specify *web_application_routing* as shown in the following example: ```azurecli
-az aks enable-addons --resource-group myResourceGroup --name myAKSCluster --addons web_application_routing
+az aks enable-addons --resource-group myResourceGroup --name myAKSCluster --addons azure-keyvault-secrets-provider,open-service-mesh,web_application_routing
``` ## Connect to your AKS cluster
Copy the identity's object ID:
### Grant access to Azure Key Vault
-Obtain the vault URI for your Azure Key Vault:
-
-```azurecli
-az keyvault show --resource-group myResourceGroup --name myapp-contoso
-```
- Grant `GET` permissions for Web Application Routing to retrieve certificates from Azure Key Vault: ```azurecli
-az keyvault set-policy --name myapp-contoso --object-id <WEB_APP_ROUTING_MSI_OBJECT_ID> --secret-permissions get --certificate-permissions get
+az keyvault set-policy --name myapp-contoso --object-id <WEB_APP_ROUTING_MSI_OBJECT_ID> --secret-permissions get --certificate-permissions get
``` ## Use Web Application Routing
The Web Application Routing solution may only be triggered on service resources
```yaml annotations: kubernetes.azure.com/ingress-host: myapp.contoso.com
- kubernetes.azure.com/tls-cert-keyvault-uri: myapp-contoso.vault.azure.net/certificates/keyvault-certificate-name/keyvault-certificate-name-revision
+ kubernetes.azure.com/tls-cert-keyvault-uri: https://<MY-KEYVAULT>.vault.azure.net/certificates/<KEYVAULT-CERTIFICATE-NAME>/<KEYVAULT-CERTIFICATE-REVISION>
```
-These annotations in the service manifest would direct Web Application Routing to create an ingress servicing `myapp.contoso.com` connected to the keyvault `myapp-contoso` and will retrieve the `keyvault-certificate-name` with `keyvault-certificate-name-revision`
+These annotations in the service manifest would direct Web Application Routing to create an ingress servicing `myapp.contoso.com` connected to the keyvault `<MY-KEYVAULT>` and will retrieve the `<KEYVAULT-CERTIFICATE-NAME>` with `<KEYVAULT-CERTIFICATE-REVISION>`. To obtain the certificate URI within your keyvault run:
+
+```azurecli
+az keyvault certificate show --vault-name <MY_KEYVAULT> --name <KEYVAULT-CERTIFICATE-NAME> -o jsonc | jq .id
+```
-Create a file named **samples-web-app-routing.yaml** and copy in the following YAML. On line 29-31, update `<MY_HOSTNAME>` with your DNS host name and `<MY_KEYVAULT_URI>` with the full certficicate vault URI.
+Create a file named **samples-web-app-routing.yaml** and copy in the following YAML. On line 29-31, update `<MY_HOSTNAME>` with your DNS host name and `<MY_KEYVAULT_CERTIFICATE_URI>` with the ID returned from keyvault.
```yaml apiVersion: apps/v1
metadata:
name: aks-helloworld annotations: kubernetes.azure.com/ingress-host: <MY_HOSTNAME>
- kubernetes.azure.com/tls-cert-keyvault-uri: <MY_KEYVAULT_URI>
+ kubernetes.azure.com/tls-cert-keyvault-uri: <MY_KEYVAULT_CERTIFICATE_URI>
spec: type: ClusterIP ports:
service/aks-helloworld created
## Verify the managed ingress was created ```bash
-$ kubectl get ingress -n hello-web-app-routing
+kubectl get ingress -n hello-web-app-routing
+
+NAME CLASS HOSTS ADDRESS PORTS AGE
+aks-helloworld webapprouting.kubernetes.azure.com myapp.contoso.com 20.51.92.19 80, 443 4m
```
-Open a web browser to *<MY_HOSTNAME>*, for example *myapp.contoso.com* and verify you see the demo application. The application may take a few minutes to appear.
+## Configure external DNS to point to cluster
+
+Now that Web Application Routing is configured within our cluster and we have the external IP address, we can configure our DNS servers to reflect this. As soon as the DNS updates have propagated, open a web browser to *<MY_HOSTNAME>*, for example *myapp.contoso.com* and verify you see the demo application. The application may take a few minutes to appear.
## Remove Web Application Routing
kubectl delete namespace hello-web-app-routing
The Web Application Routing add-on can be removed using the Azure CLI. To do so run the following command, substituting your AKS cluster and resource group name. ```azurecli
-az aks disable-addons --addons web_application_routing --name myAKSCluster --resource-group myResourceGroup --no-wait
+az aks disable-addons --addons azure-keyvault-secrets-provider,open-service-mesh,web_application_routing --name myAKSCluster --resource-group myResourceGroup
``` When the Web Application Routing add-on is disabled, some Kubernetes resources may remain in the cluster. These resources include *configMaps* and *secrets*, and are created in the *app-routing-system* namespace. To maintain a clean cluster, you may want to remove these resources.
api-management Api Management Access Restriction Policies https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/api-management-access-restriction-policies.md
To understand the difference between rate limits and quotas, [see Rate limits an
| name | The name of the API or operation for which the quota applies. | Yes | N/A | | bandwidth | The maximum total number of kilobytes allowed during the time interval specified in the `renewal-period`. | Either `calls`, `bandwidth`, or both together must be specified. | N/A | | calls | The maximum total number of calls allowed during the time interval specified in the `renewal-period`. | Either `calls`, `bandwidth`, or both together must be specified. | N/A |
-| renewal-period | The time period in seconds after which the quota resets. When it's set to `0` the period is set to infinite.| Yes | N/A |
+| renewal-period | The length in seconds of the fixed window after which the quota resets. The start of each period is calculated relative to the start time of the subscription. When `renewal-period` is set to `0`, the period is set to infinite.| Yes | N/A |
### Usage
In the following example, the quota is keyed by the caller IP address.
| calls | The maximum total number of calls allowed during the time interval specified in the `renewal-period`. | Either `calls`, `bandwidth`, or both together must be specified. | N/A | | counter-key | The key to use for the quota policy. For each key value, a single counter is used for all scopes at which the policy is configured. | Yes | N/A | | increment-condition | The boolean expression specifying if the request should be counted towards the quota (`true`) | No | N/A |
-| renewal-period | The time period in seconds after which the quota resets. When it's set to `0` the period is set to infinite. | Yes | N/A |
+| renewal-period | The length in seconds of the fixed window after which the quota resets. The start of each period is calculated relative to `first-perdiod-start`. When `renewal-period` is set to `0`, the period is set to infinite. | Yes | N/A |
| first-period-start | The starting date and time for quota renewal periods, in the following format: `yyyy-MM-ddTHH:mm:ssZ` as specified by the ISO 8601 standard. | No | `0001-01-01T00:00:00Z` | > [!NOTE]
api-management Api Management Api Import Restrictions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/api-management-api-import-restrictions.md
Title: Restrictions and details of API formats support
-description: Details of known issues and restrictions on Open API, WSDL, and WADL formats support in Azure API Management.
+description: Details of known issues and restrictions on OpenAPI, WSDL, and WADL formats support in Azure API Management.
documentationcenter: ''
If you prefer a different behavior, you can either:
* Manually change via form-based editor, or * Remove the "required" attribute from the OpenAPI definition, thus not converting them to template parameters.
+For GET, HEAD, and OPTIONS operations, API Management discards a request body parameter if defined in the OpenAPI specification.
+ ## <a name="open-api"> </a>OpenAPI/Swagger import limitations If you receive errors while importing your OpenAPI document, make sure you've validated it beforehand by either:
api-management Api Management Howto Aad B2c https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/api-management-howto-aad-b2c.md
Previously updated : 09/28/2021 Last updated : 07/12/2022
Azure Active Directory B2C is a cloud identity management solution for consumer-
In this tutorial, you'll learn the configuration required in your API Management service to integrate with Azure Active Directory B2C. As noted later in this article, if you are using the deprecated legacy developer portal, some steps will differ.
+> [!IMPORTANT]
+> * This article has been updated with steps to configure an Azure AD B2C app using the Microsoft Authentication Library ([MSAL](../active-directory/develop/msal-overview.md)) v2.0.
+> * If you previously configured an Azure AD B2C app for user sign-in using the Azure AD Authentication Library (ADAL), we recommend that you [migrate to MSAL](#migrate-to-msal).
+ For information about enabling access to the developer portal by using classic Azure Active Directory, see [How to authorize developer accounts using Azure Active Directory](api-management-howto-aad.md). ## Prerequisites
In this section, you'll create a user flow in your Azure Active Directory B2C te
1. In a separate [Azure portal](https://portal.azure.com) tab, navigate to your API Management instance. 1. Under **Developer portal**, select **Identities** > **+ Add**.
-1. In the **Add identity provider** page, select **Azure Active Directory B2C**.
+1. In the **Add identity provider** page, select **Azure Active Directory B2C**. Once selected, you'll be able to enter other necessary information.
+ * In the **Client library** dropdown, select **MSAL**.
+ * To add other settings, see steps later in the article.
1. In the **Add identity provider** window, copy the **Redirect URL**. :::image type="content" source="media/api-management-howto-aad-b2c/b2c-identity-provider-redirect-url.png" alt-text="Copy redirect URL":::
In this section, you'll create a user flow in your Azure Active Directory B2C te
1. In the **Register an application** page, enter your application's registration information. * In the **Name** section, enter an application name of your choosing. * In the **Supported account types** section, select **Accounts in any identity provider or organizational directory (for authenticating users with user flows)**. For more information, see [Register an application](../active-directory/develop/quickstart-register-app.md#register-an-application).
- * In **Redirect URI**, enter the Redirect URL your copied from your API Management instance.
+ * In **Redirect URI**, select **Single-page application (SPA)** and paste the redirect URL you saved from a previous step.
* In **Permissions**, select **Grant admin consent to openid and offline_access permissions.** * Select **Register** to create the application.
In this section, you'll create a user flow in your Azure Active Directory B2C te
:::image type="content" source="media/api-management-howto-aad-b2c/add-identity-provider.png" alt-text="Active Directory B2c identity provider configuration"::: 1. After you've specified the desired configuration, select **Add**.
+1. Republish the developer portal for the Azure AD B2C configuration to take effect. In the left menu, under **Developer portal**, select **Portal overview** > **Publish**.
After the changes are saved, developers will be able to create new accounts and sign in to the developer portal by using Azure Active Directory B2C.
+## Migrate to MSAL
+
+If you previously configured an Azure AD B2C app for user sign-in using the ADAL, you can use the portal to migrate the app to MSAL and update the identity provider in API Management.
+
+### Update Azure AD B2C app for MSAL compatibility
+
+For steps, see [Switch redirect URIs to the single-page application type](../active-directory/develop/migrate-spa-implicit-to-auth-code.md#switch-redirect-uris-to-spa-platform).
+
+### Update identity provider configuration
+
+1. In the left menu of your API Management instance, under **Developer portal**, select **Identities**.
+1. Select **Azure Active Directory B2C** from the list.
+1. In the **Client library** dropdown, select **MSAL**.
+1. Select **Update**.
+1. [Republish your developer portal](api-management-howto-developer-portal-customize.md#publish-from-the-azure-portal).
++ ## Developer portal - add Azure Active Directory B2C account authentication > [!IMPORTANT]
The **Sign-up form: OAuth** widget represents a form used for signing up with OA
* [Azure Active Directory B2C overview] * [Azure Active Directory B2C: Extensible policy framework]
+* Learn more about [MSAL](../active-directory/develop/msal-overview.md) and [migrating to MSAL v2](../active-directory/develop/msal-migration.md)
* [Use a Microsoft account as an identity provider in Azure Active Directory B2C] * [Use a Google account as an identity provider in Azure Active Directory B2C] * [Use a LinkedIn account as an identity provider in Azure Active Directory B2C]
api-management Api Management Howto Aad https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/api-management-howto-aad.md
description: Learn how to enable user sign-in to the API Management developer po
Previously updated : 05/20/2022 Last updated : 07/12/2022
In this article, you'll learn how to:
> * Enable access to the developer portal for users from Azure Active Directory (Azure AD). > * Manage groups of Azure AD users by adding external groups that contain the users.
+> [!IMPORTANT]
+> * This article has been updated with steps to configure an Azure AD app using the Microsoft Authentication Library ([MSAL](../active-directory/develop/msal-overview.md)).
+> * If you previously configured an Azure AD app for user sign-in using the Azure AD Authentication Library (ADAL), we recommend that you [migrate to MSAL](#migrate-to-msal).
+ ## Prerequisites - Complete the [Create an Azure API Management instance](get-started-create-service-instance.md) quickstart.
After the Azure AD provider is enabled:
1. In the left menu of your API Management instance, under **Developer portal**, select **Identities**. 1. Select **+Add** from the top to open the **Add identity provider** pane to the right.
-1. Under **Type**, select **Azure Active Directory** from the drop-down menu.
- * Once selected, you'll be able to enter other necessary information.
- * Information includes **Client ID** and **Client secret**.
- * See more information about these controls later in the article.
+1. Under **Type**, select **Azure Active Directory** from the drop-down menu. Once selected, you'll be able to enter other necessary information.
+ * In the **Client library** dropdown, select **MSAL**.
+ * To add **Client ID** and **Client secret**, see steps later in the article.
1. Save the **Redirect URL** for later. :::image type="content" source="media/api-management-howto-aad/api-management-with-aad001.png" alt-text="Screenshot of adding identity provider in Azure portal.":::
After the Azure AD provider is enabled:
1. Select **New registration**. On the **Register an application** page, set the values as follows: * Set **Name** to a meaningful name such as *developer-portal*
- * Set **Supported account types** to **Accounts in this organizational directory only**.
- * In **Redirect URI**, select **Web** and paste the redirect URL you saved from a previous step.
+ * Set **Supported account types** to **Accounts in any organizational directory**.
+ * In **Redirect URI**, select **Single-page application (SPA)** and paste the redirect URL you saved from a previous step.
* Select **Register**. 1. After you've registered the application, copy the **Application (client) ID** from the **Overview** page. 1. Switch to the browser tab with your API Management instance. 1. In the **Add identity provider** window, paste the **Application (client) ID** value into the **Client ID** box.
-1. Switch to the browser tab with the App Registration.
+1. Switch to the browser tab with the App registration.
1. Select the appropriate app registration. 1. Under the **Manage** section of the side menu, select **Certificates & secrets**. 1. From the **Certificates & secrets** page, select the **New client secret** button under **Client secrets**.
After the Azure AD provider is enabled:
* Optionally configure other sign-in settings by selecting **Identities** > **Settings**. For example, you might want to redirect anonymous users to the sign-in page. * Republish the developer portal after any configuration change.
+## Migrate to MSAL
+
+If you previously configured an Azure AD app for user sign-in using the ADAL, you can use the portal to migrate the app to MSAL and update the identity provider in API Management.
+
+### Update Azure AD app for MSAL compatibility
+
+For steps, see [Switch redirect URIs to the single-page application type](../active-directory/develop/migrate-spa-implicit-to-auth-code.md#switch-redirect-uris-to-spa-platform).
+
+### Update identity provider configuration
+
+1. In the left menu of your API Management instance, under **Developer portal**, select **Identities**.
+1. Select **Azure Active Directory** from the list.
+1. In the **Client library** dropdown, select **MSAL**.
+1. Select **Update**.
+1. [Republish your developer portal](api-management-howto-developer-portal-customize.md#publish-from-the-azure-portal).
++ ## Add an external Azure AD group Now that you've enabled access for users in an Azure AD tenant, you can:
Follow these steps to grant:
1. Update the first 3 lines of the following Azure CLI script to match your environment and run it. ```azurecli
- $subId = "Your Azure subscription ID" #e.g. "1fb8fadf-03a3-4253-8993-65391f432d3a"
- $tenantId = "Your Azure AD Tenant or Organization ID" #e.g. 0e054eb4-e5d0-43b8-ba1e-d7b5156f6da8"
- $appObjectID = "Application Object ID that has been registered in AAD" #e.g. "2215b54a-df84-453f-b4db-ae079c0d2619"
+ $subId = "Your Azure subscription ID" # Example: "1fb8fadf-03a3-4253-8993-65391f432d3a"
+ $tenantId = "Your Azure AD Tenant or Organization ID" # Example: 0e054eb4-e5d0-43b8-ba1e-d7b5156f6da8"
+ $appObjectID = "Application Object ID that has been registered in AAD" # Example: "2215b54a-df84-453f-b4db-ae079c0d2619"
#Login and Set the Subscription az login az account set --subscription $subId
Your user is now signed in to the developer portal for your API Management servi
## Next Steps -- Learn how to [Protect your web API backend in API Management by using OAuth 2.0 authorization with Azure AD](./api-management-howto-protect-backend-with-aad.md) - Learn more about [Azure Active Directory and OAuth2.0](../active-directory/develop/authentication-vs-authorization.md).-- Check out more [videos](https://azure.microsoft.com/documentation/videos/index/?services=api-management) about API Management.-- For other ways to secure your back-end service, see [Mutual Certificate authentication](./api-management-howto-mutual-certificates.md).
+- Learn more about [MSAL](../active-directory/develop/msal-overview.md) and [migrating to MSAL](../active-directory/develop/msal-migration.md).
- [Create an API Management service instance](./get-started-create-service-instance.md). - [Manage your first API](./import-and-publish.md).
app-service Quickstart Ruby https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/quickstart-ruby.md
This quickstart shows how to deploy a Ruby on Rails app to App Service on Linux
![Screenshot of the Create a new fork page in GitHub for creating a new fork of Azure-Samples/ruby-docs-hello-world.](media/quickstart-ruby/fork-details-ruby-docs-hello-world-repo.png) >[!NOTE]
- > This should take you to the new fork. Your fork URL will look something like this: https://github.com/YOUR_GITHUB_ACCOUNT_NAME/ruby-docs-hello-world
+ > This should take you to the new fork. Your fork URL will look something like this: `https://github.com/YOUR_GITHUB_ACCOUNT_NAME/ruby-docs-hello-world`
application-gateway Rewrite Http Headers Url https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/rewrite-http-headers-url.md
With URL rewrite capability in Application Gateway, you can:
* Rewrite the host name, path and query string of the request URL
-* Choose to rewrite the URL of all requests on a listener or only those requests which match one or more of the conditions you set. These conditions are based on the request and response properties (request, header, response header and server variables).
+* Choose to rewrite the URL of all requests on a listener or only those requests which match one or more of the conditions you set. These conditions are based on the request properties (request header and server variables).
* Choose to route the request (select the backend pool) based on either the original URL or the rewritten URL
applied-ai-services Form Recognizer Studio Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/form-recognizer-studio-overview.md
+
+ Title: What is Form Recognizer Studio?
+
+description: Learn how to set up and use Form Recognizer Studio to test features of Azure Form Recognizer on the web.
+++++ Last updated : 07/18/2022+
+recommendations: false
++
+<!-- markdownlint-disable MD033 -->
+# What is Form Recognizer Studio?
+
+>[!NOTE]
+> Form Recognizer Studio is currently in public preview. Some features may not be supported or have limited capabilities.
+
+Form Recognizer Studio is an online tool to visually explore, understand, train, and integrate features from the Form Recognizer service into your applications. The studio provides a platform for you to experiment with the different Form Recognizer models and sample their returned data in an interactive manner without the need to write code.
+
+The studio supports all Form Recognizer v3.0 models and v2.1 models with labeled data. Refer to the [REST API migration guide](v3-migration-guide.md) for detailed information about migrating from v2.1 to v3.0.
+
+## Get started using Form Recognizer Studio
+
+1. To use Form Recognizer Studio, you'll need the following assets:
+
+ * **Azure subscription** - [Create one for free](https://azure.microsoft.com/free/cognitive-services/).
+
+ * **Cognitive Services or Form Recognizer resource**. Once you have your Azure subscription, create a [single-service](https://portal.azure.com/#create/Microsoft.CognitiveServicesFormRecognizer) or [multi-service](https://portal.azure.com/#create/Microsoft.CognitiveServicesAllInOne) resource, in the Azure portal to get your key and endpoint. Use the free pricing tier (`F0`) to try the service, and upgrade later to a paid tier for production.
+
+ > [!TIP]
+ >
+ > * Create a Cognitive Services (multi-service) resource if you plan to access multiple cognitive services under a single endpoint and key.
+ > * Create a single-service resource for Form Recognizer access only. Please note that you'll need a single-service resource if you intend to use [Azure Active Directory authentication](../../active-directory/authentication/overview-authentication.md).
+
+1. Navigate to the [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/). If it's your first time logging in, a popup window will appear prompting you to configure your service resource. You have two options:
+
+ **a. Access by Resource**.
+
+ * Choose your existing subscription.
+ * Select an existing resource group within your subscription or create a new one.
+ * Select your existing Form Recognizer or Cognitive services resource.
+
+ :::image type="content" source="media/studio/welcome-to-studio.png" alt-text="Screenshot of the configure service resource window.":::
+
+ **b. Access by API endpoint and key**.
+
+ * Retrieve your endpoint and key from the Azure portal.
+ * Go to the overview page for your resource and select **Keys and Endpoint** from the left navigation bar.
+ * Enter the values in the appropriate fields.
+
+ :::image type="content" source="media/containers/keys-and-endpoint.png" alt-text="Screenshot: keys and endpoint location in the Azure portal.":::
+
+1. Once you've completed configuring your resource, you'll be able to try the different models offered by Form Recognizer Studio. From the front page, select any Form Recognizer model to try using with a no-code approach.
+
+ :::image type="content" source="media/studio/form-recognizer-studio-front.png" alt-text="Screenshot of Form Recognizer Studio front page.":::
+
+1. After you've tried Form Recognizer Studio, use the [**C#**](quickstarts/try-v3-csharp-sdk.md), [**Java**](quickstarts/try-v3-java-sdk.md), [**JavaScript**](quickstarts/try-v3-javascript-sdk.md) or [**Python**](quickstarts/try-v3-python-sdk.md) client libraries or the [**REST API**](quickstarts/try-v3-rest-api.md) to get started incorporating Form Recognizer models into your own applications.
+
+ To learn more about each model, *see* the concepts pages.
+
+ | Model type| Models |
+ |--|--|
+ |Document analysis models| <ul><li>[**Read model**](concept-read.md)</li><li>[**Layout model**](concept-layout.md)</li><li>[**General document model**](concept-general-document.md)</li></ul>.</br></br>
+ |**Prebuilt models**|<ul><li>[**W-2 form model**](concept-w2.md)</li><li>[**Invoice model**](concept-invoice.md)</li><li>[**Receipt model**](concept-receipt.md)</li><li>[**ID document model**](concept-id-document.md)</li><li>[**Business card model**](concept-business-card.md)</li></ul>
+ |Custom models|<ul><li>[**Custom model**](concept-custom.md)</li><ul><li>[**Template model**](concept-custom-template.md)</li><li>[**Neural model**](concept-custom-template.md)</li></ul><li>[**Composed model**](concept-model-overview.md)</li></ul>
+
+### Manage your resource
+
+ To view resource details such as name and pricing tier, select the **Settings** icon in the top-right corner of the Form Recognizer Studio home page and select the **Resource** tab. If you have access to other resources, you can switch resources as well.
++
+With Form Recognizer, you can quickly automate your data processing in applications and workflows, easily enhance data-driven strategies, and skillfully enrich document search capabilities.
+
+## Next steps
+
+* Visit [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio) to begin using the models presented by the service.
+
+* For more information on Form Recognizer capabilities, see [Azure Form Recognizer overview](overview.md).
applied-ai-services Try Sample Label Tool https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/try-sample-label-tool.md
Choose the Train icon on the left pane to open the Training page. Then select th
:::image type="content" source="../media/analyze.png" alt-text="Training view.":::
-That's it! You've learned how to use the Form Recognizer sample tool for Form Recognizer prebuilt, layout and custom models. You've also learned to analyze a custom form with manually labeled data. Now you can try a Form Recognizer client library SDK or REST API.
+That's it! You've learned how to use the Form Recognizer sample tool for Form Recognizer prebuilt, layout and custom models. You've also learned to analyze a custom form with manually labeled data.
## Next steps
-> [!div class="nextstepaction"]
-> [Explore Form Recognizer client library SDK and REST API quickstart](../quickstarts/get-started-sdk-rest-api.md)
+>[!div class="nextstepaction"]
+> [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio)
applied-ai-services Try V3 Csharp Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/try-v3-csharp-sdk.md
To view the entire output, visit the Azure samples repository on GitHub to view
That's it, congratulations!
-In this quickstart, you used the Form Recognizer C# SDK to analyze various forms and documents in different ways. Next, explore the reference documentation to learn about Form Recognizer API in more depth.
+In this quickstart, you used the Form Recognizer C# SDK to analyze various forms and documents in different ways. Next, explore the Form Recognizer Studio and reference documentation to learn about Form Recognizer API in more depth.
## Next steps
-> [!div class="nextstepaction"]
-> [Form Recognizer REST API v3.0 (preview)](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)
+>[!div class="nextstepaction"]
+> [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio)
> [!div class="nextstepaction"]
-> [Form Recognizer .NET/C# reference library](https://azuresdkdocs.blob.core.windows.net/$web/dotnet/Azure.AI.FormRecognizer/4.0.0-beta.4/https://docsupdatetracker.net/index.html)
+> [Form Recognizer REST API v3.0 (preview)](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)
applied-ai-services Try V3 Java Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/try-v3-java-sdk.md
In this quickstart, you used the Form Recognizer Java SDK to analyze various for
## Next steps
-> [!div class="nextstepaction"]
-> [Form Recognizer REST API v3.0 (preview)](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)
+>[!div class="nextstepaction"]
+> [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio)
> [!div class="nextstepaction"]
-> [Form Recognizer Java reference library](https://azuresdkdocs.blob.core.windows.net/$web/java/azure-ai-formrecognizer/4.0.0-beta.5/https://docsupdatetracker.net/index.html)
+> [Form Recognizer REST API v3.0 (preview)](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)
applied-ai-services Try V3 Javascript Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/try-v3-javascript-sdk.md
To view the entire output, visit the Azure samples repository on GitHub to view
That's it, congratulations!
-In this quickstart, you used the Form Recognizer JavaScript SDK to analyze various forms in different ways. Next, explore the reference documentation to learn moe about Form Recognizer v3.0 API.
+In this quickstart, you used the Form Recognizer JavaScript SDK to analyze various forms in different ways. Next, explore the Form Recognizer Studio and reference documentation to learn moe about Form Recognizer v3.0 API.
## Next steps
-> [!div class="nextstepaction"]
-> [Form Recognizer REST API v3.0 (preview)](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)
+>[!div class="nextstepaction"]
+> [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio)
> [!div class="nextstepaction"]
-> [Form Recognizer JavaScript reference library](https://azuresdkdocs.blob.core.windows.net/$web/javascript/azure-ai-form-recognizer/4.0.0-beta.4/https://docsupdatetracker.net/index.html)
+> [Form Recognizer REST API v3.0 (preview)](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)
applied-ai-services Try V3 Python Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/try-v3-python-sdk.md
To view the entire output, visit the Azure samples repository on GitHub to view
That's it, congratulations!
-In this quickstart, you used the Form Recognizer Python SDK to analyze various forms in different ways. Next, explore the reference documentation to learn more about Form Recognizer v3.0 API.
+In this quickstart, you used the Form Recognizer Python SDK to analyze various forms in different ways. Next, explore the Form Recognizer Studio and reference documentation to learn more about Form Recognizer v3.0 API.
## Next steps
-> [!div class="nextstepaction"]
-> [Form Recognizer REST API v3.0 (preview)](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)
+>[!div class="nextstepaction"]
+> [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio)
> [!div class="nextstepaction"]
-> [Form Recognizer Python reference library](https://azuresdkdocs.blob.core.windows.net/$web/python/azure-ai-formrecognizer/3.2.0b5/https://docsupdatetracker.net/index.html)
+> [Form Recognizer REST API v3.0 (preview)](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)
applied-ai-services Try V3 Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/try-v3-rest-api.md
The prebuilt models extract pre-defined sets of document fields. See [Model data
## Next steps
-In this quickstart, you used the Form Recognizer REST API preview (v3.0) to analyze forms in different ways. Next, further explore the latest reference documentation to learn more about the Form Recognizer API.
+In this quickstart, you used the Form Recognizer REST API preview (v3.0) to analyze forms in different ways. Next, further explore the Form Recognizer Studio and latest reference documentation to learn more about the Form Recognizer API.
+
+>[!div class="nextstepaction"]
+> [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio)
> [!div class="nextstepaction"] > [REST API preview (v3.0) reference documentation](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument)
azure-arc Resource Sync https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/data/resource-sync.md
https://management.azure.com/subscriptions/{{subscription}}/resourcegroups/{{res
## Limitations -- Resource sync rule does not hydrate Azure Arc Data controller. The Azure Arc Data controller must be deployed via ARM API.
+- Resource sync rule does not project Azure Arc Data controller. The Azure Arc Data controller must be deployed via ARM API.
- Resource sync only applies to the data services such as Arc enabled SQL managed instance, post deployment of Data controller. -- Resource sync rule does not hydrate Azure Arc enabled PostgreSQL-- Resource sync rule does not hydrate Azure Arc Active Directory connector-- Resource sync rule does not hydrate Azure Arc Instance Failover Groups
+- Resource sync rule does not project Azure Arc enabled PostgreSQL
+- Resource sync rule does not project Azure Arc Active Directory connector
+- Resource sync rule does not project Azure Arc Instance Failover Groups
## Next steps
-[Create Azure Arc-enabled data controller using Kubernetes tools](create-data-controller-using-kubernetes-native-tools.md)
+[Create Azure Arc data controller in direct connectivity mode using CLI](create-data-controller-direct-cli.md)
+
azure-fluid-relay Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-fluid-relay/reference/service-limits.md
This article outlines known limitations of Azure Fluid Relay.
## Distributed Data Structures
-The Azure Fluid Relay doesn't support [experimental distributed data structures (DDSes)](https://fluidframework.com/docs/data-structures/experimental/). These include but are not limited to DDS packages with the `@fluid-experimental` package namespace.
+The Azure Fluid Relay doesn't support [experimental distributed data structures (DDSes)](https://fluidframework.com/docs/data-structures/overview). These include but are not limited to DDS packages with the `@fluid-experimental` package namespace.
## Fluid sessions
azure-functions Create First Function Cli Csharp Ieux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/create-first-function-cli-csharp-ieux.md
- Title: "Create a C# function from the command line - Azure Functions"
-description: "Learn how to create a C# function from the command line, then publish the local project to serverless hosting in Azure Functions."
Previously updated : 10/03/2020-----
-# Quickstart: Create a C# function in Azure from the command line
-
-> [!div class="op_single_selector" title1="Select your function language: "]
-> - [C#](create-first-function-cli-csharp-ieux.md)
-> - [Java](create-first-function-cli-java.md)
-> - [JavaScript](create-first-function-cli-node.md)
-> - [PowerShell](create-first-function-cli-powershell.md)
-> - [Python](create-first-function-cli-python.md)
-> - [TypeScript](create-first-function-cli-typescript.md)
-
-In this article, you use command-line tools to create a C# class library-based function that responds to HTTP requests. After testing the code locally, you deploy it to the <abbr title="A runtime computing environment in which all the details of the server are transparent to application developers, which simplifies the process of deploying and managing code.">serverless</abbr> environment of <abbr title="Azure's service that provides a low-cost serverless computing environment for applications.">Azure Functions</abbr>.
-
-Completing this quickstart incurs a small cost of a few USD cents or less in your Azure account.
-
-There is also a [Visual Studio Code-based version](create-first-function-vs-code-csharp.md) of this article.
-
-## 1. Prepare your environment
-
-+ Get an Azure <abbr title="The profile that maintains billing information for Azure usage.">account</abbr> with an active <abbr title="The basic organizational structure in which you manage resources on Azure, typically associated with an individual or department within an organization.">subscription</abbr>. [Create an account for free](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio).
-
-+ Install [.NET Core 3.1 SDK](https://dotnet.microsoft.com/download)
-
-+ Install [Azure Functions Core Tools](functions-run-local.md#v2) version 3.x.
-
-+ Either the <abbr title="A set of cross-platform command line tools for working with Azure resources from your local development computer, as an alternative to using the Azure portal.">Azure CLI</abbr> or <abbr title="A PowerShell module that provides commands for working with Azure resources from your local development computer, as an alternative to using the Azure portal.">Azure PowerShell</abbr> for creating Azure resources:
-
- + [Azure CLI](/cli/azure/install-azure-cli) version 2.4 or later.
-
- + The Azure [Az PowerShell module](/powershell/azure/install-az-ps) version 5.9.0 or later.
---
-### 2. Verify prerequisites
-
-Verify your prerequisites, which depend on whether you are using the Azure CLI or Azure PowerShell for creating Azure resources:
-
-# [Azure CLI](#tab/azure-cli)
-
-+ In a terminal or command window, run `func --version` to check that the <abbr title="The set of command line tools for working with Azure Functions on your local computer.">Azure Functions Core Tools</abbr> are version 3.x.
-
-+ **Run** `az --version` to check that the Azure CLI version is 2.4 or later.
-
-+ **Run** `az login` to sign in to Azure and verify an active subscription.
-
-+ **Run** `dotnet --list-sdks` to check that .NET Core SDK version 3.1.x is installed
-
-# [Azure PowerShell](#tab/azure-powershell)
-
-+**Run** `func --version` to check that the Azure Functions Core Tools are version 3.x.
-
-+ **Run** `(Get-Module -ListAvailable Az).Version` and verify version 5.0 or later.
-
-+ **Run** `Connect-AzAccount` to sign in to Azure and verify an active subscription.
-
-+ **Run** `dotnet --list-sdks` to check that .NET Core SDK version 3.1.x is installed
---
-## 3. Create a local function project
-
-In this section, you create a local <abbr title="A logical container for one or more individual functions that can be deployed and managed together.">Azure Functions project</abbr> in C#. Each function in the project responds to a specific <abbr title="An event that invokes the functionΓÇÖs code, such as an HTTP request, a queue message, or a specific time.">trigger</abbr>.
-
-1. Run the `func init` command to create a functions project in a folder named *LocalFunctionProj* with the specified runtime:
-
- ```csharp
- func init LocalFunctionProj --dotnet
- ```
-
-1. **Run** 'cd LocalFunctionProj' to navigate to the <abbr title="This folder contains various files for the project, including configurations files named local.settings.json and host.json. Because local.settings.json can contain secrets downloaded from Azure, the file is excluded from source control by default in the .gitignore file.">project folder</abbr>.
-
- ```console
- cd LocalFunctionProj
- ```
- <br/>
-
-1. Add a function to your project by using the following command:
-
- ```console
- func new --name HttpExample --template "HTTP trigger" --authlevel "anonymous"
- ```
- The `--name` argument is the unique name of your function (HttpExample).
-
- The `--template` argument specifies the function's trigger (HTTP).
--
- <br/>
- <details>
- <summary><strong>Optional: Code for HttpExample.cs</strong></summary>
-
- *HttpExample.cs* contains a `Run` method that receives request data in the `req` variable is an [HttpRequest](/dotnet/api/microsoft.aspnetcore.http.httprequest) that's decorated with the **HttpTriggerAttribute**, which defines the trigger behavior.
-
- :::code language="csharp" source="~/functions-docs-csharp/http-trigger-template/HttpExample.cs":::
-
- The return object is an [ActionResult](/dotnet/api/microsoft.aspnetcore.mvc.actionresult) that returns a response message as either an [OkObjectResult](/dotnet/api/microsoft.aspnetcore.mvc.okobjectresult) (200) or a [BadRequestObjectResult](/dotnet/api/microsoft.aspnetcore.mvc.badrequestobjectresult) (400). To learn more, see [Azure Functions HTTP triggers and bindings](./functions-bindings-http-webhook.md?tabs=csharp).
- </details>
-
-<br/>
---
-## 4. Run the function locally
-
-1. Run your function by starting the local Azure Functions runtime host from the *LocalFunctionProj* folder:
-
- ```
- func start
- ```
-
- Toward the end of the output, the following lines should appear:
-
- <pre class="is-monospace is-size-small has-padding-medium has-background-tertiary has-text-tertiary-invert">
- ...
-
- Now listening on: http://0.0.0.0:7071
- Application started. Press Ctrl+C to shut down.
-
- Http Functions:
-
- HttpExample: [GET,POST] http://localhost:7071/api/HttpExample
- ...
-
- </pre>
-
- <br/>
- <details>
- <summary><strong>I don't see HttpExample in the output</strong></summary>
-
- If HttpExample doesn't appear, you likely started the host from outside the root folder of the project. In that case, use <kbd>Ctrl+C</kbd> to stop the host, navigate to the project's root folder, and run the previous command again.
- </details>
-
-1. Copy the URL of your **HttpExample** function from this output to a browser and append the query string **?name=<YOUR_NAME>**, making the full URL like **http://localhost:7071/api/HttpExample?name=Functions**. The browser should display a message like **Hello Functions**:
-
- ![Result of the function run locally in the browser](../../includes/media/functions-run-function-test-local-cli/function-test-local-browser.png)
--
-1. Select <kbd>Ctrl+C</kbd> and choose <kbd>y</kbd> to stop the functions host.
-
-<br/>
---
-## 5. Create supporting Azure resources for your function
-
-Before you can deploy your function code to Azure, you need to create a <abbr title="A logical container for related Azure resources that you can manage as a unit.">resource group</abbr>, a <abbr title="An account that contains all your Azure storage data objects. The storage account provides a unique namespace for your storage data.">storage account</abbr>, and a <abbr title="The cloud resource that hosts serverless functions in Azure, which provides the underlying compute environment in which functions run.">function app</abbr> by using the following commands:
-
-1. If you haven't done so already, sign in to Azure:
-
- # [Azure CLI](#tab/azure-cli)
- ```azurecli
- az login
- ```
--
- # [Azure PowerShell](#tab/azure-powershell)
- ```azurepowershell
- Connect-AzAccount
- ```
--
-
-
-1. Create a resource group named `AzureFunctionsQuickstart-rg` in the `westeurope` region.
-
- # [Azure CLI](#tab/azure-cli)
-
- ```azurecli
- az group create --name AzureFunctionsQuickstart-rg --location westeurope
- ```
-
- The [az group create](/cli/azure/group#az-group-create) command creates a resource group. You generally create your resource group and resources in a <abbr title="A geographical reference to a specific Azure datacenter in which resources are allocated.">region</abbr> near you, using an available region returned from the `az account list-locations` command.
-
- # [Azure PowerShell](#tab/azure-powershell)
-
- ```azurepowershell
- New-AzResourceGroup -Name AzureFunctionsQuickstart-rg -Location westeurope
- ```
--
-
-
- You can't host Linux and Windows apps in the same resource group. If you have an existing resource group named `AzureFunctionsQuickstart-rg` with a Windows function app or web app, you must use a different resource group.
-
-1. Create a general-purpose Azure Storage account in your resource group and region:
-
- # [Azure CLI](#tab/azure-cli)
-
- ```azurecli
- az storage account create --name <STORAGE_NAME> --location westeurope --resource-group AzureFunctionsQuickstart-rg --sku Standard_LRS
- ```
--
- # [Azure PowerShell](#tab/azure-powershell)
-
- ```azurepowershell
- New-AzStorageAccount -ResourceGroupName AzureFunctionsQuickstart-rg -Name <STORAGE_NAME> -SkuName Standard_LRS -Location westeurope
- ```
--
-
-
- Replace `<STORAGE_NAME>` with a name that is appropriate to you and <abbr title="The name must be unique across all storage accounts used by all Azure customers globally. For example, you can use a combination of your personal or company name, application name, and a numeric identifier, as in contosobizappstorage20">unique in Azure Storage</abbr>. Names must contain three to 24 characters numbers and lowercase letters only. `Standard_LRS` specifies a general-purpose account, which is [supported by Functions](storage-considerations.md#storage-account-requirements).
--
-1. Create the function app in Azure.
-**Replace** '<STORAGE_NAME>** with name in previous step.
-**Replace** '<APP_NAME>' with a globally unique name.
-
- # [Azure CLI](#tab/azure-cli)
-
- ```azurecli
- az functionapp create --resource-group AzureFunctionsQuickstart-rg --consumption-plan-location westeurope --runtime dotnet --functions-version 3 --name <APP_NAME> --storage-account <STORAGE_NAME>
- ```
--
- # [Azure PowerShell](#tab/azure-powershell)
-
- ```azurepowershell
- New-AzFunctionApp -Name <APP_NAME> -ResourceGroupName AzureFunctionsQuickstart-rg -StorageAccount <STORAGE_NAME> -Runtime dotnet -FunctionsVersion 3 -Location 'West Europe'
- ```
--
-
-
- Replace `<STORAGE_NAME>` with the name of the account you used in the previous step.
-
- Replace `<APP_NAME>` with a <abbr title="A name that must be unique across all Azure customers globally. For example, you can use a combination of your personal or organization name, application name, and a numeric identifier, as in contoso-bizapp-func-20.">unique name</abbr>. The `<APP_NAME>` is also the default DNS domain for the function app.
-
- <br/>
- <details>
- <summary><strong>What is the cost of the resources provisioned on Azure?</strong></summary>
-
- This command creates a function app running in your specified language runtime under the [Azure Functions Consumption plan](consumption-plan.md), which is free for the amount of usage you incur here. The command also provisions an associated Azure Application Insights instance in the same resource group, with which you can monitor your function app and view logs. For more information, see [Monitor Azure Functions](functions-monitoring.md). The instance incurs no costs until you activate it.
- </details>
-
-<br/>
---
-## 6. Deploy the function project to Azure
--
-**Copy** ' func azure funtionapp publish <APP_NAME> into your terminal
-**Replace** `<APP_NAME>` with the name of your app.
-**Run**
-
-```console
-func azure functionapp publish <APP_NAME>
-```
-
-The `publish` command shows results similar to the following output (truncated for simplicity):
-
-<pre class="is-monospace is-size-small has-padding-medium has-background-tertiary has-text-tertiary-invert">
-...
-
-Getting site publishing info...
-Creating archive for current directory...
-Performing remote build for functions project.
-
-...
-
-Deployment successful.
-Remote build succeeded!
-Syncing triggers...
-Functions in msdocs-azurefunctions-qs:
- HttpExample - [httpTrigger]
- Invoke url: https://msdocs-azurefunctions-qs.azurewebsites.net/api/httpexample
-</pre>
-
-<br/>
---
-## 7. Invoke the function on Azure
-
-Copy the complete **Invoke URL** shown in the output of the `publish` command into a browser address bar. **Append** the query parameter **&name=Functions**.
-
-![The output of the function run on Azure in a browser](../../includes/media/functions-run-remote-azure-cli/function-test-cloud-browser.png)
-
-<br/>
---
-## 8. Clean up resources
-
-If you continue to the [next step](#next-steps) and add an Azure Storage queue output <abbr title="A declarative connection between a function and other resources. An input binding provides data to the function; an output binding provides data from the function to other resources.">binding</abbr>, keep all your resources in place as you'll build on what you've already done.
-
-Otherwise, use the following command to delete the resource group and all its contained resources to avoid incurring further costs.
-
-# [Azure CLI](#tab/azure-cli)
-
-```azurecli
-az group delete --name AzureFunctionsQuickstart-rg
-```
-
-# [Azure PowerShell](#tab/azure-powershell)
-
-```azurepowershell
-Remove-AzResourceGroup -Name AzureFunctionsQuickstart-rg
-```
---
-<br/>
---
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Connect to an Azure Storage queue](functions-add-output-binding-storage-queue-cli.md?pivots=programming-language-csharp)
azure-functions Create First Function Cli Java Uiex https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/create-first-function-cli-java-uiex.md
- Title: Create a Java function from the command line - Azure Functions
-description: Learn how to create a Java function from the command line, then publish the local project to serverless hosting in Azure Functions.
Previously updated : 11/03/2020-----
-# Quickstart: Create a Java function in Azure from the command line
-
-> [!div class="op_single_selector" title1="Select your function language: "]
-> - [Java](create-first-function-cli-java.md)
-> - [Python](create-first-function-cli-python.md)
-> - [C#](create-first-function-cli-csharp.md)
-> - [JavaScript](create-first-function-cli-node.md)
-> - [PowerShell](create-first-function-cli-powershell.md)
-> - [TypeScript](create-first-function-cli-typescript.md)
-
-Use command-line tools to create a Java function that responds to HTTP requests. Test the code locally, then deploy it to the serverless environment of Azure Functions.
-
-Completing this quickstart incurs a small cost of a few USD cents or less in your <abbr title="The profile that maintains billing information for Azure usage.">Azure account</abbr>.
-
-If Maven is not your preferred development tool, check out our similar tutorials for Java developers using [Gradle](./functions-create-first-java-gradle.md), [IntelliJ IDEA](/azure/developer/jav).
-
-## 1. Prepare your environment
-
-Before you begin, you must have the following:
-
-+ An Azure account with an active <abbr title="The basic organizational structure in which you manage resources in Azure, typically associated with an individual or department within an organization.">subscription</abbr>. [Create an account for free](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio).
-
-+ The [Azure Functions Core Tools](functions-run-local.md#v2) version 3.x.
-
-+ The [Azure CLI](/cli/azure/install-azure-cli) version 2.4 or later.
-
-+ The [Java Developer Kit](/azure/developer/java/fundamentals/java-support-on-azure), version 8 or 11. The `JAVA_HOME` environment variable must be set to the install location of the correct version of the JDK.
-
-+ [Apache Maven](https://maven.apache.org), version 3.0 or above.
-
-### Prerequisite check
-
-+ In a terminal or command window, run `func --version` to check that the <abbr title="The set of command line tools for working with Azure Functions on your local computer.">Azure Functions Core Tools</abbr> are version 3.x.
-
-+ Run `az --version` to check that the Azure CLI version is 2.4 or later.
-
-+ Run `az login` to sign in to Azure and verify an active subscription.
-
-<br>
-<hr/>
-
-## 2. Create a local function project
-
-In Azure Functions, a function project is a container for one or more individual functions that each responds to a specific <abbr title="The type of event that invokes the functionΓÇÖs code, such as an HTTP request, a queue message, or a specific time.">trigger</abbr>. All functions in a project share the same local and hosting configurations. In this section, you create a function project that contains a single function.
-
-1. In an empty folder, run the following command to generate the Functions project from a [Maven archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html).
-
- # [Bash](#tab/bash)
-
- ```bash
- mvn archetype:generate -DarchetypeGroupId=com.microsoft.azure -DarchetypeArtifactId=azure-functions-archetype -DjavaVersion=8
- ```
-
- # [PowerShell](#tab/powershell)
-
- ```powershell
- mvn archetype:generate "-DarchetypeGroupId=com.microsoft.azure" "-DarchetypeArtifactId=azure-functions-archetype" "-DjavaVersion=8"
- ```
-
- # [Cmd](#tab/cmd)
-
- ```cmd
- mvn archetype:generate "-DarchetypeGroupId=com.microsoft.azure" "-DarchetypeArtifactId=azure-functions-archetype" "-DjavaVersion=8"
- ```
-
-
-
- <br/>
- <details>
- <summary><strong>To run functions on Java 11</strong></summary>
-
- Use `-DjavaVersion=11` if you want your functions to run on Java 11. To learn more, see [Java versions](functions-reference-java.md#java-versions).
- </details>
-
-1. Maven asks you for values needed to finish generating the project on deployment.
- Provide the following values when prompted:
-
- | Prompt | Value | Description |
- | | -- | -- |
- | **groupId** | `com.fabrikam` | A value that uniquely identifies your project across all projects, following the [package naming rules](https://docs.oracle.com/javase/specs/jls/se6/html/packages.html#7.7) for Java. |
- | **artifactId** | `fabrikam-functions` | A value that is the name of the jar, without a version number. |
- | **version** | `1.0-SNAPSHOT` | Choose the default value. |
- | **package** | `com.fabrikam` | A value that is the Java package for the generated function code. Use the default. |
-
-1. Type `Y` or press Enter to confirm.
-
- Maven creates the project files in a new folder with a name of _artifactId_, which in this example is `fabrikam-functions`.
-
-1. Navigate into the project folder:
-
- ```console
- cd fabrikam-functions
- ```
-
-<br/>
-<details>
-<summary><strong>What's created in the LocalFunctionProj folder?</strong></summary>
-
-This folder contains various files for the project, such as *Function.java*, *FunctionTest.java*, and *pom.xml*. There are also configurations files named
-[local.settings.json](functions-develop-local.md#local-settings-file) and
-[host.json](functions-host-json.md). Because *local.settings.json* can contain secrets
-downloaded from Azure, the file is excluded from source control by default in the *.gitignore*
-file.
-</details>
-
-<br/>
-<details>
-<summary><strong>Code for Function.java</strong></summary>
-
-*Function.java* contains a `run` method that receives request data in the `request` variable is an [HttpRequestMessage](/java/api/com.microsoft.azure.functions.httprequestmessage) that's decorated with the [HttpTrigger](/java/api/com.microsoft.azure.functions.annotation.httptrigger) annotation, which defines the trigger behavior.
--
-The response message is generated by the [HttpResponseMessage.Builder](/java/api/com.microsoft.azure.functions.httpresponsemessage.builder) API.
-
-The archetype also generates a unit test for your function. When you change your function to add bindings or add new functions to the project, you'll also need to modify the tests in the *FunctionTest.java* file.
-</details>
-
-<br/>
-<details>
-<summary><strong>Code for pom.xml</strong></summary>
-
-Settings for the Azure resources created to host your app are defined in the **configuration** element of the plugin with a **groupId** of `com.microsoft.azure` in the generated *pom.xml* file. For example, the configuration element below instructs a Maven-based deployment to create a function app in the `java-functions-group` resource group in the `westus` <abbr title="A geographical reference to a specific Azure datacenter in which resources are allocated.">region</abbr>. The function app itself runs on Windows hosted in the `java-functions-app-service-plan` plan, which by default is a serverless Consumption plan.
--
-You can change these settings to control how resources are created in Azure, such as by changing `runtime.os` from `windows` to `linux` before initial deployment. For a complete list of settings supported by the Maven plug-in, see the [configuration details](https://github.com/microsoft/azure-maven-plugins/wiki/Azure-Functions:-Configuration-Details).
-</details>
-
-<br>
-<hr/>
-
-## 3. Run the function locally
-
-1. **Run your function** by starting the local Azure Functions runtime host from the *LocalFunctionProj* folder:
-
- ```console
- mvn clean package
- mvn azure-functions:run
- ```
-
- Toward the end of the output, the following lines should appear:
-
- <pre class="is-monospace is-size-small has-padding-medium has-background-tertiary has-text-tertiary-invert">
- ...
-
- Now listening on: http://0.0.0.0:7071
- Application started. Press Ctrl+C to shut down.
-
- Http Functions:
-
- HttpExample: [GET,POST] http://localhost:7071/api/HttpExample
- ...
- </pre>
-
- If HttpExample doesn't appear as shown above, you likely started the host from outside the root folder of the project. In that case, use <kbd>Ctrl+C</kbd> to stop the host, navigate to the project's root folder, and run the previous command again.
-
-1. **Copy the URL** of your `HttpExample` function from this output to a browser and append the query string `?name=<YOUR_NAME>`, making the full URL like `http://localhost:7071/api/HttpExample?name=Functions`. The browser should display a message like `Hello Functions`:
-
- ![Result of the function run locally in the browser](./media/functions-create-first-azure-function-azure-cli/function-test-local-browser.png)
-
- The terminal in which you started your project also shows log output as you make requests.
-
-1. When you're done, use <kbd>Ctrl+C</kbd> and choose <kbd>y</kbd> to stop the functions host.
-
-<br>
-<hr/>
-
-## 4. Deploy the function project to Azure
-
-A function app and related resources are created in Azure when you first deploy your functions project. Settings for the Azure resources created to host your app are defined in the *pom.xml* file. In this article, you'll accept the defaults.
-
-<br/>
-<details>
-<summary><strong>To create a function app running on Linux</strong></summary>
-
-To create a function app running on Linux instead of Windows, change the `runtime.os` element in the *pom.xml* file from `windows` to `linux`. Running Linux in a consumption plan is supported in [these regions](https://github.com/Azure/azure-functions-host/wiki/Linux-Consumption-Regions). You can't have apps that run on Linux and apps that run on Windows in the same resource group.
-</details>
-
-1. Before you can deploy, sign in to your Azure subscription using either Azure CLI or Azure PowerShell.
-
- # [Azure CLI](#tab/azure-cli)
- ```azurecli
- az login
- ```
-
- The [az login](/cli/azure/reference-index#az-login) command signs you into your Azure account.
-
- # [Azure PowerShell](#tab/azure-powershell)
- ```azurepowershell
- Connect-AzAccount
- ```
-
- The [Connect-AzAccount](/powershell/module/az.accounts/connect-azaccount) cmdlet signs you into your Azure account.
-
-
-
-1. Use the following command to deploy your project to a new function app.
-
- ```console
- mvn azure-functions:deploy
- ```
-
- The deployment packages the project files and deploys them to the new function app using [zip deployment](functions-deployment-technologies.md#zip-deploy). The code runs from the deployment package in Azure.
-
-<br/>
-<details>
-<summary><strong>What's created in Azure?</strong></summary>
-
-+ Resource group. Named as _java-functions-group_.
-+ Storage account. Required by Functions. The name is generated randomly based on Storage account name requirements.
-+ Hosting plan. Serverless hosting for your function app in the _westus_ region. The name is _java-functions-app-service-plan_.
-+ Function app. A function app is the deployment and execution unit for your functions. The name is randomly generated based on your _artifactId_, appended with a randomly generated number.
-</details>
-
-<br>
-<hr/>
-
-## 5. Invoke the function on Azure
-
-Because your function uses an HTTP trigger, you **invoke it by making an HTTP request to its URL** in the browser or with a tool like <abbr title="A command line tool for generating HTTP requests to a URL; see https://curl.se/">curl</abbr>.
-
-# [Browser](#tab/browser)
-
-Copy the complete **Invoke URL** shown in the output of the `publish` command into a browser address bar, appending the query parameter `&name=Functions`. The browser should display similar output as when you ran the function locally.
-
-![The output of the function run on Azure in a browser](../../includes/media/functions-run-remote-azure-cli/function-test-cloud-browser.png)
-
-# [curl](#tab/curl)
-
-Run [`curl`](https://curl.haxx.se/) with the **Invoke URL**, appending the parameter `&name=Functions`. The output of the command should be the text, "Hello Functions."
-
-![The output of the function run on Azure using curl](../../includes/media/functions-run-remote-azure-cli/function-test-cloud-curl.png)
----
-<br>
-<hr/>
-
-## 6. Clean up resources
-
-If you continue to the [next step](#next-steps) and add an Azure Storage <abbr title="In Azure Storage, a means to associate a function with a storage queue, so that it can create messages on the queue.">queue output binding</abbr>, keep all your resources in place as you'll build on what you've already done.
-
-Otherwise, use the following command to delete the resource group and all its contained resources to avoid incurring further costs.
-
- # [Azure CLI](#tab/azure-cli)
-
-```azurecli
-az group delete --name java-functions-group
-```
-
-# [Azure PowerShell](#tab/azure-powershell)
-
-```azurepowershell
-Remove-AzResourceGroup -Name java-functions-group
-```
---
-<br>
-<hr/>
-
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Connect to an Azure Storage queue](functions-add-output-binding-storage-queue-cli.md?pivots=programming-language-java)
azure-functions Create First Function Cli Python Uiex https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/create-first-function-cli-python-uiex.md
- Title: Create a Python function from the command line for Azure Functions
-description: Learn how to create a Python function from the command line and publish the local project to serverless hosting in Azure Functions.
Previously updated : 11/03/2020-----
-# Quickstart: Create a Python function in Azure from the command line
-
-> [!div class="op_single_selector" title1="Select your function language: "]
-> - [Python](create-first-function-cli-python.md)
-> - [C#](create-first-function-cli-csharp.md)
-> - [Java](create-first-function-cli-java.md)
-> - [JavaScript](create-first-function-cli-node.md)
-> - [PowerShell](create-first-function-cli-powershell.md)
-> - [TypeScript](create-first-function-cli-typescript.md)
-
-In this article, you use command-line tools to create a Python function that responds to HTTP requests. After testing the code locally, you deploy it to the <abbr title="A runtime computing environment in which all the details of the server are transparent to application developers, which simplifies the process of deploying and managing code.">serverless</abbr> environment of <abbr title="An Azure service that provides a low-cost serverless computing environment for applications.">Azure Functions</abbr>.
-
-Completing this quickstart incurs a small cost of a few USD cents or less in your Azure account.
-
-There is also a [Visual Studio Code-based version](create-first-function-vs-code-python.md) of this article.
-
-## 1. Configure your environment
-
-Before you begin, you must have the following:
-
-+ An Azure <abbr title="The profile that maintains billing information for Azure usage.">account</abbr> with an active <abbr title="The basic organizational structure in which you manage resources in Azure, typically associated with an individual or department within an organization.">subscription</abbr>. [Create an account for free](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio).
-
-+ The [Azure Functions Core Tools](functions-run-local.md#v2) version 3.x.
-
-+ Either the <abbr title="A set of cross-platform command line tools for working with Azure resources from your local development computer, as an alternative to using the Azure portal.">Azure CLI</abbr> or <abbr title="A PowerShell module that provides commands for working with Azure resources from your local development computer, as an alternative to using the Azure portal.">Azure PowerShell</abbr> for creating Azure resources:
-
- + [Azure CLI](/cli/azure/install-azure-cli) version 2.4 or later.
-
- + The Azure [Az PowerShell module](/powershell/azure/install-az-ps) version 5.9.0 or later.
-
-+ [Python 3.8 (64-bit)](https://www.python.org/downloads/release/python-382/), [Python 3.7 (64-bit)](https://www.python.org/downloads/release/python-375/), [Python 3.6 (64-bit)](https://www.python.org/downloads/release/python-368/), which are all supported by version 3.x of Azure Functions.
-
-### 1.1 Prerequisite check
-
-Verify your prerequisites, which depend on whether you are using the Azure CLI or Azure PowerShell for creating Azure resources:
-
-# [Azure CLI](#tab/azure-cli)
-
-+ In a terminal or command window, run `func --version` to check that the <abbr title="The set of command line tools for working with Azure Functions on your local computer.">Azure Functions Core Tools</abbr> are version 3.x.
-
-+ Run `az --version` to check that the Azure CLI version is 2.4 or later.
-
-+ Run `az login` to sign in to Azure and verify an active subscription.
-
-+ Run `python --version` (Linux/macOS) or `py --version` (Windows) to check your Python version reports 3.8.x, 3.7.x or 3.6.x.
-
-# [Azure PowerShell](#tab/azure-powershell)
-
-+ In a terminal or command window, run `func --version` to check that the <abbr title="The set of command line tools for working with Azure Functions on your local computer.">Azure Functions Core Tools</abbr> are version 3.x.
-
-+ Run `(Get-Module -ListAvailable Az).Version` and verify version 5.0 or later.
-
-+ Run `Connect-AzAccount` to sign in to Azure and verify an active subscription.
-
-+ Run `python --version` (Linux/macOS) or `py --version` (Windows) to check your Python version reports 3.8.x, 3.7.x or 3.6.x.
---
-<br/>
---
-## 2. <a name="create-venv"></a>Create and activate a virtual environment
-
-In a suitable folder, run the following commands to create and activate a virtual environment named `.venv`. Be sure to use Python 3.8, 3.7 or 3.6, which are supported by Azure Functions.
-
-# [bash](#tab/bash)
-
-```bash
-python -m venv .venv
-```
-
-```bash
-source .venv/bin/activate
-```
-
-If Python didn't install the venv package on your Linux distribution, run the following command:
-
-```bash
-sudo apt-get install python3-venv
-```
-
-# [PowerShell](#tab/powershell)
-
-```powershell
-py -m venv .venv
-```
-
-```powershell
-.venv\scripts\activate
-```
-
-# [Cmd](#tab/cmd)
-
-```cmd
-py -m venv .venv
-```
-
-```cmd
-.venv\scripts\activate
-```
---
-You run all subsequent commands in this activated virtual environment.
-
-<br/>
---
-## 3. Create a local function project
-
-In this section, you create a local <abbr title="A logical container for one or more individual functions that can be deployed and managed together.">Azure Functions project</abbr> in Python. Each function in the project responds to a specific <abbr title="The type of event that invokes the functionΓÇÖs code, such as an HTTP request, a queue message, or a specific time.">trigger</abbr>.
-
-1. Run the `func init` command to create a functions project in a folder named *LocalFunctionProj* with the specified runtime:
-
- ```console
- func init LocalFunctionProj --python
- ```
-
-1. Navigate into the project folder:
-
- ```console
- cd LocalFunctionProj
- ```
-
- <br/>
- <details>
- <summary><strong>What's created in the LocalFunctionProj folder?</strong></summary>
-
- This folder contains various files for the project, including configurations files named [local.settings.json](functions-develop-local.md#local-settings-file) and [host.json](functions-host-json.md). Because *local.settings.json* can contain secrets downloaded from Azure, the file is excluded from source control by default in the *.gitignore* file.
- </details>
-
-1. Add a function to your project by using the following command:
-
- ```console
- func new --name HttpExample --template "HTTP trigger" --authlevel "anonymous"
- ```
- The `--name` argument is the unique name of your function (HttpExample).
-
- The `--template` argument specifies the function's trigger (HTTP).
-
- `func new` creates a subfolder matching the function name that contains an *\_\_init\_\_.py* file with the function's code and a configuration file named *function.json*.
-
- <br/>
- <details>
- <summary><strong>Code for __init__.py</strong></summary>
-
- *\_\_init\_\_.py* contains a `main()` Python function that's triggered according to the configuration in *function.json*.
-
- :::code language="python" source="~/functions-quickstart-templates/Functions.Templates/Templates/HttpTrigger-Python/__init__.py":::
-
- For an HTTP trigger, the function receives request data in the variable `req` as defined in *function.json*. `req` is an instance of the [azure.functions.HttpRequest class](/python/api/azure-functions/azure.functions.httprequest). The return object, defined as `$return` in *function.json*, is an instance of [azure.functions.HttpResponse class](/python/api/azure-functions/azure.functions.httpresponse). To learn more, see [Azure Functions HTTP triggers and bindings](./functions-bindings-http-webhook.md?tabs=python).
- </details>
-
- <br/>
- <details>
- <summary><strong>Code for function.json</strong></summary>
-
- *function.json* is a configuration file that defines the <abbr title="Declarative connections between a function and other resources. An input binding provides data to the function; an output binding provides data from the function to other resources.">input and output bindings</abbr> for the function, including the trigger type.
-
- You can change `scriptFile` to invoke a different Python file if desired.
-
- :::code language="json" source="~/functions-quickstart-templates/Functions.Templates/Templates/HttpTrigger-Python/function.json":::
-
- Each binding requires a direction, a type, and a unique name. The HTTP trigger has an input binding of type [`httpTrigger`](functions-bindings-http-webhook-trigger.md) and output binding of type [`http`](functions-bindings-http-webhook-output.md).
- </details>
-
-<br/>
---
-## 4. Run the function locally
-
-1. Run your function by starting the local Azure Functions runtime host from the *LocalFunctionProj* folder:
-
- ```
- func start
- ```
-
- Toward the end of the output, the following lines should appear:
-
- <pre class="is-monospace is-size-small has-padding-medium has-background-tertiary has-text-tertiary-invert">
- ...
-
- Now listening on: http://0.0.0.0:7071
- Application started. Press Ctrl+C to shut down.
-
- Http Functions:
-
- HttpExample: [GET,POST] http://localhost:7071/api/HttpExample
- ...
-
- </pre>
-
- <br/>
- <details>
- <summary><strong>I don't see HttpExample in the output</strong></summary>
-
- If HttpExample doesn't appear, you likely started the host from outside the root folder of the project. In that case, use <kbd>Ctrl+C</kbd> to stop the host, navigate to the project's root folder, and run the previous command again.
- </details>
-
-1. Copy the URL of your **HttpExample** function from this output to a browser and append the query string **?name=<YOUR_NAME>**, making the full URL like **http://localhost:7071/api/HttpExample?name=Functions**. The browser should display a message like **Hello Functions**:
-
- ![Result of the function run locally in the browser](../../includes/media/functions-run-function-test-local-cli/function-test-local-browser.png)
-
-1. The terminal in which you started your project also shows log output as you make requests.
-
-1. When you're done, use <kbd>Ctrl+C</kbd> and choose <kbd>y</kbd> to stop the functions host.
-
-<br/>
---
-## 5. Create supporting Azure resources for your function
-
-Before you can deploy your function code to Azure, you need to create a <abbr title="A logical container for related Azure resources that you can manage as a unit.">resource group</abbr>, a <abbr title="An account that contains all your Azure storage data objects. The storage account provides a unique namespace for your storage data.">storage account</abbr>, and a <abbr title="The cloud resource that hosts serverless functions in Azure, which provides the underlying compute environment in which functions run.">function app</abbr> by using the following commands:
-
-1. If you haven't done so already, sign in to Azure:
-
- # [Azure CLI](#tab/azure-cli)
- ```azurecli
- az login
- ```
-
- The [az login](/cli/azure/reference-index#az-login) command signs you into your Azure account.
-
- # [Azure PowerShell](#tab/azure-powershell)
- ```azurepowershell
- Connect-AzAccount
- ```
-
- The [Connect-AzAccount](/powershell/module/az.accounts/connect-azaccount) cmdlet signs you into your Azure account.
-
-
-
-1. Create a resource group named `AzureFunctionsQuickstart-rg` in the `westeurope` region.
-
- # [Azure CLI](#tab/azure-cli)
-
- ```azurecli
- az group create --name AzureFunctionsQuickstart-rg --location westeurope
- ```
-
- The [az group create](/cli/azure/group#az-group-create) command creates a resource group. You generally create your resource group and resources in a <abbr title="A geographical reference to a specific Azure datacenter in which resources are allocated.">region</abbr> near you, using an available region returned from the `az account list-locations` command.
-
- # [Azure PowerShell](#tab/azure-powershell)
-
- ```azurepowershell
- New-AzResourceGroup -Name AzureFunctionsQuickstart-rg -Location westeurope
- ```
-
- The [New-AzResourceGroup](/powershell/module/az.resources/new-azresourcegroup) command creates a resource group. You generally create your resource group and resources in a region near you, using an available region returned from the [Get-AzLocation](/powershell/module/az.resources/get-azlocation) cmdlet.
-
-
-
- You can't host Linux and Windows apps in the same resource group. If you have an existing resource group named `AzureFunctionsQuickstart-rg` with a Windows function app or web app, you must use a different resource group.
-
-1. Create a general-purpose Azure Storage account in your resource group and region:
-
- # [Azure CLI](#tab/azure-cli)
-
- ```azurecli
- az storage account create --name <STORAGE_NAME> --location westeurope --resource-group AzureFunctionsQuickstart-rg --sku Standard_LRS
- ```
-
- The [az storage account create](/cli/azure/storage/account#az-storage-account-create) command creates the storage account.
-
- # [Azure PowerShell](#tab/azure-powershell)
-
- ```azurepowershell
- New-AzStorageAccount -ResourceGroupName AzureFunctionsQuickstart-rg -Name <STORAGE_NAME> -SkuName Standard_LRS -Location westeurope
- ```
-
- The [New-AzStorageAccount](/powershell/module/az.storage/new-azstorageaccount) cmdlet creates the storage account.
-
-
-
- Replace `<STORAGE_NAME>` with a name that is appropriate to you and <abbr title="The name must be unique across all storage accounts used by all Azure customers globally. For example, you can use a combination of your personal or company name, application name, and a numeric identifier, as in contosobizappstorage20.">unique in Azure Storage</abbr>. Names must contain three to 24 characters numbers and lowercase letters only. `Standard_LRS` specifies a general-purpose account, which is [supported by Functions](storage-considerations.md#storage-account-requirements).
-
- The storage account incurs only a few cents (USD) for this quickstart.
-
-1. Create the function app in Azure:
-
- # [Azure CLI](#tab/azure-cli)
-
- ```azurecli
- az functionapp create --resource-group AzureFunctionsQuickstart-rg --consumption-plan-location westeurope --runtime python --runtime-version 3.8 --functions-version 3 --name <APP_NAME> --storage-account <STORAGE_NAME> --os-type linux
- ```
-
- The [az functionapp create](/cli/azure/functionapp#az-functionapp-create) command creates the function app in Azure. If you are using Python 3.7 or 3.6, change `--runtime-version` to `3.7` or `3.6`, respectively.
-
- # [Azure PowerShell](#tab/azure-powershell)
-
- ```azurepowershell
- New-AzFunctionApp -Name <APP_NAME> -ResourceGroupName AzureFunctionsQuickstart-rg -StorageAccount <STORAGE_NAME> -FunctionsVersion 3 -RuntimeVersion 3.8 -Runtime python -Location 'West Europe'
- ```
-
- The [New-AzFunctionApp](/powershell/module/az.functions/new-azfunctionapp) cmdlet creates the function app in Azure. If you're using Python 3.7 or 3.6, change `-RuntimeVersion` to `3.7` or `3.6`, respectively.
-
-
-
- Replace `<STORAGE_NAME>` with the name of the account you used in the previous step.
-
- Replace `<APP_NAME>` with a <abbr title="A name that must be unique across all Azure customers globally. For example, you can use a combination of your personal or organization name, application name, and a numeric identifier, as in contoso-bizapp-func-20.">globally unique name appropriate to you</abbr>. The `<APP_NAME>` is also the default DNS domain for the function app.
-
- <br/>
- <details>
- <summary><strong>What is the cost of the resources provisioned on Azure?</strong></summary>
-
- This command creates a function app running in your specified language runtime under the [Azure Functions Consumption Plan](functions-scale.md#overview-of-plans), which is free for the amount of usage you incur here. The command also provisions an associated Azure Application Insights instance in the same resource group, with which you can monitor your function app and view logs. For more information, see [Monitor Azure Functions](functions-monitoring.md). The instance incurs no costs until you activate it.
- </details>
-
-<br/>
---
-## 6. Deploy the function project to Azure
-
-After you've successfully created your function app in Azure, you're now ready to **deploy your local functions project** by using the [func azure functionapp publish](functions-run-local.md#project-file-deployment) command.
-
-In the following example, replace `<APP_NAME>` with the name of your app.
-
-```console
-func azure functionapp publish <APP_NAME>
-```
-
-The `publish` command shows results similar to the following output (truncated for simplicity):
-
-<pre class="is-monospace is-size-small has-padding-medium has-background-tertiary has-text-tertiary-invert">
-...
-
-Getting site publishing info...
-Creating archive for current directory...
-Performing remote build for functions project.
-
-...
-
-Deployment successful.
-Remote build succeeded!
-Syncing triggers...
-Functions in msdocs-azurefunctions-qs:
- HttpExample - [httpTrigger]
- Invoke url: https://msdocs-azurefunctions-qs.azurewebsites.net/api/httpexample
-</pre>
-
-<br/>
---
-## 7. Invoke the function on Azure
-
-Because your function uses an HTTP trigger, you invoke it by making an HTTP request to its URL in the browser or with a tool like <abbr title="A command line tool for generating HTTP requests to a URL; see https://curl.se/">curl</abbr>.
-
-# [Browser](#tab/browser)
-
-Copy the complete **Invoke URL** shown in the output of the `publish` command into a browser address bar, appending the query parameter **&name=Functions**. The browser should display similar output as when you ran the function locally.
-
-![The output of the function run on Azure in a browser](../../includes/media/functions-run-remote-azure-cli/function-test-cloud-browser.png)
-
-# [curl](#tab/curl)
-
-Run [`curl`](https://curl.haxx.se/) with the **Invoke URL**, appending the parameter **&name=Functions**. The output of the command should be the text, "Hello Functions."
-
-![The output of the function run on Azure using curl](../../includes/media/functions-run-remote-azure-cli/function-test-cloud-curl.png)
---
-### 7.1 View real-time streaming logs
-
-Run the following command to view near real-time [streaming logs](functions-run-local.md#enable-streaming-logs) in Application Insights in the Azure portal:
-
-```console
-func azure functionapp logstream <APP_NAME> --browser
-```
-
-Replace `<APP_NAME>` with the name of your function app.
-
-In a separate terminal window or in the browser, call the remote function again. A verbose log of the function execution in Azure is shown in the terminal.
-
-<br/>
---
-## 8. Clean up resources
-
-If you continue to the [next step](#next-steps) and add an <abbr title="A means to associate a function with a storage queue, so that it can create messages on the queue. ">Azure Storage queue output binding</abbr>, keep all your resources in place as you'll build on what you've already done.
-
-Otherwise, use the following command to delete the resource group and all its contained resources to avoid incurring further costs.
-
- # [Azure CLI](#tab/azure-cli)
-
-```azurecli
-az group delete --name AzureFunctionsQuickstart-rg
-```
-
-# [Azure PowerShell](#tab/azure-powershell)
-
-```azurepowershell
-Remove-AzResourceGroup -Name AzureFunctionsQuickstart-rg
-```
-
-<br/>
---
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Connect to an Azure Storage queue](functions-add-output-binding-storage-queue-cli.md?pivots=programming-language-python)
-
-[Having issues? Let us know.](https://aka.ms/python-functions-qs-survey)
azure-functions Create First Function Vs Code Csharp Ieux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/create-first-function-vs-code-csharp-ieux.md
- Title: "Create a C# function using Visual Studio Code - Azure Functions"
-description: "Learn how to create a C# function, then publish the local project to serverless hosting in Azure Functions using the Azure Functions extension in Visual Studio Code. "
- Previously updated : 11/03/2020----
-# Quickstart: Create a C# function in Azure using Visual Studio Code
-
-> [!div class="op_single_selector" title1="Select your function language: "]
-> - [C#](create-first-function-vs-code-csharp.md)
-> - [Java](create-first-function-vs-code-java.md)
-> - [JavaScript](create-first-function-vs-code-node.md)
-> - [PowerShell](create-first-function-vs-code-powershell.md)
-> - [Python](create-first-function-vs-code-python.md)
-> - [TypeScript](create-first-function-vs-code-typescript.md)
-> - [Other (Go/Rust)](create-first-function-vs-code-other.md)
-
-In this article, you use Visual Studio Code to create a C# class library-based function that responds to HTTP requests. After testing the code locally, you deploy it to the <abbr title="A runtime computing environment in which all the details of the server are transparent to application developers, simplifying the process of deploying and managing code.">serverless</abbr> environment of <abbr title="Azure's service that provides a low-cost serverless computing environment for applications.">Azure Functions</abbr>.
-
-Completing this quickstart incurs a small cost of a few USD cents or less in your Azure account.
-
-There's also a [CLI-based version](create-first-function-cli-csharp.md) of this article.
-
-## 1. Configure your environment
-
-Before you get started, make sure you have the following requirements in place:
-
-+ An Azure <abbr title="The profile that maintains billing information for Azure usage.">account</abbr> with an active <abbr title="The basic organizational structure in which you manage resources in Azure, typically associated with an individual or department within an organization.">subscription</abbr>. [Create an account for free](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio).
-
-+ The [Azure Functions Core Tools](functions-run-local.md#install-the-azure-functions-core-tools) version 3.x.
-
-+ [Visual Studio Code](https://code.visualstudio.com/) on one of the [supported platforms](https://code.visualstudio.com/docs/supporting/requirements#_platforms).
-
-+ The [C# extension](https://marketplace.visualstudio.com/items?itemName=ms-dotnettools.csharp) for Visual Studio Code.
-
-+ The [Azure Functions extension](https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-azurefunctions) for Visual Studio Code.
-
-## <a name="create-an-azure-functions-project"></a>2. Create your local project
-
-In this section, you use Visual Studio Code to create a local <abbr title="A logical container for one or more individual functions that can be deployed and managed together.">Azure Functions project</abbr> in C#. Later in this article, you'll publish your function code to Azure.
-
-1. Choose the Azure icon in the <abbr title="The vertical group of icons on the left side of the Visual Studio Code window.">Activity bar</abbr>, then in the **Azure: Functions** area, select the **Create new project...** icon.
-
- ![Choose Create a new project](./media/functions-create-first-function-vs-code/create-new-project.png)
-
-1. Choose a directory location for your project workspace and choose **Select**.
-
- > [!NOTE]
- > These steps were designed to be completed outside of a workspace. In this case, do not select a project folder that is part of a workspace.
-
-1. Provide the following information at the prompts:
-
- + **Select a language for your function project**: Choose `C#`.
-
- + **Select a template for your project's first function**: Choose `HTTP trigger`.
-
- + **Provide a function name**: Type `HttpExample`.
-
- + **Provide a namespace**: Type `My.Functions`.
-
- + **Authorization level**: Choose `Anonymous`, which enables anyone to call your function endpoint. To learn about authorization levels, see [Authorization keys](functions-bindings-http-webhook-trigger.md#authorization-keys).
-
- + **Select how you would like to open your project**: Choose `Add to workspace`.
-
-1. Using this information, Visual Studio Code generates an Azure Functions project with an HTTP <abbr title="The type of event that invokes the functionΓÇÖs code, such as an HTTP request, a queue message, or a specific time.">trigger</abbr>. You can view the local project files in the Explorer. To learn more about files that are created, see [Generated project files](functions-develop-vs-code.md#generated-project-files).
--
-After you've verified that the function runs correctly on your local computer, it's time to use Visual Studio Code to publish the project directly to Azure.
--
-## 5. Publish the project to Azure
-
-In this section, you create a function app and related resources in your Azure subscription and then deploy your code.
-
-> [!IMPORTANT]
-> Publishing to an existing function app overwrites the content of that app in Azure.
-
-1. Choose the Azure icon in the Activity bar, then in the **Azure: Functions** area, choose the **Deploy to function app...** button.
-
-
-
-1. Provide the following information at the prompts:
-
- + **Select folder**: Choose a folder from your workspace or browse to one that contains your function app. You won't see this if you already have a valid function app opened.
-
- + **Select subscription**: Choose the subscription to use. You won't see this if you only have one subscription.
-
- + **Select Function App in Azure**: Choose `+ Create new Function App`. (Don't choose the `Advanced` option, which isn't covered in this article.)
-
- + **Enter a globally unique name for the function app**: Type a name that is valid in a URL path. The name you type is validated to make sure that it's <abbr title="The name must be unique across all Functions projects used by all Azure customers globally. Typically, you use a combination of your personal or company name, application name, and a numeric identifier, as in contoso-bizapp-func-20">unique in Azure Functions</abbr>.
-
- + **Select a location for new resources**: For better performance, choose a [region](https://azure.microsoft.com/regions/) near you.
-
- The extension shows the status of individual resources as they are being created in Azure in the notification area.
-
- :::image type="content" source="../../includes/media/functions-publish-project-vscode/resource-notification.png" alt-text="Notification of Azure resource creation":::
-
-1. When completed, the following Azure resources are created in your subscription, using names based on your function app name:
-
- + A **resource group**, which is a logical container for related resources.
- + A standard **Azure Storage account**, which maintains state and other information about your projects.
- + A **consumption plan**, which defines the underlying host for your serverless function app.
- + A **function app**, which provides the environment for executing your function code. A function app lets you group functions as a logical unit for easier management, deployment, and sharing of resources within the same hosting plan.
- + An **Application Insights instance** connected to the function app, which tracks usage of your serverless function.
-
- A notification is displayed after your function app is created and the deployment package is applied.
-
- > [!TIP]
- > By default, the Azure resources required by your function app are created based on the function app name you provide. By default, they are also created in the same new resource group with the function app. If you want to either customize the names of these resources or reuse existing resources, you need to instead [publish the project with advanced create options](functions-develop-vs-code.md#enable-publishing-with-advanced-create-options).
--
-1. Select **View Output** in this notification to view the creation and deployment results, including the Azure resources that you created. If you miss the notification, select the bell icon in the lower right corner to see it again.
-
- ![Create complete notification](./media/functions-create-first-function-vs-code/function-create-notifications.png)
-
-## 6. Run the function in Azure
-
-1. Back in the **Azure: Functions** area in the side bar, expand your subscription, your new function app, and **Functions**. Right-click (Windows) or <kbd>Ctrl -</kbd> click (macOS) the `HttpExample` function and choose **Execute Function Now...**.
-
- :::image type="content" source="../../includes/media/functions-vs-code-run-remote/execute-function-now.png" alt-text="Execute function now in Azure from Visual Studio Code":::
-
-1. In **Enter request body** you see the request message body value of `{ "name": "Azure" }`.
-
- Press Enter to send this request message to your function.
-
-1. When the function executes in Azure and returns a response, a notification is raised in Visual Studio Code.
-
-## 5. Clean up resources
-
-When you continue to the [next step](#next-steps) and add an <abbr title="A means to associate a function with a storage queue, so that it can create messages on the queue.">Azure Storage queue output binding</abbr> to your function, you'll need to keep all your resources in place to build on what you've already done.
-
-Otherwise, you can use the following steps to delete the function app and its related resources to avoid incurring any further costs.
--
-To learn more about Functions costs, see [Estimating Consumption plan costs](functions-consumption-costs.md).
-
-## Next steps
-
-You have used Visual Studio Code to create a function app with a simple HTTP-triggered function. In the next article, you expand that function by adding an output <abbr title="A declarative connection between a function and other resources. An input binding provides data to the function; an output binding provides data from the function to other resources.">binding</abbr>. This binding writes the string from the HTTP request to a message in an Azure Queue Storage queue.
-
-> [!div class="nextstepaction"]
-> [Connect to an Azure Storage queue](functions-add-output-binding-storage-queue-vs-code.md?pivots=programming-language-csharp)
-
-[Azure Functions Core Tools]: functions-run-local.md
-[Azure Functions extension for Visual Studio Code]: https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-azurefunctions
azure-functions Create First Function Vs Code Java Uiex https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/create-first-function-vs-code-java-uiex.md
- Title: Create a Java function using Visual Studio Code - Azure Functions
-description: Learn how to create a Java function, then publish the local project to serverless hosting in Azure Functions using the Azure Functions extension in Visual Studio Code.
- Previously updated : 11/03/2020----
-# Quickstart: Create a Java function in Azure using Visual Studio Code
--
-Use Visual Studio Code to create a Java function that responds to HTTP requests. Test the code locally, then deploy it to the serverless environment of Azure Functions.
-
-Completing this quickstart incurs a small cost of a few USD cents or less in your <abbr title="The profile that maintains billing information for Azure usage.">Azure account</abbr>.
-
-If Visual Studio Code isn't your preferred development tool, check out our similar tutorials for Java developers using [Maven](create-first-function-cli-java.md), [Gradle](./functions-create-first-java-gradle.md) and [IntelliJ IDEA](/azure/developer/java/toolkit-for-intellij/quickstart-functions).
-
-## 1. Prepare your environment
-
-Before you get started, make sure you have the following requirements in place:
-
-+ An Azure account with an active <abbr title="The basic organizational structure in which you manage resources in Azure, typically associated with an individual or department within an organization.">subscription</abbr>. [Create an account for free](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio).
-
-+ The [Java Developer Kit](/azure/developer/java/fundamentals/java-support-on-azure), version 8 or 11.
-
-+ [Apache Maven](https://maven.apache.org), version 3.0 or above.
-
-+ [Visual Studio Code](https://code.visualstudio.com/) on one of the [supported platforms](https://code.visualstudio.com/docs/supporting/requirements#_platforms).
-
-+ The [Java extension pack](https://marketplace.visualstudio.com/items?itemName=vscjava.vscode-java-pack)
-
-+ The [Azure Functions extension](https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-azurefunctions) for Visual Studio Code.
-
-<br/>
-<hr/>
-
-## 2. <a name="create-an-azure-functions-project"></a>Create your local Functions project
-
-1. Choose the Azure icon in the **Activity bar**, then in the **Azure: Functions** area, select the **Create new project...** icon.
-
- ![Choose Create a new project](./media/functions-create-first-function-vs-code/create-new-project.png)
-
-1. **Choose a directory location** for your project workspace then choose **Select**.
-
-1. Provide the following information at the prompts:
-
- + **Select a language for your function project**: Choose `Java`.
-
- + **Select a version of Java**: Choose `Java 8` or `Java 11`, the Java version on which your functions run in Azure. Choose a Java version that you've verified locally.
-
- + **Provide a group ID**: Choose `com.function`.
-
- + **Provide an artifact ID**: Choose `myFunction`.
-
- + **Provide a version**: Choose `1.0-SNAPSHOT`.
-
- + **Provide a package name**: Choose `com.function`.
-
- + **Provide an app name**: Choose `myFunction-12345`.
-
- + **Authorization level**: Choose `Anonymous`, which enables anyone to call your function endpoint.
-
- + **Select how you would like to open your project**: Choose `Add to workspace`.
-
-<br/>
-
-<details>
-<summary><strong>Can't create a function project?</strong></summary>
-
-The most common issues to resolve when creating a local Functions project are:
-* You do not have the Azure Functions extension installed.
-</details>
-
-<br/>
-<hr/>
-
-## 3. Run the function locally
-
-1. Press <kbd>F5</kbd> to start the function app project.
-
-1. In the **Terminal**, see the URL endpoint of your function running locally.
-
- ![Local function VS Code output](media/functions-create-first-function-vs-code/functions-vscode-f5.png)
-
-1. With Core Tools running, go to the **Azure: Functions** area. Under **Functions**, expand **Local Project** > **Functions**. Right-click (Windows) or <kbd>Ctrl -</kbd> click (macOS) the `HttpExample` function and choose **Execute Function Now...**.
-
- :::image type="content" source="../../includes/media/functions-run-function-test-local-vs-code/execute-function-now.png" alt-text="Execute function now from Visual Studio Code":::
-
-1. In **Enter request body** you see the request message body value of `{ "name": "Azure" }`. Press <kbd>Enter</kbd> to send this request message to your function.
-
-1. When the function executes locally and returns a response, a notification is raised in Visual Studio Code. Information about the function execution is shown in **Terminal** panel.
-
-1. Press <kbd>Ctrl + C</kbd> to stop Core Tools and disconnect the debugger.
-
-<br/>
-
-<details>
-<summary><strong>Can't run the function locally?</strong></summary>
-
-The most common issues to resolve when running a local Functions project are:
-* You do not have the Core Tools installed.
-* If you have trouble running on Windows, make sure that the default terminal shell for Visual Studio Code isn't set to WSL Bash.
-</details>
-
-<br/>
-<hr/>
-
-## 4. Sign in to Azure
-
-To publish your app, sign in to Azure. If you're already signed in, go to the next section.
-
-1. Choose the Azure icon in the Activity bar, then in the **Azure: Functions** area, choose **Sign in to Azure...**.
-
- ![Sign in to Azure within VS Code](../../includes/media/functions-sign-in-vs-code/functions-sign-into-azure.png)
-
-1. When prompted in the browser, **choose your Azure account** and **sign in** using your Azure account credentials.
-
-1. After you've successfully signed in, close the new browser window and go back to Visual Studio Code.
-
-<br/>
-<hr/>
-
-## 5. Publish the project to Azure
-
-Your first deployment of your code includes creating a Function resource in your Azure subscription.
-
-1. Choose the Azure icon in the Activity bar, then in the **Azure: Functions** area, choose the **Deploy to function app...** button.
-
- ![Publish your project to Azure](../../includes/media/functions-publish-project-vscode/function-app-publish-project.png)
-
-1. Provide the following information at the prompts:
-
- + **Select folder**: Choose the folder that contains your function app.
-
- + **Select subscription**: Choose the subscription to use. You won't see this if you only have one subscription.
-
- + **Select Function App in Azure**: Choose `Create new Function App`.
-
- + **Enter a globally unique name for the function app**: Type a name that is unique across Azure in a URL path. The name you type is validated to ensure global uniqueness.
-
- - **Select a location for new resources**: For better performance, choose a [region](https://azure.microsoft.com/regions/) near you.
-
-1. A notification is displayed after your function app is created and the deployment package is applied. Select **View Output** to see the creation and deployment results.
-
- ![Create complete notification](../../includes/media/functions-publish-project-vscode/function-create-notifications.png)
-
-<br/>
-
-<details>
-<summary><strong>Can't publish the function?</strong></summary>
-
-This section created the Azure resources and deployed your local code to the Function app. If that didn't succeed:
-
-* Review the Output for error information. The bell icon in the lower right corner is another way to view the output.
-* Did you publish to an existing function app? That action overwrites the content of that app in Azure.
-</details>
-
-<br/>
-
-<details>
-<summary><strong>What resources were created?</strong></summary>
-
-When completed, the following Azure resources are created in your subscription, using names based on your function app name:
-
-* **Resource group**: A resource group is a logical container for related resources in the same region.
-* **Azure Storage account**: A Storage resource maintains state and other information about your project.
-* **Consumption plan**: A consumption plan defines the underlying host for your serverless function app.
-* **Function app**: A function app provides the environment for executing your function code and group functions as a logical unit.
-* **Application Insights**: Application Insights tracks usage of your serverless function.
-
-</details>
-
-<br/>
-<hr/>
-
-## 6. Run the function in Azure
-
-1. Back in the **Azure: Functions** area in the side bar, expand your subscription, your new function app, and **Functions**. Right-click (Windows) or <kbd>Ctrl -</kbd> click (macOS) the `HttpExample` function and choose **Execute Function Now...**.
-
- :::image type="content" source="../../includes/media/functions-vs-code-run-remote/execute-function-now.png" alt-text="Execute function now in Azure from Visual Studio Code":::
-
-1. In **Enter request body** you see the request message body value of `{ "name": "Azure" }`. Press Enter to send this request message to your function.
-
-1. When the function executes in Azure and returns a response, a notification is raised in Visual Studio Code.
-
-<br/>
-<hr/>
-
-## 7. Clean up resources
-
-If you don't plan to continue to the [next step](#next-steps), delete the function app and its resources to avoid incurring any further costs.
-
-1. In Visual Studio Code, select the Azure icon in the Activity bar, then select the Functions area in the side bar.
-1. Select the function app, then right-click and select **Delete Function app...**.
-
-<br/>
-<hr/>
-
-## Next steps
-
-Expand the function by adding an <abbr title="In Azure Storage, a means to associate a function with a storage queue, so that it can create messages on the queue.">output binding</abbr>. This binding writes the string from the HTTP request to a message in an Azure Queue Storage queue.
-
-> [!div class="nextstepaction"]
-> [Connect to an Azure Storage queue](functions-add-output-binding-storage-queue-vs-code.md?pivots=programming-language-java)
azure-functions Create First Function Vs Code Python Uiex https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/create-first-function-vs-code-python-uiex.md
- Title: Create a Python function using Visual Studio Code - Azure Functions
-description: Learn how to create a Python function, then publish the local project to serverless hosting in Azure Functions using the Azure Functions extension in Visual Studio Code.
- Previously updated : 11/04/2020----
-# Quickstart: Create a function in Azure with Python using Visual Studio Code
-
-> [!div class="op_single_selector" title1="Select your function language: "]
-> - [Python](create-first-function-vs-code-python.md)
-> - [C#](create-first-function-vs-code-csharp.md)
-> - [Java](create-first-function-vs-code-java.md)
-> - [JavaScript](create-first-function-vs-code-node.md)
-> - [PowerShell](create-first-function-vs-code-powershell.md)
-> - [TypeScript](create-first-function-vs-code-typescript.md)
-> - [Other (Go/Rust)](create-first-function-vs-code-other.md)
-
-In this article, you use Visual Studio Code to create a Python function that responds to HTTP requests. After testing the code locally, you deploy it to the <abbr title="A runtime computing environment in which all the details of the server are transparent to application developers, which simplifies the process of deploying and managing code.">serverless</abbr> environment of <abbr title="An Azure service that provides a low-cost serverless computing environment for applications.">Azure Functions</abbr>.
-
-Completing this quickstart incurs a small cost of a few USD cents or less in your Azure account.
-
-There's also a [CLI-based version](create-first-function-cli-python.md) of this article.
-
-## 1. Prepare your environment
-
-Before you get started, make sure you have the following requirements in place:
-
-+ An Azure <abbr title="The profile that maintains billing information for Azure usage.">account</abbr> with an active <abbr title="The basic organizational structure in which you manage resources in Azure, typically associated with an individual or department within an organization.">subscription</abbr>. [Create an account for free](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio).
-
-+ The [Azure Functions Core Tools](functions-run-local.md#install-the-azure-functions-core-tools) version 3.x.
-
-+ [Python 3.8](https://www.python.org/downloads/release/python-381/), [Python 3.7](https://www.python.org/downloads/release/python-375/), [Python 3.6](https://www.python.org/downloads/release/python-368/) are supported by Azure Functions (x64).
-
-+ [Visual Studio Code](https://code.visualstudio.com/) on one of the [supported platforms](https://code.visualstudio.com/docs/supporting/requirements#_platforms).
-
-+ The [Python extension](https://marketplace.visualstudio.com/items?itemName=ms-python.python) for Visual Studio Code.
-
-+ The [Azure Functions extension](https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-azurefunctions) for Visual Studio Code.
-
-<hr/>
-<br/>
-
-## 2. <a name="create-an-azure-functions-project"></a>Create your local project
-
-1. Choose the Azure icon in the <abbr title="The vertical group of icons on the left side of the Visual Studio Code window.">Activity bar</abbr>, then in the **Azure: Functions** area, select the **Create new project...** icon.
-
- ![Choose Create a new project](./media/functions-create-first-function-vs-code/create-new-project.png)
-
-1. Choose a directory location for your project workspace and choose **Select**. It is recommended that you create a new folder or choose an empty folder as the project workspace.
-
- > [!NOTE]
- > These steps were designed to be completed outside of a workspace. In this case, do not select a project folder that is part of a workspace.
-
-1. Provide the following information at the prompts:
-
- + **Select a language for your function project**: Choose `Python`.
-
- + **Select a Python alias to create a virtual environment**: Choose the location of your Python interpreter. If the location isn't shown, type in the full path to your Python binary.
-
- + **Select a template for your project's first function**: Choose `HTTP trigger`.
-
- + **Provide a function name**: Type `HttpExample`.
-
- + **Authorization level**: Choose `Anonymous`, which enables anyone to call your function endpoint. To learn about authorization levels, see [Authorization keys](functions-bindings-http-webhook-trigger.md#authorization-keys).
-
- + **Select how you would like to open your project**: Choose `Add to workspace`.
-
-<br/>
-<details>
-<summary><strong>Can't create a function project?</strong></summary>
-
-The most common issues to resolve when creating a local Functions project are:
-* You do not have the Azure Functions extension installed.
-</details>
-
-<hr/>
-<br/>
-
-## Run the function locally
-
-1. Press <kbd>F5</kbd> to start the function app project.
-
-1. In the **Terminal** panel, see the URL endpoint of your function running locally.
-
- ![Local function VS Code output](../../includes/media/functions-run-function-test-local-vs-code/functions-vscode-f5.png)
--
-1. With Core Tools running, go to the **Azure: Functions** area. Under **Functions**, expand **Local Project** > **Functions**. Right-click (Windows) or <kbd>Ctrl -</kbd> click (macOS) the `HttpExample` function and choose **Execute Function Now...**.
-
- :::image type="content" source="../../includes/media/functions-run-function-test-local-vs-code/execute-function-now.png" alt-text="Execute function now from Visual Studio Code":::
-
-1. In **Enter request body** you see the request message body value of `{ "name": "Azure" }`. Press Enter to send this request message to your function.
-
-1. When the function executes locally and returns a response, a notification is raised in Visual Studio Code. Information about the function execution is shown in **Terminal** panel.
-
-1. Press <kbd>Ctrl + C</kbd> to stop Core Tools and disconnect the debugger.
-
-<br/>
-<details>
-<summary><strong>Can't run the function locally?</strong></summary>
-
-The most common issues to resolve when running a local Functions project are:
-* You do not have the Core Tools installed.
-* If you have trouble running on Windows, make sure that the default terminal shell for Visual Studio Code isn't set to **WSL Bash**.
-</details>
-
-<hr/>
-<br/>
-
-## 4. Sign in to Azure
-
-To publish your app, sign in to Azure. If you're already signed in, go to the next section.
-
-1. Choose the Azure icon in the Activity bar, then in the **Azure: Functions** area, choose **Sign in to Azure...**.
-
- ![Sign in to Azure within VS Code](../../includes/media/functions-sign-in-vs-code/functions-sign-into-azure.png)
-
-1. When prompted in the browser, **choose your Azure account** and **sign in** using your Azure account credentials.
-
-1. After you've successfully signed in, close the new browser window and go back to Visual Studio Code.
-
-<hr/>
-<br/>
-
-## 5. Publish the project to Azure
-
-Your first deployment of your code includes creating a Function resource in your Azure subscription.
-
-1. Choose the Azure icon in the Activity bar, then in the **Azure: Functions** area, choose the **Deploy to function app...** button.
-
- ![Publish your project to Azure](../../includes/media/functions-publish-project-vscode/function-app-publish-project.png)
-
-1. Provide the following information at the prompts:
-
- + **Select folder**: Choose the folder that contains your function app.
-
- + **Select subscription**: Choose the subscription to use. You won't see this if you only have one subscription.
-
- + **Select Function App in Azure**: Choose `+ Create new Function App`.
-
- + **Enter a globally unique name for the function app**: Type a name that is valid in a URL path. The name you type is validated to make sure that it's <abbr title="The name must be unique across all Azure customers globally. For example, you can use a combination of your personal or organization name, application name, and a numeric identifier, as in contoso-bizapp-func-20.">unique across Azure</abbr>.
-
- + **Select a runtime**: Choose the version of Python you've been running on locally. You can use the `python --version` command to check your version.
-
- + **Select a location for new resources**: For better performance, choose a [region](https://azure.microsoft.com/regions/) near you.
-
- The extension shows the status of individual resources as they are being created in Azure in the notification area.
-
- :::image type="content" source="../../includes/media/functions-publish-project-vscode/resource-notification.png" alt-text="Notification of Azure resource creation":::
-
-1. A notification is displayed after your function app is created and the deployment package is applied. Select **View Output** to view the creation and deployment results.
-
- ![Create complete notification](./media/functions-create-first-function-vs-code/function-create-notifications.png)
-
-<br/>
-<details>
-<summary><strong>Can't publish the function?</strong></summary>
-
-This section created the Azure resources and deployed your local code to the Function app. If that didn't succeed:
-
-* Review the Output for error information. The bell icon in the lower right corner is another way to view the output.
-* Did you publish to an existing function app? That action overwrites the content of that app in Azure.
-</details>
--
-<br/>
-<details>
-<summary><strong>What resources were created?</strong></summary>
-
-When completed, the following Azure resources are created in your subscription, using names based on your function app name:
-* **Resource group**: A resource group is a logical container for related resources in the same region.
-* **Azure Storage account**: A Storage resource maintains state and other information about your project.
-* **Consumption plan**: A consumption plan defines the underlying host for your serverless function app.
-* **Function app**: A function app provides the environment for executing your function code and group functions as a logical unit.
-* **Application Insights**: Application Insights tracks usage of your serverless function.
-
-</details>
-
-<hr/>
-<br/>
-
-## 6. Run the function in Azure
-
-1. Back in the **Azure: Functions** side bar, expand the new function app.
-1. Expand **Functions**, then right-click (Windows) or <kbd>Ctrl -</kbd> click (macOS) the `HttpExample` function and choose **Execute Function Now...**.
-
- :::image type="content" source="../../includes/media/functions-vs-code-run-remote/execute-function-now.png" alt-text="Execute function now in Azure from Visual Studio Code":::
-
-1. In **Enter request body** you see the request message body value of `{ "name": "Azure" }`.
-
- Press Enter to send this request message to your function.
-
-1. When the function executes in Azure and returns a response, a notification is raised in Visual Studio Code.
-
-## 7. Clean up resources
-
-When you continue to the [next step](#next-steps) and add an <abbr title="A means to associate a function with a storage queue, so that it can create messages on the queue.">Azure Storage queue output binding</abbr> to your function, you'll need to keep all your resources in place to build on what you've already done.
-
-Otherwise, you can use the following steps to delete the function app and its related resources to avoid incurring any further costs.
--
-To learn more about Functions costs, see [Estimating Consumption plan costs](functions-consumption-costs.md).
-
-## Next steps
-
-Expand that function by adding an output <abbr title="A declarative connection between a function and other resources. An input binding provides data to the function; an output binding provides data from the function to other resources.">binding</abbr>. This binding writes the string from the HTTP request to a message in an Azure Queue Storage queue.
-
-> [!div class="nextstepaction"]
-> [Connect to an Azure Storage queue](functions-add-output-binding-storage-queue-vs-code.md?pivots=programming-language-python)
-
-[Having issues? Let us know.](https://aka.ms/python-functions-qs-survey)
-
-[Azure Functions Core Tools]: functions-run-local.md
-[Azure Functions extension for Visual Studio Code]: https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-azurefunctions
azure-functions Functions Create First Function Resource Manager https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-create-first-function-resource-manager.md
Title: Create your first function using Azure Resource Manager templates description: Create and deploy to Azure a simple HTTP triggered serverless function by using an Azure Resource Manager template (ARM template). Previously updated : 06/22/2022 Last updated : 07/19/2022
The following four Azure resources are created by this template:
## Deploy the template
+The following scripts are designed for and tested in [Azure Cloud Shell](../cloud-shell/overview.md). Choose **Try It** to open a Cloud Shell instance right in your browser.
+ # [Azure CLI](#tab/azure-cli) ```azurecli-interactive read -p "Enter a resource group name that is used for generating resource names:" resourceGroupName &&
az deployment group create --resource-group $resourceGroupName --template-uri $
echo "Press [ENTER] to continue ..." && read ```
-# [PowerShell](#tab/powershell)
+# [Azure PowerShell](#tab/azure-powershell)
```powershell-interactive $resourceGroupName = Read-Host -Prompt "Enter a resource group name that is used for generating resource names"
If you continue to the next step and add an Azure Storage queue output binding,
Otherwise, use the following command to delete the resource group and all its contained resources to avoid incurring further costs.
-# [CLI](#tab/CLI)
+# [Azure CLI](#tab/azure-cli)
```azurecli-interactive az group delete --name <RESOURCE_GROUP_NAME> ```
-# [PowerShell](#tab/PowerShell)
+# [Azure PowerShell](#tab/azure-powershell)
```azurepowershell-interactive Remove-AzResourceGroup -Name <RESOURCE_GROUP_NAME>
azure-monitor Azure Monitor Agent Data Collection Endpoint https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/azure-monitor-agent-data-collection-endpoint.md
description: Use data collection endpoints to uniquely configure ingestion setti
Previously updated : 3/16/2022 Last updated : 06/06/2022
-# Using data collection endpoints with Azure Monitor agent
-[Data Collection Endpoints (DCEs)](../essentials/data-collection-endpoint-overview.md) allow you to uniquely configure ingestion settings for your machines, giving you greater control over your networking requirements.
+# Enable network isolation for the Azure Monitor Agent
+By default, Azure Monitor agent will connect to a public endpoint to connect to your Azure Monitor environment. You can enable network isolation for your agents by creating [data collection endpoints](../essentials/data-collection-endpoint-overview.md) and adding them to your [Azure Monitor Private Link Scopes (AMPLS)](../logs/private-link-configure.md#connect-azure-monitor-resources).
+ ## Create data collection endpoint
-See [Data collection endpoints in Azure Monitor](../essentials/data-collection-endpoint-overview.md) for details on data collection endpoints and how to create them.
+To use network isolation, you must create a data collection endpoint for each of your regions for agents to connect instead of the public endpoint. See [Create a data collection endpoint](../essentials/data-collection-endpoint-overview.md#create-data-collection-endpoint) for details on create a DCE. An agent can only connect to a DCE in the same region. If you have agents in multiple regions, then you must create a DCE in each one.
-## Create endpoint association in Azure portal
-Use **Data collection rules** in the portal to associate endpoints with a resource (e.g. a virtual machine) or a set of resources. Create a new rule or open an existing rule. In the **Resources** tab, click on the **Data collection endpoint** drop-down to associate an existing endpoint for your resource in the same region (or select multiple resources in the same region to bulk-assign an endpoint for them). Doing this creates an association per resource which links the endpoint to the resource. The Azure Monitor agent running on these resources will now start using the endpoint instead for uploading data to Azure Monitor.
-[![Data Collection Rule virtual machines](media/data-collection-rule-azure-monitor-agent/data-collection-rule-virtual-machines-with-endpoint.png)](../agents/media/data-collection-rule-azure-monitor-agent/data-collection-rule-virtual-machines-with-endpoint.png#lightbox)
+## Create private link
+With [Azure Private Link](../../private-link/private-link-overview.md), you can securely link Azure platform as a service (PaaS) resources to your virtual network by using private endpoints. An Azure Monitor Private Link connects a private endpoint to a set of Azure Monitor resources, defining the boundaries of your monitoring network. That set is called an Azure Monitor Private Link Scope (AMPLS). See [Configure your Private Link](../logs/private-link-configure.md) for details on creating and configuring your AMPLS.
+## Add DCE to AMPLS
+Add the data collection endpoints to a new or existing [Azure Monitor Private Link Scopes (AMPLS)](../logs/private-link-configure.md#connect-azure-monitor-resources) resource. This adds the DCE endpoints to your private DNS zone (see [how to validate](../logs/private-link-configure.md#review-and-validate-your-private-link-setup)) and allows communication via private links. You can do this from either the AMPLS resource or from within an existing DCE resource's 'Network Isolation' tab.
> [!NOTE]
-> The data collection endpoint should be created in the **same region** where your virtual machines exist.
+> Other Azure Monitor resources like the Log Analytics workspace(s) configured in your data collection rules that you wish to send data to, must be part of this same AMPLS resource.
++
+For your data collection endpoint(s), ensure **Accept access from public networks not connected through a Private Link Scope** option is set to **No** under the 'Network Isolation' tab of your endpoint resource in Azure portal, as shown below. This ensures that public internet access is disabled, and network communication only happen via private links.
++++
+ Associate the data collection endpoints to the target resources by editing the data collection rule in Azure portal. From the **Resources** tab, select **Enable Data Collection Endpoints** and select a DCE for each virtual machine. See [Configure data collection for the Azure Monitor agent](../agents/data-collection-rule-azure-monitor-agent.md).
++
-## Create endpoint and association using REST API
-> [!NOTE]
-> The data collection endpoint should be created in the **same region** where your virtual machines exist.
-
-1. Create data collection endpoint(s) using these [DCE REST APIs](/cli/azure/monitor/data-collection/endpoint).
-2. Create association(s) to link the endpoint(s) to your target machines or resources, using these [DCRA REST APIs](/rest/api/monitor/datacollectionruleassociations/create#examples).
--
-## Sample data collection endpoint
-The sample data collection endpoint below is for virtual machines with Azure Monitor agent, with public network access disabled so that agent only uses private links to communicate and send data to Azure Monitor/Log Analytics.
-
-```json
-{
- "id": "/subscriptions/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx/resourceGroups/myResourceGroup/providers/Microsoft.Insights/dataCollectionEndpoints/myCollectionEndpoint",
- "name": "myCollectionEndpoint",
- "type": "Microsoft.Insights/dataCollectionEndpoints",
- "location": "eastus",
- "tags": {
- "tag1": "A",
- "tag2": "B"
- },
- "properties": {
- "configurationAccess": {
- "endpoint": "https://mycollectionendpoint-abcd.eastus-1.control.monitor.azure.com"
- },
- "logsIngestion": {
- "endpoint": "https://mycollectionendpoint-abcd.eastus-1.ingest.monitor.azure.com"
- },
- "networkAcls": {
- "publicNetworkAccess": "Disabled"
- }
- },
- "systemData": {
- "createdBy": "user1",
- "createdByType": "User",
- "createdAt": "yyyy-mm-ddThh:mm:ss.sssssssZ",
- "lastModifiedBy": "user2",
- "lastModifiedByType": "User",
- "lastModifiedAt": "yyyy-mm-ddThh:mm:ss.sssssssZ"
- },
- "etag": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
-}
-```
-
-## Enable network isolation for the Azure Monitor Agent
-You can use data collection endpoints to enable the Azure Monitor agent to communicate to the internet via private links. To do so, you must:
-1. Create data collection endpoint(s), at least one per region, as shown above
-2. Add the data collection endpoints to a new or existing [Azure Monitor Private Link Scopes (AMPLS)](../logs/private-link-configure.md#connect-azure-monitor-resources) resource. This adds the DCE endpoints to your private DNS zone (see [how to validate](../logs/private-link-configure.md#review-and-validate-your-private-link-setup)) and allows communication via private links. You can do this from either the AMPLS resource or from within an existing DCE resource's 'Network Isolation' tab.
- > [!NOTE]
- > Other Azure Monitor resources like the Log Analytics workspace(s) configured in your data collection rules that you wish to send data to, must be part of this same AMPLS resource.
-3. For your data collection endpoint(s), ensure **Accept access from public networks not connected through a Private Link Scope** option is set to **No** under the 'Network Isolation' tab of your endpoint resource in Azure portal, as shown below. This ensures that public internet access is disabled, and network communication only happen via private links.
-4. Associate the data collection endpoints to the target resources, using the data collection rules experience in Azure portal. This results in the agent using the configured the data collection endpoint(s) for network communications. See [Configure data collection for the Azure Monitor agent](../agents/data-collection-rule-azure-monitor-agent.md).
-
- ![Data collection endpoint network isolation](media/azure-monitor-agent-dce/data-collection-endpoint-network-isolation.png)
## Next steps - [Associate endpoint to machines](../agents/data-collection-rule-azure-monitor-agent.md#create-data-collection-rule-and-association)
azure-monitor Data Collection Rule Azure Monitor Agent https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/data-collection-rule-azure-monitor-agent.md
To collect data from virtual machines using the Azure Monitor agent, you'll:
1. Create [data collection rules (DCR)](../essentials/data-collection-rule-overview.md) that define which data Azure Monitor agent sends to which destinations. 1. Associate the data collection rule to specific virtual machines.
- You can associate virtual machines to multiple data collection rules. This allows you to define each data collection rule to address a particular requirement, and associate the data collection rules to virtual machines based on the specific data you want to collect from each machine.
+You can associate virtual machines to multiple data collection rules. This allows you to define each data collection rule to address a particular requirement, and associate the data collection rules to virtual machines based on the specific data you want to collect from each machine.
## Create data collection rule and association
To send data to Log Analytics, create the data collection rule in the **same reg
### [Portal](#tab/portal)
-1. From the **Monitor** menu, select **Data Collection Rules**.
-1. Select **Create** to create a new Data Collection Rule and associations.
+In the **Monitor** menu in the Azure portal, select **Data Collection Rules** from the **Settings** section. Click **Create** to create a new Data Collection Rule and assignment.
- [![Screenshot showing the Create button on the Data Collection Rules screen.](media/data-collection-rule-azure-monitor-agent/data-collection-rules-updated.png)](media/data-collection-rule-azure-monitor-agent/data-collection-rules-updated.png#lightbox)
-
-1. Provide a **Rule name** and specify a **Subscription**, **Resource Group**, **Region**, and **Platform Type**.
+[![Screenshot of viewing data collection rules in Azure portal.](media/data-collection-rule-azure-monitor-agent/data-collection-rules-updated.png)](media/data-collection-rule-azure-monitor-agent/data-collection-rules-updated.png#lightbox)
- **Region** specifies where the DCR will be created. The virtual machines and their associations can be in any subscription or resource group in the tenant.
+Click **Create** to create a new rule and set of associations. Provide a **Rule name** and specify a **Subscription**, **Resource Group** and **Region**. This specifies where the DCR will be created. The virtual machines and their associations can be in any subscription or resource group in the tenant.
+Additionally, choose the appropriate **Platform Type** which specifies the type of resources this rule can apply to. Custom will allow for both Windows and Linux types. This allows for pre-curated creation experiences with options scoped to the selected platform type.
- **Platform Type** specifies the type of resources this rule can apply to. Custom allows for both Windows and Linux types.
+[![Screenshot of Azure portal form to create new data collection rule.](media/data-collection-rule-azure-monitor-agent/data-collection-rule-basics-updated.png)](media/data-collection-rule-azure-monitor-agent/data-collection-rule-basics-updated.png#lightbox)
- [![Screenshot showing the Basics tab of the Data Collection Rules screen.](media/data-collection-rule-azure-monitor-agent/data-collection-rule-basics-updated.png)](media/data-collection-rule-azure-monitor-agent/data-collection-rule-basics-updated.png#lightbox)
+In the **Resources** tab, add the resources (virtual machines, virtual machine scale sets, Arc for servers) that should have the Data Collection Rule applied. The Azure Monitor Agent will be installed on resources that don't already have it installed, and will enable Azure Managed Identity as well.
-1. On the **Resources** tab, add the resources (virtual machines, virtual machine scale sets, Arc for servers) to which to associate the data collection rule. The portal will install Azure Monitor Agent on resources that don't already have it installed, and will also enable Azure Managed Identity.
+> [!IMPORTANT]
+> If you need network isolation using private links for collecting data using agents from your resources, then select **Enable Data Collection Endpoints** and select a DCE for each virtual machine. See [Enable network isolation for the Azure Monitor Agent](azure-monitor-agent-data-collection-endpoint.md) for details.
- > [!IMPORTANT]
- > The portal enables System-Assigned managed identity on the target resources, in addition to existing User-Assigned Identities (if any). For existing applications, unless you specify the User-Assigned identity in the request, the machine will default to using System-Assigned Identity instead.
- If you need network isolation using private links, select existing endpoints from the same region for the respective resources, or [create a new endpoint](../essentials/data-collection-endpoint-overview.md).
- [![Data Collection Rule virtual machines](media/data-collection-rule-azure-monitor-agent/data-collection-rule-virtual-machines-with-endpoint.png)](media/data-collection-rule-azure-monitor-agent/data-collection-rule-virtual-machines-with-endpoint.png#lightbox)
-1. On the **Collect and deliver** tab, select **Add data source** to add a data source and set a destination.
-1. Select a **Data source type**.
-1. Select which data you want to collect. For performance counters, you can select from a predefined set of objects and their sampling rate. For events, you can select from a set of logs and severity levels.
+On the **Collect and deliver** tab, click **Add data source** to add a data source and destination set. Select a **Data source type**, and the corresponding details to select will be displayed. For performance counters, you can select from a predefined set of objects and their sampling rate. For events, you can select from a set of logs or facilities and the severity level.
- [![Data source basic](media/data-collection-rule-azure-monitor-agent/data-collection-rule-data-source-basic-updated.png)](media/data-collection-rule-azure-monitor-agent/data-collection-rule-data-source-basic-updated.png#lightbox)
+[![Screenshot of Azure portal form to select basic performance counters in a data collection rule.](media/data-collection-rule-azure-monitor-agent/data-collection-rule-data-source-basic-updated.png)](media/data-collection-rule-azure-monitor-agent/data-collection-rule-data-source-basic-updated.png#lightbox)
-1. Select **Custom** to collect logs and performance counters that are not [currently supported data sources](azure-monitor-agent-overview.md#data-sources-and-destinations) or to [filter events using XPath queries](#filter-events-using-xpath-queries). You can then specify an [XPath](https://www.w3schools.com/xml/xpath_syntax.asp) to collect any specific values. See [Sample DCR](data-collection-rule-sample-agent.md) for an example.
- [![Data source custom](media/data-collection-rule-azure-monitor-agent/data-collection-rule-data-source-custom-updated.png)](media/data-collection-rule-azure-monitor-agent/data-collection-rule-data-source-custom-updated.png#lightbox)
+To specify other logs and performance counters from the [currently supported data sources](azure-monitor-agent-overview.md#data-sources-and-destinations) or to filter events using XPath queries, select **Custom**. You can then specify an [XPath ](https://www.w3schools.com/xml/xpath_syntax.asp) for any specific values to collect. See [Sample DCR](data-collection-rule-sample-agent.md) for an example.
-1. On the **Destination** tab, add one or more destinations for the data source. You can select multiple destinations of the same or different types - for instance multiple Log Analytics workspaces (known as "multi-homing").
+[![Screenshot of Azure portal form to select custom performance counters in a data collection rule.](media/data-collection-rule-azure-monitor-agent/data-collection-rule-data-source-custom-updated.png)](media/data-collection-rule-azure-monitor-agent/data-collection-rule-data-source-custom-updated.png#lightbox)
- You can send Windows event and Syslog data sources can to Azure Monitor Logs only. You can send performance counters to both Azure Monitor Metrics and Azure Monitor Logs.
+On the **Destination** tab, add one or more destinations for the data source. You can select multiple destinations of same of different types, for instance multiple Log Analytics workspaces (i.e. "multi-homing"). Windows event and Syslog data sources can only send to Azure Monitor Logs. Performance counters can send to both Azure Monitor Metrics and Azure Monitor Logs.
- [![Destination](media/data-collection-rule-azure-monitor-agent/data-collection-rule-destination.png)](media/data-collection-rule-azure-monitor-agent/data-collection-rule-destination.png#lightbox)
+[![Screenshot of Azure portal form to add a data source in a data collection rule.](media/data-collection-rule-azure-monitor-agent/data-collection-rule-destination.png)](media/data-collection-rule-azure-monitor-agent/data-collection-rule-destination.png#lightbox)
-1. Select **Add Data Source** and then **Review + create** to review the details of the data collection rule and association with the set of virtual machines.
-1. Select **Create** to create the data collection rule.
+Click **Add Data Source** and then **Review + create** to review the details of the data collection rule and association with the set of VMs. Click **Create** to create it.
> [!NOTE]
-> It might take up to 5 minutes for data to be sent to the destinations after you create the data collection rule and associations.
+> After the data collection rule and associations have been created, it might take up to 5 minutes for data to be sent to the destinations.
+## Create rule and association in Azure portal
+
+You can use the Azure portal to create a data collection rule and associate virtual machines in your subscription to that rule. The Azure Monitor agent will be automatically installed and a managed identity created for any virtual machines that don't already have it installed.
+
+> [!IMPORTANT]
+> Creating a data collection rule using the portal also enables System-Assigned managed identity on the target resources, in addition to existing User-Assigned Identities (if any). For existing applications unless they specify the User-Assigned identity in the request, the machine will default to using System-Assigned Identity instead. [Learn More](../../active-directory/managed-identities-azure-resources/managed-identities-faq.md#what-identity-will-imds-default-to-if-dont-specify-the-identity-in-the-request)
+++
+> [!NOTE]
+> If you wish to send data to Log Analytics, you must create the data collection rule in the **same region** where your Log Analytics workspace resides. The rule can be associated to machines in other supported region(s).
++
+## Limit data collection with custom XPath queries
+Since you're charged for any data collected in a Log Analytics workspace, you should collect only the data that you require. Using basic configuration in the Azure portal, you only have limited ability to filter events to collect. For Application and System logs, this is all logs with a particular severity. For Security logs, this is all audit success or all audit failure logs.
+
+To specify additional filters, you must use Custom configuration and specify an XPath that filters out the events you don't. XPath entries are written in the form `LogName!XPathQuery`. For example, you may want to return only events from the Application event log with an event ID of 1035. The XPathQuery for these events would be `*[System[EventID=1035]]`. Since you want to retrieve the events from the Application event log, the XPath would be `Application!*[System[EventID=1035]]`
+
+### Extracting XPath queries from Windows Event Viewer
+One of the ways to create XPath queries is to use Windows Event Viewer to extract XPath queries as shown below.
+
+* In step 5 when pasting over the 'Select Path' parameter value, you must append the log type category followed by '!' and then paste the copied value.
+
+[![Screenshot of steps in Azure portal showing the steps to create an XPath query in the Windows Event Viewer.](media/data-collection-rule-azure-monitor-agent/data-collection-rule-extract-xpath.png)](media/data-collection-rule-azure-monitor-agent/data-collection-rule-extract-xpath.png#lightbox)
+
+See [XPath 1.0 limitations](/windows/win32/wes/consuming-events#xpath-10-limitations) for a list of limitations in the XPath supported by Windows event log.
+
+> [!TIP]
+> You can use the PowerShell cmdlet `Get-WinEvent` with the `FilterXPath` parameter to test the validity of an XPathQuery locally on your machine first. The following script shows an example.
+>
+> ```powershell
+> $XPath = '*[System[EventID=1035]]'
+> Get-WinEvent -LogName 'Application' -FilterXPath $XPath
+> ```
+>
+> - **In the cmdlet above, the value for '-LogName' parameter is the initial part of the XPath query until the '!', while only the rest of the XPath query goes into the $XPath parameter.**
+> - If events are returned, the query is valid.
+> - If you receive the message *No events were found that match the specified selection criteria.*, the query may be valid, but there are no matching events on the local machine.
+> - If you receive the message *The specified query is invalid* , the query syntax is invalid.
+
+The following table shows examples for filtering events using a custom XPath.
+
+| Description | XPath |
+|:|:|
+| Collect only System events with Event ID = 4648 | `System!*[System[EventID=4648]]`
+| Collect Security Log events with Event ID = 4648 and a process name of consent.exe | `Security!*[System[(EventID=4648)]] and *[EventData[Data[@Name='ProcessName']='C:\Windows\System32\consent.exe']]` |
+| Collect all Critical, Error, Warning, and Information events from the System event log except for Event ID = 6 (Driver loaded) | `System!*[System[(Level=1 or Level=2 or Level=3) and (EventID != 6)]]` |
+| Collect all success and failure Security events except for Event ID 4624 (Successful logon) | `Security!*[System[(band(Keywords,13510798882111488)) and (EventID != 4624)]]` |
++
+## Create rule and association using REST API
+
+Follow the steps below to create a data collection rule and associations using the REST API.
+
+> [!NOTE]
+> If you wish to send data to Log Analytics, you must create the data collection rule in the **same region** where your Log Analytics workspace resides. The rule can be associated to machines in other supported region(s).
### [API](#tab/api) 1. Create a DCR file using the JSON format shown in [Sample DCR](data-collection-rule-sample-agent.md).
azure-monitor Data Collection Text Log https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/data-collection-text-log.md
# Collect text and IIS logs with Azure Monitor agent (preview) This article describes how to configure the collection of file-based text logs, including logs generated by IIS on Windows computers, with the [Azure Monitor agent](azure-monitor-agent-overview.md). Many applications log information to text files instead of standard logging services such as Windows Event log or Syslog.
->[!IMPORTANT]
-> This feature is currently in preview. You must submit a request for it to be enabled in your subscriptions at [Azure Monitor Logs: DCR-based Custom Logs Preview Signup](https://aka.ms/CustomLogsOnboard).
## Prerequisites To complete this procedure, you need the following:
Use the **Tables - Update** API to create the table with the PowerShell code bel
1. Click the **Cloud Shell** button in the Azure portal and ensure the environment is set to **PowerShell**.
- :::image type="content" source="../logs/media/tutorial-ingestion-time-transformations-api/open-cloud-shell.png" lightbox="../logs/media/tutorial-ingestion-time-transformations-api/open-cloud-shell.png" alt-text="Screenshot of opening Cloud Shell in the Azure portal.":::
+ :::image type="content" source="../logs/media/tutorial-workspace-transformations-api/open-cloud-shell.png" lightbox="../logs/media/tutorial-workspace-transformations-api/open-cloud-shell.png" alt-text="Screenshot of opening Cloud Shell in the Azure portal.":::
2. Copy the following PowerShell code and replace the **Path** parameter with the appropriate values for your workspace in the `Invoke-AzRestMethod` command. Paste it into the Cloud Shell prompt to run it.
A [data collection endpoint (DCE)](../essentials/data-collection-endpoint-overvi
1. In the Azure portal's search box, type in *template* and then select **Deploy a custom template**.
- :::image type="content" source="../logs/media/tutorial-ingestion-time-transformations-api/deploy-custom-template.png" lightbox="../logs/media/tutorial-ingestion-time-transformations-api/deploy-custom-template.png" alt-text="Screenshot that shows portal blade to deploy custom template.":::
+ :::image type="content" source="../logs/media/tutorial-workspace-transformations-api/deploy-custom-template.png" lightbox="../logs/media/tutorial-workspace-transformations-api/deploy-custom-template.png" alt-text="Screenshot that shows portal blade to deploy custom template.":::
2. Click **Build your own template in the editor**.
- :::image type="content" source="../logs/media/tutorial-ingestion-time-transformations-api/build-custom-template.png" lightbox="../logs/media/tutorial-ingestion-time-transformations-api/build-custom-template.png" alt-text="Screenshot that shows portal blade to build template in the editor.":::
+ :::image type="content" source="../logs/media/tutorial-workspace-transformations-api/build-custom-template.png" lightbox="../logs/media/tutorial-workspace-transformations-api/build-custom-template.png" alt-text="Screenshot that shows portal blade to build template in the editor.":::
3. Paste the Resource Manager template below into the editor and then click **Save**. You don't need to modify this template since you will provide values for its parameters.
- :::image type="content" source="../logs/media/tutorial-ingestion-time-transformations-api/edit-template.png" lightbox="../logs/media/tutorial-ingestion-time-transformations-api/edit-template.png" alt-text="Screenshot that shows portal blade to edit Resource Manager template.":::
+ :::image type="content" source="../logs/media/tutorial-workspace-transformations-api/edit-template.png" lightbox="../logs/media/tutorial-workspace-transformations-api/edit-template.png" alt-text="Screenshot that shows portal blade to edit Resource Manager template.":::
```json {
A [data collection endpoint (DCE)](../essentials/data-collection-endpoint-overvi
4. On the **Custom deployment** screen, specify a **Subscription** and **Resource group** to store the data collection rule and then provide values a **Name** for the data collection endpoint. The **Location** should be the same location as the workspace. The **Region** will already be populated and is used for the location of the data collection endpoint.
- :::image type="content" source="../logs/media/tutorial-ingestion-time-transformations-api/custom-deployment-values.png" lightbox="../logs/media/tutorial-ingestion-time-transformations-api/custom-deployment-values.png" alt-text="Screenshot that shows portal blade to edit custom deployment values for data collection endpoint.":::
+ :::image type="content" source="../logs/media/tutorial-workspace-transformations-api/custom-deployment-values.png" lightbox="../logs/media/tutorial-workspace-transformations-api/custom-deployment-values.png" alt-text="Screenshot that shows portal blade to edit custom deployment values for data collection endpoint.":::
5. Click **Review + create** and then **Create** when you review the details. 6. Once the DCE is created, select it so you can view its properties. Note the **Logs ingestion URI** since you'll need this in a later step.
- :::image type="content" source="../logs/media/tutorial-custom-logs-api/data-collection-endpoint-overview.png" lightbox="../logs/media/tutorial-custom-logs-api/data-collection-endpoint-overview.png" alt-text="Screenshot that shows portal blade with details of data collection endpoint uri.":::
+ :::image type="content" source="../logs/media/tutorial-logs-ingestion-api/data-collection-endpoint-overview.png" lightbox="../logs/media/tutorial-logs-ingestion-api/data-collection-endpoint-overview.png" alt-text="Screenshot that shows portal blade with details of data collection endpoint uri.":::
7. Click **JSON View** to view other details for the DCE. Copy the **Resource ID** since you'll need this in a later step.
- :::image type="content" source="../logs/media/tutorial-custom-logs-api/data-collection-endpoint-json.png" lightbox="../logs/media/tutorial-custom-logs-api/data-collection-endpoint-json.png" alt-text="Screenshot that shows JSON view for data collection endpoint with the resource ID.":::
+ :::image type="content" source="../logs/media/tutorial-logs-ingestion-api/data-collection-endpoint-json.png" lightbox="../logs/media/tutorial-logs-ingestion-api/data-collection-endpoint-json.png" alt-text="Screenshot that shows JSON view for data collection endpoint with the resource ID.":::
## Create data collection rule
The [data collection rule (DCR)](../essentials/data-collection-rule-overview.md)
1. The data collection rule requires the resource ID of your workspace. Navigate to your workspace in the **Log Analytics workspaces** menu in the Azure portal. From the **Properties** page, copy the **Resource ID** and save it for later use.
- :::image type="content" source="../logs/media/tutorial-custom-logs-api/workspace-resource-id.png" lightbox="../logs/media/tutorial-custom-logs-api/workspace-resource-id.png" alt-text="Screenshot showing workspace resource ID.":::
+ :::image type="content" source="../logs/media/tutorial-logs-ingestion-api/workspace-resource-id.png" lightbox="../logs/media/tutorial-logs-ingestion-api/workspace-resource-id.png" alt-text="Screenshot showing workspace resource ID.":::
1. In the Azure portal's search box, type in *template* and then select **Deploy a custom template**.
- :::image type="content" source="../logs/media/tutorial-ingestion-time-transformations-api/deploy-custom-template.png" lightbox="../logs/media/tutorial-ingestion-time-transformations-api/deploy-custom-template.png" alt-text="Screenshot that shows portal blade to deploy custom template.":::
+ :::image type="content" source="../logs/media/tutorial-workspace-transformations-api/deploy-custom-template.png" lightbox="../logs/media/tutorial-workspace-transformations-api/deploy-custom-template.png" alt-text="Screenshot that shows portal blade to deploy custom template.":::
2. Click **Build your own template in the editor**.
- :::image type="content" source="../logs/media/tutorial-ingestion-time-transformations-api/build-custom-template.png" lightbox="../logs/media/tutorial-ingestion-time-transformations-api/build-custom-template.png" alt-text="Screenshot that shows portal blade to build template in the editor.":::
+ :::image type="content" source="../logs/media/tutorial-workspace-transformations-api/build-custom-template.png" lightbox="../logs/media/tutorial-workspace-transformations-api/build-custom-template.png" alt-text="Screenshot that shows portal blade to build template in the editor.":::
3. Paste one of the Resource Manager templates below into the editor and then change the following values: - `streamDeclarations`: Defines the columns of the incoming data. This must match the structure of the log file. - `filePatterns`: Specifies the location and file pattern of the log files to collect. This defines a separate pattern for Windows and Linux agents.
- - `transformKql`: Specifies a [transformation](../logs/../essentials/data-collection-rule-transformations.md) to apply to the incoming data before it's sent to the workspace. Data collection rules for Azure Monitor agent don't yet support transformations, so this value should currently be `source`.
+ - `transformKql`: Specifies a [transformation](../logs/../essentials//data-collection-transformations.md) to apply to the incoming data before it's sent to the workspace. Data collection rules for Azure Monitor agent don't yet support transformations, so this value should currently be `source`.
4. Click **Save**.
- :::image type="content" source="../logs/media/tutorial-ingestion-time-transformations-api/edit-template.png" lightbox="../logs/media/tutorial-ingestion-time-transformations-api/edit-template.png" alt-text="Screenshot that shows portal blade to edit Resource Manager template.":::
+ :::image type="content" source="../logs/media/tutorial-workspace-transformations-api/edit-template.png" lightbox="../logs/media/tutorial-workspace-transformations-api/edit-template.png" alt-text="Screenshot that shows portal blade to edit Resource Manager template.":::
**Data collection rule for text log**
Open IIS log on the agent machine to verify logs are in W3C format.
### Share logs with Microsoft If everything is configured properly, but you're still not collecting log data, use the following procedure to collect diagnostics logs for Azure Monitor agent to share with the Azure Monitor group.
-1. Open an elevated powershell window.
+1. Open an elevated PowerShell window.
2. Change to directory `C:\Packages\Plugins\Microsoft.Azure.Monitor.AzureMonitorWindowsAgent\[version]\`. 3. Execute the script: `.\CollectAMALogs.ps1`. 4. Share the `AMAFiles.zip` file generated on the desktop.
azure-monitor Data Sources Custom Logs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/data-sources-custom-logs.md
# Collect text logs with Log Analytics agent in Azure Monitor
-> [!IMPORTANT]
-> This article describes collecting file based text logs using the Log Analytics agent. It should not be confused with the [custom logs API](../logs/custom-logs-overview.md) which allows you to send data to Azure Monitor Logs using a REST API.
- The Custom Logs data source for the Log Analytics agent in Azure Monitor allows you to collect events from text files on both Windows and Linux computers. Many applications log information to text files instead of standard logging services such as Windows Event log or Syslog. Once collected, you can either parse the data into individual fields in your queries or extract the data during collection to individual fields. [!INCLUDE [Log Analytics agent deprecation](../../../includes/log-analytics-agent-deprecation.md)]
azure-monitor Data Sources https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/data-sources.md
- Title: Sources of data in Azure Monitor | Microsoft Docs
-description: Describes the data available to monitor the health and performance of your Azure resources and the applications running on them.
--- Previously updated : 02/07/2022----
-# Sources of monitoring data for Azure Monitor
-Azure Monitor is based on a [common monitoring data platform](../data-platform.md) that includes [Logs](../logs/data-platform-logs.md) and [Metrics](../essentials/data-platform-metrics.md). Collecting data into this platform allows data from multiple resources to be analyzed together using a common set of tools in Azure Monitor. Monitoring data may also be sent to other locations to support certain scenarios, and some resources may write to other locations before they can be collected into Logs or Metrics.
-
-This article describes the different sources of monitoring data collected by Azure Monitor in addition to the monitoring data created by Azure resources. Links are provided to detailed information on configuration required to collect this data to different locations.
-
-## Application tiers
-
-Sources of monitoring data from Azure applications can be organized into tiers, the highest tiers being your application itself and the lower tiers being components of Azure platform. The method of accessing data from each tier varies. The application tiers are summarized in the table below, and the sources of monitoring data in each tier are presented in the following sections. See [Monitoring data locations in Azure](../monitor-reference.md) for a description of each data location and how you can access its data.
--
-![Monitoring tiers](../media/overview/overview.png)
--
-### Azure
-The following table briefly describes the application tiers that are specific to Azure. Following the link for further details on each in the sections below.
-
-| Tier | Description | Collection method |
-|:|:|:|
-| [Azure Tenant](#azure-tenant) | Data about the operation of tenant-level Azure services, such as Azure Active Directory. | View AAD data in portal or configure collection to Azure Monitor using a tenant diagnostic setting. |
-| [Azure subscription](#azure-subscription) | Data related to the health and management of cross-resource services in your Azure subscription such as Resource Manager and Service Health. | View in portal or configure collection to Azure Monitor using a log profile. |
-| [Azure resources](#azure-resources) | Data about the operation and performance of each Azure resource. | Metrics collected automatically, view in Metrics Explorer.<br>Configure diagnostic settings to collect logs in Azure Monitor.<br>Monitoring solutions and Insights available for more detailed monitoring for specific resource types. |
-
-### Azure, other cloud, or on-premises
-The following table briefly describes the application tiers that may be in Azure, another cloud, or on-premises. Following the link for further details on each in the sections below.
-
-| Tier | Description | Collection method |
-|:|:|:|
-| [Operating system (guest)](#operating-system-guest) | Data about the operating system on compute resources. | Install Azure Monitor agent on virtual machines, scale sets and Arc-enabled servers to collect logs and metrics into Azure Monitor. |
-| [Application Code](#application-code) | Data about the performance and functionality of the actual application and code, including performance traces, application logs, and user telemetry. | Instrument your code to collect data into Application Insights. |
-| [Custom sources](#custom-sources) | Data from external services or other components or devices. | Collect log or metrics data into Azure Monitor from any REST client. |
-
-## Azure tenant
-Telemetry related to your Azure tenant is collected from tenant-wide services such as Azure Active Directory.
-
-![Azure tenant collection](media/data-sources/tenant.png)
-
-### Azure Active Directory Audit Logs
-[Azure Active Directory reporting](../../active-directory/reports-monitoring/overview-reports.md) contains the history of sign-in activity and audit trail of changes made within a particular tenant.
-
-| Destination | Description | Reference |
-|:|:|:|
-| Azure Monitor Logs | Configure Azure AD logs to be collected in Azure Monitor to analyze them with other monitoring data. | [Integrate Azure AD logs with Azure Monitor logs](../../active-directory/reports-monitoring/howto-integrate-activity-logs-with-log-analytics.md) |
-| Azure Storage | Export Azure AD logs to Azure Storage for archiving. | [Tutorial: Archive Azure AD logs to an Azure storage account](../../active-directory/reports-monitoring/quickstart-azure-monitor-route-logs-to-storage-account.md) |
-| Event Hub | Stream Azure AD logs to other locations using Event Hubs. | [Tutorial: Stream Azure Active Directory logs to an Azure event hub](../../active-directory/reports-monitoring/tutorial-azure-monitor-stream-logs-to-event-hub.md). |
---
-## Azure subscription
-Telemetry related to the health and operation of your Azure subscription.
-
-![Azure subscription](media/data-sources/azure-subscription.png)
-
-### Azure Activity log
-The [Azure Activity log](../essentials/platform-logs-overview.md) includes service health records along with records on any configuration changes made to the resources in your Azure subscription. The Activity log is available to all Azure resources and represents their _external_ view.
-
-| Destination | Description | Reference |
-|:|:|
-| Activity log | The Activity log is collected into its own data store that you can view from the Azure Monitor menu or use to create Activity log alerts. | [Query the Activity log in the Azure portal](../essentials/activity-log.md#view-the-activity-log) |
-| Azure Monitor Logs | Configure Azure Monitor Logs to collect the Activity log to analyze it with other monitoring data. | [Collect and analyze Azure activity logs in Log Analytics workspace in Azure Monitor](../essentials/activity-log.md) |
-| Azure Storage | Export the Activity log to Azure Storage for archiving. | [Archive Activity log](../essentials/resource-logs.md#send-to-azure-storage) |
-| Event Hubs | Stream the Activity log to other locations using Event Hubs | [Stream Activity log to Event Hub](../essentials/resource-logs.md#send-to-azure-event-hubs). |
-
-### Azure Service Health
-[Azure Service Health](../../service-health/service-health-overview.md) provides information about the health of the Azure services in your subscription that your application and resources rely on.
-
-| Destination | Description | Reference |
-|:|:|:|
-| Activity log<br>Azure Monitor Logs | Service Health records are stored in the Azure Activity log, so you can view them in the Azure portal or perform any other activities you can perform with the Activity log. | [View service health notifications by using the Azure portal](../../service-health/service-notifications.md) |
--
-## Azure resources
-Metrics and resource logs provide information about the _internal_ operation of Azure resources. These are available for most Azure services, and monitoring solutions and insights collect additional data for particular services.
-
-![Azure resource collection](media/data-sources/data-source-azure-resources.svg)
--
-### Platform metrics
-Most Azure services will send [platform metrics](../essentials/data-platform-metrics.md) that reflect their performance and operation directly to the metrics database. The specific [metrics will vary for each type of resource](../essentials/metrics-supported.md).
-
-| Destination | Description | Reference |
-|:|:|:|
-| Azure Monitor Metrics | Platform metrics will write to the Azure Monitor metrics database with no configuration. Access platform metrics from Metrics Explorer. | [Getting started with Azure Metrics Explorer](../essentials/metrics-getting-started.md)<br>[Supported metrics with Azure Monitor](../essentials/metrics-supported.md) |
-| Azure Monitor Logs | Copy platform metrics to Logs for trending and other analysis using Log Analytics. | [Azure diagnostics direct to Log Analytics](../essentials/resource-logs.md#send-to-log-analytics-workspace) |
-| Event Hubs | Stream metrics to other locations using Event Hubs. |[Stream Azure monitoring data to an event hub for consumption by an external tool](../essentials/stream-monitoring-data-event-hubs.md) |
-
-### Resource logs
-[Resource logs](../essentials/platform-logs-overview.md) provide insights into the _internal_ operation of an Azure resource. Resource logs are created automatically, but you must create a diagnostic setting to specify a destination for them to collected for each resource.
-
-The configuration requirements and content of resource logs vary by resource type, and not all services yet create them. See [Supported services, schemas, and categories for Azure resource logs](../essentials/resource-logs-schema.md) for details on each service and links to detailed configuration procedures. If the service isn't listed in this article, then that service doesn't currently create resource logs.
-
-| Destination | Description | Reference |
-|:|:|:|
-| Azure Monitor Logs | Send resource logs to Azure Monitor Logs for analysis with other collected log data. | [Collect Azure resource logs in Log Analytics workspace in Azure Monitor](../essentials/resource-logs.md#send-to-azure-storage) |
-| Storage | Send resource logs to Azure Storage for archiving. | [Archive Azure resource logs](../essentials/resource-logs.md#send-to-log-analytics-workspace) |
-| Event Hubs | Stream resource logs to other locations using Event Hubs. |[Stream Azure resource logs to an event hub](../essentials/resource-logs.md#send-to-azure-event-hubs) |
-
-## Operating system (guest)
-Compute resources in Azure, in other clouds, and on-premises have a guest operating system to monitor. With the installation of the Azure Monitor agent, you can gather telemetry from the guest into Azure Monitor to analyze it with the same monitoring tools as the Azure services themselves.
-
-![Azure compute resource collection](media/data-sources/compute-resources-updated.png)
-
-### Azure Monitor agent
-[Install the Azure Monitor agent](./azure-monitor-agent-manage.md) for comprehensive monitoring and management of your Windows or Linux virtual machines, scale sets and Arc-enabled servers (resources in other clouds or on-premises with Azure Arc installed, at no additional cost).
-
-| Destination | Description | Reference |
-|:|:|:|
-| Azure Monitor Logs | The Azure Monitor agent allows you to collect logs from data sources that you configure using [data collection rules](./data-collection-rule-azure-monitor-agent.md) or from monitoring solutions that provide additional insights into applications running on the machine. These can be sent to one or more Log Analytics workspaces. | [Data sources and destinations](./azure-monitor-agent-overview.md#data-sources-and-destinations) |
-| Azure Monitor Metrics (preview) | The Azure Monitor agent allows you to collect performance counters and send them to Azure Monitor metrics database | [Data sources and destinations](./azure-monitor-agent-overview.md#data-sources-and-destinations) |
-
-### Azure Diagnostic extension
-Enabling the Azure Diagnostics extension for Azure Virtual machines allows you to collect logs and metrics from the guest operating system of Azure compute resources including Azure Cloud Service (classic) Web and Worker Roles, Virtual Machines, virtual machine scale sets, and Service Fabric.
-
-| Destination | Description | Reference |
-|:|:|:|
-| Storage | Azure diagnostics extension always writes to an Azure Storage account. | [Install and configure Windows Azure diagnostics extension (WAD)](./diagnostics-extension-windows-install.md)<br>[Use Linux Diagnostic Extension to monitor metrics and logs](../../virtual-machines/extensions/diagnostics-linux.md) |
-| Azure Monitor Metrics (preview) | When you configure the Diagnostics Extension to collect performance counters, they are written to the Azure Monitor metrics database. | [Send Guest OS metrics to the Azure Monitor metric store using a Resource Manager template for a Windows virtual machine](../essentials/collect-custom-metrics-guestos-resource-manager-vm.md) |
-| Event Hubs | Configure the Diagnostics Extension to stream the data to other locations using Event Hubs. | [Streaming Azure Diagnostics data by using Event Hubs](./diagnostics-extension-stream-event-hubs.md)<br>[Use Linux Diagnostic Extension to monitor metrics and logs](../../virtual-machines/extensions/diagnostics-linux.md) |
-| Application Insights Logs | Collect logs and performance counters from the compute resource supporting your application to be analyzed with other application data. | [Send Cloud Service, Virtual Machine, or Service Fabric diagnostic data to Application Insights](./diagnostics-extension-to-application-insights.md) |
--
-### VM insights
-[VM insights](../vm/vminsights-overview.md) provides a customized monitoring experience for virtual machines providing features beyond core Azure Monitor functionality. It requires a Dependency Agent on Windows and Linux virtual machines that integrates with the Log Analytics agent to collect discovered data about processes running on the virtual machine and external process dependencies.
-
-| Destination | Description | Reference |
-|:|:|:|
-| Azure Monitor Logs | Stores data about processes and dependencies on the agent. | [Using VM insights Map to understand application components](../vm/vminsights-maps.md) |
---
-## Application Code
-Detailed application monitoring in Azure Monitor is done with [Application Insights](/azure/application-insights/) which collects data from applications running on a variety of platforms. The application can be running in Azure, another cloud, or on-premises.
-
-![Application data collection](media/data-sources/applications.png)
--
-### Application data
-When you enable Application Insights for an application by installing an instrumentation package, it collects metrics and logs related to the performance and operation of the application. Application Insights stores the data it collects in the same Azure Monitor data platform used by other data sources. It includes extensive tools for analyzing this data, but you can also analyze it with data from other sources using tools such as Metrics Explorer and Log Analytics.
-
-| Destination | Description | Reference |
-|:|:|:|
-| Azure Monitor Logs | Operational data about your application including page views, application requests, exceptions, and traces. | [Analyze log data in Azure Monitor](../logs/log-query-overview.md) |
-| | Dependency information between application components to support Application Map and telemetry correlation. | [Telemetry correlation in Application Insights](../app/correlation.md) <br> [Application Map](../app/app-map.md) |
-| | Results of availability tests that test the availability and responsiveness of your application from different locations on the public Internet. | [Monitor availability and responsiveness of any web site](../app/monitor-web-app-availability.md) |
-| Azure Monitor Metrics | Application Insights collects metrics describing the performance and operation of the application in addition to custom metrics that you define in your application into the Azure Monitor metrics database. | [Log-based and pre-aggregated metrics in Application Insights](../app/pre-aggregated-metrics-log-metrics.md)<br>[Application Insights API for custom events and metrics](../app/api-custom-events-metrics.md) |
-| Azure Storage | Send application data to Azure Storage for archiving. | [Export telemetry from Application Insights](../app/export-telemetry.md) |
-| | Details of availability tests are stored in Azure Storage. Use Application Insights in the Azure portal to download for local analysis. Results of availability tests are stored in Azure Monitor Logs. | [Monitor availability and responsiveness of any web site](../app/monitor-web-app-availability.md) |
-| | Profiler trace data is stored in Azure Storage. Use Application Insights in the Azure portal to download for local analysis. | [Profile production applications in Azure with Application Insights](../app/profiler-overview.md)
-| | Debug snapshot data that is captured for a subset of exceptions is stored in Azure Storage. Use Application Insights in the Azure portal to download for local analysis. | [How snapshots work](../app/snapshot-debugger.md#how-snapshots-work) |
-
-## Monitoring Solutions and Insights
-[Monitoring solutions](../insights/solutions.md) and [Insights](../monitor-reference.md) collect data to provide additional insights into the operation of a particular service or application. They may address resources in different application tiers and even multiple tiers.
-
-### Monitoring solutions
-
-| Destination | Description | Reference
-|:|:|:|
-| Azure Monitor Logs | Monitoring solutions collect data into Azure Monitor logs where it may be analyzed using the query language or [views](../visualize/view-designer.md) that are typically included in the solution. | [Data collection details for monitoring solutions in Azure](../monitor-reference.md) |
--
-### Container insights
-[Container insights](../containers/container-insights-overview.md) provides a customized monitoring experience for [Azure Kubernetes Service (AKS)](../../aks/index.yml). It collects additional data about these resources described in the following table.
-
-| Destination | Description | Reference |
-|:|:|:|
-| Azure Monitor Logs | Stores monitoring data for AKS including inventory, logs, and events. Metric data is also stored in Logs in order to leverage its analysis functionality in the portal. | [Understand AKS cluster performance with Container insights](../containers/container-insights-analyze.md) |
-| Azure Monitor Metrics | Metric data is stored in the metric database to drive visualization and alerts. | [View container metrics in metrics explorer](../containers/container-insights-analyze.md#view-container-metrics-in-metrics-explorer) |
-| Azure Kubernetes Service | Provides direct access to your Azure Kubernetes Service (AKS) container logs (stdout/stderror), events, and pod metrics in the portal. | [How to view Kubernetes logs, events, and pod metrics in real-time](../containers/container-insights-livedata-overview.md) |
-
-### VM insights
-[VM insights](../vm/vminsights-overview.md) provides a customized experience for monitoring virtual machines. A description of the data collected by VM insights is included in the [Operating System (guest)](#operating-system-guest) section above.
-
-## Custom sources
-In addition to the standard tiers of an application, you may need to monitor other resources that have telemetry that can't be collected with the other data sources. For these resources, write this data to either Metrics or Logs using an Azure Monitor API.
-
-![Custom collection](media/data-sources/custom.png)
-
-| Destination | Method | Description | Reference |
-|:|:|:|:|
-| Azure Monitor Logs | Data Collector API | Collect log data from any REST client and store in Log Analytics workspace. | [Send log data to Azure Monitor with the HTTP Data Collector API (public preview)](../logs/data-collector-api.md) |
-| Azure Monitor Metrics | Custom Metrics API | Collect metric data from any REST client and store in Azure Monitor metrics database. | [Send custom metrics for an Azure resource to the Azure Monitor metric store by using a REST API](../essentials/metrics-store-custom-rest-api.md) |
--
-## Other services
-Other services in Azure write data to the Azure Monitor data platform. This allows you to analyze data collected by these services with data collected by Azure Monitor and leverage the same analysis and visualization tools.
-
-| Service | Destination | Description | Reference |
-|:|:|:|:|
-| [Microsoft Defender for Cloud](../../security-center/index.yml) | Azure Monitor Logs | Microsoft Defender for Cloud stores the security data it collects in a Log Analytics workspace which allows it to be analyzed with other log data collected by Azure Monitor. | [Data collection in Microsoft Defender for Cloud](../../security-center/security-center-enable-data-collection.md) |
-| [Microsoft Sentinel](../../sentinel/index.yml) | Azure Monitor Logs | Microsoft Sentinel stores the data it collects from different data sources in a Log Analytics workspace which allows it to be analyzed with other log data collected by Azure Monitor. | [Connect data sources](../../sentinel/quickstart-onboard.md) |
--
-## Next steps
--- Learn more about the [types of monitoring data collected by Azure Monitor](../data-platform.md) and how to view and analyze this data.-- List the [different locations where Azure resources store data](../monitor-reference.md) and how you can access it.
azure-monitor Asp Net Core https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/asp-net-core.md
If you want to disable telemetry conditionally and dynamically, you can resolve
The preceding code sample prevents the sending of telemetry to Application Insights. It doesn't prevent any automatic collection modules from collecting telemetry. If you want to remove a particular auto collection module, see [remove the telemetry module](#configuring-or-removing-default-telemetrymodules).
+## Frequently asked questions
+
+### Does Application Insights support ASP.NET Core 3.X?
+
+Yes. Update to [Application Insights SDK for ASP.NET Core](https://nuget.org/packages/Microsoft.ApplicationInsights.AspNetCore) version 2.8.0 or later. Earlier versions of the SDK don't support ASP.NET Core 3.X.
+
+Also, if you're [enabling server-side telemetry based on Visual Studio](#enable-application-insights-server-side-telemetry-visual-studio), update to the latest version of Visual Studio 2019 (16.3.0) to onboard. Earlier versions of Visual Studio don't support automatic onboarding for ASP.NET Core 3.X apps.
+
+### How can I track telemetry that's not automatically collected?
+
+Get an instance of `TelemetryClient` by using constructor injection, and call the required `TrackXXX()` method on it. We don't recommend creating new `TelemetryClient` or `TelemetryConfiguration` instances in an ASP.NET Core application. A singleton instance of `TelemetryClient` is already registered in the `DependencyInjection` container, which shares `TelemetryConfiguration` with rest of the telemetry. Creating a new `TelemetryClient` instance is recommended only if it needs a configuration that's separate from the rest of the telemetry.
+
+The following example shows how to track more telemetry from a controller.
+
+```csharp
+using Microsoft.ApplicationInsights;
+
+public class HomeController : Controller
+{
+ private TelemetryClient telemetry;
+
+ // Use constructor injection to get a TelemetryClient instance.
+ public HomeController(TelemetryClient telemetry)
+ {
+ this.telemetry = telemetry;
+ }
+
+ public IActionResult Index()
+ {
+ // Call the required TrackXXX method.
+ this.telemetry.TrackEvent("HomePageRequested");
+ return View();
+ }
+```
+
+For more information about custom data reporting in Application Insights, see [Application Insights custom metrics API reference](./api-custom-events-metrics.md). A similar approach can be used for sending custom metrics to Application Insights using the [GetMetric API](./get-metric.md).
+
+### How do I customize ILogger logs collection?
+
+By default, only `Warning` logs and more severe logs are automatically captured. To change this behavior, explicitly override the logging configuration for the provider `ApplicationInsights` as shown below.
+The following configuration allows ApplicationInsights to capture all `Information` logs and more severe logs.
+
+```json
+{
+ "Logging": {
+ "LogLevel": {
+ "Default": "Warning"
+ },
+ "ApplicationInsights": {
+ "LogLevel": {
+ "Default": "Information"
+ }
+ }
+ }
+}
+```
+
+It's important to note that the following example doesn't cause the ApplicationInsights provider to capture `Information` logs. It doesn't capture it because the SDK adds a default logging filter that instructs `ApplicationInsights` to capture only `Warning` logs and more severe logs. ApplicationInsights requires an explicit override.
+
+```json
+{
+ "Logging": {
+ "LogLevel": {
+ "Default": "Information"
+ }
+ }
+}
+```
+
+For more information, see [ILogger configuration](ilogger.md#logging-level).
+
+### Some Visual Studio templates used the UseApplicationInsights() extension method on IWebHostBuilder to enable Application Insights. Is this usage still valid?
+
+The extension method `UseApplicationInsights()` is still supported, but it's marked as obsolete in Application Insights SDK version 2.8.0 and later. It will be removed in the next major version of the SDK. To enable Application Insights telemetry, we recommend using `AddApplicationInsightsTelemetry()` because it provides overloads to control some configuration. Also, in ASP.NET Core 3.X apps, `services.AddApplicationInsightsTelemetry()` is the only way to enable Application Insights.
+
+### I'm deploying my ASP.NET Core application to Web Apps. Should I still enable the Application Insights extension from Web Apps?
+
+If the SDK is installed at build time as shown in this article, you don't need to enable the [Application Insights extension](./azure-web-apps.md) from the App Service portal. If the extension is installed, it will back off when it detects the SDK is already added. If you enable Application Insights from the extension, you don't have to install and update the SDK. But if you enable Application Insights by following instructions in this article, you have more flexibility because:
+
+ * Application Insights telemetry will continue to work in:
+ * All operating systems, including Windows, Linux, and Mac.
+ * All publish modes, including self-contained or framework dependent.
+ * All target frameworks, including the full .NET Framework.
+ * All hosting options, including Web Apps, VMs, Linux, containers, Azure Kubernetes Service, and non-Azure hosting.
+ * All .NET Core versions including preview versions.
+ * You can see telemetry locally when you're debugging from Visual Studio.
+ * You can track more custom telemetry by using the `TrackXXX()` API.
+ * You have full control over the configuration.
+
+### Can I enable Application Insights monitoring by using tools like Azure Monitor Application Insights Agent (formerly Status Monitor v2)?
+
+ Yes. In [Application Insights Agent 2.0.0-beta1](https://www.powershellgallery.com/packages/Az.ApplicationMonitor/2.0.0-beta1) and later, ASP.NET Core applications hosted in IIS are supported.
+
+### Are all features supported if I run my application in Linux?
+
+Yes. Feature support for the SDK is the same in all platforms, with the following exceptions:
+
+* The SDK collects [Event Counters](./eventcounters.md) on Linux because [Performance Counters](./performance-counters.md) are only supported in Windows. Most metrics are the same.
+* Although `ServerTelemetryChannel` is enabled by default, if the application is running in Linux or macOS, the channel doesn't automatically create a local storage folder to keep telemetry temporarily if there are network issues. Because of this limitation, telemetry is lost when there are temporary network or server issues. To work around this issue, configure a local folder for the channel:
+
+```csharp
+using Microsoft.ApplicationInsights.Channel;
+using Microsoft.ApplicationInsights.WindowsServer.TelemetryChannel;
+
+ public void ConfigureServices(IServiceCollection services)
+ {
+ // The following will configure the channel to use the given folder to temporarily
+ // store telemetry items during network or Application Insights server issues.
+ // User should ensure that the given folder already exists
+ // and that the application has read/write permissions.
+ services.AddSingleton(typeof(ITelemetryChannel),
+ new ServerTelemetryChannel () {StorageFolder = "/tmp/myfolder"});
+ services.AddApplicationInsightsTelemetry();
+ }
+```
+
+This limitation isn't applicable from version [2.15.0](https://www.nuget.org/packages/Microsoft.ApplicationInsights.AspNetCore/2.15.0) and later.
+
+### Is this SDK supported for the new .NET Core 3.X Worker Service template applications?
+
+This SDK requires `HttpContext`; therefore, it doesn't work in any non-HTTP applications, including the .NET Core 3.X Worker Service applications. To enable Application Insights in such applications using the newly released Microsoft.ApplicationInsights.WorkerService SDK, see [Application Insights for Worker Service applications (non-HTTP applications)](worker-service.md).
+ ## Open-source SDK * [Read and contribute to the code](https://github.com/microsoft/ApplicationInsights-dotnet).
azure-monitor Usage Heart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/usage-heart.md
Happiness is a user-reported dimension that measures how users feel about the pr
A common approach to measure happiness is to ask users a Customer Satisfaction (CSAT) question like *How satisfied are you with this product?*. Users' responses on a three or a five-point scale (for example, *no, maybe,* and *yes*) are aggregated to create a product-level score ranging from 1-5. Since user-initiated feedback tends to be negatively biased, HEART tracks happiness from surveys displayed to users at pre-defined intervals.
-Common happiness metrics include values such as *Average Star Rating* and *Customer Satisfaction Score*. Send these values to Azure Monitor using one of the custom ingestion methods described in [Custom sources](../agents/data-sources.md#custom-sources).
+Common happiness metrics include values such as *Average Star Rating* and *Customer Satisfaction Score*. Send these values to Azure Monitor using one of the custom ingestion methods described in [Custom sources](../data-sources.md#custom-sources).
For more on editing workbook templates, refer to the [Azure Workbook templates](
## Next steps - Set up the [Click Analytics Auto Collection Plugin](javascript-click-analytics-plugin.md) via npm.-- Check out the [GitHub Repository](https://github.com/microsoft/ApplicationInsights-JS/tree/master/extensions/applicationinsights-clickanalytics-js) and [NPM Package](https://www.npmjs.com/package/@microsoft/applicationinsights-clickanalytics-js) for the Click Analytics Auto Collection Plugin.
+- Check out the [GitHub Repository](https://github.com/microsoft/ApplicationInsights-JS/tree/master/extensions/applicationinsights-clickanalytics-js) and [npm Package](https://www.npmjs.com/package/@microsoft/applicationinsights-clickanalytics-js) for the Click Analytics Auto Collection Plugin.
- Use [Events Analysis in Usage Experience](usage-segmentation.md) to analyze top clicks and slice by available dimensions. - Find click data under content field within customDimensions attribute in CustomEvents table in [Log Analytics](../logs/log-analytics-tutorial.md#write-a-query). See [Sample App](https://go.microsoft.com/fwlink/?linkid=2152871) for more guidance. - Learn more about [Google's HEART framework](https://storage.googleapis.com/pub-tools-public-publication-data/pdf/36299.pdf).
azure-monitor Autoscale Predictive https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/autoscale/autoscale-predictive.md
Title: Use predictive autoscale to scale out before load demands in virtual machine scale sets (Preview) description: Details on the new predictive autoscale feature in Azure Monitor. Previously updated : 01/24/2022++ Last updated : 07/18/2022
azure-monitor Best Practices Cost https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/best-practices-cost.md
You can save on data ingestion costs by configuring [certain tables](logs/basic-
The decision whether to configure a table for Basic Logs is based on the following criteria: -- The table currently support Basic Logs.
+- The table currently supports Basic Logs.
- You don't require more than eight days of data retention for the table. - You only require basic queries of the data using a limited version of the query language.-- The cost savings for data ingestion over a month exceeds the expected cost for any expected queries
+- The cost savings for data ingestion over a month exceed the expected cost for any expected queries
See [Query Basic Logs in Azure Monitor (Preview)](.//logs/basic-logs-query.md) for details on query limitations and [Configure Basic Logs in Azure Monitor (Preview)](logs/basic-logs-configure.md) for more details about them.
Virtual machines can vary significantly in the amount of data they collect, depe
### Use transformations to filter events
-The bulk of data collection from virtual machines will be from Windows or Syslog events. While you can provide more filtering with the Azure Monitor agent, you still may be collecting records that provide little value. Use [transformations](essentials/data-collection-rule-transformations.md) to implement more granular filtering and also to filter data from columns that provide little value. For example, you might have a Windows event that's valuable for alerting, but it includes columns with redundant or excessive data. You can create a transformation that allows the event to be collected but removes this excessive data.
+The bulk of data collection from virtual machines will be from Windows or Syslog events. While you can provide more filtering with the Azure Monitor agent, you still may be collecting records that provide little value. Use [transformations](essentials//data-collection-transformations.md) to implement more granular filtering and also to filter data from columns that provide little value. For example, you might have a Windows event that's valuable for alerting, but it includes columns with redundant or excessive data. You can create a transformation that allows the event to be collected but removes this excessive data.
See the section below on filtering data with transformations for a summary on where to implement filtering and transformations for different data sources. ### Multi-homing agents You should be cautious with any configuration using multi-homed agents where a single virtual machine sends data to multiple workspaces since you may be incurring charges for the same data multiple times. If you do multi-home agents, ensure that you're sending unique data to each workspace.
-You can also collect duplicate data with a single virtual machine running both the Azure Monitor agent and Log Analytics agent, even if they're both sending data to the same workspace. While the agents can coexist, each works independently without any knowledge of the other. You should continue to use the Log Analytics agent until you [migrate to the Azure Monitor agent](./agents/azure-monitor-agent-migration.md) rather than using both together unless you can ensure that each are collecting unique data.
+You can also collect duplicate data with a single virtual machine running both the Azure Monitor agent and Log Analytics agent, even if they're both sending data to the same workspace. While the agents can coexist, each works independently without any knowledge of the other. You should continue to use the Log Analytics agent until you [migrate to the Azure Monitor agent](./agents/azure-monitor-agent-migration.md) rather than using both together unless you can ensure that each is collecting unique data.
See [Analyze usage in Log Analytics workspace](logs/analyze-usage.md) for guidance on analyzing your collected data to ensure that you aren't collecting duplicate data for the same machine.
There are multiple methods that you can use to limit the amount of data collecte
## Resource logs The data volume for [resource logs](essentials/resource-logs.md) varies significantly between services, so you should only collect the categories that are required. You may also not want to collect platform metrics from Azure resources since this data is already being collected in Metrics. Only configured your diagnostic data to collect metrics if you need metric data in the workspace for more complex analysis with log queries.
-Diagnostic settings do not allow granular filtering of resource logs. You may require certain logs in a particular category but not others. In this case, use [ingestion-time transformations](logs/ingestion-time-transformations.md) on the workspace to filter logs that you don't require. You can also filter out the value of certain columns that you don't require to save additional cost.
+Diagnostic settings do not allow granular filtering of resource logs. You may require certain logs in a particular category but not others. In this case, use [transformations](essentials/data-collection-transformations.md) on the workspace to filter logs that you don't require. You can also filter out the value of certain columns that you don't require to save additional cost.
## Other insights and services See the documentation for other services that store their data in a Log Analytics workspace for recommendations on optimizing their data usage. Following
See the documentation for other services that store their data in a Log Analytic
## Filter data with transformations (preview)
-[Data collection rule transformations in Azure Monitor](essentials/data-collection-rule-transformations.md) allow you to filter incoming data to reduce costs for data ingestion and retention. In addition to filtering records from the incoming data, you can filter out columns in the data, reducing its billable size as described in [Data size calculation](logs/cost-logs.md#data-size-calculation).
+[Data collection rule transformations in Azure Monitor](essentials//data-collection-transformations.md) allow you to filter incoming data to reduce costs for data ingestion and retention. In addition to filtering records from the incoming data, you can filter out columns in the data, reducing its billable size as described in [Data size calculation](logs/cost-logs.md#data-size-calculation).
Use ingestion-time transformations on the workspace to further filter data for workflows where you don't have granular control. For example, you can select categories in a [diagnostic setting](essentials/diagnostic-settings.md) to collect resource logs for a particular service, but that category might send a variety of records that you don't need. Create a transformation for the table that service uses to filter out records you don't want.
The following table for methods to apply transformations to different workflows.
| Azure Monitor agent | Azure tables | Collect data from standard sources such as Windows events, syslog, and performance data and send to Azure tables in Log Analytics workspace. | Use XPath in DCR to collect specific data from client machine. Ingestion-time transformations in agent DCR are not yet supported. | | Azure Monitor agent | Custom tables | Collecting data outside of standard data sources is not yet supported. | | | Log Analytics agent | Azure tables | Collect data from standard sources such as Windows events, syslog, and performance data and send to Azure tables in Log Analytics workspace. | Configure data collection on the workspace. Optionally, create ingestion-time transformation in the workspace DCR to filter records and columns. |
-| Log Analytics agent | Custom tables | Configure [custom logs](agents/data-sources-custom-logs.md) on the workspace to collect file based text logs. | Configure ingestion-time transformation in the workspace DCR to filter or transform incoming data. You must first migrate the custom table to the new custom logs API. |
-| Data Collector API | Custom tables | Use [Data Collector API](logs/data-collector-api.md) to send data to custom tables in the workspace using REST API. | Configure ingestion-time transformation in the workspace DCR to filter or transform incoming data. You must first migrate the custom table to the new custom logs API. |
-| Custom Logs API | Custom tables<br>Azure tables | Use [Custom Logs API](logs/custom-logs-overview.md) to send data to custom tables in the workspace using REST API. | Configure ingestion-time transformation in the DCR for the custom log. |
+| Log Analytics agent | Custom tables | Configure [custom logs](agents/data-sources-custom-logs.md) on the workspace to collect file based text logs. | Configure ingestion-time transformation in the workspace DCR to filter or transform incoming data. You must first migrate the custom table to the new logs ingestion API. |
+| Data Collector API | Custom tables | Use [Data Collector API](logs/data-collector-api.md) to send data to custom tables in the workspace using REST API. | Configure ingestion-time transformation in the workspace DCR to filter or transform incoming data. You must first migrate the custom table to the new logs ingestion API. |
+| Logs ingestion API | Custom tables<br>Azure tables | Use [Logs ingestion API](logs/logs-ingestion-api-overview.md) to send data to the workspace using REST API. | Configure ingestion-time transformation in the DCR for the custom log. |
| Other data sources | Azure tables | Includes resource logs from diagnostic settings and other Azure Monitor features such as Application insights, Container insights and VM insights. | Configure ingestion-time transformation in the workspace DCR to filter or transform incoming data. |
Once you've configured your environment and data collection for cost optimizatio
### Set a daily cap A [daily cap](logs/daily-cap.md) disables data collection in a Log Analytics workspace for the rest of the day once your configured limit is reached. This should not be used as a method to reduce costs, but rather as a preventative measure to ensure that you don't exceed a particular budget. Daily caps are typically used by organizations that are particularly cost conscious.
-When data collection stops, you effectively have no monitoring of features and resources relying on that workspace. Rather than just relying on the daily cap alone, you can configure an alert rule to notify you when data collection reaches some level before the daily cap. This allows you address any increases before data collection shuts down, or even to temporarily disable collection for less critical resources.
+When data collection stops, you effectively have no monitoring of features and resources relying on that workspace. Rather than just relying on the daily cap alone, you can configure an alert rule to notify you when data collection reaches some level before the daily cap. This allows you to address any increases before data collection shuts down, or even to temporarily disable collection for less critical resources.
See [Set daily cap on Log Analytics workspace](logs/daily-cap.md) for details on how the daily cap works and how to configure one. ### Send alert when data collection is high
-In order to avoid unexpected bills, you should be proactively notified any time you experience excessive usage. This allows you to address any potential anomalies before the end of your billing period.
+In order to avoid unexpected bills, you should be proactively notified anytime you experience excessive usage. This allows you to address any potential anomalies before the end of your billing period.
The following example is a [log alert rule](alerts/alerts-unified-log.md) that sends an alert if the billable data volume ingested in the last 24 hours was greater than 50 GB. Modify the **Alert Logic** to use a different threshold based on expected usage in your environment. You can also increase the frequency to check usage multiple times every day, but this will result in a higher charge for the alert rule.
azure-monitor Container Insights Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/containers/container-insights-overview.md
Container insights is a feature designed to monitor the performance of container
- Self-managed Kubernetes clusters hosted on [Azure Stack](/azure-stack/user/azure-stack-kubernetes-aks-engine-overview) or on-premises - [Azure Arc-enabled Kubernetes](../../azure-arc/kubernetes/overview.md)
-Container insights supports clusters running the Linux and Windows Server 2019 operating system. The container runtimes it supports are Docker, Moby, and any CRI compatible runtime such as CRI-O and ContainerD.
+Container insights supports clusters running the Linux and Windows Server 2019 operating system. The container runtimes it supports are Moby and any CRI compatible runtime such as CRI-O and ContainerD. Docker is no longer supported as a container runtime as of September 2022. For more information about this deprecation, see the [AKS release notes][aks-release-notes].
>[!NOTE] > Container insights support for Windows Server 2022 operating system in public preview.
The main differences in monitoring a Windows Server cluster compared to a Linux
## Next steps To begin monitoring your Kubernetes cluster, review [How to enable Container insights](container-insights-onboard.md) to understand the requirements and available methods to enable monitoring.+
+<!-- LINKS - external -->
+[aks-release-notes]: https://github.com/Azure/AKS/releases
azure-monitor Continuous Monitoring https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/continuous-monitoring.md
In order to gain observability across your entire environment, you need to enabl
## Enable monitoring for your entire infrastructure Applications are only as reliable as their underlying infrastructure. Having monitoring enabled across your entire infrastructure will help you achieve full observability and make it easier to discover a potential root cause when something fails. Azure Monitor helps you track the health and performance of your entire hybrid infrastructure including resources such as VMs, containers, storage, and network. -- You automatically get [platform metrics, activity logs and diagnostics logs](agents/data-sources.md) from most of your Azure resources with no configuration.
+- You automatically get [platform metrics, activity logs and diagnostics logs](data-sources.md) from most of your Azure resources with no configuration.
- Enable deeper monitoring for VMs with [VM insights](vm/vminsights-overview.md). - Enable deeper monitoring for AKS clusters with [Container insights](containers/container-insights-overview.md). - Add [monitoring solutions](./monitor-reference.md) for different applications and services in your environment.
azure-monitor Data Platform https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/data-platform.md
Title: Azure Monitor data platform | Microsoft Docs
-description: Monitoring data collected by Azure Monitor is separated into metrics that are lightweight and capable of supporting near-real-time scenarios and logs that are used for advanced analysis.
+ Title: Azure Monitor data platform
+description: Overview of the Azure Monitor data platform and collection of observability data.
- na Last updated 04/05/2022-
Today's complex computing environments run distributed applications that rely on
[Azure Monitor](overview.md) collects and aggregates data from various sources into a common data platform where it can be used for analysis, visualization, and alerting. It provides a consistent experience on top of data from multiple sources. You can gain deep insights across all your monitored resources and even with data from other services that store their data in Azure Monitor.
-![Screenshot that shows Azure Monitor overview.](media/data-platform/overview.png)
+![Diagram that shows an overview of Azure Monitor with data sources on the left sending data to a central data platform and features of Azure Monitor on the right that use the collected data.](media/overview/azure-monitor-overview-optm.svg)
## Observability data in Azure Monitor
+Metrics, logs, and distributed traces are commonly referred to as the three pillars of observability. A monitoring tool must collect and analyze these three different kinds of data to provide sufficient observability of a monitored system. Observability can be achieved by correlating data from multiple pillars and aggregating data across the entire set of resources being monitored. Because Azure Monitor stores data from multiple sources together, the data can be correlated and analyzed by using a common set of tools. It also correlates data across multiple Azure subscriptions and tenants, in addition to hosting data for other services.
-Metrics, logs, and distributed traces are commonly referred to as the three pillars of observability. A monitoring tool must collect and analyze these three different kinds of data to provide sufficient observability of a monitored system.
-Observability can be achieved by correlating data from multiple pillars and aggregating data across the entire set of resources being monitored. Because Azure Monitor stores data from multiple sources together, the data can be correlated and analyzed by using a common set of tools. It also correlates data across multiple Azure subscriptions and tenants, in addition to hosting data for other services.
+Azure resources generate a significant amount of monitoring data. Azure Monitor consolidates this data along with monitoring data from other sources into either a Metrics or Logs platform. Each is optimized for particular monitoring scenarios, and each supports different features in Azure Monitor. Features such as data analysis, visualizations, or alerting require you to understand the differences so that you can implement your required scenario in the most efficient and cost effective manner. Insights in Azure Monitor such as [Application Insights](app/app-insights-overview.md) or [Container insights](containers/container-insights-overview.md) have analysis tools that allow you to focus on the particular monitoring scenario without having to understand the differences between the two types of data.
-Azure resources generate a significant amount of monitoring data. Azure Monitor consolidates this data along with monitoring data from other sources into either a Metrics or Logs platform. Each platform is optimized for particular monitoring scenarios, and each one supports different features in Azure Monitor.
-
-Features such as data analysis, visualizations, or alerting require you to understand the differences so that you can implement your required scenario in the most efficient and cost-effective manner. Insights in Azure Monitor such as [Application Insights](app/app-insights-overview.md) or [VM insights](vm/vminsights-overview.md) have analysis tools that allow you to focus on the particular monitoring scenario without having to understand the differences between the two types of data.
### Metrics
The following table compares metrics and logs in Azure Monitor.
| Attribute | Metrics | Logs | |:|:|:|
-| Benefits | Lightweight and capable of near-real-time scenarios such as alerting. Ideal for fast detection of issues. | Analyzed with rich query language. Ideal for deep analysis and identifying root cause. |
-| Data | Numerical values only. | Text or numeric data. |
-| Structure | Standard set of properties including sample time, resource being monitored, and numeric value. Some metrics include multiple dimensions for further definition. | Unique set of properties depending on the log type. |
-| Collection | Collected at regular intervals. | Might be collected sporadically as events trigger a record to be created. |
-| View in the Azure portal | Metrics Explorer. | Log Analytics. |
-| Data sources include | Platform metrics collected from Azure resources.<br>Applications monitored by Application Insights.<br>Custom defined by application or API. | Application and resource logs.<br>Monitoring solutions.<br>Agents and VM extensions.<br>Application requests and exceptions.<br>Microsoft Defender for Cloud.<br>Data Collector API. |
+| Benefits | Lightweight and capable of near-real time scenarios such as alerting. Ideal for fast detection of issues. | Analyzed with rich query language. Ideal for deep analysis and identifying root cause. |
+| Data | Numerical values only | Text or numeric data |
+| Structure | Standard set of properties including sample time, resource being monitored, a numeric value. Some metrics include multiple dimensions for further definition. | Unique set of properties depending on the log type. |
+| Collection | Collected at regular intervals. | May be collected sporadically as events trigger a record to be created. |
+| Analyze in Azure portal | Metrics Explorer | Log Analytics |
+| Data sources include | Platform metrics collected from Azure resources<br>Applications monitored by Application Insights<br>Azure Monitor agent<br>Custom defined by application or API | Application and resource logs<br>Azure Monitor agent<br>Application requests and exceptions<br>Logs ingestion API<br>Azure Sentinel<br>Microsoft Defender for Cloud |
## Collect monitoring data-
-Different [sources of data for Azure Monitor](agents/data-sources.md) will write to either a Log Analytics workspace (Logs) or the Azure Monitor metrics database (Metrics) or both. Some sources will write directly to these data stores. Others might write to another location, such as Azure Storage, and require some configuration to populate logs or metrics.
+Different [sources of data for Azure Monitor](data-sources.md) will write to either a Log Analytics workspace (Logs) or the Azure Monitor metrics database (Metrics) or both. Some sources will write directly to these data stores, while others may write to another location such as Azure storage and require some configuration to populate logs or metrics.
For a listing of different data sources that populate each type, see [Metrics in Azure Monitor](essentials/data-platform-metrics.md) and [Logs in Azure Monitor](logs/data-platform-logs.md).
In addition to using the tools in Azure to analyze monitoring data, you might ha
Some sources can be configured to send data directly to an event hub while you can use another process, such as a logic app, to retrieve the required data. For more information, see [Stream Azure monitoring data to an event hub for consumption by an external tool](essentials/stream-monitoring-data-event-hubs.md).
-## Next steps
-- Read more about [metrics in Azure Monitor](essentials/data-platform-metrics.md).-- Read more about [logs in Azure Monitor](logs/data-platform-logs.md).-- Learn about the [monitoring data available](agents/data-sources.md) for different resources in Azure.+++
+## Next steps
+- Read more about [Metrics in Azure Monitor](essentials/data-platform-metrics.md).
+- Read more about [Logs in Azure Monitor](logs/data-platform-logs.md).
+- Learn about the [monitoring data available](data-sources.md) for different resources in Azure.
azure-monitor Data Sources https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/data-sources.md
+
+ Title: Sources of data in Azure Monitor
+description: Describes the data available to monitor the health and performance of your Azure resources and the applications running on them.
+++ Last updated : 07/09/2022++++
+# Sources of monitoring data for Azure Monitor
+Azure Monitor is based on a [common monitoring data platform](data-platform.md) that includes [Logs](logs/data-platform-logs.md) and [Metrics](essentials/data-platform-metrics.md). This platform allows data from multiple resources to be analyzed together using a common set of tools in Azure Monitor. Monitoring data may also be sent to other locations to support certain scenarios, and some resources may write to other locations before they can be collected into Logs or Metrics.
+
+This article describes common sources of monitoring data collected by Azure Monitor in addition to the monitoring data created by Azure resources. Links are provided to detailed information on configuration required to collect this data to different locations.
+
+Some of these data sources use the [new data ingestion pipeline](essentials/data-collection.md) in Azure Monitor. This article will be updated as other data sources transition to this new data collection method.
+
+## Application tiers
+
+Sources of monitoring data from Azure applications can be organized into tiers, the highest tiers being your application itself and the lower tiers being components of Azure platform. The method of accessing data from each tier varies. The application tiers are summarized in the table below, and the sources of monitoring data in each tier are presented in the following sections. See [Monitoring data locations in Azure](monitor-reference.md) for a description of each data location and how you can access its data.
++++
+### Azure
+The following table briefly describes the application tiers that are specific to Azure. Following the link for further details on each in the sections below.
+
+| Tier | Description | Collection method |
+|:|:|:|
+| [Azure Tenant](#azure-tenant) | Data about the operation of tenant-level Azure services, such as Azure Active Directory. | View Azure Active Directory data in portal or configure collection to Azure Monitor using a tenant diagnostic setting. |
+| [Azure subscription](#azure-subscription) | Data related to the health and management of cross-resource services in your Azure subscription such as Resource Manager and Service Health. | View in portal or configure collection to Azure Monitor using a log profile. |
+| [Azure resources](#azure-resources) | Data about the operation and performance of each Azure resource. | Metrics collected automatically, view in Metrics Explorer.<br>Configure diagnostic settings to collect logs in Azure Monitor.<br>Monitoring solutions and Insights available for more detailed monitoring for specific resource types. |
+
+### Azure, other cloud, or on-premises
+The following table briefly describes the application tiers that may be in Azure, another cloud, or on-premises. Following the link for further details on each in the sections below.
+
+| Tier | Description | Collection method |
+|:|:|:|
+| [Operating system (guest)](#operating-system-guest) | Data about the operating system on compute resources. | Install Azure Monitor agent on virtual machines, scale sets and Arc-enabled servers to collect logs and metrics into Azure Monitor. |
+| [Application Code](#application-code) | Data about the performance and functionality of the actual application and code, including performance traces, application logs, and user telemetry. | Instrument your code to collect data into Application Insights. |
+| [Custom sources](#custom-sources) | Data from external services or other components or devices. | Collect log or metrics data into Azure Monitor from any REST client. |
+
+## Azure tenant
+Telemetry related to your Azure tenant is collected from tenant-wide services such as Azure Active Directory.
+++
+### Azure Active Directory Audit Logs
+[Azure Active Directory reporting](../active-directory/reports-monitoring/overview-reports.md) contains the history of sign-in activity and audit trail of changes made within a particular tenant.
+
+| Destination | Description | Reference |
+|:|:|:|
+| Azure Monitor Logs | Configure Azure AD logs to be collected in Azure Monitor to analyze them with other monitoring data. | [Integrate Azure AD logs with Azure Monitor logs](../active-directory/reports-monitoring/howto-integrate-activity-logs-with-log-analytics.md) |
+| Azure Storage | Export Azure AD logs to Azure Storage for archiving. | [Tutorial: Archive Azure AD logs to an Azure storage account](../active-directory/reports-monitoring/quickstart-azure-monitor-route-logs-to-storage-account.md) |
+| Event Hubs | Stream Azure AD logs to other locations using Event Hubs. | [Tutorial: Stream Azure Active Directory logs to an Azure event hub](../active-directory/reports-monitoring/tutorial-azure-monitor-stream-logs-to-event-hub.md). |
+++
+## Azure subscription
+Telemetry related to the health and operation of your Azure subscription.
++
+### Azure Activity log
+The [Azure Activity log](essentials/platform-logs-overview.md) includes service health records along with records on any configuration changes made to the resources in your Azure subscription. The Activity log is available to all Azure resources and represents their _external_ view.
+
+| Destination | Description | Reference |
+|:|:|:|
+| Activity log | The Activity log is collected into its own data store that you can view from the Azure Monitor menu or use to create Activity log alerts. | [Query the Activity log in the Azure portal](essentials/activity-log.md#view-the-activity-log) |
+| Azure Monitor Logs | Configure Azure Monitor Logs to collect the Activity log to analyze it with other monitoring data. | [Collect and analyze Azure activity logs in Log Analytics workspace in Azure Monitor](essentials/activity-log.md) |
+| Azure Storage | Export the Activity log to Azure Storage for archiving. | [Archive Activity log](essentials/resource-logs.md#send-to-azure-storage) |
+| Event Hubs | Stream the Activity log to other locations using Event Hubs | [Stream Activity log to Event Hubs](essentials/resource-logs.md#send-to-azure-event-hubs). |
+
+### Azure Service Health
+[Azure Service Health](../service-health/service-health-overview.md) provides information about the health of the Azure services in your subscription that your application and resources rely on.
+
+| Destination | Description | Reference |
+|:|:|:|
+| Activity log<br>Azure Monitor Logs | Service Health records are stored in the Azure Activity log, so you can view them in the Azure portal or perform any other activities you can perform with the Activity log. | [View service health notifications by using the Azure portal](../service-health/service-notifications.md) |
++
+## Azure resources
+Metrics and resource logs provide information about the _internal_ operation of Azure resources. These are available for most Azure services, and monitoring solutions and insights collect additional data for particular services.
+++
+### Platform metrics
+Most Azure services will send [platform metrics](essentials/data-platform-metrics.md) that reflect their performance and operation directly to the metrics database. The specific [metrics will vary for each type of resource](essentials/metrics-supported.md).
+
+| Destination | Description | Reference |
+|:|:|:|
+| Azure Monitor Metrics | Platform metrics will write to the Azure Monitor metrics database with no configuration. Access platform metrics from Metrics Explorer. | [Getting started with Azure Metrics Explorer](essentials/metrics-getting-started.md)<br>[Supported metrics with Azure Monitor](essentials/metrics-supported.md) |
+| Azure Monitor Logs | Copy platform metrics to Logs for trending and other analysis using Log Analytics. | [Azure diagnostics direct to Log Analytics](essentials/resource-logs.md#send-to-log-analytics-workspace) |
+| Event Hubs | Stream metrics to other locations using Event Hubs. |[Stream Azure monitoring data to an event hub for consumption by an external tool](essentials/stream-monitoring-data-event-hubs.md) |
+
+### Resource logs
+[Resource logs](essentials/platform-logs-overview.md) provide insights into the _internal_ operation of an Azure resource. Resource logs are created automatically, but you must create a diagnostic setting to specify a destination for them to be collected for each resource.
+
+The configuration requirements and content of resource logs vary by resource type, and not all services yet create them. See [Supported services, schemas, and categories for Azure resource logs](essentials/resource-logs-schema.md) for details on each service and links to detailed configuration procedures. If the service isn't listed in this article, then that service doesn't currently create resource logs.
+
+| Destination | Description | Reference |
+|:|:|:|
+| Azure Monitor Logs | Send resource logs to Azure Monitor Logs for analysis with other collected log data. | [Collect Azure resource logs in Log Analytics workspace in Azure Monitor](essentials/resource-logs.md#send-to-azure-storage) |
+| Storage | Send resource logs to Azure Storage for archiving. | [Archive Azure resource logs](essentials/resource-logs.md#send-to-log-analytics-workspace) |
+| Event Hubs | Stream resource logs to other locations using Event Hubs. |[Stream Azure resource logs to an event hub](essentials/resource-logs.md#send-to-azure-event-hubs) |
+
+## Operating system (guest)
+Compute resources in Azure, in other clouds, and on-premises have a guest operating system to monitor. With the installation of an agent, you can gather telemetry from the guest into Azure Monitor to analyze it with the same monitoring tools as the Azure services themselves.
+++
+### Azure Monitor agent
+[Install the Azure Monitor agent](agents/azure-monitor-agent-manage.md) for comprehensive monitoring and management of your Windows or Linux virtual machines, scale sets and Arc-enabled servers. The Azure Monitor agent replaces the Log Analytics agent and Azure diagnostic extension.
+
+| Destination | Description | Reference |
+|:|:|:|
+| Azure Monitor Logs | The Azure Monitor agent allows you to collect logs from data sources that you configure using [data collection rules](agents/data-collection-rule-azure-monitor-agent.md) or from monitoring solutions that provide additional insights into applications running on the machine. These can be sent to one or more Log Analytics workspaces. | [Data sources and destinations](agents/azure-monitor-agent-overview.md#data-sources-and-destinations) |
+| Azure Monitor Metrics (preview) | The Azure Monitor agent allows you to collect performance counters and send them to Azure Monitor metrics database | [Data sources and destinations](agents/azure-monitor-agent-overview.md#data-sources-and-destinations) |
++
+### Log Analytics agent
+[Install the Log Analytics agent](agents/log-analytics-agent.md) for comprehensive monitoring and management of your Windows or Linux virtual machines. The virtual machine can be running in Azure, another cloud, or on-premises. The Log Analytics agent is still supported but has been replaced by the Azure Monitor agent.
+
+| Destination | Description | Reference |
+|:|:|:|
+| Azure Monitor Logs | The Log Analytics agent connects to Azure Monitor either directly or through System Center Operations Manager and allows you to collect data from data sources that you configure or from monitoring solutions that provide additional insights into applications running on the virtual machine. | [Agent data sources in Azure Monitor](agents/agent-data-sources.md)<br>[Connect Operations Manager to Azure Monitor](agents/om-agents.md) |
+
+### Azure diagnostic extension
+Enabling the Azure diagnostics extension for Azure Virtual machines allows you to collect logs and metrics from the guest operating system of Azure compute resources including Azure Cloud Service (classic) Web and Worker Roles, Virtual Machines, virtual machine scale sets, and Service Fabric.
+
+| Destination | Description | Reference |
+|:|:|:|
+| Storage | Azure diagnostics extension always writes to an Azure Storage account. | [Install and configure Azure diagnostics extension (WAD)](agents/diagnostics-extension-windows-install.md)<br>[Use Linux Diagnostic Extension to monitor metrics and logs](../virtual-machines/extensions/diagnostics-linux.md) |
+| Azure Monitor Metrics (preview) | When you configure the Diagnostics Extension to collect performance counters, they are written to the Azure Monitor metrics database. | [Send Guest OS metrics to the Azure Monitor metric store using a Resource Manager template for a Windows virtual machine](essentials/collect-custom-metrics-guestos-resource-manager-vm.md) |
+| Event Hubs | Configure the Diagnostics Extension to stream the data to other locations using Event Hubs. | [Streaming Azure Diagnostics data by using Event Hubs](agents/diagnostics-extension-stream-event-hubs.md)<br>[Use Linux Diagnostic Extension to monitor metrics and logs](../virtual-machines/extensions/diagnostics-linux.md) |
+| Application Insights Logs | Collect logs and performance counters from the compute resource supporting your application to be analyzed with other application data. | [Send Cloud Service, Virtual Machine, or Service Fabric diagnostic data to Application Insights](agents/diagnostics-extension-to-application-insights.md) |
++
+### VM insights
+[VM insights](vm/vminsights-overview.md) provides a customized monitoring experience for virtual machines providing features beyond core Azure Monitor functionality. It requires a Dependency Agent on Windows and Linux virtual machines that integrates with the Log Analytics agent to collect discovered data about processes running on the virtual machine and external process dependencies.
+
+| Destination | Description | Reference |
+|:|:|:|
+| Azure Monitor Logs | Stores data about processes and dependencies on the agent. | [Using VM insights Map to understand application components](vm/vminsights-maps.md) |
+++
+## Application Code
+Detailed application monitoring in Azure Monitor is done with [Application Insights](/azure/application-insights/) which collects data from applications running on a variety of platforms. The application can be running in Azure, another cloud, or on-premises.
++++
+### Application data
+When you enable Application Insights for an application by installing an instrumentation package, it collects metrics and logs related to the performance and operation of the application. Application Insights stores the data it collects in the same Azure Monitor data platform used by other data sources. It includes extensive tools for analyzing this data, but you can also analyze it with data from other sources using tools such as Metrics Explorer and Log Analytics.
+
+| Destination | Description | Reference |
+|:|:|:|
+| Azure Monitor Logs | Operational data about your application including page views, application requests, exceptions, and traces. | [Analyze log data in Azure Monitor](logs/log-query-overview.md) |
+| | Dependency information between application components to support Application Map and telemetry correlation. | [Telemetry correlation in Application Insights](app/correlation.md) <br> [Application Map](app/app-map.md) |
+| | Results of availability tests that test the availability and responsiveness of your application from different locations on the public Internet. | [Monitor availability and responsiveness of any web site](app/monitor-web-app-availability.md) |
+| Azure Monitor Metrics | Application Insights collects metrics describing the performance and operation of the application in addition to custom metrics that you define in your application into the Azure Monitor metrics database. | [Log-based and pre-aggregated metrics in Application Insights](app/pre-aggregated-metrics-log-metrics.md)<br>[Application Insights API for custom events and metrics](app/api-custom-events-metrics.md) |
+| Azure Storage | Send application data to Azure Storage for archiving. | [Export telemetry from Application Insights](app/export-telemetry.md) |
+| | Details of availability tests are stored in Azure Storage. Use Application Insights in the Azure portal to download for local analysis. Results of availability tests are stored in Azure Monitor Logs. | [Monitor availability and responsiveness of any web site](app/monitor-web-app-availability.md) |
+| | Profiler trace data is stored in Azure Storage. Use Application Insights in the Azure portal to download for local analysis. | [Profile production applications in Azure with Application Insights](app/profiler-overview.md)
+| | Debug snapshot data that is captured for a subset of exceptions is stored in Azure Storage. Use Application Insights in the Azure portal to download for local analysis. | [How snapshots work](app/snapshot-debugger.md#how-snapshots-work) |
+
+## Insights
+[Insights](monitor-reference.md) collect data to provide additional insights into the operation of a particular service or application. They may address resources in different application tiers and even multiple tiers.
++
+### Container insights
+[Container insights](containers/container-insights-overview.md) provides a customized monitoring experience for [Azure Kubernetes Service (AKS)](../aks/index.yml). It collects additional data about these resources described in the following table.
+
+| Destination | Description | Reference |
+|:|:|:|
+| Azure Monitor Logs | Stores monitoring data for AKS including inventory, logs, and events. Metric data is also stored in Logs in order to leverage its analysis functionality in the portal. | [Understand AKS cluster performance with Container insights](containers/container-insights-analyze.md) |
+| Azure Monitor Metrics | Metric data is stored in the metric database to drive visualization and alerts. | [View container metrics in metrics explorer](containers/container-insights-analyze.md#view-container-metrics-in-metrics-explorer) |
+| Azure Kubernetes Service | Provides direct access to your Azure Kubernetes Service (AKS) container logs (stdout/stderror), events, and pod metrics in the portal. | [How to view Kubernetes logs, events, and pod metrics in real-time](containers/container-insights-livedata-overview.md) |
+
+### VM insights
+[VM insights](vm/vminsights-overview.md) provides a customized experience for monitoring virtual machines. A description of the data collected by VM insights is included in the [Operating System (guest)](#operating-system-guest) section above.
+
+## Custom sources
+In addition to the standard tiers of an application, you may need to monitor other resources that have telemetry that can't be collected with the other data sources. For these resources, write this data to either Metrics or Logs using an Azure Monitor API.
++++
+| Destination | Method | Description | Reference |
+|:|:|:|:|
+| Azure Monitor Logs | Logs ingestion API | Collect log data from any REST client and store in Log Analytics workspace using a data collection rule. | [Logs ingestion API in Azure Monitor (preview)](logs/logs-ingestion-api-overview.md) |
+| | Data Collector API | Collect log data from any REST client and store in Log Analytics workspace. | [Send log data to Azure Monitor with the HTTP Data Collector API (preview)](logs/data-collector-api.md) |
+| Azure Monitor Metrics | Custom Metrics API | Collect metric data from any REST client and store in Azure Monitor metrics database. | [Send custom metrics for an Azure resource to the Azure Monitor metric store by using a REST API](essentials/metrics-store-custom-rest-api.md) |
++
+## Other services
+Other services in Azure write data to the Azure Monitor data platform. This allows you to analyze data collected by these services with data collected by Azure Monitor and leverage the same analysis and visualization tools.
+
+| Service | Destination | Description | Reference |
+|:|:|:|:|
+| [Microsoft Defender for Cloud](../security-center/index.yml) | Azure Monitor Logs | Microsoft Defender for Cloud stores the security data it collects in a Log Analytics workspace which allows it to be analyzed with other log data collected by Azure Monitor. | [Data collection in Microsoft Defender for Cloud](../security-center/security-center-enable-data-collection.md) |
+| [Microsoft Sentinel](../sentinel/index.yml) | Azure Monitor Logs | Microsoft Sentinel stores the data it collects from different data sources in a Log Analytics workspace which allows it to be analyzed with other log data collected by Azure Monitor. | [Connect data sources](../sentinel/quickstart-onboard.md) |
++
+## Next steps
+
+- Learn more about the [types of monitoring data collected by Azure Monitor](data-platform.md) and how to view and analyze this data.
+- List the [different locations where Azure resources store data](monitor-reference.md) and how you can access it.
azure-monitor Data Collection Endpoint Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/data-collection-endpoint-overview.md
ms.reviwer: nikeist
# Data collection endpoints in Azure Monitor
-Data Collection Endpoints (DCEs) allow you to uniquely configure ingestion settings for Azure Monitor. This article provides an overview of data collection endpoints including their contents and structure and how you can create and work with them.
+Data Collection Endpoints (DCEs) provide a connection for certain data sources of Azure Monitor. This article provides an overview of data collection endpoints including their contents and structure and how you can create and work with them.
-## Workflows that use DCEs
-The following workflows currently use DCEs:
+## Data sources that use DCEs
+The following data sources currently use DCEs:
-- [Azure Monitor agent](../agents/data-collection-rule-azure-monitor-agent.md)-- [Custom logs](../logs/custom-logs-overview.md)
+- [Azure Monitor agent when network isolation is required](../agents/azure-monitor-agent-data-collection-endpoint.md)
+- [Custom logs](../logs/logs-ingestion-api-overview.md)
## Components of a data collection endpoint A data collection endpoint includes the following components. | Component | Description | |:|:|
-| Configuration access endpoint | The endpoint used to access the configuration service to fetch associated data collection rules (DCR). Example: `<unique-dce-identifier>.<regionname>.handler.control` |
-| Logs ingestion endpoint | The endpoint used to ingest logs to Log Analytics workspace(s). Example: `<unique-dce-identifier>.<regionname>.ingest` |
+| Configuration access endpoint | The endpoint used to access the configuration service to fetch associated data collection rules (DCR) for Azure Monitor agent.<br>Example: `<unique-dce-identifier>.<regionname>.handler.control` |
+| Logs ingestion endpoint | The endpoint used to ingest logs to Log Analytics workspace(s).<br>Example: `<unique-dce-identifier>.<regionname>.ingest` |
| Network Access Control Lists (ACLs) | Network access control rules for the endpoints
A data collection endpoint includes the following components.
Data collection endpoints are ARM resources created within specific regions. An endpoint in a given region can only be **associated with machines in the same region**, although you can have more than one endpoint within the same region as per your needs. ## Limitations
-Data collection endpoints only support Log Analytics as a destination for collected data. [Custom Metrics (preview)](../essentials/metrics-custom-overview.md) collected and uploaded via the Azure Monitor Agent are not currently controlled by DCEs nor can they be configured over private links.
+Data collection endpoints only support Log Analytics workspaces as a destination for collected data. [Custom Metrics (preview)](../essentials/metrics-custom-overview.md) collected and uploaded via the Azure Monitor Agent are not currently controlled by DCEs nor can they be configured over private links.
+
+## Create data collection endpoint
+
+> [!IMPORTANT]
+> If agents will connect to your DCE, it must be created in the same region. If you have agents in different regions, then you'll need multiple DCEs.
+
+# [Azure portal](#tab/portal)
-## Create endpoint in Azure portal
1. In the **Azure Monitor** menu in the Azure portal, select **Data Collection Endpoint** from the **Settings** section. Click **Create** to create a new Data Collection Rule and assignment.
Data collection endpoints only support Log Analytics as a destination for collec
3. Click **Review + create** to review the details of the data collection endpoint. Click **Create** to create it.
-## Create endpoint and association using REST API
+# [REST API](#tab/restapi)
-> [!NOTE]
-> The data collection endpoint should be created in the **same region** where your virtual machines exist.
-1. Create data collection endpoint(s) using these [DCE REST APIs](/cli/azure/monitor/data-collection/endpoint).
-2. Create association(s) to link the endpoint(s) to your target machines or resources, using these [DCRA REST APIs](/rest/api/monitor/datacollectionruleassociations/create#examples).
+Create data collection endpoint(s) using the [DCE REST APIs](/cli/azure/monitor/data-collection/endpoint).
+Create associations between endpoints to your target machines or resources, using the [DCRA REST APIs](/rest/api/monitor/datacollectionruleassociations/create#examples).
++ ## Sample data collection endpoint
-The sample data collection endpoint below is for virtual machines with Azure Monitor agent, with public network access disabled so that agent only uses private links to communicate and send data to Azure Monitor/Log Analytics.
-
-```json
-{
- "id": "/subscriptions/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx/resourceGroups/myResourceGroup/providers/Microsoft.Insights/dataCollectionEndpoints/myCollectionEndpoint",
- "name": "myCollectionEndpoint",
- "type": "Microsoft.Insights/dataCollectionEndpoints",
- "location": "eastus",
- "tags": {
- "tag1": "A",
- "tag2": "B"
- },
- "properties": {
- "configurationAccess": {
- "endpoint": "https://mycollectionendpoint-abcd.eastus-1.control.monitor.azure.com"
- },
- "logsIngestion": {
- "endpoint": "https://mycollectionendpoint-abcd.eastus-1.ingest.monitor.azure.com"
- },
- "networkAcls": {
- "publicNetworkAccess": "Disabled"
- }
- },
- "systemData": {
- "createdBy": "user1",
- "createdByType": "User",
- "createdAt": "yyyy-mm-ddThh:mm:ss.sssssssZ",
- "lastModifiedBy": "user2",
- "lastModifiedByType": "User",
- "lastModifiedAt": "yyyy-mm-ddThh:mm:ss.sssssssZ"
- },
- "etag": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
-}
-```
+See [Sample data collection endpoint](data-collection-endpoint-sample.md) for a sample data collection endpoint.
## Next steps - [Associate endpoint to machines](../agents/data-collection-rule-azure-monitor-agent.md#create-data-collection-rule-and-association)
azure-monitor Data Collection Endpoint Sample https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/data-collection-endpoint-sample.md
+
+ Title: Sample data collection endpoint
+description: Sample data collection endpoint below is for virtual machines with Azure Monitor agent
+ Last updated : 03/16/2022+++
+# Sample data collection endpoint
+The sample data collection endpoint (DCE) below is for virtual machines with Azure Monitor agent, with public network access disabled so that agent only uses private links to communicate and send data to Azure Monitor/Log Analytics.
+
+## Sample DCE
+
+```json
+{
+ "id": "/subscriptions/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx/resourceGroups/myResourceGroup/providers/Microsoft.Insights/dataCollectionEndpoints/myCollectionEndpoint",
+ "name": "myCollectionEndpoint",
+ "type": "Microsoft.Insights/dataCollectionEndpoints",
+ "location": "eastus",
+ "tags": {
+ "tag1": "A",
+ "tag2": "B"
+ },
+ "properties": {
+ "configurationAccess": {
+ "endpoint": "https://mycollectionendpoint-abcd.eastus-1.control.monitor.azure.com"
+ },
+ "logsIngestion": {
+ "endpoint": "https://mycollectionendpoint-abcd.eastus-1.ingest.monitor.azure.com"
+ },
+ "networkAcls": {
+ "publicNetworkAccess": "Disabled"
+ }
+ },
+ "systemData": {
+ "createdBy": "user1",
+ "createdByType": "User",
+ "createdAt": "yyyy-mm-ddThh:mm:ss.sssssssZ",
+ "lastModifiedBy": "user2",
+ "lastModifiedByType": "User",
+ "lastModifiedAt": "yyyy-mm-ddThh:mm:ss.sssssssZ"
+ },
+ "etag": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
+}
+```
+
+## Next steps
+- [Read more about data collection endpoints](data-collection-endpoint-overview.md)
azure-monitor Data Collection Rule Edit https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/data-collection-rule-edit.md
While going through the wizard on the portal is the simplest way to set up the i
In this tutorial, you will, first, set up ingestion of a custom log, then. you will modify the KQL transformation for your custom log to include additional filtering and apply the changes to your DCR. Finally, we are going to combine all editing operations into a single PowerShell script, which can be used to edit any DCR for any of the above mentioned reasons. ## Set up new custom log
-Start by setting up a new custom log. Follow [Tutorial: Send custom logs to Azure Monitor Logs using the Azure portal (preview)]( ../logs/tutorial-custom-logs.md). Note the resource ID of the DCR created.
+Start by setting up a new custom log. Follow [Tutorial: Send custom logs to Azure Monitor Logs using the Azure portal (preview)]( ../logs/tutorial-logs-ingestion-portal.md). Note the resource ID of the DCR created.
## Retrieve DCR content In order to update DCR, we are going to retrieve its content and save it as a file, which can be further edited. 1. Click the **Cloud Shell** button in the Azure portal and ensure the environment is set to **PowerShell**.
- :::image type="content" source="../logs/media/tutorial-ingestion-time-transformations-api/open-cloud-shell.png" lightbox="../logs/media/tutorial-ingestion-time-transformations-api/open-cloud-shell.png" alt-text="Screenshot of opening cloud shell":::
+ :::image type="content" source="../logs/media/tutorial-workspace-transformations-api/open-cloud-shell.png" lightbox="../logs/media/tutorial-workspace-transformations-api/open-cloud-shell.png" alt-text="Screenshot of opening cloud shell":::
2. Execute the following commands to retrieve DCR content and save it to a file. Replace `<ResourceId>` with DCR ResourceID and `<FilePath>` with the name of the file to store DCR.
azure-monitor Data Collection Rule Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/data-collection-rule-overview.md
Title: Data Collection Rules in Azure Monitor description: Overview of data collection rules (DCRs) in Azure Monitor including their contents and structure and how you can create and work with them. Previously updated : 04/26/2022 Last updated : 07/15/2022 -+ # Data collection rules in Azure Monitor
-[Data Collection Rules (DCRs)](../essentials/data-collection-rule-overview.md) provide an [ETL](/azure/architecture/data-guide/relational-data/etl)-like pipeline in Azure Monitor, allowing you to define the way that data coming into Azure Monitor should be handled. Depending on the type of workflow, DCRs may specify where data should be sent and may filter or transform data before it's stored in Azure Monitor Logs. Some data collection rules will be created and managed by Azure Monitor, while you may create others to customize data collection for your particular requirements. This article describes DCRs including their contents and structure and how you can create and work with them.
+Data Collection Rules (DCRs) define the [data collection process in Azure Monitor](../essentials/data-collection.md). DCRs specify what data should be collected, how to transform that data, and where to send that data. Some DCRs will be created and managed by Azure Monitor to collect a specific set of data to enable insights and visualizations. You may also create your own DCRs to define the set of data required for other scenarios.
++
+## View data collection rules
+To view your data collection rules in the Azure portal, select **Data Collection Rules** from the **Monitor** menu.
+
+> [!NOTE]
+> While this view shows all data collection rules in the specified subscriptions, clicking the **Create** button will create a data collection for Azure Monitor agent. Similarly, this page will only allow you to modify data collection rules for the Azure Monitor agent. See [Creating a data collection rule](#create-a-data-collection-rule) below for guidance on creating and updating data collection rules for other workflows.
++
+## Create a data collection rule
+The following resources describe different scenarios for creating data collection rules. In some cases, the data collection rule may be created for you, while in others you may need to create and edit it yourself.
+
+| Scenario | Resources | Description |
+|:|:|:|
+| Azure Monitor agent | [Configure data collection for the Azure Monitor agent](../agents/data-collection-rule-azure-monitor-agent.md) | Use the Azure portal to create a data collection rule that specifies events and performance counters to collect from a machine with the Azure Monitor agent and then apply that rule to one or more virtual machines. The Azure Monitor agent will be installed on any machines that don't currently have it. |
+| | [Use Azure Policy to install Azure Monitor agent and associate with DCR](../agents/azure-monitor-agent-manage.md#using-azure-policy) | Use Azure Policy to install the Azure Monitor agent and associate one or more data collection rules with any virtual machines or virtual machine scale sets as they're created in your subscription.
+| Custom logs | [Configure custom logs using the Azure portal](../logs/tutorial-logs-ingestion-portal.md)<br>[Configure custom logs using Resource Manager templates and REST API](../logs/tutorial-logs-ingestion-api.md) | Send custom data using a REST API. The API call connects to a DCE and specifies a DCR to use. The DCR specifies the target table and potentially includes a transformation that filters and modifies the data before it's stored in a Log Analytics workspace. |
+| Workspace transformation | [Configure ingestion-time transformations using the Azure portal](../logs/tutorial-workspace-transformations-portal.md)<br>[Configure ingestion-time transformations using Resource Manager templates and REST API](../logs/tutorial-workspace-transformations-api.md) | Create a transformation for any supported table in a Log Analytics workspace. The transformation is defined in a DCR that's then associated with the workspace and applied to any data sent to that table from a legacy workload that doesn't use a DCR. |
+
+## Work with data collection rules
+See the following resources for working with data collection rules outside of the Azure portal.
+
+| Method | Resources |
+|:|:|
+| API | Directly edit the data collection rule in any JSON editor and then [install using the REST API](/rest/api/monitor/datacollectionrules). |
+| CLI | Create DCR and associations with [Azure CLI](https://github.com/Azure/azure-cli-extensions/blob/master/src/monitor-control-service/README.md). |
+| PowerShell | Work with DCR and associations with the following Azure PowerShell cmdlets.<br>[Get-AzDataCollectionRule](/powershell/module/az.monitor/get-azdatacollectionrule)<br>[New-AzDataCollectionRule](/powershell/module/az.monitor/new-azdatacollectionrule)<br>[Set-AzDataCollectionRule](/powershell/module/az.monitor/set-azdatacollectionrule)<br>[Update-AzDataCollectionRule](/powershell/module/az.monitor/update-azdatacollectionrule)<br>[Remove-AzDataCollectionRule](/powershell/module/az.monitor/remove-azdatacollectionrule)<br>[Get-AzDataCollectionRuleAssociation](/powershell/module/az.monitor/get-azdatacollectionruleassociation)<br>[New-AzDataCollectionRuleAssociation](/powershell/module/az.monitor/new-azdatacollectionruleassociation)<br>[Remove-AzDataCollectionRuleAssociation](/powershell/module/az.monitor/remove-azdatacollectionruleassociation)
-## Types of data collection rules
-There are currently two types of data collection rule in Azure Monitor:
-- **Standard DCR**. Used with different workflows that send data to Azure Monitor. Workflows currently supported are [Azure Monitor agent](../agents/azure-monitor-agent-overview.md) and [custom logs (preview)](../logs/custom-logs-overview.md). -- **Workspace transformation DCR**. Used with a Log Analytics workspace to apply [ingestion-time transformations (preview)](../logs/ingestion-time-transformations.md) to workflows that don't currently support DCRs. ## Structure of a data collection rule
-Data collection rules are formatted in JSON. While you may not need to interact with them directly, there are scenarios where you may need to directly edit a data collection rule. See [Data collection rule structure](data-collection-rule-structure.md) for a description of this structure and different elements.
+Data collection rules are formatted in JSON. While you may not need to interact with them directly, there are scenarios where you may need to directly edit a data collection rule. See [Data collection rule structure](data-collection-rule-structure.md) for a description of this structure and the different elements used for different workflows.
## Permissions When using programmatic methods to create data collection rules and associations, you require the following permissions:
When using programmatic methods to create data collection rules and associations
## Limits For limits that apply to each data collection rule, see [Azure Monitor service limits](../service-limits.md#data-collection-rules).
-## Creating a data collection rule
-The following articles describe different scenarios for creating data collection rules. In some cases, the data collection rule may be created for you, while in others you may need to create and edit it yourself.
-
-| Workflow | Resources |
-|:|:|
-| Azure Monitor agent | [Configure data collection for the Azure Monitor agent](../agents/data-collection-rule-azure-monitor-agent.md)<br>[Use Azure Policy to install Azure Monitor agent and associate with DCR](../agents/azure-monitor-agent-manage.md#using-azure-policy) |
-| Custom logs | [Configure custom logs using the Azure portal](../logs/tutorial-custom-logs.md)<br>[Configure custom logs using Resource Manager templates and REST API](../logs/tutorial-custom-logs-api.md) |
-| Workspace transformation | [Configure ingestion-time transformations using the Azure portal](../logs/tutorial-ingestion-time-transformations.md)<br>[Configure ingestion-time transformations using Resource Manager templates and REST API](../logs/tutorial-ingestion-time-transformations-api.md) |
--
-## Programmatically work with DCRs
-See the following resources for programmatically working with DCRs.
-- Directly edit the data collection rule in JSON and [submit using the REST API](/rest/api/monitor/datacollectionrules).-- Create DCR and associations with [Azure CLI](https://github.com/Azure/azure-cli-extensions/blob/master/src/monitor-control-service/README.md).-- Create DCR and associations with Azure PowerShell.
- - [Get-AzDataCollectionRule](https://github.com/Azure/azure-powershell/blob/master/src/Monitor/Monitor/help/Get-AzDataCollectionRule.md)
- - [New-AzDataCollectionRule](https://github.com/Azure/azure-powershell/blob/master/src/Monitor/Monitor/help/New-AzDataCollectionRule.md)
- - [Set-AzDataCollectionRule](https://github.com/Azure/azure-powershell/blob/master/src/Monitor/Monitor/help/Set-AzDataCollectionRule.md)
- - [Update-AzDataCollectionRule](https://github.com/Azure/azure-powershell/blob/master/src/Monitor/Monitor/help/Update-AzDataCollectionRule.md)
- - [Remove-AzDataCollectionRule](https://github.com/Azure/azure-powershell/blob/master/src/Monitor/Monitor/help/Remove-AzDataCollectionRule.md)
- - [Get-AzDataCollectionRuleAssociation](https://github.com/Azure/azure-powershell/blob/master/src/Monitor/Monitor/help/Get-AzDataCollectionRuleAssociation.md)
- - [New-AzDataCollectionRuleAssociation](https://github.com/Azure/azure-powershell/blob/master/src/Monitor/Monitor/help/New-AzDataCollectionRuleAssociation.md)
- - [Remove-AzDataCollectionRuleAssociation](https://github.com/Azure/azure-powershell/blob/master/src/Monitor/Monitor/help/Remove-AzDataCollectionRuleAssociation.md)
+## Supported regions
+Data collection rules are available in all public regions where Log Analytics workspace are supported, as well as the Azure Government and China clouds. Air-gapped clouds are not yet supported.
+**Single region data residency** is a preview feature to enable storing customer data in a single region and is currently only available in the Southeast Asia Region (Singapore) of the Asia Pacific Geo and Brazil South (Sao Paulo State) Region of Brazil Geo. Single region residency is enabled by default in these regions.
## Data resiliency and high availability
-A rule gets created and stored in the region you specify, and is backed up to the [paired-region](../../availability-zones/cross-region-replication-azure.md#azure-cross-region-replication-pairings-for-all-geographies) within the same geography. The service is deployed to all three [availability zones](../../availability-zones/az-overview.md#availability-zones) within the region, making it a **zone-redundant service** which further adds to high availability.
-
-## Supported regions
-Data collection rules are stored regionally, and are available in all public regions where Log Analytics is supported, as well as the Azure Government and China clouds. Air-gapped clouds are not yet supported.
-
-### Single region data residency
-This is a preview feature to enable storing customer data in a single region is currently only available in the Southeast Asia Region (Singapore) of the Asia Pacific Geo and Brazil South (Sao Paulo State) Region of Brazil Geo. Single region residency is enabled by default in these regions.
+A rule gets created and stored in a particular region and is backed up to the [paired-region](../../availability-zones/cross-region-replication-azure.md#azure-cross-region-replication-pairings-for-all-geographies) within the same geography. The service is deployed to all three [availability zones](../../availability-zones/az-overview.md#availability-zones) within the region, making it a **zone-redundant service** which further increases availability.
## Next steps - [Read about the detailed structure of a data collection rule.](data-collection-rule-structure.md)-- [Get details on transformations in a data collection rule.](data-collection-rule-transformations.md)
+- [Get details on transformations in a data collection rule.](data-collection-transformations.md)
azure-monitor Data Collection Rule Structure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/data-collection-rule-structure.md
description: Details on the structure of different kinds of data collection rule
Previously updated : 02/22/2022 Last updated : 07/10/2022 ms.reviwer: nikeist
ms.reviwer: nikeist
# Structure of a data collection rule in Azure Monitor (preview)
-[Data Collection Rules (DCRs)](data-collection-rule-overview.md) in Azure Monitor define the way that data coming into Azure Monitor should be handled. Some data collection rules will be created and managed by Azure Monitor, while you may create others to customize data collection for your particular requirements. This article describes the structure of DCRs for creating and editing data collection rules in those cases where you need to work with them directly.
+[Data Collection Rules (DCRs)](data-collection-rule-overview.md) determine how to collect and process telemetry sent to Azure. Some data collection rules will be created and managed by Azure Monitor, while you may create others to customize data collection for your particular requirements. This article describes the structure of DCRs for creating and editing data collection rules in those cases where you need to work with them directly.
## Custom logs
-A DCR for [custom logs](../logs/custom-logs-overview.md) contains the following sections:
+A DCR for [custom logs](../logs/logs-ingestion-api-overview.md) contains the sections below. For a sample, see [Sample data collection rule - custom logs](../logs/data-collection-rule-sample-custom-logs.md).
+ ### streamDeclarations This section contains the declaration of all the different types of data that will be sent via the HTTP endpoint directly into Log Analytics. Each stream is an object whose key represents the stream name (Must begin with *Custom-*) and whose value is the full list of top-level properties that the JSON data that will be sent will contain. Note that the shape of the data you send to the endpoint doesn't need to match that of the destination table. Rather, the output of the transform that is applied on top of the input data needs to match the destination shape. The possible data types that can be assigned to the properties are `string`, `int`, `long`, `real`, `boolean`, `dynamic`, and `datetime`.
This section contains a declaration of all the destinations where the data will
This section ties the other sections together. Defines the following for each stream declared in the `streamDeclarations` section: - `destination` from the `destinations` section where the data will be sent. -- `transformKql` which is the [transformation](data-collection-rule-transformations.md) applied to the data that was sent in the input shape described in the `streamDeclarations` section to the shape of the target table.
+- `transformKql` which is the [transformation](/data-collection-transformations.md) applied to the data that was sent in the input shape described in the `streamDeclarations` section to the shape of the target table.
- `outputStream` section, which describes which table in the workspace specified under the `destination` property the data will be ingested into. The value of the outputStream will have the `Microsoft-[tableName]` shape when data is being ingested into a standard Log Analytics table, or `Custom-[tableName]` when ingesting data into a custom-created table. Only one destination is allowed per stream. ## Azure Monitor agent
- A DCR for [Azure Monitor agent](../agents/data-collection-rule-azure-monitor-agent.md) contains the following sections:
+ A DCR for [Azure Monitor agent](../agents/data-collection-rule-azure-monitor-agent.md) contains the sections below. For a sample, see [Sample data collection rule - agent](../agents/data-collection-rule-sample-agent.md).
-### Data sources
+### dataSources
Unique source of monitoring data with its own format and method of exposing its data. Examples of a data source include Windows event log, performance counters, and syslog. Each data source matches a particular data source type as described below. Each data source has a data source type. Each type defines a unique set of properties that must be specified for each data source. The data source types currently available are shown in the following table.
Each data source has a data source type. Each type defines a unique set of prope
### Streams Unique handle that describes a set of data sources that will be transformed and schematized as one type. Each data source requires one or more streams, and one stream may be used by multiple data sources. All data sources in a stream share a common schema. Use multiple streams for example, when you want to send a particular data source to multiple tables in the same Log Analytics workspace.
-### Destinations
+### destinations
Set of destinations where the data should be sent. Examples include Log Analytics workspace and Azure Monitor Metrics. Multiple destinations are allowed for multi-homing scenario.
-### Data flows
+### dataFlows
Definition of which streams should be sent to which destinations.
azure-monitor Data Collection Transformations Structure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/data-collection-transformations-structure.md
+
+ Title: KQL limitations in data collection transformations
+description: Structure of transformation in Azure Monitor including limitations of KQL allowed in a transformation.
+ Last updated : 06/29/2022
+ms.reviwer: nikeist
+++
+# Structure of transformation in Azure Monitor (preview)
+[Transformations in Azure Monitor](/data-collection-transformations.md) allow you to filter or modify incoming data before it's stored in a Log Analytics workspace. They are implemented as a Kusto Query Language (KQL) statement in a [data collection rule (DCR)](data-collection-rule-overview.md). This article provides details on how this query is structured and limitations on the KQL language allowed.
++
+## Transformation structure
+The KQL statement is applied individually to each entry in the data source. It must understand the format of the incoming data and create output in the structure of the target table. The input stream is represented by a virtual table named `source` with columns matching the input data stream definition. Following is a typical example of a transformation. This example includes the following functionality:
+
+- Filters the incoming data with a [where](/azure/data-explorer/kusto/query/whereoperator) statement
+- Adds a new column using the [extend](/azure/data-explorer/kusto/query/extendoperator) operator
+- Formats the output to match the columns of the target table using the [project](/azure/data-explorer/kusto/query/projectoperator) operator
+
+```kusto
+source
+| where severity == "Critical"
+| extend Properties = parse_json(properties)
+| project
+ TimeGenerated = todatetime(["time"]),
+ Category = category,
+ StatusDescription = StatusDescription,
+ EventName = name,
+ EventId = tostring(Properties.EventId)
+```
+
+## KQL limitations
+Since the transformation is applied to each record individually, it can't use any KQL operators that act on multiple records. Only operators that take a single row as input and return no more than one row are supported. For example, [summarize](/azure/data-explorer/kusto/query/summarizeoperator) isn't supported since it summarizes multiple records. See [Supported KQL features](#supported-kql-features) for a complete list of supported features.
+++
+Transformations in a [data collection rule (DCR)](data-collection-rule-overview.md) allow you to filter or modify incoming data before it's stored in a Log Analytics workspace. This article describes how to build transformations in a DCR, including details and limitations of the Kusto Query Language (KQL) used for the transform statement.
++++
+## Inline reference table
+The [datatable](/azure/data-explorer/kusto/query/datatableoperator?pivots=azuremonitor) operator isn't supported in the subset of KQL available to use in transformations. This operator would normally be used in KQL to define an inline query-time table. Use dynamic literals instead to work around this limitation.
+
+For example, the following statement isn't supported in a transformation:
+
+```kusto
+let galaxy = datatable (country:string,entity:string)['ES','Spain','US','United States'];
+source
+| join kind=inner (galaxy) on $left.Location == $right.country
+| extend Galaxy_CF = ['entity']
+```
+
+You can instead use the following statement, which is supported and performs the same functionality:
+
+```kusto
+let galaxyDictionary = parsejson('{"ES": "Spain","US": "United States"}');
+source
+| extend Galaxy_CF = galaxyDictionary[Location]
+```
+
+### has operator
+Transformations don't currently support [has](/azure/data-explorer/kusto/query/has-operator). Use [contains](/azure/data-explorer/kusto/query/contains-operator) which is supported and performs similar functionality.
++
+### Handling dynamic data
+Consider the following input with [dynamic data](/azure/data-explorer/kusto/query/scalar-data-types/dynamic):
+
+```json
+{
+ "TimeGenerated" : "2021-11-07T09:13:06.570354Z",
+ "Message": "Houston, we have a problem",
+ "AdditionalContext": {
+ "Level": 2,
+ "DeviceID": "apollo13"
+ }
+}
+```
+
+In order to access the properties in *AdditionalContext*, define it as dynamic-typed column in the input stream:
+
+```json
+"columns": [
+ {
+ "name": "TimeGenerated",
+ "type": "datetime"
+ },
+ {
+ "name": "Message",
+ "type": "string"
+ },
+ {
+ "name": "AdditionalContext",
+ "type": "dynamic"
+ }
+]
+```
+
+The content of *AdditionalContext* column can now be parsed and used in the KQL transformation:
+
+```kusto
+source
+| extend parsedAdditionalContext = parse_json(AdditionalContext)
+| extend Level = toint (parsedAdditionalContext.Level)
+| extend DeviceId = tostring(parsedAdditionalContext.DeviceID)
+```
+
+### Dynamic literals
+Use the [parse_json function](/azure/data-explorer/kusto/query/parsejsonfunction) to handle [dynamic literals](/azure/data-explorer/kusto/query/scalar-data-types/dynamic#dynamic-literals).
+
+For example, the following queries provide the same functionality:
+
+```kql
+print d=dynamic({"a":123, "b":"hello", "c":[1,2,3], "d":{}})
+```
+
+```kql
+print d=parse_json('{"a":123, "b":"hello", "c":[1,2,3], "d":{}}')
+```
+
+## Supported KQL features
+
+### Supported statements
+
+#### let statement
+The right-hand side of [let](/azure/data-explorer/kusto/query/letstatement) can be a scalar expression, a tabular expression or a user-defined function. Only user-defined functions with scalar arguments are supported.
+
+#### tabular expression statements
+The only supported data sources for the KQL statement are as follows:
+
+- **source**, which represents the source data. For example:
+
+```kql
+source
+| where ActivityId == "383112e4-a7a8-4b94-a701-4266dfc18e41"
+| project PreciseTimeStamp, Message
+```
+
+- [print](/azure/data-explorer/kusto/query/printoperator) operator, which always produces a single row. For example:
+
+```kusto
+print x = 2 + 2, y = 5 | extend z = exp2(x) + exp2(y)
+```
++
+### Tabular operators
+- [extend](/azure/data-explorer/kusto/query/extendoperator)
+- [project](/azure/data-explorer/kusto/query/projectoperator)
+- [print](/azure/data-explorer/kusto/query/printoperator)
+- [where](/azure/data-explorer/kusto/query/whereoperator)
+- [parse](/azure/data-explorer/kusto/query/parseoperator)
+- [project-away](/azure/data-explorer/kusto/query/projectawayoperator)
+- [project-rename](/azure/data-explorer/kusto/query/projectrenameoperator)
+- [columnifexists]() (use columnifexists instead of column_ifexists)
+
+### Scalar operators
+
+#### Numerical operators
+All [Numerical operators](/azure/data-explorer/kusto/query/numoperators) are supported.
+
+#### Datetime and Timespan arithmetic operators
+All [Datetime and Timespan arithmetic operators](/azure/data-explorer/kusto/query/datetime-timespan-arithmetic) are supported.
+
+#### String operators
+The following [String operators](/azure/data-explorer/kusto/query/datatypes-string-operators) are supported.
+
+- ==
+- !=
+- =~
+- !~
+- contains
+- !contains
+- contains_cs
+- !contains_cs
+- startswith
+- !startswith
+- startswith_cs
+- !startswith_cs
+- endswith
+- !endswith
+- endswith_cs
+- !endswith_cs
+- matches regex
+- in
+- !in
+
+#### Bitwise operators
+
+The following [Bitwise operators](/azure/data-explorer/kusto/query/binoperators) are supported.
+
+- binary_and()
+- binary_or()
+- binary_xor()
+- binary_not()
+- binary_shift_left()
+- binary_shift_right()
+
+### Scalar functions
+
+#### Bitwise functions
+
+- [binary_and](/azure/data-explorer/kusto/query/binary-andfunction)
+- [binary_or](/azure/data-explorer/kusto/query/binary-orfunction)
+- [binary_not](/azure/data-explorer/kusto/query/binary-notfunction)
+- [binary_shift_left](/azure/data-explorer/kusto/query/binary-shift-leftfunction)
+- [binary_shift_right](/azure/data-explorer/kusto/query/binary-shift-rightfunction)
+- [binary_xor](/azure/data-explorer/kusto/query/binary-xorfunction)
+
+#### Conversion functions
+
+- [tobool](/azure/data-explorer/kusto/query/toboolfunction)
+- [todatetime](/azure/data-explorer/kusto/query/todatetimefunction)
+- [todouble/toreal](/azure/data-explorer/kusto/query/todoublefunction)
+- [toguid](/azure/data-explorer/kusto/query/toguidfunction)
+- [toint](/azure/data-explorer/kusto/query/tointfunction)
+- [tolong](/azure/data-explorer/kusto/query/tolongfunction)
+- [tostring](/azure/data-explorer/kusto/query/tostringfunction)
+- [totimespan](/azure/data-explorer/kusto/query/totimespanfunction)
+
+#### DateTime and TimeSpan functions
+
+- [ago](/azure/data-explorer/kusto/query/agofunction)
+- [datetime_add](/azure/data-explorer/kusto/query/datetime-addfunction)
+- [datetime_diff](/azure/data-explorer/kusto/query/datetime-difffunction)
+- [datetime_part](/azure/data-explorer/kusto/query/datetime-partfunction)
+- [dayofmonth](/azure/data-explorer/kusto/query/dayofmonthfunction)
+- [dayofweek](/azure/data-explorer/kusto/query/dayofweekfunction)
+- [dayofyear](/azure/data-explorer/kusto/query/dayofyearfunction)
+- [endofday](/azure/data-explorer/kusto/query/endofdayfunction)
+- [endofmonth](/azure/data-explorer/kusto/query/endofmonthfunction)
+- [endofweek](/azure/data-explorer/kusto/query/endofweekfunction)
+- [endofyear](/azure/data-explorer/kusto/query/endofyearfunction)
+- [getmonth](/azure/data-explorer/kusto/query/getmonthfunction)
+- [getyear](/azure/data-explorer/kusto/query/getyearfunction)
+- [hourofday](/azure/data-explorer/kusto/query/hourofdayfunction)
+- [make_datetime](/azure/data-explorer/kusto/query/make-datetimefunction)
+- [make_timespan](/azure/data-explorer/kusto/query/make-timespanfunction)
+- [now](/azure/data-explorer/kusto/query/nowfunction)
+- [startofday](/azure/data-explorer/kusto/query/startofdayfunction)
+- [startofmonth](/azure/data-explorer/kusto/query/startofmonthfunction)
+- [startofweek](/azure/data-explorer/kusto/query/startofweekfunction)
+- [startofyear](/azure/data-explorer/kusto/query/startofyearfunction)
+- [todatetime](/azure/data-explorer/kusto/query/todatetimefunction)
+- [totimespan](/azure/data-explorer/kusto/query/totimespanfunction)
+- [weekofyear](/azure/data-explorer/kusto/query/weekofyearfunction)
+
+#### Dynamic and array functions
+
+- [array_concat](/azure/data-explorer/kusto/query/arrayconcatfunction)
+- [array_length](/azure/data-explorer/kusto/query/arraylengthfunction)
+- [pack_array](/azure/data-explorer/kusto/query/packarrayfunction)
+- [pack](/azure/data-explorer/kusto/query/packfunction)
+- [parse_json](/azure/data-explorer/kusto/query/parsejsonfunction)
+- [parse_xml](/azure/data-explorer/kusto/query/parse-xmlfunction)
+- [zip](/azure/data-explorer/kusto/query/zipfunction)
+
+#### Mathematical functions
+
+- [abs](/azure/data-explorer/kusto/query/abs-function)
+- [bin/floor](/azure/data-explorer/kusto/query/binfunction)
+- [ceiling](/azure/data-explorer/kusto/query/ceilingfunction)
+- [exp](/azure/data-explorer/kusto/query/exp-function)
+- [exp10](/azure/data-explorer/kusto/query/exp10-function)
+- [exp2](/azure/data-explorer/kusto/query/exp2-function)
+- [isfinite](/azure/data-explorer/kusto/query/isfinitefunction)
+- [isinf](/azure/data-explorer/kusto/query/isinffunction)
+- [isnan](/azure/data-explorer/kusto/query/isnanfunction)
+- [log](/azure/data-explorer/kusto/query/log-function)
+- [log10](/azure/data-explorer/kusto/query/log10-function)
+- [log2](/azure/data-explorer/kusto/query/log2-function)
+- [pow](/azure/data-explorer/kusto/query/powfunction)
+- [round](/azure/data-explorer/kusto/query/roundfunction)
+- [sign](/azure/data-explorer/kusto/query/signfunction)
+
+#### Conditional functions
+
+- [case](/azure/data-explorer/kusto/query/casefunction)
+- [iif](/azure/data-explorer/kusto/query/iiffunction)
+- [max_of](/azure/data-explorer/kusto/query/max-offunction)
+- [min_of](/azure/data-explorer/kusto/query/min-offunction)
+
+#### String functions
+
+- [base64_encodestring](/azure/data-explorer/kusto/query/base64_encode_tostringfunction) (use base64_encodestring instead of base64_encode_tostring)
+- [base64_decodestring](/azure/data-explorer/kusto/query/base64_decode_tostringfunction) (use base64_decodestring instead of base64_decode_tostring)
+- [countof](/azure/data-explorer/kusto/query/countoffunction)
+- [extract](/azure/data-explorer/kusto/query/extractfunction)
+- [extract_all](/azure/data-explorer/kusto/query/extractallfunction)
+- [indexof](/azure/data-explorer/kusto/query/indexoffunction)
+- [isempty](/azure/data-explorer/kusto/query/isemptyfunction)
+- [isnotempty](/azure/data-explorer/kusto/query/isnotemptyfunction)
+- [parse_json](/azure/data-explorer/kusto/query/parsejsonfunction)
+- [split](/azure/data-explorer/kusto/query/splitfunction)
+- [strcat](/azure/data-explorer/kusto/query/strcatfunction)
+- [strcat_delim](/azure/data-explorer/kusto/query/strcat-delimfunction)
+- [strlen](/azure/data-explorer/kusto/query/strlenfunction)
+- [substring](/azure/data-explorer/kusto/query/substringfunction)
+- [tolower](/azure/data-explorer/kusto/query/tolowerfunction)
+- [toupper](/azure/data-explorer/kusto/query/toupperfunction)
+- [hash_sha256](/azure/data-explorer/kusto/query/sha256hashfunction)
+
+#### Type functions
+
+- [gettype](/azure/data-explorer/kusto/query/gettypefunction)
+- [isnotnull](/azure/data-explorer/kusto/query/isnotnullfunction)
+- [isnull](/azure/data-explorer/kusto/query/isnullfunction)
+
+### Identifier quoting
+Use [Identifier quoting](/azure/data-explorer/kusto/query/schema-entities/entity-names?q=identifier#identifier-quoting) as required.
++++
+## Next steps
+
+- [Create a data collection rule](../agents/data-collection-rule-azure-monitor-agent.md) and an association to it from a virtual machine using the Azure Monitor agent.
azure-monitor Data Collection Transformations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/data-collection-transformations.md
+
+ Title: Data collection transformations
+description: Use transformations in a data collection rule in Azure Monitor to filter and modify incoming data.
+ Last updated : 06/29/2022
+ms.reviwer: nikeist
+++
+# Data collection transformations in Azure Monitor (preview)
+Transformations in Azure Monitor allow you to filter or modify incoming data before it's sent to a Log Analytics workspace. This article provides a basic description of transformations and how they are implemented. It provides links to other content for actually creating a transformation.
+
+## When to use transformations
+Transformations are useful for a variety of scenarios, including those described below.
+
+### Reduce data costs
+Since you're charged ingestion cost for any data sent to a Log Analytics workspace, you want to filter out any data that you don't require to reduce your costs.
+
+- **Remove entire rows.** For example, you might have a diagnostic setting to collect resource logs from a particular resource but not require all of the log entries that it generates. Create a transformation that filters out records that match a certain criteria.
+
+- **Remove a column from each row.** For example, your data may include columns with data that's redundant or has minimal value. Create a transformation that filters out columns that aren't required.
+
+- **Parse important data from a column.** You may have a table with valuable data buried in a particular column. Use a transformation to parse the valuable data into a new column and remove the original.
++
+### Remove sensitive data
+You may have a data source that sends information you don't want stored for privacy or compliancy reasons.
+
+- **Filter sensitive information.** Filter out entire rows or just particular columns that contain sensitive information.
+
+- **Obfuscate sensitive information**. For example, you might replace digits with a common character in an IP address or telephone number.
++
+### Enrich data with additional or calculated information
+Use a transformation to add information to data that provides business context or simplifies querying the data later.
+
+- **Add a column with additional information.** For example, you might add a column identifying whether an IP address in another column is internal or external.
+
+- **Add business specific information.** For example, you might add a column indicating a company division based on location information in other columns.
+
+## Supported tables
+Transformations may be applied to the following tables in a Log Analytics workspace.
+
+- Any Azure table listed in [Tables that support time transformations in Azure Monitor Logs (preview)](../logs/tables-feature-support.md)
+- Any custom table
++
+## How transformations work
+Transformations are performed in Azure Monitor in the [data ingestion pipeline](../essentials/data-collection.md) after the data source delivers the data and before it's sent to the destination. The data source may perform its own filtering before sending data but then rely on the transformation for further manipulation for it's sent to the destination.
+
+Transformations are defined in a [data collection rule (DCR)](data-collection-rule-overview.md) and use a [Kusto Query Language (KQL) statement](data-collection-transformations-structure.md) that is applied individually to each entry in the incoming data. It must understand the format of the incoming data and create output in the structure expected by the destination.
+
+For example, a DCR that collects data from a virtual machine using Azure Monitor agent would specify particular data to collect from the client operating system. It could also include a transformation that would get applied to that data after it's sent to the data ingestion pipeline that further filters the data or adds a calculated column. This workflow is shown in the following diagram.
++
+Another example is data sent from a custom application using the [logs ingestion API](../logs/logs-ingestion-api-overview.md). In this case, the application sends the data to a [data collection endpoint](data-collection-endpoint-overview.md) and specifies a data collection rule in the REST API call. The DCR includes the transformation and the destination workspace and table.
++
+## Workspace transformation DCR
+The workspace transformation DCR is a special DCR that's applied directly to a Log Analytics workspace. It includes default transformations for one more [supported tables](../logs/tables-feature-support.md). These transformations are applied to any data sent to these tables unless that data came from another DCR.
+
+For example, if you create a transformation in the workspace transformation DCR for the `Event` table, it would be applied to events collected by virtual machines running the [Log Analytics agent](../agents/log-analytics-agent.md) since this agent doesn't use a DCR. The transformation would be ignored by any data sent from the [Azure Monitor agent](../agents/azure-monitor-agent-overview.md) though since it uses a DCR and would be expected to provide its own transformation.
+
+A common use of the workspace transformation DCR is collection of [resource logs](resource-logs.md) which are configured with a [diagnostic setting](diagnostic-settings.md). This is shown in the example below.
++
+## Creating a transformation
+There are multiple methods to create transformations depending on the data collection method. The following table lists guidance for different methods for creating transformations.
+
+| Type | Reference |
+|:|:|
+| Logs ingestion API with transformation | [Send data to Azure Monitor Logs using REST API (Azure portal)](../logs/tutorial-logs-ingestion-portal.md)<br>[Send data to Azure Monitor Logs using REST API (Resource Manager templates)](../logs/tutorial-logs-ingestion-api.md) |
+| Transformation in workspace DCR | [Add workspace transformation to Azure Monitor Logs using the Azure portal (preview)](../logs/tutorial-workspace-transformations-portal.md)<br>[Add workspace transformation to Azure Monitor Logs using resource manager templates (preview)](../logs/tutorial-workspace-transformations-api.md)
++
+## Next steps
+
+- [Create a data collection rule](../agents/data-collection-rule-azure-monitor-agent.md) and an association to it from a virtual machine using the Azure Monitor agent.
azure-monitor Data Collection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/data-collection.md
+
+ Title: Data collection in Azure Monitor
+description: Monitoring data collected by Azure Monitor is separated into metrics that are lightweight and capable of supporting near real-time scenarios and logs that are used for advanced analysis.
+ Last updated : 07/10/2022++
+# Data collection in Azure Monitor
+Azure Monitor has a [common data platform](../data-platform.md) that consolidates data from a variety of sources. Currently, different sources of data for Azure Monitor use different methods to deliver their data, and each typically require different types of configuration. Get a description of the most common data sources at [Sources of monitoring data for Azure Monitor](../data-sources.md).
+
+Azure Monitor is implementing a new [ETL](/azure/architecture/data-guide/relational-data/etl)-like data collection pipeline that improves on legacy data collection methods. This process uses a common data ingestion pipeline for all data sources and provides a standard method of configuration that's more manageable and scalable than current methods. Specific advantages of the new data collection include the following:
+
+- Common set of destinations for different data sources.
+- Ability to apply a transformation to filter or modify incoming data before it's stored.
+- Consistent method for configuration of different data sources.
+- Scalable configuration options supporting infrastructure as code and DevOps processes.
+
+When implementation is complete, all data collected by Azure Monitor will use the new data collection process and be managed by data collection rules. Currently, only certain data collection methods support the ingestion pipeline, and they may have limited configuration options. There's no difference between data collected with the new ingestion pipeline and data collected using other methods. The data is all stored together as [Logs](../logs/data-platform-logs.md) and [Metrics](data-platform-metrics.md), supporting Azure Monitor features such as log queries, alerts, and workbooks. The only difference is in the method of collection.
+## Data collection rules
+Azure Monitor data collection is configured using a [data collection rule (DCR)](data-collection-rule-overview.md). A DCR defines the details of a particular data collection scenario including what data should be collected, how to potentially transform that data, and where to send that data. A single DCR can be used with multiple monitored resources, giving you a consistent method to configure a variety of monitoring scenarios. In some cases, Azure Monitor will create and configure a DCR for you using options in the Azure portal. You may also directly edit DCRs to configure particular scenarios.
+
+See [Data collection rules in Azure Monitor](data-collection-rule-overview.md) for details on data collection rules including how to view and create them.
+
+## Transformations
+One of the most valuable features of the new data collection process is [data transformations](data-collection-transformations.md), which allow you to apply a KQL query to incoming data to modify it before sending it to its destination. You might filter out unwanted data or modify existing data to improve your query or reporting capabilities.
+
+See [Data collection transformations in Azure Monitor (preview)](data-collection-transformations.md) For complete details on transformations including how to write transformation queries.
++
+## Data collection scenarios
+The following sections describe the data collection scenarios that are currently supported using DCR and the new data ingestion pipeline.
+
+### Azure Monitor agent
+The diagram below shows data collection for the [Azure Monitor agent](../agents/azure-monitor-agent-overview.md) running on a virtual machine. In this scenario, the DCR specifies events and performance data to collect from the agent machine, a transformation to filter and modify the data after its collected, and a Log Analytics workspace to send the transformed data. To implement this scenario, you create an association between the DCR and the agent. One agent can be associated with multiple DCRs, and one DCR can be associated with multiple agents.
++
+See [Collect data from virtual machines with the Azure Monitor agent](../agents/data-collection-rule-azure-monitor-agent.md) for details on creating a DCR for the Azure Monitor agent.
+
+### Log ingestion API
+The diagram below shows data collection for the [Logs ingestion API](../logs/logs-ingestion-api-overview.md), which allows you to send data to a Log Analytics workspace from any REST client. In this scenario, the API call connects to a [data collection endpoint (DCE)](data-collection-endpoint-overview.md) and specifies a DCR to accept its incoming data. The DCR understands the structure of the incoming data, includes a transformation that ensures that the data is in the format of the target table, and specifies a workspace and table to send the transformed data.
++
+See [Logs ingestion API in Azure Monitor (Preview)](../logs/logs-ingestion-api-overview.md) for details on the Logs ingestion API.
+
+### Workspace transformation DCR
+The diagram below shows data collection for [resource logs](resource-logs.md) using a [workspace transformation DCR](data-collection-transformations.md#workspace-transformation-dcr). This is a special DCR that's associated with a workspace and provides a default transformation for [supported tables](../logs/tables-feature-support.md). This transformation is applied to any data sent to the table that doesn't use another DCR. The example here shows resource logs using a diagnostic setting, but this same transformation could be applied to other data collection methods such as Log Analytics agent or Container insights.
++
+See [Workspace transformation DCR](data-collection-transformations.md#workspace-transformation-dcr) for details about workspace transformation DCRs and links to walkthroughs for creating them.
+
+## Next steps
+
+- Read more about [data collection rules](data-collection-rule-overview.md).
+- Read more about [transformations](data-collection-transformations.md).
+
azure-monitor Data Platform Metrics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/data-platform-metrics.md
If you see a blank chart or your chart displays only part of metric data, verify
- Learn more about the [Azure Monitor data platform](../data-platform.md). - Learn about [log data in Azure Monitor](../logs/data-platform-logs.md).-- Learn about the [monitoring data available](../agents/data-sources.md) for various resources in Azure.
+- Learn about the [monitoring data available](../data-sources.md) for various resources in Azure.
azure-monitor Stream Monitoring Data Event Hubs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/stream-monitoring-data-event-hubs.md
Before you configure streaming for any data source, you need to [create an Event
* Outbound port 5671 and 5672 must typically be opened on the computer or VNET consuming data from the event hub. ## Monitoring data available
-[Sources of monitoring data for Azure Monitor](../agents/data-sources.md) describes the data tiers for Azure applications and the kinds of data available for each. The following table lists each of these tiers and a description of how that data can be streamed to an event hub. Follow the links provided for further detail.
+[Sources of monitoring data for Azure Monitor](../data-sources.md) describes the data tiers for Azure applications and the kinds of data available for each. The following table lists each of these tiers and a description of how that data can be streamed to an event hub. Follow the links provided for further detail.
| Tier | Data | Method | |:|:|:|
-| [Azure tenant](../agents/data-sources.md#azure-tenant) | Azure Active Directory audit logs | Configure a tenant diagnostic setting on your Azure AD tenant. See [Tutorial: Stream Azure Active Directory logs to an Azure event hub](../../active-directory/reports-monitoring/tutorial-azure-monitor-stream-logs-to-event-hub.md) for details. |
-| [Azure subscription](../agents/data-sources.md#azure-subscription) | Azure Activity Log | Create a log profile to export Activity Log events to Event Hubs. See [Stream Azure platform logs to Azure Event Hubs](../essentials/resource-logs.md#send-to-azure-event-hubs) for details. |
-| [Azure resources](../agents/data-sources.md#azure-resources) | Platform metrics<br> Resource logs |Both types of data are sent to an event hub using a resource diagnostic setting. See [Stream Azure resource logs to an event hub](../essentials/resource-logs.md#send-to-azure-event-hubs) for details. |
-| [Operating system (guest)](../agents/data-sources.md#operating-system-guest) | Azure virtual machines | Install the [Azure Diagnostics Extension](../agents/diagnostics-extension-overview.md) on Windows and Linux virtual machines in Azure. See [Streaming Azure Diagnostics data in the hot path by using Event Hubs](../agents/diagnostics-extension-stream-event-hubs.md) for details on Windows VMs and [Use Linux Diagnostic Extension to monitor metrics and logs](../../virtual-machines/extensions/diagnostics-linux.md#protected-settings) for details on Linux VMs. |
-| [Application code](../agents/data-sources.md#application-code) | Application Insights | Use diagnostic settings to stream to event hubs. This is only available with workspace-based Application Insights resources. For help setting up workspace-based Application Insights resources, see [Workspace-based Application Insights resources](../app/create-workspace-resource.md#workspace-based-application-insights-resources) and [Migrate to workspace-based Application Insights resources](../app/convert-classic-resource.md#migrate-to-workspace-based-application-insights-resources).|
+| [Azure tenant](../data-sources.md#azure-tenant) | Azure Active Directory audit logs | Configure a tenant diagnostic setting on your Azure AD tenant. See [Tutorial: Stream Azure Active Directory logs to an Azure event hub](../../active-directory/reports-monitoring/tutorial-azure-monitor-stream-logs-to-event-hub.md) for details. |
+| [Azure subscription](../data-sources.md#azure-subscription) | Azure Activity Log | Create a log profile to export Activity Log events to Event Hubs. See [Stream Azure platform logs to Azure Event Hubs](../essentials/resource-logs.md#send-to-azure-event-hubs) for details. |
+| [Azure resources](../data-sources.md#azure-resources) | Platform metrics<br> Resource logs |Both types of data are sent to an event hub using a resource diagnostic setting. See [Stream Azure resource logs to an event hub](../essentials/resource-logs.md#send-to-azure-event-hubs) for details. |
+| [Operating system (guest)](../data-sources.md#operating-system-guest) | Azure virtual machines | Install the [Azure Diagnostics Extension](../agents/diagnostics-extension-overview.md) on Windows and Linux virtual machines in Azure. See [Streaming Azure Diagnostics data in the hot path by using Event Hubs](../agents/diagnostics-extension-stream-event-hubs.md) for details on Windows VMs and [Use Linux Diagnostic Extension to monitor metrics and logs](../../virtual-machines/extensions/diagnostics-linux.md#protected-settings) for details on Linux VMs. |
+| [Application code](../data-sources.md#application-code) | Application Insights | Use diagnostic settings to stream to event hubs. This is only available with workspace-based Application Insights resources. For help with setting up workspace-based Application Insights resources, see [Workspace-based Application Insights resources](../app/create-workspace-resource.md#workspace-based-application-insights-resources) and [Migrate to workspace-based Application Insights resources](../app/convert-classic-resource.md#migrate-to-workspace-based-application-insights-resources).|
## Manual streaming with Logic App For data that you can't directly stream to an event hub, you can write to Azure storage and then use a time-triggered Logic App that [pulls data from blob storage](../../connectors/connectors-create-api-azureblobstorage.md#add-action) and [pushes it as a message to the event hub](../../connectors/connectors-create-api-azure-event-hubs.md#add-action).
Routing your monitoring data to an event hub with Azure Monitor enables you to e
| Tool | Hosted in Azure | Description | |:|:| :|
-| IBM QRadar | No | The Microsoft Azure DSM and Microsoft Azure Event Hub Protocol are available for download from [the IBM support website](https://www.ibm.com/support). |
+| IBM QRadar | No | The Microsoft Azure DSM and Microsoft Azure Event Hubs Protocol are available for download from [the IBM support website](https://www.ibm.com/support). |
| Splunk | No | [Splunk Add-on for Microsoft Cloud Services](https://splunkbase.splunk.com/app/3110/) is an open source project available in Splunkbase. <br><br> If you can't install an add-on in your Splunk instance, if for example you're using a proxy or running on Splunk Cloud, you can forward these events to the Splunk HTTP Event Collector using [Azure Function For Splunk](https://github.com/Microsoft/AzureFunctionforSplunkVS), which is triggered by new messages in the event hub. |
-| SumoLogic | No | Instructions for setting up SumoLogic to consume data from an event hub are available at [Collect Logs for the Azure Audit App from Event Hub](https://help.sumologic.com/Send-Data/Applications-and-Other-Data-Sources/Azure-Audit/02Collect-Logs-for-Azure-Audit-from-Event-Hub). |
-| ArcSight | No | The ArcSight Azure Event Hub smart connector is available as part of [the ArcSight smart connector collection](https://community.microfocus.com/cyberres/arcsight/f/arcsight-product-announcements/163662/announcing-general-availability-of-arcsight-smart-connectors-7-10-0-8114-0). |
+| SumoLogic | No | Instructions for setting up SumoLogic to consume data from an event hub are available at [Collect Logs for the Azure Audit App from Event Hubs](https://help.sumologic.com/Send-Data/Applications-and-Other-Data-Sources/Azure-Audit/02Collect-Logs-for-Azure-Audit-from-Event-Hub). |
+| ArcSight | No | The ArcSight Azure Event Hubs smart connector is available as part of [the ArcSight smart connector collection](https://community.microfocus.com/cyberres/arcsight/f/arcsight-product-announcements/163662/announcing-general-availability-of-arcsight-smart-connectors-7-10-0-8114-0). |
| Syslog server | No | If you want to stream Azure Monitor data directly to a syslog server, you can use a [solution based on an Azure function](https://github.com/miguelangelopereira/azuremonitor2syslog/). | LogRhythm | No| Instructions to set up LogRhythm to collect logs from an event hub are available [here](https://logrhythm.com/six-tips-for-securing-your-azure-cloud-environment/). |Logz.io | Yes | For more information, see [Getting started with monitoring and logging using Logz.io for Java apps running on Azure](/azure/developer/java/fundamentals/java-get-started-with-logzio)
azure-monitor Analyze Usage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/analyze-usage.md
W3CIISLog
- See [Azure Monitor Logs pricing details](cost-logs.md) for details on how charges are calculated for data in a Log Analytics workspace and different configuration options to reduce your charges. - See [Azure Monitor cost and usage](../usage-estimated-costs.md) for a description of the different types of Azure Monitor charges and how to analyze them on your Azure bill. - See [Azure Monitor best practices - Cost management](../best-practices-cost.md) for best practices on configuring and managing Azure Monitor to minimize your charges.-- See [Ingestion-time transformations in Azure Monitor Logs (preview)](ingestion-time-transformations.md) for details on using ingestion-time transformations to reduce the amount of data you collected in a Log Analytics workspace by filtering unwanted records and columns.
+- See [Data collection transformations in Azure Monitor (preview)](../essentials/data-collection-transformations.md) for details on using transformations to reduce the amount of data you collected in a Log Analytics workspace by filtering unwanted records and columns.
azure-monitor Azure Cli Log Analytics Workspace Sample https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/azure-cli-log-analytics-workspace-sample.md
For more information about tables, see [Data structure](./log-analytics-workspac
## Delete a table
-You can delete [Custom Log](custom-logs-overview.md), [Search Results](search-jobs.md) and [Restored Logs](restore.md) tables.
+You can delete [Custom Log](logs-ingestion-api-overview.md), [Search Results](search-jobs.md) and [Restored Logs](restore.md) tables.
To delete a table, run the [az monitor log-analytics workspace table delete](/cli/azure/monitor/log-analytics/workspace/table#az-monitor-log-analytics-workspace-data-export-delete) command:
azure-monitor Basic Logs Configure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/basic-logs-configure.md
Setting a table's [log data plan](log-analytics-workspace-overview.md#log-data-p
By default, all tables in your Log Analytics are Analytics tables, and available for query and alerts. You can currently configure the following tables for Basic Logs: -- All tables created with the [Data Collection Rule (DCR)-based custom logs API.](custom-logs-overview.md)
+- All tables created with the [Data Collection Rule (DCR)-based logs ingestion API.](logs-ingestion-api-overview.md)
- [ContainerLogV2](/azure/azure-monitor/reference/tables/containerlogv2), which [Container Insights](../containers/container-insights-overview.md) uses and which include verbose text-based log records. - [AppTraces](/azure/azure-monitor/reference/tables/apptraces), which contains freeform log records for application traces in Application Insights.
azure-monitor Cost Logs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/cost-logs.md
The default pricing for Log Analytics is a Pay-As-You-Go model that's based on i
- The types of data collected from each monitored resource ## Data size calculation
-Data volume is measured as the size of the data that will be stored in GB (10^9 bytes). The data size of a single record is calculated from a string representation of the columns that are stored in the Log Analytics workspace for that record, regardless of whether the data is sent from an agent or added during the ingestion process. This includes any custom columns added by the [custom logs API](custom-logs-overview.md), [ingestion-time transformations](ingestion-time-transformations.md), or [custom fields](custom-fields.md) that are added as data is collected and then stored in the workspace.
+Data volume is measured as the size of the data that will be stored in GB (10^9 bytes). The data size of a single record is calculated from a string representation of the columns that are stored in the Log Analytics workspace for that record, regardless of whether the data is sent from an agent or added during the ingestion process. This includes any custom columns added by the [logs ingestion API](logs-ingestion-api-overview.md), [transformations](../essentials/data-collection-transformations.md), or [custom fields](custom-fields.md) that are added as data is collected and then stored in the workspace.
>[!NOTE] >The billable data volume calculation is substantially smaller than the size of the entire incoming JSON-packaged event. On average across all event types, the billed size is about 25% less than the incoming data size. This can be up to 50% for small events. It is essential to understand this calculation of billed data size when estimating costs and comparing to other pricing models.
See [Configure data retention and archive policies in Azure Monitor Logs](data-r
Searching against Archived Logs uses [search jobs](search-jobs.md). Search jobs are asynchronous queries that fetch records into a new search table within your workspace for further analytics. Search jobs are billed by the number of GB of data scanned on each day that is accessed to perform the search. ## Log data restore
-For situations in which older or archived logs need to be intensively queried with the full analyitics query capabilities, the [data restore](restore.md) feature is a powerful tool. The restore operation makes a specific time range of data in a table available in the hot cache for high-performance queries. You can later dismiss the data when you're done. Log data restore is billed by the amount of data restored, and by the time the restore is kept active. The minimal values billed for any data restore is 2 TB and 12 hours. Data restored of more than 2 TB and/or more than 12 hours in duration are billed on a pro-rated basis.
+For situations in which older or archived logs need to be intensively queried with the full analytic query capabilities, the [data restore](restore.md) feature is a powerful tool. The restore operation makes a specific time range of data in a table available in the hot cache for high-performance queries. You can later dismiss the data when you're done. Log data restore is billed by the amount of data restored, and by the time the restore is kept active. The minimal values billed for any data restore are 2 TB and 12 hours. Data restored of more than 2 TB and/or more than 12 hours in duration are billed on a pro-rated basis.
## Log data export [Data export](logs-data-export.md) in Log Analytics workspace lets you continuously export data per selected tables in your workspace, to an Azure Storage Account or Azure Event Hubs as it arrives to Azure Monitor pipeline. Charges for the use of data export are based on the amount of data exported. The size of data exported is the number of bytes in the exported JSON formatted data.
Telemetry from ping tests and multi-step tests is charged the same as data usage
See [Application Insights legacy enterprise (per node) pricing tier](../app/legacy-pricing.md) for details about legacy tiers that are available to early adopters of Application Insights. ## Workspaces with Microsoft Sentinel
-When Microsoft Sentinel is enabled in a Log Analytics workspace, all data collected in that workspace is subject to Sentinel charges in addition to Log Analytics charges. For this reason, you will often separate your security and operational data in different workspaces so that you don't incur [Sentinel charges](../../sentinel/billing.md) for operational data. There may be particular situations though where combining this data can actually result in a cost savings. This is typically when you aren't collecting enough security and operational data to each reach a commitment tier on their own, but the combined data is enough to reach a commitment tier. See **Combining your SOC and non-SOC data** in [Design your Microsoft Sentinel workspace architecture](../../sentinel/design-your-workspace-architecture.md#decision-tree) for details and a sample cost calculation.
+When Microsoft Sentinel is enabled in a Log Analytics workspace, all data collected in that workspace is subject to Sentinel charges in addition to Log Analytics charges. For this reason, you will often separate your security and operational data in different workspaces so that you don't incur [Sentinel charges](../../sentinel/billing.md) for operational data. For some particular situations though, combining this data can actually result in a cost savings. This is typically when you aren't collecting enough security and operational data to each reach a commitment tier on their own, but the combined data is enough to reach a commitment tier. See **Combining your SOC and non-SOC data** in [Design your Microsoft Sentinel workspace architecture](../../sentinel/design-your-workspace-architecture.md#decision-tree) for details and a sample cost calculation.
## Workspaces with Microsoft Defender for Cloud [Microsoft Defender for Servers (part of Defender for Cloud)](../../security-center/index.yml) [bills by the number of monitored services](https://azure.microsoft.com/pricing/details/azure-defender/) and provides 500 MB/server/day data allocation that is applied to the following subset of [security data types](/azure/azure-monitor/reference/tables/tables-category#security):
When Microsoft Sentinel is enabled in a Log Analytics workspace, all data collec
The count of monitored servers is calculated on an hourly granularity. The daily data allocation contributions from each monitored server are aggregated at the workspace level. If the workspace is in the legacy Per Node pricing tier, the Microsoft Defender for Cloud and Log Analytics allocations are combined and applied jointly to all billable ingested data. ## Legacy pricing tiers
-Subscriptions that contained a Log Analytics workspace or Application Insights resource on April 2, 2018, or are linked to an Enterprise Agreement that started before February 1, 2019 and is still active, will continue to have access to use the the following legacy pricing tiers:
+Subscriptions that contained a Log Analytics workspace or Application Insights resource on April 2, 2018, or are linked to an Enterprise Agreement that started before February 1, 2019 and is still active, will continue to have access to use the following legacy pricing tiers:
- Standalone (Per GB) - Per Node (OMS)
azure-monitor Custom Logs Migrate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/custom-logs-migrate.md
Title: Migrate from Data Collector API and custom fields-enabled tables to DCR-based custom logs
-description: Steps that you must perform when migrating from Data Collector API and custom fields-enabled tables to DCR-based custom logs.
+ Title: Migrate from Data Collector API and custom fields-enabled tables to DCR-based custom log collection
+description: Steps that you must perform when migrating from Data Collector API and custom fields-enabled tables to DCR-based custom log collection.
Last updated 01/06/2022
-# Migrate from Data Collector API and custom fields-enabled tables to DCR-based custom logs
-This article describes how to migrate from [Data Collector API](data-collector-api.md) or [custom fields](custom-fields.md) in Azure Monitor to [DCR-based custom logs](custom-logs-overview.md). It includes configuration required for tables in your Log Analytics workspace and applies to both [direct ingestion](custom-logs-overview.md) and [ingestion-time transformations](ingestion-time-transformations.md).
+# Migrate from Data Collector API and custom fields-enabled tables to DCR-based custom log collection
+This article describes how to migrate from [Data Collector API](data-collector-api.md) or [custom fields](custom-fields.md) in Azure Monitor to [DCR-based custom log collection](../essentials/data-collection-rule-overview.md). It includes configuration required for custom tables created in your Log Analytics workspace so that they can be used by [Logs ingestion API](logs-ingestion-api-overview.md) and [workspace transformations](../essentials/data-collection-transformations.md#workspace-transformation-dcr).
> [!IMPORTANT]
-> You do not need to follow this article if you are defining your DCR-based custom logs using the Azure Portal. This article only applies if you are using Resource Manager templates and the custom logs API.
+> You do not need to follow this article if you are configuring your DCR-based custom logs [using the Azure Portal](tutorial-workspace-transformations-portal.md) since the configuration will be performed for you. This article only applies if you're configuring using Resource Manager templates APIs.
## Background
-To use a table with the [direct ingestion](custom-logs-overview.md), and [ingestion-time transformations](ingestion-time-transformations.md), it must be configured to support these new features. When you complete the process described in this article, the following actions are taken:
+To use a table with the [Logs ingestion API](logs-ingestion-api-overview.md) or with a [workspace transformation](../essentials/data-collection-transformations.md#workspace-transformation-dcr), it must be configured to support new features. When you complete the process described in this article, the following actions are taken:
-- The table will be reconfigured to enable all DCR-based custom logs features. This includes DCR and DCE support and management with the new Tables control plane.
+- The table is reconfigured to enable all DCR-based custom logs features. This includes DCR and DCE support and management with the new **Tables** control plane.
- Any previously defined custom fields will stop populating. - The Data Collector API will continue to work but won't create any new columns. Data will only populate into any columns that was created prior to migration. - The schema and historic data is preserved and can be accessed the same way it was previously.
To use a table with the [direct ingestion](custom-logs-overview.md), and [ingest
## Applicable scenarios This article is only applicable if all of the following criteria apply: -- You need to use the DCR-based custom logs functionality to send data to an existing table, preserving both schema and historical data in that table.-- The table in question was either created using the Data Collector API, or has custom fields defined in it -- You want to migrate using the custom logs API instead of the Azure portal.
+- You're going to send data to the table using the [Logs ingestion API](logs-ingestion-api-overview.md) or configure a transformation for the table in the [workspace transformation DCR](../essentials/data-collection-transformations.md#workspace-transformation-dcr), preserving both schema and historical data in that table.
+- The table was either created using the Data Collector API, or has custom fields defined in it.
+- You want to migrate using the APIs instead of the Azure portal as described in [Send custom logs to Azure Monitor Logs using the Azure portal (preview)](tutorial-logs-ingestion-portal.md) or [Add transformation in workspace data collection rule using the Azure portal (preview)](tutorial-workspace-transformations-portal.md).
-If all of these conditions aren't true, then you can use DCR-based custom logs without following the procedure described here.
+If all of these conditions aren't true, then you can use DCR-based log collection without following the procedure described here.
## Migration procedure
-If the table that you're targeting with DCR-based custom logs does indeed falls under the criteria described above, the following strategy is required for a graceful migration:
+If the table that you're targeting with DCR-based log collection fits the criteria above, then you must perform the following steps:
-1. Configure your data collection rule (DCR) following procedures at [Send custom logs to Azure Monitor Logs using Resource Manager templates (preview)](tutorial-custom-logs-api.md) or [Add ingestion-time transformation to Azure Monitor Logs using Resource Manager templates (preview)](tutorial-ingestion-time-transformations-api.md).
+1. Configure your data collection rule (DCR) following procedures at [Send custom logs to Azure Monitor Logs using Resource Manager templates (preview)](tutorial-logs-ingestion-api.md) or [Add transformation in workspace data collection rule to Azure Monitor using resource manager templates (preview)](tutorial-workspace-transformations-api.md).
-1. If using the DCR-based API, also [configure the data collection endpoint (DCE)](tutorial-custom-logs-api.md#create-data-collection-endpoint) and the agent or component that will be sending data to the API.
+1. If using the Logs ingestion API, also [configure the data collection endpoint (DCE)](tutorial-logs-ingestion-api.md#create-data-collection-endpoint) and the agent or component that will be sending data to the API.
1. Issue the following API call against your table. This call is idempotent, so there will be no effect if the table has already been migrated.
If the table that you're targeting with DCR-based custom logs does indeed falls
POST /subscriptions/{subscriptionId}/resourcegroups/{resourceGroupName}/providers/microsoft.operationalinsights/workspaces/{workspaceName}/tables/{tableName}/migrate?api-version=2021-03-01-privatepreview ```
-1. Discontinue use of the Data Collector API and start using the new custom logs API.
+1. Discontinue use of the Data Collector API and start using the new Logs ingestion API.
## Next steps -- [Walk through a tutorial sending custom logs using the Azure portal.](tutorial-custom-logs.md)-- [Walk through a tutorial sending custom logs using Resource Manager templates and REST API.](tutorial-custom-logs-api.md)
+- [Walk through a tutorial sending custom logs using the Azure portal.](tutorial-logs-ingestion-portal.md)
+- [Walk through a tutorial sending custom logs using Resource Manager templates and REST API.](tutorial-logs-ingestion-api.md)
azure-monitor Data Collection Rule Sample Custom Logs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/data-collection-rule-sample-custom-logs.md
Last updated 02/15/2022
# Sample data collection rule - custom logs
-The sample [data collection rule](../essentials/data-collection-rule-overview.md) below is for use with [custom logs](../logs/custom-logs-overview.md). It has the following details:
+The sample [data collection rule](../essentials/data-collection-rule-overview.md) below is for use with [custom logs](../logs/logs-ingestion-api-overview.md). It has the following details:
- Sends data to a table called MyTable_CL in a workspace called my-workspace.-- Applies a [transformation](../essentials/data-collection-rule-transformations.md) to the incoming data.
+- Applies a [transformation](../essentials//data-collection-transformations.md) to the incoming data.
## Sample DCR
The sample [data collection rule](../essentials/data-collection-rule-overview.md
## Next steps -- [Walk through a tutorial on configuring custom logs using resource manager templates.](tutorial-custom-logs-api.md)
+- [Walk through a tutorial on configuring custom logs using resource manager templates.](tutorial-logs-ingestion-api.md)
- [Get details on the structure of data collection rules.](../essentials/data-collection-rule-structure.md)-- [Get an overview on custom logs](custom-logs-overview.md).
+- [Get an overview on custom logs](logs-ingestion-api-overview.md).
azure-monitor Data Platform Logs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/data-platform-logs.md
This configuration will be different depending on the data source. For example:
- [Create diagnostic settings](../essentials/diagnostic-settings.md) to send resource logs from Azure resources to the workspace. - [Enable VM insights](../vm/vminsights-enable-overview.md) to collect data from virtual machines. -- [Configure data sources on the workspace](../agents/data-sources.md) to collect more events and performance data.
+- [Configure data sources on the workspace](../data-sources.md) to collect more events and performance data.
> [!IMPORTANT] > Most data collection in Logs will incur ingestion and retention costs, so refer to [Azure Monitor pricing](https://azure.microsoft.com/pricing/details/monitor/) before enabling any data collection.
The experience of using Log Analytics to work with Azure Monitor queries in the
- Learn about [log queries](./log-query-overview.md) to retrieve and analyze data from a Log Analytics workspace. - Learn about [metrics in Azure Monitor](../essentials/data-platform-metrics.md).-- Learn about the [monitoring data available](../agents/data-sources.md) for various resources in Azure.
+- Learn about the [monitoring data available](../data-sources.md) for various resources in Azure.
azure-monitor Ingestion Time Transformations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/ingestion-time-transformations.md
- Title: Overview of ingestion-time transformations in Azure Monitor Logs
-description: This article describes ingestion-time transformations which allow you to filter and transform data before it's stored in a Log Analytics workspace in Azure Monitor.
- Previously updated : 01/19/2022--
-# Ingestion-time transformations in Azure Monitor Logs (preview)
-[Ingestion-time transformations](ingestion-time-transformations.md) allow you to manipulate incoming data before it's stored in a Log Analytics workspace. You can add data filtering, parsing and extraction, and control the structure of the data that gets ingested.
--
-## Basic operation
-The transformation is a [KQL query](../essentials/data-collection-rule-transformations.md) that runs against the incoming data and modifies it before it's stored in the workspace. Transformations are defined separately for each table in the workspace. This article provides an overview of this feature and guidance for further details and samples. Configuration for ingestion-time transformation is stored in a workspace transformation DCR. You can either [create this DCR directly](tutorial-ingestion-time-transformations-api.md) or configure transformation [through the Azure portal](tutorial-ingestion-time-transformations.md).
-
-## When to use ingestion-time transformations
-Use ingestion-time transformation for the following scenarios:
-
-**Reduce data ingestion cost.** You can create a transformation to filter data that you don't require from a particular workflow. You may also remove data that you don't require from specific columns, resulting in a lower amount of the data that you need to ingest and store. For example, you might have a diagnostic setting to collect resource logs from a particular resource but not require all of the log entries that it generates. Create a transformation that filters out records that match a certain criteria.
-
-**Simplify query requirements.** You may have a table with valuable data buried in a particular column or data that needs some type of conversion each time it's queried. Create a transformation that parses this data into a custom column so that queries don't need to parse it. Remove extra data from the column that isn't required to decrease ingestion and retention costs.
-
-## Supported workflows
-Ingestion-time transformation is applied to any workflow that doesn't currently use a [data collection rule](../essentials/data-collection-rule-overview.md) to send data to a [supported table](tables-feature-support.md). Any transformation on a workspace will be ignored for these workflows.
-
-The workflows that currently use data collection rules are as follows:
--- [Azure Monitor agent](../agents/data-collection-rule-azure-monitor-agent.md)-- [Custom logs](../logs/custom-logs-overview.md)-
-## Supported tables
-See [Supported tables for ingestion-time transformations](tables-feature-support.md) for a complete list of tables that support ingestion-time transformations.
-
-## Configure ingestion-time transformation
-See the following tutorials for a complete walkthrough of configuring ingestion-time transformation.
--- [Azure portal](../logs/tutorial-ingestion-time-transformations.md)-- [Resource Manager templates and REST API](../logs/tutorial-ingestion-time-transformations-api.md)--
-## Limits
--- Transformation queries use a subset of KQL. See [Supported KQL features](../essentials/data-collection-rule-transformations.md#supported-kql-features) for details.-
-## Next steps
--- [Get details on transformation queries](../essentials/data-collection-rule-transformations.md)-- [Walk through configuration of ingestion-time transformation using the Azure portal](tutorial-ingestion-time-transformations.md)-- [Walk through configuration of ingestion-time transformation using Resource Manager templates and REST API](tutorial-ingestion-time-transformations.md)
azure-monitor Log Analytics Workspace Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/log-analytics-workspace-overview.md
The following table summarizes the two plans. For more information on Basic Logs
| Retention | Configure retention from 30 days to 730 days. | Retention fixed at 8 days. | | Alerts | Supported. | Not supported. |
-## Ingestion-time transformations
+## Workspace transformation DCR
-[Data collection rules (DCRs)](../essentials/data-collection-rule-overview.md) that define data coming into Azure Monitor can include transformations that allow you to filter and transform data before it's ingested into the workspace. Since all workflows don't yet support DCRs, each workspace can define ingestion-time transformations. For this reason, you can filter or transform data before it's stored.
+[Data collection rules (DCRs)](../essentials/data-collection-rule-overview.md) that define data coming into Azure Monitor can include transformations that allow you to filter and transform data before it's ingested into the workspace. Since all data sources don't yet support DCRs, each workspace can have a [workspace transformation DCR](../essentials/data-collection-transformations.md#workspace-transformation-dcr).
-[Ingestion-time transformations](ingestion-time-transformations.md) are defined for each table in a workspace and apply to all data sent to that table, even if sent from multiple sources. Ingestion-time transformations only apply to workflows that don't already use a DCR. For example, [Azure Monitor agent](../agents/azure-monitor-agent-overview.md) uses a DCR to define data collected from virtual machines. This data won't be subject to any ingestion-time transformations defined in the workspace.
+[Transformations](../essentials/data-collection-transformations.md) in the workspace transformation DCR are defined for each table in a workspace and apply to all data sent to that table, even if sent from multiple sources. These transformations only apply to workflows that don't already use a DCR. For example, [Azure Monitor agent](../agents/azure-monitor-agent-overview.md) uses a DCR to define data collected from virtual machines. This data won't be subject to any ingestion-time transformations defined in the workspace.
For example, you might have [diagnostic settings](../essentials/diagnostic-settings.md) that send [resource logs](../essentials/resource-logs.md) for different Azure resources to your workspace. You can create a transformation for the table that collects the resource logs that filters this data for only records that you want. This method saves you the ingestion cost for records you don't need. You might also want to extract important data from certain columns and store it in other columns in the workspace to support simpler queries.
azure-monitor Logs Data Export https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/logs-data-export.md
Data export in Log Analytics workspace lets you continuously export data per sel
## Overview Data in Log Analytics is available for the retention period defined in your workspace, and used in various experiences provided in Azure Monitor and Azure services. There are cases where you need to use other tools: * Tamper protected store compliance ΓÇô data can't be altered in Log Analytics once ingested, but can be purged. Export to Storage Account set with [immutability policies](../../storage/blobs/immutable-policy-configure-version-scope.md) to keep data tamper protected.
-* Integration with Azure services and other tools ΓÇô export to Event Hubs as it arrives and processed in Azure Monitor.
+* Integration with Azure services and other tools ΓÇô export to Event Hubs as data arrives and is processed in Azure Monitor.
* Keep audit and security data for very long time ΓÇô export to Storage Account in the workspace's region, or replicate data to other regions using any of the [Azure Storage redundancy options](../../storage/common/storage-redundancy.md#redundancy-in-a-secondary-region) including "GRS" and "GZRS".
-After configuring data export rules in Log Analytics workspace, new data for tables in rules is exported from Azure Monitor pipeline to your Storage Account or Event Hubs as it arrives.
+Once you've configured data export rules in Log Analytics workspace, new data for tables in rules is exported from Azure Monitor pipeline to your Storage Account or Event Hubs as it arrives.
[![Data export overview](media/logs-data-export/data-export-overview.png "Screenshot of data export flow diagram.")](media/logs-data-export/data-export-overview.png#lightbox)
Data is exported without a filter. For example, when you configure a data export
## Other export options Log Analytics workspace data export continuously exports data that is sent to your Log Analytics workspace. There are other options to export data for particular scenarios: -- Configure Diagnostic Settings in Azure resources. Logs is sent to destination directly and has lower latency compared to data export in Log Analytics.
+- Configure Diagnostic Settings in Azure resources. Logs are sent to destination directly and has lower latency compared to data export in Log Analytics.
- Schedule export of data based on a log query you define with the [Log Analytics query API](/rest/api/loganalytics/dataaccess/query/execute). Use services such as Azure Data Factory, Azure Functions, or Azure Logic App to orchestrate queries in your workspace and export data to a destination. This is similar to the data export feature, but allows you to export historical data from your workspace, using filters and aggregation. This method is subject to [log query limits](../service-limits.md#log-analytics-workspaces) and not intended for scale. See [Archive data from Log Analytics workspace to Azure Storage Account using Logic App](logs-export-logic-app.md). - One time export to local machine using PowerShell script. See [Invoke-AzOperationalInsightsQueryExport](https://www.powershellgallery.com/packages/Invoke-AzOperationalInsightsQueryExport). ## Limitations - All tables will be supported in export, but currently limited to those specified in the [supported tables](#supported-tables) section.-- Legacy custom log using the [HTTP Data Collector API](./data-collector-api.md) wonΓÇÖt be supported in export, while data for [DCR based custom logs](./custom-logs-overview.md) can be exported.
+- Legacy custom log using the [HTTP Data Collector API](./data-collector-api.md) wonΓÇÖt be supported in export, while data for [DCR based custom logs](./logs-ingestion-api-overview.md) can be exported.
- You can define up to 10 enabled rules in your workspace. More rules are allowed when disabled. - Destinations must be in the same region as the Log Analytics workspace. - Storage Account must be unique across rules in workspace.
The Storage Account must be StorageV1 or above and in the same region as your wo
Data is sent to Storage Accounts as it reaches Azure Monitor and exported to destinations located in workspace region. A container is created for each table in Storage Account, with the name *am-* followed by the name of the table. For example, the table *SecurityEvent* would send to a container named *am-SecurityEvent*.
-Blobs are stored in 5-minute folders in path structure: *WorkspaceResourceId=/subscriptions/subscription-id/resourcegroups/\<resource-group\>/providers/microsoft.operationalinsights/workspaces/\<workspace\>/y=\<four-digit numeric year\>/m=\<two-digit numeric month\>/d=\<two-digit numeric day\>/h=\<two-digit 24-hour clock hour\>/m=\<two-digit 60-minute clock minute\>/PT05M.json*. Append blobs is limited to 50-K writes and could be reached, and more blobs will be added in folder as: PT05M_#.json*, where # is incremental blob count.
+Blobs are stored in 5-minute folders in path structure: *WorkspaceResourceId=/subscriptions/subscription-id/resourcegroups/\<resource-group\>/providers/microsoft.operationalinsights/workspaces/\<workspace\>/y=\<four-digit numeric year\>/m=\<two-digit numeric month\>/d=\<two-digit numeric day\>/h=\<two-digit 24-hour clock hour\>/m=\<two-digit 60-minute clock minute\>/PT05M.json*. Appends to blobs are limited to 50-K writes. More blobs will be added in folder as: PT05M_#.json*, where # is incremental blob count.
The format of blobs in Storage Account is in [JSON lines](../essentials/resource-logs-blob-format.md), where each record is delimited by a newline, with no outer records array and no commas between JSON records.
You need to have 'write' permissions to both workspace and destination to config
Don't use an existing Event Hubs that has, non-monitoring data to prevent reaching Event Hubs namespace ingress rate limit, failures, and latency.
-Data is sent to your Event Hubs as it reaches Azure Monitor and exported to destinations located in workspace region. You can create multiple export rules to the same Event Hubs namespace by providing different `event hub name` in rule. When `event hub name` isn't provided, default Event Hubs are created for tables that you export with name: *am-* followed by the name of the table. For example, the table *SecurityEvent* would sent to an Event Hub named: *am-SecurityEvent*. The [number of supported Event Hubs in 'Basic' and 'Standard' namespaces tiers is 10](../../event-hubs/event-hubs-quotas.md#common-limits-for-all-tiers). When exporting more than 10 tables to these tiers, either split the tables between several export rules, to different Event Hubs namespaces, or provide an Event Hub name to export all tables to it.
+Data is sent to your Event Hubs as it reaches Azure Monitor and exported to destinations located in workspace region. You can create multiple export rules to the same Event Hubs namespace by providing different `event hub name` in rule. When `event hub name` isn't provided, default Event Hubs is created for tables that you export with name: *am-* followed by the name of the table. For example, the table *SecurityEvent* would be sent to an Event Hubs named: *am-SecurityEvent*. The [number of supported Event Hubs in 'Basic' and 'Standard' namespaces tiers is 10](../../event-hubs/event-hubs-quotas.md#common-limits-for-all-tiers). When exporting more than 10 tables to these tiers, either split the tables between several export rules, to different Event Hubs namespaces, or provide an Event Hubs name to export all tables to it.
> [!NOTE] > - 'Basic' Event Hubs namespace tier is limited ΓÇô it supports [lower event size](../../event-hubs/event-hubs-quotas.md#basic-vs-standard-vs-premium-vs-dedicated-tiers) and no [Auto-inflate](../../event-hubs/event-hubs-auto-inflate.md) option to automatically scale up and increase the number of throughput units. Since data volume to your workspace increases over time and consequence Event Hubs scaling is required, use 'Standard', 'Premium' or 'Dedicated' Event Hubs tiers with **Auto-inflate** feature enabled. See [Automatically scale up Azure Event Hubs throughput units](../../event-hubs/event-hubs-auto-inflate.md).
If you have configured your Storage Account to allow access from selected networ
| Scope | Metric Namespace | Metric | Aggregation | Threshold | |:|:|:|:|:|
- | namespaces-name | Event Hub standard metrics | Incoming bytes | Sum | 80% of max ingress per alert evaluation period. For example, limit is 1 MB/s per unit ("TU" or "PU") and five units used. Threshold is 1200 MB per 5-minutes evaluation period |
- | namespaces-name | Event Hub standard metrics | Incoming requests | Count | 80% of max events per alert evaluation period. For example, limit is 1000/s per unit ("TU" or ""PU") and five units used. Threshold is 1200000 per 5-minutes evaluation period |
- | namespaces-name | Event Hub standard metrics | Quota Exceeded Errors | Count | Between 1% of request. For example, requests per 5-minute is 600000. Threshold is 6000 per 5-minute evaluation period |
+ | namespaces-name | Event Hubs standard metrics | Incoming bytes | Sum | 80% of max ingress per alert evaluation period. For example, limit is 1 MB/s per unit ("TU" or "PU") and five units used. Threshold is 1200 MB per 5-minutes evaluation period |
+ | namespaces-name | Event Hubs standard metrics | Incoming requests | Count | 80% of max events per alert evaluation period. For example, limit is 1000/s per unit ("TU" or ""PU") and five units used. Threshold is 1200000 per 5-minutes evaluation period |
+ | namespaces-name | Event Hubs standard metrics | Quota Exceeded Errors | Count | Between 1% of request. For example, requests per 5-minute is 600000. Threshold is 6000 per 5-minute evaluation period |
2. Alert remediation actions - Use separate Event Hubs namespace for export that isn't shared with non-monitoring data.
Data export rule defines the destination and tables for which data is exported.
> - You can include tables that aren't yet supported in export, and no data will be exported for these until the tables are supported. > - The legacy custom log wonΓÇÖt be supported in export. The next generation of custom log available in preview early 2022 can be exported. > - Export to Storage Account - a separate container is created in Storage Account for each table.
-> - Export to Event Hubs - if Event Hub name isn't provided, a separate Event Hub is created for each table. The [number of supported Event Hubs in 'Basic' and 'Standard' namespaces tiers is 10](../../event-hubs/event-hubs-quotas.md#common-limits-for-all-tiers). When exporting more than 10 tables to these tiers, either split the tables between several export rules to different Event Hubs namespaces, or provide an Event Hub name in the rule to export all tables to it.
+> - Export to Event Hubs - if Event Hubs name isn't provided, a separate Event Hubs is created for each table. The [number of supported Event Hubs in 'Basic' and 'Standard' namespaces tiers is 10](../../event-hubs/event-hubs-quotas.md#common-limits-for-all-tiers). When exporting more than 10 tables to these tiers, either split the tables between several export rules to different Event Hubs namespaces, or provide an Event Hubs name in the rule to export all tables to it.
# [Azure portal](#tab/portal)
$storageAccountResourceId = '/subscriptions/subscription-id/resourceGroups/resou
New-AzOperationalInsightsDataExport -ResourceGroupName resourceGroupName -WorkspaceName workspaceName -DataExportName 'ruleName' -TableName 'SecurityEvent,Heartbeat' -ResourceId $storageAccountResourceId ```
-Use the following command to create a data export rule to a specific Event Hub using PowerShell. All tables are exported to the provided Event Hub name and can be filtered by "Type" field to separate tables.
+Use the following command to create a data export rule to a specific Event Hubs using PowerShell. All tables are exported to the provided Event Hubs name and can be filtered by "Type" field to separate tables.
```powershell $eventHubResourceId = '/subscriptions/subscription-id/resourceGroups/resource-group-name/providers/Microsoft.EventHub/namespaces/namespaces-name/eventhubs/eventhub-name' New-AzOperationalInsightsDataExport -ResourceGroupName resourceGroupName -WorkspaceName workspaceName -DataExportName 'ruleName' -TableName 'SecurityEvent,Heartbeat' -ResourceId $eventHubResourceId -EventHubName EventhubName ```
-Use the following command to create a data export rule to an Event Hub using PowerShell. When specific Event Hub name isn't provided, a separate container is created for each table, up to the [number of Event Hubs supported in Event Hubs tier](../../event-hubs/event-hubs-quotas.md#common-limits-for-all-tiers). To export more tables, provide an Event Hub name in rule, or set another rule and export the remaining tables to another Event Hubs namespace.
+Use the following command to create a data export rule to an Event Hubs using PowerShell. When specific Event Hubs name isn't provided, a separate container is created for each table, up to the [number of Event Hubs supported in Event Hubs tier](../../event-hubs/event-hubs-quotas.md#common-limits-for-all-tiers). To export more tables, provide an Event Hubs name in rule, or set another rule and export the remaining tables to another Event Hubs namespace.
```powershell $eventHubResourceId = '/subscriptions/subscription-id/resourceGroups/resource-group-name/providers/Microsoft.EventHub/namespaces/namespaces-name'
$storageAccountResourceId = '/subscriptions/subscription-id/resourceGroups/resou
az monitor log-analytics workspace data-export create --resource-group resourceGroupName --workspace-name workspaceName --name ruleName --tables SecurityEvent Heartbeat --destination $storageAccountResourceId ```
-Use the following command to create a data export rule to a specific Event Hub using CLI. All tables are exported to the provided Event Hub name and can be filtered by "Type" field to separate tables.
+Use the following command to create a data export rule to a specific Event Hubs using CLI. All tables are exported to the provided Event Hubs name and can be filtered by "Type" field to separate tables.
```azurecli $eventHubResourceId = '/subscriptions/subscription-id/resourceGroups/resource-group-name/providers/Microsoft.EventHub/namespaces/namespaces-name/eventhubs/eventhub-name' az monitor log-analytics workspace data-export create --resource-group resourceGroupName --workspace-name workspaceName --name ruleName --tables SecurityEvent Heartbeat --destination $eventHubResourceId ```
-Use the following command to create a data export rule to an Event Hubs using CLI. When specific Event Hub name isn't provided, a separate container is created for each table up to the [number of supported Event Hubs for your Event Hubs tier](../../event-hubs/event-hubs-quotas.md#common-limits-for-all-tiers). If you have more tables to export, provide Event Hub name to export any number of tables, or set another rule to export the remaining tables to another Event Hubs namespace.
+Use the following command to create a data export rule to an Event Hubs using CLI. When specific Event Hubs name isn't provided, a separate container is created for each table up to the [number of supported Event Hubs for your Event Hubs tier](../../event-hubs/event-hubs-quotas.md#common-limits-for-all-tiers). If you have more tables to export, provide Event Hubs name to export any number of tables, or set another rule to export the remaining tables to another Event Hubs namespace.
```azurecli $eventHubsNamespacesResourceId = '/subscriptions/subscription-id/resourceGroups/resource-group-name/providers/Microsoft.EventHub/namespaces/namespaces-name'
Following is a sample body for the REST request for an Event Hubs.
} ```
-Following is a sample body for the REST request for an Event Hubs where Event Hub name is provided. In this case, all exported data is sent to it.
+Following is a sample body for the REST request for an Event Hubs where Event Hubs name is provided. In this case, all exported data is sent to it.
```json {
Use the following command to create a data export rule to a Storage Account usin
} ```
-Use the following command to create a data export rule to an Event Hubs using Resource Manager template. A separate Event Hub is created for each table.
+Use the following command to create a data export rule to an Event Hubs using Resource Manager template. A separate Event Hubs is created for each table.
``` {
Use the following command to create a data export rule to an Event Hubs using Re
} ```
-Use the following command to create a data export rule to a specific Event Hub using Resource Manager template. All tables are exported to it.
+Use the following command to create a data export rule to a specific Event Hubs using Resource Manager template. All tables are exported to it.
``` {
azure-monitor Logs Ingestion Api Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/logs-ingestion-api-overview.md
+
+ Title: Logs ingestion API in Azure Monitor (Preview)
+description: Send data to Log Analytics workspace using REST API.
+ Last updated : 06/27/2022+++
+# Logs ingestion API in Azure Monitor (Preview)
+The Logs ingestion API in Azure Monitor lets you send data to a Log Analytics workspace from any REST API client. This allows you to send data from virtually any source to [supported built-in tables](#supported-tables) or to custom tables that you create. You can even extend the schema of built-in tables with custom columns.
+
+> [!NOTE]
+> The Logs ingestion API was previously referred to as the custom logs API.
++
+## Basic operation
+Your application sends data to a [data collection endpoint](../essentials/data-collection-endpoint-overview.md) which is a unique connection point for your subscription. The payload of your API call includes the source data formatted in JSON. The call specifies a [data collection rule](../essentials/data-collection-rule-overview.md) that understands the format of the source data, potentially filters and transforms it for the target table, and then directs it to a specific table in a specific workspace. You can modify the target table and workspace by modifying the data collection rule without any change to the REST API call or source data.
+++
+> [!NOTE]
+> See [Migrate from Data Collector API and custom fields-enabled tables to DCR-based custom logs](custom-logs-migrate.md) to migrate solutions from the [Data Collector API](data-collector-api.md).
+
+## Supported tables
+
+### Custom tables
+Logs ingestion API can send data to any custom table that you create and to certain built-in tables in your Log Analytics workspace. The target table must exist before you can send data to it.
+
+### Built-in tables
+Logs ingestion API can send data to the following built-in tables. Other tables may be added to this list as support for them is implemented.
+
+- [CommonSecurityLog](/azure/azure-monitor/reference/tables/commonsecuritylog)
+- [SecurityEvents](/azure/azure-monitor/reference/tables/securityevent)
+- [Syslog](/azure/azure-monitor/reference/tables/syslog)
+- [WindowsEvents](/azure/azure-monitor/reference/tables/windowsevent)
+
+### Table limits
+
+* Custom tables must have the `_CL` suffix.
+* Column names can consist of alphanumeric characters as well as the characters `_` and `-`. They must start with a letter.
+* Columns extended on top of built-in tables must have the suffix `_CF`. Columns in a custom table do not need this suffix.
++
+## Authentication
+Authentication for the logs ingestion API is performed at the data collection endpoint which uses standard Azure Resource Manager authentication. A common strategy is to use an Application ID and Application Key as described in [Tutorial: Add ingestion-time transformation to Azure Monitor Logs (preview)](tutorial-logs-ingestion-portal.md).
+
+## Source data
+The source data sent by your application is formatted in JSON and must match the structure expected by the data collection rule. It doesn't necessarily need to match the structure of the target table since the DCR can include a [transformation](../essentials//data-collection-transformations.md) to convert the data to match the table's structure.
+
+## Data collection rule
+[Data collection rules](../essentials/data-collection-rule-overview.md) define data collected by Azure Monitor and specify how and where that data should be sent or stored. The REST API call must specify a DCR to use. A single DCE can support multiple DCRs, so you can specify a different DCR for different sources and target tables.
+
+The DCR must understand the structure of the input data and the structure of the target table. If the two don't match, it can use a [transformation](../essentials/data-collection-transformations.md) to convert the source data to match the target table. You may also use the transformation to filter source data and perform any other calculations or conversions.
+
+## Sending data
+To send data to Azure Monitor with the logs ingestion API, make a POST call to the data collection endpoint over HTTP. Details of the call are as follows:
+
+### Endpoint URI
+The endpoint URI uses the following format, where the `Data Collection Endpoint` and `DCR Immutable ID` identify the DCE and DCR. `Stream Name` refers to the [stream](../essentials/data-collection-rule-structure.md#custom-logs) in the DCR that should handle the custom data.
+
+```
+{Data Collection Endpoint URI}/dataCollectionRules/{DCR Immutable ID}/streams/{Stream Name}?api-version=2021-11-01-preview
+```
+
+> [!NOTE]
+> You can retrieve the immutable ID from the JSON view of the DCR. See [Collect information from DCR](tutorial-logs-ingestion-portal.md#collect-information-from-dcr).
+
+### Headers
+
+| Header | Required? | Value | Description |
+|:|:|:|:|
+| Authorization | Yes | Bearer {Bearer token obtained through the Client Credentials Flow} | |
+| Content-Type | Yes | `application/json` | |
+| Content-Encoding | No | `gzip` | Use the GZip compression scheme for performance optimization. |
+| x-ms-client-request-id | No | String-formatted GUID | Request ID that can be used by Microsoft for any troubleshooting purposes. |
+
+### Body
+The body of the call includes the custom data to be sent to Azure Monitor. The shape of the data must be a JSON object or array with a structure that matches the format expected by the stream in the DCR.
+
+## Sample call
+For sample data and API call using the logs ingestion API, see either [Send custom logs to Azure Monitor Logs using the Azure portal (preview)](tutorial-logs-ingestion-portal.md) or [Send custom logs to Azure Monitor Logs using Resource Manager templates](tutorial-logs-ingestion-api.md)
+
+## Limits and restrictions
+For limits related to Logs ingestion API, see [Azure Monitor service limits](../service-limits.md#logs-ingestion-api).
+
+
+
+## Next steps
+
+- [Walk through a tutorial sending custom logs using the Azure portal.](tutorial-logs-ingestion-portal.md)
+- [Walk through a tutorial sending custom logs using Resource Manager templates and REST API.](tutorial-logs-ingestion-api.md)
azure-monitor Manage Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/manage-access.md
Grant access to all tables except the _SecurityAlert_ table:
Custom logs are tables created from data sources such as [text logs](../agents/data-sources-custom-logs.md) and the [HTTP Data Collector API](data-collector-api.md). The easiest way to identify the type of log is by checking the tables listed under [Custom Logs in the log schema](./log-analytics-tutorial.md#view-table-information). > [!NOTE]
-> Tables created by the [Custom Logs API](../essentials/../logs/custom-logs-overview.md) don't yet support table-level RBAC.
+> Tables created by the [Logs ingestion API](../essentials/../logs/logs-ingestion-api-overview.md) don't yet support table-level RBAC.
You can't grant access to individual custom log tables, but you can grant access to all custom logs. To create a role with access to all custom log tables, create a custom role by using the following actions:
azure-monitor Tables Feature Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/tables-feature-support.md
Title: Tables that support ingestion-time transformations in Azure Monitor Logs
description: Reference for tables that support ingestion-time transformations in Azure Monitor Logs (preview). na Previously updated : 02/22/2022 Last updated : 07/10/2022
-# Tables that support ingestion-time transformations in Azure Monitor Logs (preview)
+# Tables that support transformations in Azure Monitor Logs (preview)
-The following list identifies the tables in a [Log Analytics workspace](log-analytics-workspace-overview.md) that support [Ingest-time transformations](ingestion-time-transformations.md).
+The following list identifies the tables in a [Log Analytics workspace](log-analytics-workspace-overview.md) that support [transformations](../essentials/data-collection-transformations.md).
| Table | Limitations |
azure-monitor Tutorial Logs Ingestion Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/tutorial-logs-ingestion-api.md
+
+ Title: Tutorial - Send data to Azure Monitor Logs using REST API (Resource Manager templates)
+description: Tutorial on how to send data to a Log Analytics workspace in Azure Monitor using REST API. Resource Manager template version.
+ Last updated : 07/15/2022++
+# Tutorial: Send data to Azure Monitor Logs using REST API (Resource Manager templates)
+[Logs ingestion API (preview)](logs-ingestion-api-overview.md) in Azure Monitor allow you to send external data to a Log Analytics workspace with a REST API. This tutorial uses Resource Manager templates to walk through configuration of a new table and a sample application to send log data to Azure Monitor.
+
+> [!NOTE]
+> This tutorial uses Resource Manager templates and REST API to configure custom logs. See [Tutorial: Send data to Azure Monitor Logs using REST API (Azure portal)](tutorial-logs-ingestion-portal.md) for a similar tutorial using the Azure portal.
+
+In this tutorial, you learn to:
+
+> [!div class="checklist"]
+> * Create a custom table in a Log Analytics workspace
+> * Create a data collection endpoint to receive data over HTTP
+> * Create a data collection rule that transforms incoming data to match the schema of the target table
+> * Create a sample application to send custom data to Azure Monitor
++
+> [!NOTE]
+> This tutorial uses PowerShell from Azure Cloud Shell to make REST API calls using the Azure Monitor **Tables** API and the Azure portal to install Resource Manager templates. You can use any other method to make these calls.
+
+## Prerequisites
+To complete this tutorial, you need the following:
+
+- Log Analytics workspace where you have at least [contributor rights](manage-access.md#azure-rbac) .
+- [Permissions to create Data Collection Rule objects](../essentials/data-collection-rule-overview.md#permissions) in the workspace.
+
+## Collect workspace details
+Start by gathering information that you'll need from your workspace.
+
+1. Navigate to your workspace in the **Log Analytics workspaces** menu in the Azure portal. From the **Properties** page, copy the **Resource ID** and save it for later use.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-api/workspace-resource-id.png" lightbox="media/tutorial-logs-ingestion-api/workspace-resource-id.png" alt-text="Screenshot showing workspace resource ID.":::
+
+## Configure application
+Start by registering an Azure Active Directory application to authenticate against the API. Any ARM authentication scheme is supported, but this will follow the [Client Credential Grant Flow scheme](../../active-directory/develop/v2-oauth2-client-creds-grant-flow.md) for this tutorial.
+
+1. From the **Azure Active Directory** menu in the Azure portal, select **App registrations** and then **New registration**.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/new-app-registration.png" lightbox="media/tutorial-logs-ingestion-portal/new-app-registration.png" alt-text="Screenshot showing app registration screen.":::
+
+2. Give the application a name and change the tenancy scope if the default is not appropriate for your environment. A **Redirect URI** isn't required.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/new-app-name.png" lightbox="media/tutorial-logs-ingestion-portal/new-app-name.png" alt-text="Screenshot showing app details.":::
+
+3. Once registered, you can view the details of the application. Note the **Application (client) ID** and the **Directory (tenant) ID**. You'll need these values later in the process.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/new-app-id.png" lightbox="media/tutorial-logs-ingestion-portal/new-app-id.png" alt-text="Screenshot showing app ID.":::
+
+4. You now need to generate an application client secret, which is similar to creating a password to use with a username. Select **Certificates & secrets** and then **New client secret**. Give the secret a name to identify its purpose and select an **Expires** duration. *1 year* is selected here although for a production implementation, you would follow best practices for a secret rotation procedure or use a more secure authentication mode such a certificate.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/new-app-secret.png" lightbox="media/tutorial-logs-ingestion-portal/new-app-secret.png" alt-text="Screenshot showing secret for new app.":::
+
+5. Click **Add** to save the secret and then note the **Value**. Ensure that you record this value since You can't recover it once you navigate away from this page. Use the same security measures as you would for safekeeping a password as it's the functional equivalent.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/new-app-secret-value.png" lightbox="media/tutorial-logs-ingestion-portal/new-app-secret-value.png" alt-text="Screenshot show secret value for new app.":::
+
+## Create new table in Log Analytics workspace
+The custom table must be created before you can send data to it. The table for this tutorial will include three columns, as described in the schema below. The `name`, `type`, and `description` properties are mandatory for each column. The properties `isHidden` and `isDefaultDisplay` both default to `false` if not explicitly specified. Possible data types are `string`, `int`, `long`, `real`, `boolean`, `dateTime`, `guid`, and `dynamic`.
+
+Use the **Tables - Update** API to create the table with the PowerShell code below.
+
+> [!IMPORTANT]
+> Custom tables must use a suffix of *_CL*.
+
+1. Click the **Cloud Shell** button in the Azure portal and ensure the environment is set to **PowerShell**.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-api/open-cloud-shell.png" lightbox="media/tutorial-workspace-transformations-api/open-cloud-shell.png" alt-text="Screenshot of opening Cloud Shell":::
+
+2. Copy the following PowerShell code and replace the **Path** parameter with the appropriate values for your workspace in the `Invoke-AzRestMethod` command. Paste it into the Cloud Shell prompt to run it.
+
+ ```PowerShell
+ $tableParams = @'
+ {
+ "properties": {
+ "schema": {
+ "name": "MyTable_CL",
+ "columns": [
+ {
+ "name": "TimeGenerated",
+ "type": "datetime",
+ "description": "The time at which the data was generated"
+ },
+ {
+ "name": "AdditionalContext",
+ "type": "dynamic",
+ "description": "Additional message properties"
+ },
+ {
+ "name": "ExtendedColumn",
+ "type": "string",
+ "description": "An additional column extended at ingestion time"
+ }
+ ]
+ }
+ }
+ }
+ '@
+
+ Invoke-AzRestMethod -Path "/subscriptions/{subscription}/resourcegroups/{resourcegroup}/providers/microsoft.operationalinsights/workspaces/{workspace}/tables/MyTable_CL?api-version=2021-12-01-preview" -Method PUT -payload $tableParams
+ ```
++
+## Create data collection endpoint
+A [data collection endpoint (DCE)](../essentials/data-collection-endpoint-overview.md) is required to accept the data being sent to Azure Monitor. Once you configure the DCE and link it to a data collection rule, you can send data over HTTP from your application. The DCE must be located in the same region as the Log Analytics Workspace where the data will be sent.
+
+1. In the Azure portal's search box, type in *template* and then select **Deploy a custom template**.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-api/deploy-custom-template.png" lightbox="media/tutorial-workspace-transformations-api/deploy-custom-template.png" alt-text="Screenshot to deploy custom template.":::
+
+2. Click **Build your own template in the editor**.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-api/build-custom-template.png" lightbox="media/tutorial-workspace-transformations-api/build-custom-template.png" alt-text="Screenshot to build template in the editor.":::
+
+3. Paste the Resource Manager template below into the editor and then click **Save**. You don't need to modify this template since you will provide values for its parameters.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-api/edit-template.png" lightbox="media/tutorial-workspace-transformations-api/edit-template.png" alt-text="Screenshot to edit Resource Manager template.":::
++
+ ```json
+ {
+ "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
+ "contentVersion": "1.0.0.0",
+ "parameters": {
+ "dataCollectionEndpointName": {
+ "type": "string",
+ "metadata": {
+ "description": "Specifies the name of the Data Collection Endpoint to create."
+ }
+ },
+ "location": {
+ "type": "string",
+ "defaultValue": "westus2",
+ "allowedValues": [
+ "westus2",
+ "eastus2",
+ "eastus2euap"
+ ],
+ "metadata": {
+ "description": "Specifies the location in which to create the Data Collection Endpoint."
+ }
+ }
+ },
+ "resources": [
+ {
+ "type": "Microsoft.Insights/dataCollectionEndpoints",
+ "name": "[parameters('dataCollectionEndpointName')]",
+ "location": "[parameters('location')]",
+ "apiVersion": "2021-04-01",
+ "properties": {
+ "networkAcls": {
+ "publicNetworkAccess": "Enabled"
+ }
+ }
+ }
+ ],
+ "outputs": {
+ "dataCollectionEndpointId": {
+ "type": "string",
+ "value": "[resourceId('Microsoft.Insights/dataCollectionEndpoints', parameters('dataCollectionEndpointName'))]"
+ }
+ }
+ }
+ ```
+
+4. On the **Custom deployment** screen, specify a **Subscription** and **Resource group** to store the data collection rule and then provide values a **Name** for the data collection endpoint. The **Location** should be the same location as the workspace. The **Region** will already be populated and is used for the location of the data collection endpoint.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-api/custom-deployment-values.png" lightbox="media/tutorial-workspace-transformations-api/custom-deployment-values.png" alt-text="Screenshot to edit custom deployment values.":::
+
+5. Click **Review + create** and then **Create** when you review the details.
+
+6. Once the DCE is created, select it so you can view its properties. Note the **Logs ingestion URI** since you'll need this in a later step.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-api/data-collection-endpoint-overview.png" lightbox="media/tutorial-logs-ingestion-api/data-collection-endpoint-overview.png" alt-text="Screenshot for data collection endpoint uri.":::
+
+7. Click **JSON View** to view other details for the DCE. Copy the **Resource ID** since you'll need this in a later step.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-api/data-collection-endpoint-json.png" lightbox="media/tutorial-logs-ingestion-api/data-collection-endpoint-json.png" alt-text="Screenshot for data collection endpoint resource ID.":::
++
+## Create data collection rule
+The [data collection rule (DCR)](../essentials/data-collection-rule-overview.md) defines the schema of data that being sent to the HTTP endpoint, the transformation that will be applied to it, and the destination workspace and table the transformed data will be sent to.
+
+1. In the Azure portal's search box, type in *template* and then select **Deploy a custom template**.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-api/deploy-custom-template.png" lightbox="media/tutorial-workspace-transformations-api/deploy-custom-template.png" alt-text="Screenshot to deploy custom template.":::
+
+2. Click **Build your own template in the editor**.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-api/build-custom-template.png" lightbox="media/tutorial-workspace-transformations-api/build-custom-template.png" alt-text="Screenshot to build template in the editor.":::
+
+3. Paste the Resource Manager template below into the editor and then click **Save**.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-api/edit-template.png" lightbox="media/tutorial-workspace-transformations-api/edit-template.png" alt-text="Screenshot to edit Resource Manager template.":::
+
+ Notice the following details in the DCR defined in this template:
+
+ - `dataCollectionEndpointId`: Resource ID of the data collection endpoint.
+ - `streamDeclarations`: Defines the columns of the incoming data.
+ - `destinations`: Specifies the destination workspace.
+ - `dataFlows`: Matches the stream with the destination workspace and specifies the transformation query and the destination table.
+
+ ```json
+ {
+ "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
+ "contentVersion": "1.0.0.0",
+ "parameters": {
+ "dataCollectionRuleName": {
+ "type": "string",
+ "metadata": {
+ "description": "Specifies the name of the Data Collection Rule to create."
+ }
+ },
+ "location": {
+ "type": "string",
+ "defaultValue": "westus2",
+ "allowedValues": [
+ "westus2",
+ "eastus2",
+ "eastus2euap"
+ ],
+ "metadata": {
+ "description": "Specifies the location in which to create the Data Collection Rule."
+ }
+ },
+ "workspaceResourceId": {
+ "type": "string",
+ "metadata": {
+ "description": "Specifies the Azure resource ID of the Log Analytics workspace to use."
+ }
+ },
+ "endpointResourceId": {
+ "type": "string",
+ "metadata": {
+ "description": "Specifies the Azure resource ID of the Data Collection Endpoint to use."
+ }
+ }
+ },
+ "resources": [
+ {
+ "type": "Microsoft.Insights/dataCollectionRules",
+ "name": "[parameters('dataCollectionRuleName')]",
+ "location": "[parameters('location')]",
+ "apiVersion": "2021-09-01-preview",
+ "properties": {
+ "dataCollectionEndpointId": "[parameters('endpointResourceId')]",
+ "streamDeclarations": {
+ "Custom-MyTableRawData": {
+ "columns": [
+ {
+ "name": "Time",
+ "type": "datetime"
+ },
+ {
+ "name": "Computer",
+ "type": "string"
+ },
+ {
+ "name": "AdditionalContext",
+ "type": "string"
+ }
+ ]
+ }
+ },
+ "destinations": {
+ "logAnalytics": [
+ {
+ "workspaceResourceId": "[parameters('workspaceResourceId')]",
+ "name": "clv2ws1"
+ }
+ ]
+ },
+ "dataFlows": [
+ {
+ "streams": [
+ "Custom-MyTableRawData"
+ ],
+ "destinations": [
+ "clv2ws1"
+ ],
+ "transformKql": "source | extend jsonContext = parse_json(AdditionalContext) | project TimeGenerated = Time, Computer, AdditionalContext = jsonContext, ExtendedColumn=tostring(jsonContext.CounterName)",
+ "outputStream": "Custom-MyTable_CL"
+ }
+ ]
+ }
+ }
+ ],
+ "outputs": {
+ "dataCollectionRuleId": {
+ "type": "string",
+ "value": "[resourceId('Microsoft.Insights/dataCollectionRules', parameters('dataCollectionRuleName'))]"
+ }
+ }
+ }
+ ```
+
+4. On the **Custom deployment** screen, specify a **Subscription** and **Resource group** to store the data collection rule and then provide values defined in the template. This includes a **Name** for the data collection rule and the **Workspace Resource ID** that you collected in a previous step. The **Location** should be the same location as the workspace. The **Region** will already be populated and is used for the location of the data collection rule.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-api/custom-deployment-values.png" lightbox="media/tutorial-workspace-transformations-api/custom-deployment-values.png" alt-text="Screenshot to edit custom deployment values.":::
+
+5. Click **Review + create** and then **Create** when you review the details.
+
+6. When the deployment is complete, expand the **Deployment details** box and click on your data collection rule to view its details. Click **JSON View**.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-api/data-collection-rule-details.png" lightbox="media/tutorial-workspace-transformations-api/data-collection-rule-details.png" alt-text="Screenshot for data collection rule details.":::
+
+7. Copy the **Resource ID** for the data collection rule. You'll use this in the next step.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-api/data-collection-rule-json-view.png" lightbox="media/tutorial-workspace-transformations-api/data-collection-rule-json-view.png" alt-text="Screenshot for data collection rule JSON view.":::
+
+ > [!NOTE]
+ > All of the properties of the DCR, such as the transformation, may not be displayed in the Azure portal even though the DCR was successfully created with those properties.
++
+## Assign permissions to DCR
+Once the data collection rule has been created, the application needs to be given permission to it. This will allow any application using the correct application ID and application key to send data to the new DCE and DCR.
+
+1. From the DCR in the Azure portal, select **Access Control (IAM)** and then **Add role assignment**.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/add-role-assignment.png" lightbox="media/tutorial-logs-ingestion-portal/custom-log-create.png" alt-text="Screenshot for adding custom role assignment to DCR.":::
+
+2. Select **Monitoring Metrics Publisher** and click **Next**. You could instead create a custom action with the `Microsoft.Insights/Telemetry/Write` data action.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/add-role-assignment-select-role.png" lightbox="media/tutorial-logs-ingestion-portal/add-role-assignment-select-role.png" alt-text="Screenshot for selecting role for DCR role assignment.":::
+
+3. Select **User, group, or service principal** for **Assign access to** and click **Select members**. Select the application that you created and click **Select**.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/add-role-assignment-select-member.png" lightbox="media/tutorial-logs-ingestion-portal/add-role-assignment-select-member.png" alt-text="Screenshot for selecting members for DCR role assignment.":::
++
+4. Click **Review + assign** and verify the details before saving your role assignment.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/add-role-assignment-save.png" lightbox="media/tutorial-logs-ingestion-portal/add-role-assignment-save.png" alt-text="Screenshot for saving DCR role assignment.":::
++
+## Send sample data
+The following PowerShell code sends data to the endpoint using HTTP REST fundamentals.
+
+> [!NOTE]
+> This tutorial uses commands that require PowerShell v7.0 or later. Please make sure your local installation of PowerShell is up to date, or execute this script using the Azure CloudShell.
+
+1. Run the following PowerShell command which adds a required assembly for the script.
+
+ ```powershell
+ Add-Type -AssemblyName System.Web
+ ```
+
+1. Replace the parameters in the *step 0* section with values from the resources that you just created. You may also want to replace the sample data in the *step 2* section with your own.
+
+ ```powershell
+ ##################
+ ### Step 0: set parameters required for the rest of the script
+ ##################
+ #information needed to authenticate to AAD and obtain a bearer token
+ $tenantId = "00000000-0000-0000-0000-000000000000"; #Tenant ID the data collection endpoint resides in
+ $appId = "00000000-0000-0000-0000-000000000000"; #Application ID created and granted permissions
+ $appSecret = "00000000000000000000000"; #Secret created for the application
+
+ #information needed to send data to the DCR endpoint
+ $dcrImmutableId = "dcr-000000000000000"; #the immutableId property of the DCR object
+ $dceEndpoint = "https://my-dcr-name.westus2-1.ingest.monitor.azure.com"; #the endpoint property of the Data Collection Endpoint object
+
+ ##################
+ ### Step 1: obtain a bearer token used later to authenticate against the DCE
+ ##################
+ $scope= [System.Web.HttpUtility]::UrlEncode("https://monitor.azure.com//.default")
+ $body = "client_id=$appId&scope=$scope&client_secret=$appSecret&grant_type=client_credentials";
+ $headers = @{"Content-Type"="application/x-www-form-urlencoded"};
+ $uri = "https://login.microsoftonline.com/$tenantId/oauth2/v2.0/token"
+
+ $bearerToken = (Invoke-RestMethod -Uri $uri -Method "Post" -Body $body -Headers $headers).access_token
+
+ ##################
+ ### Step 2: Load up some sample data.
+ ##################
+ $currentTime = Get-Date ([datetime]::UtcNow) -Format O
+ $staticData = @"
+ [
+ {
+ "Time": "$currentTime",
+ "Computer": "Computer1",
+ "AdditionalContext": {
+ "InstanceName": "user1",
+ "TimeZone": "Pacific Time",
+ "Level": 4,
+ "CounterName": "AppMetric1",
+ "CounterValue": 15.3
+ }
+ },
+ {
+ "Time": "$currentTime",
+ "Computer": "Computer2",
+ "AdditionalContext": {
+ "InstanceName": "user2",
+ "TimeZone": "Central Time",
+ "Level": 3,
+ "CounterName": "AppMetric1",
+ "CounterValue": 23.5
+ }
+ }
+ ]
+ "@;
+
+ ##################
+ ### Step 3: send the data to Log Analytics via the DCE.
+ ##################
+ $body = $staticData;
+ $headers = @{"Authorization"="Bearer $bearerToken";"Content-Type"="application/json"};
+ $uri = "$dceEndpoint/dataCollectionRules/$dcrImmutableId/streams/Custom-MyTableRawData?api-version=2021-11-01-preview"
+
+ $uploadResponse = Invoke-RestMethod -Uri $uri -Method "Post" -Body $body -Headers $headers
+ ```
+
+ > [!NOTE]
+ > If you receive an `Unable to find type [System.Web.HttpUtility].` error, run the last line in section 1 of the script for a fix and executely. Executing it uncommented as part of the script will not resolve the issue - the command must be executed separately.
+
+2. After executing this script, you should see a `HTTP - 204` response, and in just a few minutes, the data arrive to your Log Analytics workspace.
+
+## Troubleshooting
+This section describes different error conditions you may receive and how to correct them.
+
+### Script returns error code 403
+Ensure that you have the correct permissions for your application to the DCR. You may also need to wait up to 30 minutes for permissions to propagate.
+
+### Script returns error code 413 or warning of `TimeoutExpired` with the message `ReadyBody_ClientConnectionAbort` in the response
+The message is too large. The maximum message size is currently 1MB per call.
+
+### Script returns error code 429
+API limits have been exceeded. Refer to [service limits for Logs ingestion API](../service-limits.md#logs-ingestion-api) for details on the current limits.
+
+### Script returns error code 503
+Ensure that you have the correct permissions for your application to the DCR. You may also need to wait up to 30 minutes for permissions to propagate.
+
+### You don't receive an error, but data doesn't appear in the workspace
+The data may take some time to be ingested, especially if this is the first time data is being sent to a particular table. It shouldn't take longer than 15 minutes.
+
+### IntelliSense in Log Analytics not recognizing new table
+The cache that drives IntelliSense may take up to 24 hours to update.
+## Next steps
+
+- [Complete a similar tutorial using the Azure portal.](tutorial-logs-ingestion-portal.md)
+- [Read more about custom logs.](logs-ingestion-api-overview.md)
+- [Learn more about writing transformation queries](../essentials//data-collection-transformations.md)
azure-monitor Tutorial Logs Ingestion Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/tutorial-logs-ingestion-portal.md
+
+ Title: Tutorial - Send data to Azure Monitor Logs using REST API (Azure portal)
+description: Tutorial on how to send data to a Log Analytics workspace in Azure Monitor using REST API. Azure portal version.
+ Last updated : 07/15/2022++
+# Tutorial: Send data to Azure Monitor Logs using REST API (Azure portal)
+[Logs ingestion API (preview)](logs-ingestion-api-overview.md) in Azure Monitor allow you to send external data to a Log Analytics workspace with a REST API. This tutorial uses the Azure portal to walk through configuration of a new table and a sample application to send log data to Azure Monitor.
+
+> [!NOTE]
+> This tutorial uses the Azure portal. See [Tutorial: Send data to Azure Monitor Logs using REST API (Resource Manager templates)](tutorial-logs-ingestion-api.md) for a similar tutorial using resource manager templates.
+
+In this tutorial, you learn to:
+
+> [!div class="checklist"]
+> * Create a custom table in a Log Analytics workspace
+> * Create a data collection endpoint to receive data over HTTP
+> * Create a data collection rule that transforms incoming data to match the schema of the target table
+> * Create a sample application to send custom data to Azure Monitor
++
+## Prerequisites
+To complete this tutorial, you need the following:
+
+- Log Analytics workspace where you have at least [contributor rights](manage-access.md#azure-rbac) .
+- [Permissions to create Data Collection Rule objects](../essentials/data-collection-rule-overview.md#permissions) in the workspace.
++
+## Overview of tutorial
+In this tutorial, you'll use a PowerShell script to send sample Apache access logs over HTTP to the API endpoint. This will require a script to convert this data to the JSON format that's required for the Azure Monitor logs ingestion API. The data will further be converted with a transformation in a data collection rule (DCR) that filters out records that shouldn't be ingested and create the columns required for the table that the data will be sent to. Once the configuration is complete, you'll send sample data from the command line and then inspect the results in Log Analytics.
++
+## Configure application
+Start by registering an Azure Active Directory application to authenticate against the API. Any ARM authentication scheme is supported, but this will follow the [Client Credential Grant Flow scheme](../../active-directory/develop/v2-oauth2-client-creds-grant-flow.md) for this tutorial.
+
+1. From the **Azure Active Directory** menu in the Azure portal, select **App registrations** and then **New registration**.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/new-app-registration.png" lightbox="media/tutorial-logs-ingestion-portal/new-app-registration.png" alt-text="Screenshot showing app registration screen.":::
+
+2. Give the application a name and change the tenancy scope if the default is not appropriate for your environment. A **Redirect URI** isn't required.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/new-app-name.png" lightbox="media/tutorial-logs-ingestion-portal/new-app-name.png" alt-text="Screenshot showing app details.":::
+
+3. Once registered, you can view the details of the application. Note the **Application (client) ID** and the **Directory (tenant) ID**. You'll need these values later in the process.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/new-app-id.png" lightbox="media/tutorial-logs-ingestion-portal/new-app-id.png" alt-text="Screenshot showing app id.":::
+
+4. You now need to generate an application client secret, which is similar to creating a password to use with a username. Select **Certificates & secrets** and then **New client secret**. Give the secret a name to identify its purpose and select an **Expires** duration. *1 year* is selected here although for a production implementation, you would follow best practices for a secret rotation procedure or use a more secure authentication mode such a certificate.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/new-app-secret.png" lightbox="media/tutorial-logs-ingestion-portal/new-app-secret.png" alt-text="Screenshot showing secret for new app.":::
+
+5. Click **Add** to save the secret and then note the **Value**. Ensure that you record this value since You can't recover it once you navigate away from this page. Use the same security measures as you would for safekeeping a password as it's the functional equivalent.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/new-app-secret-value.png" lightbox="media/tutorial-logs-ingestion-portal/new-app-secret-value.png" alt-text="Screenshot show secret value for new app.":::
+
+## Create data collection endpoint
+A [data collection endpoint (DCE)](../essentials/data-collection-endpoint-overview.md) is required to accept the data from the script. Once you configure the DCE and link it to a data collection rule, you can send data over HTTP from your application. The DCE must be located in the same region as the Log Analytics workspace where the data will be sent.
+
+1. To create a new DCE, go to the **Monitor** menu in the Azure portal. Select **Data Collection Endpoints** and then **Create**.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/new-data-collection-endpoint.png" lightbox="media/tutorial-logs-ingestion-portal/new-data-collection-endpoint.png" alt-text="Screenshot showing new data collection endpoint.":::
+
+2. Provide a name for the DCE and ensure that it's in the same region as your workspace. Click **Create** to create the DCE.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/data-collection-endpoint-details.png" lightbox="media/tutorial-logs-ingestion-portal/data-collection-endpoint-details.png" alt-text="Screenshot showing data collection endpoint details.":::
+
+3. Once the DCE is created, select it so you can view its properties. Note the **Logs ingestion** URI since you'll need this in a later step.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/data-collection-endpoint-uri.png" lightbox="media/tutorial-logs-ingestion-portal/data-collection-endpoint-uri.png" alt-text="Screenshot showing data collection endpoint uri.":::
++
+## Generate sample data
+The following PowerShell script both generates sample data to configure the custom table and sends sample data to the logs ingestion API to test the configuration.
+
+1. Run the following PowerShell command which adds a required assembly for the script.
+
+ ```powershell
+ Add-Type -AssemblyName System.Web
+ ```
+
+2. Update the values of `$tenantId`, `$appId`, and `$appSecret` with the values you noted for **Directory (tenant) ID**, **Application (client) ID**, and secret **Value** and then save with the file name *LogGenerator.ps1*.
+
+ ``` PowerShell
+ param ([Parameter(Mandatory=$true)] $Log, $Type="file", $Output, $DcrImmutableId, $DceURI, $Table)
+ ################
+ ##### Usage
+ ################
+ # LogGenerator.ps1
+ # -Log <String> - log file to be forwarded
+ # [-Type "file|API"] - whether the script should generate sample JSON file or send data via
+ # API call. Data will be written to a file by default
+ # [-Output <String>] - path to resulting JSON sample
+ # [-DcrImmutableId <string>] - DCR immutable ID
+ # [-DceURI] - Data collection endpoint URI
+ # [-Table] - The name of the custom log table, including "_CL" suffix
++
+ ##### >>>> PUT YOUR VALUES HERE <<<<<
+ # information needed to authenticate to AAD and obtain a bearer token
+ $tenantId = "<put tenant ID here>"; #the tenant ID in which the Data Collection Endpoint resides
+ $appId = "<put application ID here>"; #the app ID created and granted permissions
+ $appSecret = "<put secret value here>"; #the secret created for the above app - never store your secrets in the source code
+ ##### >>>> END <<<<<
++
+ $file_data = Get-Content $Log
+ if ("file" -eq $Type) {
+ ############
+ ## Convert plain log to JSON format and output to .json file
+ ############
+ # If not provided, get output file name
+ if ($null -eq $Output) {
+ $Output = Read-Host "Enter output file name"
+ };
+
+ # Form file payload
+ $payload = @();
+ $records_to_generate = [math]::min($file_data.count, 500)
+ for ($i=0; $i -lt $records_to_generate; $i++) {
+ $log_entry = @{
+ # Define the structure of log entry, as it will be sent
+ Time = Get-Date ([datetime]::UtcNow) -Format O
+ Application = "LogGenerator"
+ RawData = $file_data[$i]
+ }
+ $payload += $log_entry
+ }
+ # Write resulting payload to file
+ New-Item -Path $Output -ItemType "file" -Value ($payload | ConvertTo-Json) -Force
+
+ } else {
+ ############
+ ## Send the content to the data collection endpoint
+ ############
+ if ($null -eq $DcrImmutableId) {
+ $DcrImmutableId = Read-Host "Enter DCR Immutable ID"
+ };
+
+ if ($null -eq $DceURI) {
+ $DceURI = Read-Host "Enter data collection endpoint URI"
+ }
+
+ if ($null -eq $Table) {
+ $Table = Read-Host "Enter the name of custom log table"
+ }
+
+ ## Obtain a bearer token used to authenticate against the data collection endpoint
+ $scope = [System.Web.HttpUtility]::UrlEncode("https://monitor.azure.com//.default")
+ $body = "client_id=$appId&scope=$scope&client_secret=$appSecret&grant_type=client_credentials";
+ $headers = @{"Content-Type" = "application/x-www-form-urlencoded" };
+ $uri = "https://login.microsoftonline.com/$tenantId/oauth2/v2.0/token"
+ $bearerToken = (Invoke-RestMethod -Uri $uri -Method "Post" -Body $body -Headers $headers).access_token
+
+ ## Generate and send some data
+ foreach ($line in $file_data) {
+ # We are going to send log entries one by one with a small delay
+ $log_entry = @{
+ # Define the structure of log entry, as it will be sent
+ Time = Get-Date ([datetime]::UtcNow) -Format O
+ Application = "LogGenerator"
+ RawData = $line
+ }
+ # Sending the data to Log Analytics via the DCR!
+ $body = $log_entry | ConvertTo-Json -AsArray;
+ $headers = @{"Authorization" = "Bearer $bearerToken"; "Content-Type" = "application/json" };
+ $uri = "$DceURI/dataCollectionRules/$DcrImmutableId/streams/Custom-$Table"+"?api-version=2021-11-01-preview";
+ $uploadResponse = Invoke-RestMethod -Uri $uri -Method "Post" -Body $body -Headers $headers;
+
+ # Let's see how the response looks like
+ Write-Output $uploadResponse
+ Write-Output ""
+
+ # Pausing for 1 second before processing the next entry
+ Start-Sleep -Seconds 1
+ }
+ }
+ ```
+
+3. Copy the sample log data from [sample data](#sample-data) or copy your own Apache log data into a file called `sample_access.log`.
+
+4. To read the data in the file and create a JSON file called `data_sample.json` that you can send to the logs ingestion API, run:
+
+ ```PowerShell
+ .\LogGenerator.ps1 -Log "sample_access.log" -Type "file" -Output "data_sample.json"
+ ```
+
+## Add custom log table
+Before you can send data to the workspace, you need to create the custom table that the data will be sent to.
+
+1. Go to the **Log Analytics workspaces** menu in the Azure portal and select **Tables (preview)**. The tables in the workspace will be displayed. Select **Create** and then **New custom log (DCR based)**.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/new-custom-log.png" lightbox="media/tutorial-logs-ingestion-portal/new-custom-log.png" alt-text="Screenshot showing new DCR-based custom log.":::
+
+2. Specify a name for the table. You don't need to add the *_CL* suffix required for a custom table since this will be automatically added to the name you specify.
+
+3. Click **Create a new data collection rule** to create the DCR that will be used to send data to this table. If you have an existing data collection rule, you can choose to use it instead. Specify the **Subscription**, **Resource group**, and **Name** for the data collection rule that will contain the custom log configuration.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/new-data-collection-rule.png" lightbox="media/tutorial-logs-ingestion-portal/new-data-collection-rule.png" alt-text="Screenshot showing new data collection rule.":::
+
+4. Select the data collection endpoint that you created and click **Next**.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/custom-log-table-name.png" lightbox="media/tutorial-logs-ingestion-portal/custom-log-table-name.png" alt-text="Screenshot showing custom log table name.":::
++
+## Parse and filter sample data
+Instead of directly configuring the schema of the table, the portal allows you to upload sample data so that Azure Monitor can determine the schema. The sample is expected to be a JSON file containing one or multiple log records structured in the same way they will be sent in the body of HTTP request of the logs ingestion API call.
+
+1. Click **Browse for files** and locate *data_sample.json* that you previously created.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/custom-log-browse-files.png" lightbox="media/tutorial-logs-ingestion-portal/custom-log-browse-files.png" alt-text="Screenshot showing custom log browse for files.":::
+
+2. Data from the sample file is displayed with a warning that a `TimeGenerated` is not in the data. All log tables within Azure Monitor Logs are required to have a `TimeGenerated` column populated with the timestamp of logged event. In this sample, the timestamp of event is stored in field called `Time`. You're going to add a transformation that will rename this column in the output.
+
+3. Click **Transformation editor** to open the transformation editor to add this column. You're going to add a transformation that will rename this column in the output. The transformation editor lets you create a transformation for the incoming data stream. This is a KQL query that is run against each incoming record. The results of the query will be stored in the destination table. See [Data collection rule transformations in Azure Monitor](../essentials//data-collection-transformations.md) for details on transformation queries.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/custom-log-data-preview.png" lightbox="media/tutorial-logs-ingestion-portal/custom-log-data-preview.png" alt-text="Screenshot showing custom log data preview.":::
+
+4. Add the following query to the transformation editor to add the `TimeGenerated` column to the output.
+
+ ```kusto
+ source
+ | extend TimeGenerated = todatetime(Time)
+ ```
+
+5. Click **Run** to view the results. You can see that the `TimeGenerated` column is now added to the other columns. Most of the interesting data is contained in the `RawData` column though
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/custom-log-query-01.png" lightbox="media/tutorial-logs-ingestion-portal/custom-log-query-01.png" alt-text="Screenshot showing initial custom log data query.":::
+
+6. Modify the query to the following, which extracts the client IP address, HTTP method, address of the page being access, and the response code from each log entry.
+
+ ```kusto
+ source
+ | extend TimeGenerated = todatetime(Time)
+ | parse RawData with
+ ClientIP:string
+ ' ' *
+ ' ' *
+ ' [' * '] "' RequestType:string
+ " " Resource:string
+ " " *
+ '" ' ResponseCode:int
+ " " *
+ ```
+
+7. Click **Run** to views the results. This extracts the contents of `RawData` into separate columns `ClientIP`, `RequestType`, `Resource`, and `ResponseCode`.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/custom-log-query-02.png" lightbox="media/tutorial-logs-ingestion-portal/custom-log-query-02.png" alt-text="Screenshot showing custom log data query with parse command.":::
+
+8. The query can be optimized more though by removing the `RawData` and `Time` columns since they aren't needed anymore.You can also filter out any records with `ResponseCode` of 200 since you're only interested in collecting data for requests that were not successful. This reduces the volume of data being ingested which reduces its overall cost.
++
+ ```kusto
+ source
+ | extend TimeGenerated = todatetime(Time)
+ | parse RawData with
+ ClientIP:string
+ ' ' *
+ ' ' *
+ ' [' * '] "' RequestType:string
+ " " Resource:string
+ " " *
+ '" ' ResponseCode:int
+ " " *
+ | where ResponseCode != 200
+ | project-away Time, RawData
+ ```
+
+9. Click **Run** to views the results.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/custom-log-query-03.png" lightbox="media/tutorial-logs-ingestion-portal/custom-log-query-03.png" alt-text="Screenshot showing custom log data query with filter.":::
+
+10. Click **Apply** to save the transformation and view the schema of the table that's about to be created. Click **Next** to proceed.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/custom-log-final-schema.png" lightbox="media/tutorial-logs-ingestion-portal/custom-log-final-schema.png" alt-text="Screenshot showing custom log final schema.":::
+
+11. Verify the final details and click **Create** to save the custom log.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/custom-log-create.png" lightbox="media/tutorial-logs-ingestion-portal/custom-log-create.png" alt-text="Screenshot showing custom log create.":::
+
+## Collect information from DCR
+With the data collection rule created, you need to collect its ID which is needed in the API call.
+
+1. From the **Monitor** menu in the Azure portal, select **Data collection rules** and select the DCR you just created. From **Overview** for the data collection rule, select the **JSON View**.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/data-collection-rule-json-view.png" lightbox="media/tutorial-logs-ingestion-portal/data-collection-rule-json-view.png" alt-text="Screenshot showing data collection rule JSON view.":::
+
+2. Copy the **immutableId** value.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/data-collection-rule-immutable-id.png" lightbox="media/tutorial-logs-ingestion-portal/data-collection-rule-immutable-id.png" alt-text="Screenshot showing collecting immutable ID from JSON view.":::
+++
+## Assign permissions to DCR
+The final step is to give the application permission to use the DCR. This will allow any application using the correct application ID and application key to send data to the new DCE and DCR.
+
+1. Select **Access Control (IAM)** for the DCR and then **Add role assignment**.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/add-role-assignment.png" lightbox="media/tutorial-logs-ingestion-portal/custom-log-create.png" alt-text="Screenshot showing adding custom role assignment to DCR.":::
+
+2. Select **Monitoring Metrics Publisher** and click **Next**. You could instead create a custom action with the `Microsoft.Insights/Telemetry/Write` data action.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/add-role-assignment-select-role.png" lightbox="media/tutorial-logs-ingestion-portal/add-role-assignment-select-role.png" alt-text="Screenshot showing selecting role for DCR role assignment.":::
+
+3. Select **User, group, or service principal** for **Assign access to** and click **Select members**. Select the application that you created and click **Select**.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/add-role-assignment-select-member.png" lightbox="media/tutorial-logs-ingestion-portal/add-role-assignment-select-member.png" alt-text="Screenshot showing selecting members for DCR role assignment.":::
++
+4. Click **Review + assign** and verify the details before saving your role assignment.
+
+ :::image type="content" source="media/tutorial-logs-ingestion-portal/add-role-assignment-save.png" lightbox="media/tutorial-logs-ingestion-portal/add-role-assignment-save.png" alt-text="Screenshot showing saving DCR role assignment.":::
+++
+## Send sample data
+Allow at least 30 minutes for the configuration to take effect. You may also experience increased latency for the first few entries, but this should normalize.
+
+1. Run the following command providing the values that you collected for your data collection rule and data collection endpoint. The script will start ingesting data by placing calls to the API at pace of approximately 1 record per second.
+
+```PowerShell
+.\LogGenerator.ps1 -Log "sample_access.log" -Type "API" -Table "ApacheAccess_CL" -DcrImmutableId <immutable ID> -DceUri <data collection endpoint URL>
+```
+
+2. From Log Analytics, query your newly created table to verify that data arrived and if it is transformed properly.
+
+## Troubleshooting
+This section describes different error conditions you may receive and how to correct them.
+
+### Script returns error code 403
+Ensure that you have the correct permissions for your application to the DCR. You may also need to wait up to 30 minutes for permissions to propagate.
+
+### Script returns error code 413 or warning of `TimeoutExpired` with the message `ReadyBody_ClientConnectionAbort` in the response
+The message is too large. The maximum message size is currently 1MB per call.
+
+### Script returns error code 429
+API limits have been exceeded. The limits are currently set to 500MB of data/minute for both compressed and uncompressed data, as well as 300,000 requests/minute. Retry after the duration listed in the `Retry-After` header in the response.
+### Script returns error code 503
+Ensure that you have the correct permissions for your application to the DCR. You may also need to wait up to 30 minutes for permissions to propagate.
+
+### You don't receive an error, but data doesn't appear in the workspace
+The data may take some time to be ingested, especially if this is the first time data is being sent to a particular table. It shouldn't take longer than 15 minutes.
+
+### IntelliSense in Log Analytics not recognizing new table
+The cache that drives IntelliSense may take up to 24 hours to update.
+
+## Sample data
+Following is sample data that you can use for the tutorial. Alternatively, you can use your own data if you have your own Apache access logs.
+
+```
+0.0.139.0
+0.0.153.185
+0.0.153.185
+0.0.66.230
+0.0.148.92
+0.0.35.224
+0.0.162.225
+0.0.162.225
+0.0.148.108
+0.0.148.1
+0.0.203.24
+0.0.4.214
+0.0.10.125
+0.0.10.125
+0.0.10.125
+0.0.10.125
+0.0.10.117
+0.0.10.114
+0.0.10.114
+0.0.10.125
+0.0.10.117
+0.0.10.117
+0.0.10.114
+0.0.10.114
+0.0.10.125
+0.0.10.114
+0.0.10.114
+0.0.10.125
+0.0.10.117
+0.0.10.117
+0.0.10.114
+0.0.10.125
+0.0.10.117
+0.0.10.114
+0.0.10.117
+0.0.10.125
+0.0.10.125
+0.0.10.125
+0.0.10.125
+0.0.10.125
+0.0.10.125
+0.0.10.114
+0.0.10.117
+0.0.167.138
+0.0.149.55
+0.0.229.86
+0.0.117.249
+0.0.117.249
+0.0.117.249
+0.0.64.41
+0.0.208.79
+0.0.208.79
+0.0.208.79
+0.0.208.79
+0.0.196.129
+0.0.196.129
+0.0.66.158
+0.0.161.12
+0.0.161.12
+0.0.51.36
+0.0.51.36
+0.0.145.131
+0.0.145.131
+0.0.0.179
+0.0.0.179
+0.0.145.131
+0.0.145.131
+0.0.95.52
+0.0.95.52
+0.0.51.36
+0.0.51.36
+0.0.227.31
+0.0.227.31
+0.0.51.36
+0.0.51.36
+0.0.51.36
+0.0.51.36
+0.0.4.22
+0.0.4.22
+0.0.143.24
+0.0.143.24
+0.0.0.98
+0.0.0.98
+0.0.51.62
+0.0.51.62
+0.0.51.36
+0.0.51.36
+0.0.0.98
+0.0.0.98
+0.0.58.254
+0.0.58.254
+0.0.51.62
+0.0.51.62
+0.0.227.31
+0.0.227.31
+0.0.0.179
+0.0.0.179
+0.0.58.254
+0.0.58.254
+0.0.95.52
+0.0.95.52
+0.0.0.98
+0.0.0.98
+0.0.58.90
+0.0.58.90
+0.0.51.36
+0.0.51.36
+0.0.207.154
+0.0.207.154
+0.0.95.52
+0.0.95.52
+0.0.51.62
+0.0.51.62
+0.0.145.131
+0.0.145.131
+0.0.58.90
+0.0.58.90
+0.0.227.55
+0.0.227.55
+0.0.95.52
+0.0.95.52
+0.0.161.12
+0.0.161.12
+0.0.227.55
+0.0.227.55
+0.0.143.30
+0.0.143.30
+0.0.227.31
+0.0.227.31
+0.0.161.6
+0.0.161.6
+0.0.161.6
+0.0.227.31
+0.0.227.31
+0.0.51.62
+0.0.51.62
+0.0.227.31
+0.0.227.31
+0.0.95.20
+0.0.95.20
+0.0.207.154
+0.0.207.154
+0.0.0.98
+0.0.0.98
+0.0.51.36
+0.0.51.36
+0.0.227.55
+0.0.227.55
+0.0.207.154
+0.0.207.154
+0.0.51.36
+0.0.51.36
+0.0.51.36
+0.0.51.36
+0.0.207.221
+0.0.207.221
+0.0.0.179
+0.0.0.179
+0.0.161.12
+0.0.161.12
+0.0.58.90
+0.0.58.90
+0.0.145.106
+0.0.145.106
+0.0.145.106
+0.0.145.106
+0.0.0.179
+0.0.0.179
+0.0.149.8
+0.0.207.154
+0.0.207.154
+0.0.227.31
+0.0.227.31
+0.0.51.62
+0.0.51.62
+0.0.227.55
+0.0.227.55
+0.0.143.30
+0.0.143.30
+0.0.95.52
+0.0.95.52
+0.0.145.131
+0.0.145.131
+0.0.51.62
+0.0.51.62
+0.0.0.98
+0.0.0.98
+0.0.207.221
+0.0.145.131
+0.0.207.221
+0.0.145.131
+0.0.51.62
+0.0.51.62
+0.0.51.36
+0.0.51.36
+0.0.145.131
+0.0.145.131
+0.0.58.254
+0.0.58.254
+0.0.145.106
+0.0.145.106
+0.0.207.221
+0.0.207.221
+0.0.227.31
+0.0.227.31
+0.0.145.106
+0.0.145.106
+0.0.145.131
+0.0.145.131
+0.0.0.179
+0.0.0.179
+0.0.227.31
+0.0.227.31
+0.0.227.55
+0.0.227.55
+0.0.95.52
+0.0.95.52
+0.0.0.98
+0.0.0.98
+0.0.4.35
+0.0.4.35
+0.0.4.22
+0.0.4.22
+0.0.58.90
+0.0.58.90
+0.0.145.106
+0.0.145.106
+0.0.143.24
+0.0.143.24
+0.0.227.55
+0.0.227.55
+0.0.207.154
+0.0.207.154
+0.0.143.30
+0.0.143.30
+0.0.227.31
+0.0.227.31
+0.0.0.179
+0.0.0.179
+0.0.0.98
+0.0.0.98
+0.0.207.221
+0.0.207.221
+0.0.0.179
+0.0.0.179
+0.0.0.98
+0.0.0.98
+0.0.207.221
+0.0.207.221
+0.0.207.154
+0.0.207.154
+0.0.58.254
+0.0.58.254
+0.0.51.36
+0.0.51.36
+0.0.51.36
+0.0.51.36
+0.0.207.154
+0.0.207.154
+0.0.161.6
+0.0.145.131
+0.0.145.131
+0.0.207.221
+0.0.207.221
+0.0.95.20
+0.0.95.20
+0.0.183.233
+0.0.183.233
+0.0.51.36
+0.0.51.36
+0.0.95.52
+0.0.95.52
+0.0.227.31
+0.0.227.31
+0.0.51.62
+0.0.51.62
+0.0.95.52
+0.0.95.52
+0.0.207.154
+0.0.207.154
+0.0.51.36
+0.0.51.36
+0.0.58.90
+0.0.58.90
+0.0.4.35
+0.0.4.35
+0.0.95.52
+0.0.95.52
+0.0.167.138
+0.0.51.36
+0.0.51.36
+0.0.161.6
+0.0.161.6
+0.0.58.254
+0.0.58.254
+0.0.207.154
+0.0.207.154
+0.0.58.90
+0.0.58.90
+0.0.51.62
+0.0.51.62
+0.0.58.90
+0.0.58.90
+0.0.81.164
+0.0.81.164
+0.0.207.221
+0.0.207.221
+0.0.227.55
+0.0.227.55
+0.0.227.55
+0.0.227.55
+0.0.207.221
+0.0.207.154
+0.0.207.154
+0.0.207.221
+0.0.143.30
+0.0.143.30
+0.0.0.179
+0.0.0.179
+0.0.51.62
+0.0.51.62
+0.0.4.35
+0.0.4.35
+0.0.207.221
+0.0.207.221
+0.0.51.62
+0.0.51.62
+0.0.51.62
+0.0.51.62
+0.0.95.20
+0.0.4.35
+0.0.4.35
+0.0.58.254
+0.0.58.254
+0.0.145.106
+0.0.145.106
+0.0.0.98
+0.0.0.98
+0.0.95.52
+0.0.95.52
+0.0.51.62
+0.0.51.62
+0.0.207.221
+0.0.207.221
+0.0.143.30
+0.0.143.30
+0.0.207.154
+0.0.207.154
+0.0.143.30
+0.0.95.20
+0.0.95.20
+0.0.0.98
+0.0.0.98
+0.0.145.131
+0.0.145.131
+0.0.161.12
+0.0.161.12
+0.0.95.52
+0.0.95.52
+0.0.161.12
+0.0.161.12
+0.0.0.179
+0.0.0.179
+0.0.4.35
+0.0.4.35
+0.0.164.246
+0.0.161.12
+0.0.161.12
+0.0.161.12
+0.0.161.12
+0.0.207.221
+0.0.207.221
+0.0.4.35
+0.0.4.35
+0.0.207.221
+0.0.207.221
+0.0.145.106
+0.0.145.106
+0.0.4.22
+0.0.4.22
+0.0.161.12
+0.0.161.12
+0.0.58.254
+0.0.58.254
+0.0.161.12
+0.0.161.12
+0.0.66.216
+0.0.0.179
+0.0.0.179
+0.0.145.131
+0.0.145.131
+0.0.4.35
+0.0.4.35
+0.0.58.254
+0.0.58.254
+0.0.143.24
+0.0.143.24
+0.0.143.24
+0.0.143.24
+0.0.207.221
+0.0.207.221
+0.0.58.254
+0.0.58.254
+0.0.145.131
+0.0.145.131
+0.0.51.36
+0.0.51.36
+0.0.227.31
+0.0.161.12
+0.0.227.31
+0.0.161.6
+0.0.161.6
+0.0.207.221
+0.0.207.221
+0.0.161.12
+0.0.145.106
+0.0.145.106
+0.0.161.6
+0.0.161.6
+0.0.95.20
+0.0.95.20
+0.0.4.35
+0.0.4.35
+0.0.95.52
+0.0.95.52
+0.0.128.50
+0.0.227.31
+0.0.227.31
+0.0.227.31
+0.0.227.31
+0.0.227.55
+0.0.227.55
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+0.0.29.211
+```
++
+## Next steps
+
+- [Complete a similar tutorial using the Azure portal.](tutorial-logs-ingestion-api.md)
+- [Read more about custom logs.](logs-ingestion-api-overview.md)
+- [Learn more about writing transformation queries](../essentials//data-collection-transformations.md)
azure-monitor Tutorial Workspace Transformations Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/tutorial-workspace-transformations-api.md
+
+ Title: Tutorial - Add ingestion-time transformation to Azure Monitor Logs using resource manager templates
+description: Describes how to add a custom transformation to data flowing through Azure Monitor Logs using resource manager templates.
+ Last updated : 07/01/2022++
+# Tutorial: Add transformation in workspace data collection rule to Azure Monitor using resource manager templates (preview)
+This tutorial walks you through configuration of a sample [transformation in a workspace data collection rule](../essentials/data-collection-transformations.md) using resource manager templates. [Transformations](../essentials/data-collection-transformations.md) in Azure Monitor allow you to filter or modify incoming data before it's sent to its destination. Workspace transformations provide support for [ingestion-time transformations](../essentials/data-collection-transformations.md) for workflows that don't yet use the [Azure Monitor data ingestion pipeline](../essentials/data-collection.md).
+
+Workspace transformations are stored together in a single [data collection rule (DCR)](../essentials/data-collection-rule-overview.md) for the workspace, called the workspace DCR. Each transformation is associated with a particular table. The transformation will be applied to all data sent to this table from any workflow not using a DCR.
+
+> [!NOTE]
+> This tutorial uses resource manager templates and REST API to configure a workspace transformation. See [Tutorial: Add transformation in workspace data collection rule to Azure Monitor using the Azure portal (preview)](tutorial-workspace-transformations-portal.md) for the same tutorial using the Azure portal.
+
+In this tutorial, you learn to:
+
+> [!div class="checklist"]
+> * Configure [workspace transformation](../essentials/data-collection-transformations.md#workspace-transformation-dcr) for a table in a Log Analytics workspace.
+> * Write a log query for an ingestion-time transform.
++
+> [!NOTE]
+> This tutorial uses PowerShell from Azure Cloud Shell to make REST API calls using the Azure Monitor **Tables** API and the Azure portal to install resource manager templates. You can use any other method to make these calls.
+
+## Prerequisites
+To complete this tutorial, you need the following:
+
+- Log Analytics workspace where you have at least [contributor rights](manage-access.md#azure-rbac).
+- [Permissions to create Data Collection Rule objects](../essentials/data-collection-rule-overview.md#permissions) in the workspace.
+- The table must already have some data.
+- The table can't already be linked to the [workspace transformation DCR](../essentials/data-collection-transformations.md#workspace-transformation-dcr).
++
+## Overview of tutorial
+In this tutorial, you'll reduce the storage requirement for the `LAQueryLogs` table by filtering out certain records. You'll also remove the contents of a column while parsing the column data to store a piece of data in a custom column. The [LAQueryLogs table](query-audit.md#audit-data) is created when you enable [log query auditing](query-audit.md) in a workspace, but this is only used as a sample for the tutorial. You can use this same basic process to create a transformation for any [supported table](tables-feature-support.md) in a Log Analytics workspace.
++
+## Enable query audit logs
+You need to enable [query auditing](query-audit.md) for your workspace to create the `LAQueryLogs` table that you'll be working with. This is not required for all ingestion time transformations. It's just to generate the sample data that this sample transformation will use.
+
+1. From the **Log Analytics workspaces** menu in the Azure portal, select **Diagnostic settings** and then **Add diagnostic setting**.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-portal/diagnostic-settings.png" lightbox="media/tutorial-workspace-transformations-portal/diagnostic-settings.png" alt-text="Screenshot of diagnostic settings.":::
+
+2. Provide a name for the diagnostic setting and select the workspace so that the auditing data is stored in the same workspace. Select the **Audit** category and then click **Save** to save the diagnostic setting and close the diagnostic setting page.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-portal/new-diagnostic-setting.png" lightbox="media/tutorial-workspace-transformations-portal/new-diagnostic-setting.png" alt-text="Screenshot of new diagnostic setting.":::
+
+3. Select **Logs** and then run some queries to populate `LAQueryLogs` with some data. These queries don't need to actually return any data.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-portal/sample-queries.png" lightbox="media/tutorial-workspace-transformations-portal/sample-queries.png" alt-text="Screenshot of sample log queries.":::
+
+## Update table schema
+Before you can create the transformation, the following two changes must be made to the table:
+
+- The table must be enabled for workspace transformation. This is required for any table that will have a transformation, even if the transformation doesn't modify the table's schema.
+- Any additional columns populated by the transformation must be added to the table.
+
+Use the **Tables - Update** API to configure the table with the PowerShell code below. Calling the API enables the table for workspace transformations, whether or not custom columns are defined. In this sample, it includes a custom column called *Resources_CF* that will be populated with the transformation query.
+
+> [!IMPORTANT]
+> Any custom columns added to a built-in table must end in *_CF*. Columns added to a custom table (a table with a name that ends in *_CL*) does not need to have this suffix.
+
+1. Click the **Cloud Shell** button in the Azure portal and ensure the environment is set to **PowerShell**.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-api/open-cloud-shell.png" lightbox="media/tutorial-workspace-transformations-api/open-cloud-shell.png" alt-text="Screenshot of opening cloud shell.":::
+
+2. Copy the following PowerShell code and replace the **Path** parameter with the details for your workspace.
+
+ ```PowerShell
+ $tableParams = @'
+ {
+ "properties": {
+ "schema": {
+ "name": "LAQueryLogs",
+ "columns": [
+ {
+ "name": "Resources_CF",
+ "description": "The list of resources, this query ran against",
+ "type": "string",
+ "isDefaultDisplay": true,
+ "isHidden": false
+ }
+ ]
+ }
+ }
+ }
+ '@
+
+ Invoke-AzRestMethod -Path "/subscriptions/{subscription}/resourcegroups/{resourcegroup}/providers/microsoft.operationalinsights/workspaces/{workspace}/tables/LAQueryLogs?api-version=2021-12-01-preview" -Method PUT -payload $tableParams
+ ```
+
+3. Paste the code into the cloud shell prompt to run it.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-api/cloud-shell-script.png" lightbox="media/tutorial-workspace-transformations-api/cloud-shell-script.png" alt-text="Screenshot of script in cloud shell.":::
+
+4. You can verify that the column was added by going to the **Log Analytics workspace** menu in the Azure portal. Select **Logs** to open Log Analytics and then expand the `LAQueryLogs` table to view its columns.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-portal/verify-table.png" lightbox="media/tutorial-workspace-transformations-portal/verify-table.png" alt-text="Screenshot of Log Analytics with new column.":::
+
+## Define transformation query
+Use Log Analytics to test the transformation query before adding it to a data collection rule.
+
+1. Open your workspace in the **Log Analytics workspaces** menu in the Azure portal and select **Logs** to open Log Analytics.
+
+2. Run the following query to view the contents of the `LAQueryLogs` table. Notice the contents of the `RequestContext` column. The transformation will retrieve the workspace name from this column and remove the rest of the data in it.
+
+ ```kusto
+ LAQueryLogs
+ | take 10
+ ```
+
+ :::image type="content" source="media/tutorial-workspace-transformations-portal/initial-query.png" lightbox="media/tutorial-workspace-transformations-portal/initial-query.png" alt-text="Screenshot of initial query in Log Analytics.":::
+
+3. Modify the query to the following:
+
+ ``` kusto
+ LAQueryLogs
+ | where QueryText !contains 'LAQueryLogs'
+ | extend Context = parse_json(RequestContext)
+ | extend Workspace_CF = tostring(Context['workspaces'][0])
+ | project-away RequestContext, Context
+ ```
+ This makes the following changes:
+
+ - Drop rows related to querying the `LAQueryLogs` table itself to save space since these log entries aren't useful.
+ - Add a column for the name of the workspace that was queried.
+ - Remove data from the `RequestContext` column to save space.
++
+ :::image type="content" source="media/tutorial-workspace-transformations-portal/modified-query.png" lightbox="media/tutorial-workspace-transformations-portal/modified-query.png" alt-text="Screenshot of modified query in Log Analytics.":::
++
+4. Make the following changes to the query to use it in the transformation:
+
+ - Instead of specifying a table name (`LAQueryLogs` in this case) as the source of data for this query, use the `source` keyword. This is a virtual table that always represents the incoming data in a transformation query.
+ - Remove any operators that aren't supported by transform queries. See [Supported tables for ingestion-time transformations](tables-feature-support.md) for a detail list of operators that are supported.
+ - Flatten the query to a single line so that it can fit into the DCR JSON.
+
+ Following is the query that you will use in the transformation after these modifications:
+
+ ```kusto
+ source | where QueryText !contains 'LAQueryLogs' | extend Context = parse_json(RequestContext) | extend Resources_CF = tostring(Context['workspaces']) |extend RequestContext = ''
+ ```
+
+## Create data collection rule (DCR)
+Since this is the first transformation in the workspace, you need to create a [workspace transformation DCR](../essentials/data-collection-transformations.md#workspace-transformation-dcr). If you create workspace transformations for other tables in the same workspace, they must be stored in this same DCR.
+
+1. In the Azure portal's search box, type in *template* and then select **Deploy a custom template**.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-api/deploy-custom-template.png" lightbox="media/tutorial-workspace-transformations-api/deploy-custom-template.png" alt-text="Screenshot to deploy custom template.":::
+
+2. Click **Build your own template in the editor**.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-api/build-custom-template.png" lightbox="media/tutorial-workspace-transformations-api/build-custom-template.png" alt-text="Screenshot to build template in the editor.":::
+
+3. Paste the resource manager template below into the editor and then click **Save**. This template defines the DCR and contains the transformation query. You don't need to modify this template since it will collect values for its parameters.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-api/edit-template.png" lightbox="media/tutorial-workspace-transformations-api/edit-template.png" alt-text="Screenshot to edit resource manager template.":::
++
+ ```json
+ {
+     "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
+     "contentVersion": "1.0.0.0",
+     "parameters": {
+         "dataCollectionRuleName": {
+             "type": "string",
+             "metadata": {
+                 "description": "Specifies the name of the Data Collection Rule to create."
+             }
+         },
+         "location": {
+             "type": "string",
+             "defaultValue": "westus2",
+             "allowedValues": [
+                 "westus2",
+ "eastus2",
+                 "eastus2euap"
+             ],
+             "metadata": {
+                 "description": "Specifies the location in which to create the Data Collection Rule."
+             }
+         },
+         "workspaceResourceId": {
+             "type": "string",
+             "metadata": {
+                 "description": "Specifies the Azure resource ID of the Log Analytics workspace to use."
+             }
+         }
+     },
+     "resources": [
+         {
+             "type": "Microsoft.Insights/dataCollectionRules",
+             "name": "[parameters('dataCollectionRuleName')]",
+             "location": "[parameters('location')]",
+             "apiVersion": "2021-09-01-preview",
+             "kind": "WorkspaceTransforms",
+             "properties": {
+ "destinations": {
+ "logAnalytics": [
+ {
+ "workspaceResourceId": "[parameters('workspaceResourceId')]",
+ "name": "clv2ws1"
+ }
+ ]
+ },
+ "dataFlows": [
+ {
+ "streams": [
+ "Microsoft-Table-LAQueryLogs"
+ ],
+ "destinations": [
+ "clv2ws1"
+ ],
+ "transformKql": "source |where QueryText !contains 'LAQueryLogs' | extend Context = parse_json(RequestContext) | extend Resources_CF = tostring(Context['workspaces']) |extend RequestContext = ''"
+ }
+ ]
+             }
+         }
+     ],
+     "outputs": {
+         "dataCollectionRuleId": {
+             "type": "string",
+             "value": "[resourceId('Microsoft.Insights/dataCollectionRules', parameters('dataCollectionRuleName'))]"
+         }
+     }
+ }
+ ```
+
+4. On the **Custom deployment** screen, specify a **Subscription** and **Resource group** to store the data collection rule and then provide values defined in the template. This includes a **Name** for the data collection rule and the **Workspace Resource ID** that you collected in a previous step. The **Location** should be the same location as the workspace. The **Region** will already be populated and is used for the location of the data collection rule.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-api/custom-deployment-values.png" lightbox="media/tutorial-workspace-transformations-api/custom-deployment-values.png" alt-text="Screenshot to edit custom deployment values.":::
+
+5. Click **Review + create** and then **Create** when you review the details.
+
+6. When the deployment is complete, expand the **Deployment details** box and click on your data collection rule to view its details. Click **JSON View**.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-api/data-collection-rule-details.png" lightbox="media/tutorial-workspace-transformations-api/data-collection-rule-details.png" alt-text="Screenshot for data collection rule details.":::
+
+7. Copy the **Resource ID** for the data collection rule. You'll use this in the next step.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-api/data-collection-rule-json-view.png" lightbox="media/tutorial-workspace-transformations-api/data-collection-rule-json-view.png" alt-text="Screenshot for data collection rule JSON view.":::
+
+## Link workspace to DCR
+The final step to enable the transformation is to link the DCR to the workspace.
+
+> [!IMPORTANT]
+> A workspace can only be connected to a single DCR, and the linked DCR must contain this workspace as a destination.
+
+Use the **Workspaces - Update** API to configure the table with the PowerShell code below.
+
+1. Click the **Cloud shell** button to open cloud shell again. Copy the following PowerShell code and replace the parameters with values for your workspace and DCR.
+
+ ```PowerShell
+ $defaultDcrParams = @'
+ {
+ "properties": {
+ "defaultDataCollectionRuleResourceId": "/subscriptions/{subscription}/resourceGroups/{resourcegroup}/providers/Microsoft.Insights/dataCollectionRules/{DCR}"
+ }
+ }
+ '@
+
+ Invoke-AzRestMethod -Path "/subscriptions/{subscription}/resourcegroups/{resourcegroup}/providers/microsoft.operationalinsights/workspaces/{workspace}?api-version=2021-12-01-preview" -Method PATCH -payload $defaultDcrParams
+ ```
+
+2. Paste the code into the cloud shell prompt to run it.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-api/cloud-shell-script-link-workspace.png" lightbox="media/tutorial-workspace-transformations-api/cloud-shell-script-link-workspace.png" alt-text="Screenshot of script to link workspace to DCR.":::
+
+## Test transformation
+Allow about 30 minutes for the transformation to take effect, and you can then test it by running a query against the table. Only data sent to the table after the transformation was applied will be affected.
+
+For this tutorial, run some sample queries to send data to the `LAQueryLogs` table. Include some queries against `LAQueryLogs` so you can verify that the transformation filters these records. Notice that the output has the new `Workspace_CF` column, and there are no records for `LAQueryLogs`.
++
+## Troubleshooting
+This section describes different error conditions you may receive and how to correct them.
+
+### IntelliSense in Log Analytics not recognizing new columns in the table
+The cache that drives IntelliSense may take up to 24 hours to update.
+
+### Transformation on a dynamic column isn't working
+There is currently a known issue affecting dynamic columns. A temporary workaround is to explicitly parse dynamic column data using `parse_json()` prior to performing any operations against them.
+
+## Next steps
+
+- [Read more about transformations](../essentials/data-collection-transformations.md)
+- [See which tables support workspace transformations](tables-feature-support.md)
+- [Learn more about writing transformation queries](../essentials/data-collection-transformations-structure.md)
azure-monitor Tutorial Workspace Transformations Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/tutorial-workspace-transformations-portal.md
+
+ Title: Tutorial - Add workspace transformation to Azure Monitor Logs using Azure portal
+description: Describes how to add a custom transformation to data flowing through Azure Monitor Logs using the Azure portal.
+ Last updated : 07/01/2022++
+# Tutorial: Add transformation in workspace data collection rule using the Azure portal (preview)
+This tutorial walks you through configuration of a sample [transformation in a workspace data collection rule](../essentials/data-collection-transformations.md) using the Azure portal. [Transformations](../essentials/data-collection-transformations.md) in Azure Monitor allow you to filter or modify incoming data before it's sent to its destination. Workspace transformations provide support for [ingestion-time transformations](../essentials/data-collection-transformations.md) for workflows that don't yet use the [Azure Monitor data ingestion pipeline](../essentials/data-collection.md).
+
+Workspace transformations are stored together in a single [data collection rule (DCR)](../essentials/data-collection-rule-overview.md) for the workspace, called the workspace DCR. Each transformation is associated with a particular table. The transformation will be applied to all data sent to this table from any workflow not using a DCR.
+
+> [!NOTE]
+> This tutorial uses the Azure portal to configure a workspace transformation. See [Tutorial: Add transformation in workspace data collection rule to Azure Monitor using resource manager templates (preview)](tutorial-workspace-transformations-api.md) for the same tutorial using resource manager templates and REST API.
+
+In this tutorial, you learn to:
+
+> [!div class="checklist"]
+> * Configure [workspace transformation](../essentials/data-collection-transformations.md#workspace-transformation-dcr) for a table in a Log Analytics workspace.
+> * Write a log query for a workspace transformation.
++
+## Prerequisites
+To complete this tutorial, you need the following:
+
+- Log Analytics workspace where you have at least [contributor rights](manage-access.md#azure-rbac).
+- [Permissions to create data collection rule (DCR) objects](../essentials/data-collection-rule-overview.md#permissions) in the workspace.
+- The table must already have some data.
+- The table can't be linked to the [workspace transformation DCR](../essentials/data-collection-transformations.md#workspace-transformation-dcr).
++
+## Overview of tutorial
+In this tutorial, you'll reduce the storage requirement for the `LAQueryLogs` table by filtering out certain records. You'll also remove the contents of a column while parsing the column data to store a piece of data in a custom column. The [LAQueryLogs table](query-audit.md#audit-data) is created when you enable [log query auditing](query-audit.md) in a workspace. You can use this same basic process to create a transformation for any [supported table](tables-feature-support.md) in a Log Analytics workspace.
+
+This tutorial will use the Azure portal which provides a wizard to walk you through the process of creating an ingestion-time transformation. The following actions are performed for you when you complete this wizard:
+
+- Updates the table schema with any additional columns from the query.
+- Creates a `WorkspaceTransforms` data collection rule (DCR) and links it to the workspace if a default DCR isn't already linked to the workspace.
+- Creates an ingestion-time transformation and adds it to the DCR.
++
+## Enable query audit logs
+You need to enable [query auditing](query-audit.md) for your workspace to create the `LAQueryLogs` table that you'll be working with. This is not required for all ingestion time transformations. It's just to generate the sample data that we'll be working with.
+
+1. From the **Log Analytics workspaces** menu in the Azure portal, select **Diagnostic settings** and then **Add diagnostic setting**.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-portal/diagnostic-settings.png" lightbox="media/tutorial-workspace-transformations-portal/diagnostic-settings.png" alt-text="Screenshot of diagnostic settings.":::
+
+2. Provide a name for the diagnostic setting and select the workspace so that the auditing data is stored in the same workspace. Select the **Audit** category and then click **Save** to save the diagnostic setting and close the diagnostic setting page.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-portal/new-diagnostic-setting.png" lightbox="media/tutorial-workspace-transformations-portal/new-diagnostic-setting.png" alt-text="Screenshot of new diagnostic setting.":::
+
+3. Select **Logs** and then run some queries to populate `LAQueryLogs` with some data. These queries don't need to return data to be added to the audit log.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-portal/sample-queries.png" lightbox="media/tutorial-workspace-transformations-portal/sample-queries.png" alt-text="Screenshot of sample log queries.":::
+
+## Add transformation to the table
+Now that the table's created, you can create the transformation for it.
+
+1. From the **Log Analytics workspaces** menu in the Azure portal, select **Tables (preview)**. Locate the `LAQueryLogs` table and select **Create transformation**.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-portal/create-transformation.png" lightbox="media/tutorial-workspace-transformations-portal/create-transformation.png" alt-text="Screenshot of creating a new transformation.":::
++
+2. Since this is the first transformation in the workspace, you need to create a [workspace transformation DCR](../essentials/data-collection-transformations.md#workspace-transformation-dcr). If you create transformations for other tables in the same workspace, they will be stored in this same DCR. Click **Create a new data collection rule**. The **Subscription** and **Resource group** will already be populated for the workspace. Provide a name for the DCR and click **Done**.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-portal/new-data-collection-rule.png" lightbox="media/tutorial-workspace-transformations-portal/new-data-collection-rule.png" alt-text="Screenshot of creating a new data collection rule.":::
+
+3. Click **Next** to view sample data from the table. As you define the transformation, the result will be applied to the sample data allowing you to evaluate the results before applying it to actual data. Click **Transformation editor** to define the transformation.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-portal/sample-data.png" lightbox="media/tutorial-workspace-transformations-portal/sample-data.png" alt-text="Screenshot of sample data from the log table.":::
+
+4. In the transformation editor, you can see the transformation that will be applied to the data prior to its ingestion into the table. The incoming data is represented by a virtual table named `source`, which has the same set of columns as the destination table itself. The transformation initially contains a simple query returning the `source` table with no changes.
+
+5. Modify the query to the following:
+
+ ``` kusto
+ source
+ | where QueryText !contains 'LAQueryLogs'
+ | extend Context = parse_json(RequestContext)
+ | extend Workspace_CF = tostring(Context['workspaces'][0])
+ | project-away RequestContext, Context
+ ```
+
+ This makes the following changes:
+
+ - Drop rows related to querying the `LAQueryLogs` table itself to save space since these log entries aren't useful.
+ - Add a column for the name of the workspace that was queried.
+ - Remove data from the `RequestContext` column to save space.
+++
+ > [!Note]
+ > Using the Azure portal, the output of the transformation will initiate changes to the table schema if required. Columns will be added to match the transformation output if they don't already exist. Make sure that your output doesn't contain any additional columns that you don't want added to the table. If the output does not include columns that are already in the table, those columns will not be removed, but data will not be added.
+ >
+ > Any custom columns added to a built-in table must end in *_CF*. Columns added to a custom table (a table with a name that ends in *_CL*) does not need to have this suffix.
+
+6. Copy the query into the transformation editor and click **Run** to view results from the sample data. You can verify that the new `Workspace_CF` column is in the query.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-portal/transformation-editor.png" lightbox="media/tutorial-workspace-transformations-portal/transformation-editor.png" alt-text="Screenshot of transformation editor.":::
+
+7. Click **Apply** to save the transformation and then **Next** to review the configuration. Click **Create** to update the data collection rule with the new transformation.
+
+ :::image type="content" source="media/tutorial-workspace-transformations-portal/save-transformation.png" lightbox="media/tutorial-workspace-transformations-portal/save-transformation.png" alt-text="Screenshot of saving transformation.":::
+
+## Test transformation
+Allow about 30 minutes for the transformation to take effect and then test it by running a query against the table. Only data sent to the table after the transformation was applied will be affected.
+
+For this tutorial, run some sample queries to send data to the `LAQueryLogs` table. Include some queries against `LAQueryLogs` so you can verify that the transformation filters these records. Notice that the output has the new `Workspace_CF` column, and there are no records for `LAQueryLogs`.
+
+## Troubleshooting
+This section describes different error conditions you may receive and how to correct them.
+
+### IntelliSense in Log Analytics not recognizing new columns in the table
+The cache that drives IntelliSense may take up to 24 hours to update.
+
+### Transformation on a dynamic column isn't working
+There is currently a known issue affecting dynamic columns. A temporary workaround is to explicitly parse dynamic column data using `parse_json()` prior to performing any operations against them.
+
+## Next steps
+
+- [Read more about transformations](../essentials/data-collection-transformations.md)
+- [See which tables support workspace transformations](tables-feature-support.md)
+- [Learn more about writing transformation queries](../essentials/data-collection-transformations-structure.md)
azure-monitor Workspace Design https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/workspace-design.md
In a hybrid model, each tenant has its own workspace, and some mechanism is used
There are two options to implement logs in a central location: -- Central workspace. The service provider creates a workspace in its tenant and use a script that utilizes the [Query API](api/overview.md) with the [custom logs API](custom-logs-overview.md) to bring the data from the tenant workspaces to this central location. Another option is to use [Azure Logic Apps](../../logic-apps/logic-apps-overview.md) to copy data to the central workspace.
+- Central workspace. The service provider creates a workspace in its tenant and use a script that utilizes the [Query API](api/overview.md) with the [logs ingestion API](logs-ingestion-api-overview.md) to bring the data from the tenant workspaces to this central location. Another option is to use [Azure Logic Apps](../../logic-apps/logic-apps-overview.md) to copy data to the central workspace.
- Power BI. The tenant workspaces export data to Power BI using the integration between the [Log Analytics workspace and Power BI](log-powerbi.md).
azure-monitor Observability Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/observability-data.md
+
+ Title: Observability data in Azure Monitor
+description: Describes the
+documentationcenter: ''
+
+ na
+ Last updated : 04/05/2022++
+# Observability data in Azure Monitor
+Enabling observability across today's complex computing environments running distributed applications that rely on both cloud and on-premises services, requires collection of operational data from every layer and every component of the distributed system. You need to be able to perform deep insights on this data and consolidate it into a single pane of glass with different perspectives to support the multitude of stakeholders in your organization.
+
+[Azure Monitor](overview.md) collects and aggregates data from a variety of sources into a common data platform where it can be used for analysis, visualization, and alerting. It provides a consistent experience on top of data from multiple sources, which gives you deep insights across all your monitored resources and even with data from other services that store their data in Azure Monitor.
+++
+## Pillars of observability
+
+Metrics, logs, and distributed traces are commonly referred to as the three pillars of observability. These are the different kinds of data that a monitoring tool must collect and analyze to provide sufficient observability of a monitored system. Observability can be achieved by correlating data from multiple pillars and aggregating data across the entire set of resources being monitored. Because Azure Monitor stores data from multiple sources together, the data can be correlated and analyzed using a common set of tools. It also correlates data across multiple Azure subscriptions and tenants, in addition to hosting data for other services.
+
+Azure resources generate a significant amount of monitoring data. Azure Monitor consolidates this data along with monitoring data from other sources into either a Metrics or Logs platform. Each is optimized for particular monitoring scenarios, and each supports different features in Azure Monitor. Features such as data analysis, visualizations, or alerting require you to understand the differences so that you can implement your required scenario in the most efficient and cost effective manner. Insights in Azure Monitor such as [Application Insights](app/app-insights-overview.md) or [VM insights](vm/vminsights-overview.md) have analysis tools that allow you to focus on the particular monitoring scenario without having to understand the differences between the two types of data.
++
+## Metrics
+[Metrics](essentials/data-platform-metrics.md) are numerical values that describe some aspect of a system at a particular point in time. They are collected at regular intervals and are identified with a timestamp, a name, a value, and one or more defining labels. Metrics can be aggregated using a variety of algorithms, compared to other metrics, and analyzed for trends over time.
+
+Metrics in Azure Monitor are stored in a time-series database which is optimized for analyzing time-stamped data. This makes metrics particularly suited for alerting and fast detection of issues. They can tell you how your system is performing but typically need to be combined with logs to identify the root cause of issues.
+
+Metrics are available for interactive analysis in the Azure portal with [Azure Metrics Explorer](essentials/metrics-getting-started.md). They can be added to an [Azure dashboard](app/tutorial-app-dashboards.md) for visualization in combination with other data and used for near-real time [alerting](alerts/alerts-metric.md).
+
+Read more about Azure Monitor Metrics including their sources of data in [Metrics in Azure Monitor](essentials/data-platform-metrics.md).
+
+## Logs
+[Logs](logs/data-platform-logs.md) are events that occurred within the system. They can contain different kinds of data and may be structured or free form text with a timestamp. They may be created sporadically as events in the environment generate log entries, and a system under heavy load will typically generate more log volume.
+
+Logs in Azure Monitor are stored in a Log Analytics workspace that's based on [Azure Data Explorer](/azure/data-explorer/) which provides a powerful analysis engine and [rich query language](/azure/kusto/query/). Logs typically provide enough information to provide complete context of the issue being identified and are valuable for identifying root case of issues.
+
+> [!NOTE]
+> It's important to distinguish between Azure Monitor Logs and sources of log data in Azure. For example, subscription level events in Azure are written to an [activity log](essentials/platform-logs-overview.md) that you can view from the Azure Monitor menu. Most resources will write operational information to a [resource log](essentials/platform-logs-overview.md) that you can forward to different locations. Azure Monitor Logs is a log data platform that collects activity logs and resource logs along with other monitoring data to provide deep analysis across your entire set of resources.
++
+ You can work with [log queries](logs/log-query-overview.md) interactively with [Log Analytics](logs/log-query-overview.md) in the Azure portal or add the results to an [Azure dashboard](app/tutorial-app-dashboards.md) for visualization in combination with other data. You can also create [log alerts](alerts/alerts-log.md) which will trigger an alert based on the results of a schedule query.
+
+Read more about Azure Monitor Logs including their sources of data in [Logs in Azure Monitor](logs/data-platform-logs.md).
+
+## Distributed traces
+Traces are series of related events that follow a user request through a distributed system. They can be used to determine behavior of application code and the performance of different transactions. While logs will often be created by individual components of a distributed system, a trace measures the operation and performance of your application across the entire set of components.
+
+Distributed tracing in Azure Monitor is enabled with the [Application Insights SDK](app/distributed-tracing.md), and trace data is stored with other application log data collected by Application Insights. This makes it available to the same analysis tools as other log data including log queries, dashboards, and alerts.
+
+Read more about distributed tracing at [What is Distributed Tracing?](app/distributed-tracing.md).
++
+## Next steps
+
+- Read more about [Metrics in Azure Monitor](essentials/data-platform-metrics.md).
+- Read more about [Logs in Azure Monitor](logs/data-platform-logs.md).
+- Learn about the [monitoring data available](data-sources.md) for different resources in Azure.
azure-monitor Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/overview.md
Title: Azure Monitor overview | Microsoft Docs
+ Title: Azure Monitor overview
description: Overview of Microsoft services and functionalities that contribute to a complete monitoring strategy for your Azure services and applications.
# Azure Monitor overview- Azure Monitor helps you maximize the availability and performance of your applications and services. It delivers a comprehensive solution for collecting, analyzing, and acting on telemetry from your cloud and on-premises environments. This information helps you understand how your applications are performing and proactively identify issues that affect them and the resources they depend on. + A few examples of what you can do with Azure Monitor include: - Detect and diagnose issues across applications and dependencies with [Application Insights](app/app-insights-overview.md).
A few examples of what you can do with Azure Monitor include:
[!INCLUDE [azure-lighthouse-supported-service](../../includes/azure-lighthouse-supported-service.md)] ## Overview
+The following diagram gives a high-level view of Azure Monitor. At the center of the diagram are the data stores for metrics and logs, which are the two fundamental types of data used by Azure Monitor. On the left are the [sources of monitoring data](data-sources.md) that populate these [data stores](data-platform.md). On the right are the different functions that Azure Monitor performs with this collected data. This includes such actions as analysis, alerting, and streaming to external systems.
-The following diagram gives a high-level view of Azure Monitor. At the center of the diagram are the data stores for metrics and logs, which are the two fundamental types of data used by Azure Monitor. On the left are the [sources of monitoring data](agents/data-sources.md) that populate these [data stores](data-platform.md). On the right are the different functions that Azure Monitor performs with this collected data. Actions include analysis, alerting, and streaming to external systems.
:::image type="content" source="media/overview/azure-monitor-overview-optm.svg" alt-text="Diagram that shows an overview of Azure Monitor." border="false" lightbox="media/overview/azure-monitor-overview-optm.svg":::
The following video uses an earlier version of the preceding diagram, but its ex
> [!VIDEO https://www.microsoft.com/en-us/videoplayer/embed/RE4qXeL] ## Monitor data platform- All data collected by Azure Monitor fits into one of two fundamental types, [metrics and logs](data-platform.md). [Metrics](essentials/data-platform-metrics.md) are numerical values that describe some aspect of a system at a particular point in time. They're lightweight and capable of supporting near-real-time scenarios. [Logs](logs/data-platform-logs.md) contain different kinds of data organized into records with different sets of properties for each type. Telemetry such as events and traces is stored as logs in addition to performance data so that it can all be combined for analysis. For many Azure resources, you'll see data collected by Azure Monitor right in their overview page in the Azure portal. Look at any virtual machine (VM), for example, and you'll see several charts that display performance metrics. Select any of the graphs to open the data in [Metrics Explorer](essentials/metrics-charts.md) in the Azure portal. With Metrics Explorer, you can chart the values of multiple metrics over time. You can view the charts interactively or pin them to a dashboard to view them with other visualizations.
You'll often have the requirement to integrate Azure Monitor with other systems
Multiple APIs are available to read and write metrics and logs to and from Azure Monitor in addition to accessing generated alerts. You can also configure and retrieve alerts. With APIs, you have essentially unlimited possibilities to build custom solutions that integrate with Azure Monitor. +
+## Observability data in Azure Monitor
+Metrics, logs, and distributed traces are commonly referred to as the three pillars of observability. These are the different kinds of data that a monitoring tool must collect and analyze to provide sufficient observability of a monitored system. Observability can be achieved by correlating data from multiple pillars and aggregating data across the entire set of resources being monitored. Because Azure Monitor stores data from multiple sources together, the data can be correlated and analyzed using a common set of tools. It also correlates data across multiple Azure subscriptions and tenants, in addition to hosting data for other services.
+
+Azure resources generate a significant amount of monitoring data. Azure Monitor consolidates this data along with monitoring data from other sources into either a Metrics or Logs platform. Each is optimized for particular monitoring scenarios, and each supports different features in Azure Monitor. Features such as data analysis, visualizations, or alerting require you to understand the differences so that you can implement your required scenario in the most efficient and cost effective manner. Insights in Azure Monitor such as [Application Insights](app/app-insights-overview.md) or [Container insights](containers/container-insights-overview.md) have analysis tools that allow you to focus on the particular monitoring scenario without having to understand the differences between the two types of data.
+
+| Pillar | Description |
+|:|:|
+| Metrics | Metrics are numerical values that describe some aspect of a system at a particular point in time. They are collected at regular intervals and are identified with a timestamp, a name, a value, and one or more defining labels. Metrics can be aggregated using a variety of algorithms, compared to other metrics, and analyzed for trends over time.<br><br>Metrics in Azure Monitor are stored in a time-series database which is optimized for analyzing time-stamped data. For more information, see [Azure Monitor Metrics](essentials/data-platform-metrics.md). |
+| Logs | [Logs](logs/data-platform-logs.md) are events that occurred within the system. They can contain different kinds of data and may be structured or free form text with a timestamp. They may be created sporadically as events in the environment generate log entries, and a system under heavy load will typically generate more log volume.<br><br>Logs in Azure Monitor are stored in a Log Analytics workspace that's based on [Azure Data Explorer](/azure/data-explorer/) which provides a powerful analysis engine and [rich query language](/azure/kusto/query/). For more information, see [Azure Monitor Logs](logs/data-platform-logs.md). |
+| Distributed traces | Traces are series of related events that follow a user request through a distributed system. They can be used to determine behavior of application code and the performance of different transactions. While logs will often be created by individual components of a distributed system, a trace measures the operation and performance of your application across the entire set of components.<br><br>Distributed tracing in Azure Monitor is enabled with the [Application Insights SDK](app/distributed-tracing.md), and trace data is stored with other application log data collected by Application Insights and stored in Azure Monitor Logs. For more information, see [What is Distributed Tracing?](app/distributed-tracing.md). |
++
+> [!NOTE]
+> It's important to distinguish between Azure Monitor Logs and sources of log data in Azure. For example, subscription level events in Azure are written to an [activity log](essentials/platform-logs-overview.md) that you can view from the Azure Monitor menu. Most resources will write operational information to a [resource log](essentials/platform-logs-overview.md) that you can forward to different locations. Azure Monitor Logs is a log data platform that collects activity logs and resource logs along with other monitoring data to provide deep analysis across your entire set of resources.
+++++ ## Next steps Learn more about: * [Metrics and logs](./data-platform.md#metrics) for the data collected by Azure Monitor.
-* [Data sources](agents/data-sources.md) for how the different components of your application send telemetry.
+* [Data sources](data-sources.md) for how the different components of your application send telemetry.
* [Log queries](logs/log-query-overview.md) for analyzing collected data. * [Best practices](/azure/architecture/best-practices/monitoring) for monitoring cloud applications and services.
azure-monitor Profiler Aspnetcore Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/profiler/profiler-aspnetcore-linux.md
Title: Profile ASP.NET Core Azure Linux web apps with Application Insights Profiler | Microsoft Docs
-description: A conceptual overview and step-by-step tutorial on how to use Application Insights Profiler.
+ Title: Enable Profiler for ASP.NET Core web applications hosted in Linux on App Services | Microsoft Docs
+description: Learn how to enable Profiler on your ASP.NET Core web application hosted in Linux on App Services.
ms.devlang: csharp Previously updated : 06/16/2022- Last updated : 07/18/2022+
-# Profile ASP.NET Core Azure Linux web apps with Application Insights Profiler
+# Enable Profiler for ASP.NET Core web applications hosted in Linux on App Services
-Find out how much time is spent in each method of your live web application when using [Application Insights](../app/app-insights-overview.md). Application Insights Profiler is now available for ASP.NET Core web apps that are hosted in Linux on Azure App Service. This guide provides step-by-step instructions on how the Profiler traces can be collected for ASP.NET Core Linux web apps.
+Using Profiler, you can track how much time is spent in each method of your live ASP.NET Core web apps that are hosted in Linux on Azure App Service. While this guide focuses on web apps hosted in Linux, you can experiment using Linux, Windows, and Mac development environments.
-After you complete this walkthrough, your app can collect Profiler traces like the traces that are shown in the image. In this example, the Profiler trace indicates that a particular web request is slow because of time spent waiting. The *hot path* in the code that's slowing the app is marked by a flame icon. The **About** method in the **HomeController** section is slowing the web app because the method is calling the **Thread.Sleep** function.
-
-![Profiler traces](./media/profiler-aspnetcore-linux/profiler-traces.png)
+In this guide, you'll:
+> [!div class="checklist"]
+> - Set up and deploy an ASP.NET Core web application hosted on Linux.
+> - Add Application Insights Profiler to the ASP.NET Core web application.
+
## Prerequisites
-The following instructions apply to all Windows, Linux, and Mac development environments:
-* Install the [.NET Core SDK 3.1 or later](https://dotnet.microsoft.com/download/dotnet).
-* Install Git by following the instructions at [Getting Started - Installing Git](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git).
+- Install the [latest and greatest .NET Core SDK](https://dotnet.microsoft.com/download/dotnet).
+- Install Git by following the instructions at [Getting Started - Installing Git](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git).
## Set up the project locally
-1. Open a Command Prompt window on your machine. The following instructions work for all Windows, Linux, and Mac development environments.
+1. Open a Command Prompt window on your machine.
1. Create an ASP.NET Core MVC web application:
The following instructions apply to all Windows, Linux, and Mac development envi
dotnet add package Microsoft.ApplicationInsights.Profiler.AspNetCore ```
-1. Enable Application Insights and Profiler in Startup.cs:
+1. In your preferred code editor, enable Application Insights and Profiler in `Program.cs`:
```csharp public void ConfigureServices(IServiceCollection services)
The following instructions apply to all Windows, Linux, and Mac development envi
## Create the Linux web app to host your project
-1. Create the web app environment by using App Service on Linux:
+1. In the Azure portal, create a web app environment by using App Service on Linux:
+
+ :::image type="content" source="./media/profiler-aspnetcore-linux/create-web-app.png" alt-text="Screenshot of creating the Linux web app.":::
- :::image type="content" source="./media/profiler-aspnetcore-linux/create-linux-app-service.png" alt-text="Create the Linux web app":::
+1. Go to your new web app resource and select **Deployment Center** > **FTPS credentials** to create the deployment credentials. Make note of your credentials to use later.
-2. Create the deployment credentials:
+ :::image type="content" source="./media/profiler-aspnetcore-linux/credentials.png" alt-text="Screenshot of creating the deployment credentials.":::
- > [!NOTE]
- > Record your password to use later when deploying your web app.
+1. Click **Save**.
+1. Select the **Settings** tab.
+1. In the drop-down, select **Local Git** to set up a local Git repository in the web app.
- ![Create the deployment credentials](./media/profiler-aspnetcore-linux/create-deployment-credentials.png)
+ :::image type="content" source="./media/profiler-aspnetcore-linux/deployment-options.png" alt-text="Screenshot of view deployment options in a drop-down.":::
-3. Choose the deployment options. Set up a local Git repository in the web app by following the instructions on the Azure portal. A Git repository is automatically created.
+1. Click **Save** to create a Git repository with a Git Clone Uri.
- ![Set up the Git repository](./media/profiler-aspnetcore-linux/setup-git-repo.png)
+ :::image type="content" source="./media/profiler-aspnetcore-linux/local-git-repo.png" alt-text="Screenshot of setting up the local Git repository.":::
-For more deployment options, see [App Service documentation](../../app-service/index.yml).
+ For more deployment options, see [App Service documentation](../../app-service/deploy-best-practices.md).
## Deploy your project
For more deployment options, see [App Service documentation](../../app-service/i
... ```
-## Add Application Insights to monitor your web apps
+## Add Application Insights to monitor your web app
-1. [Create an Application Insights resource](../app/create-new-resource.md).
+You can add Application Insights to your web app either via:
-2. Copy the **iKey** value of the Application Insights resource and set the following settings in your web apps:
+- The Enablement blade in the Azure portal,
+- The Configuration blade in the Azure portal, or
+- Manually adding to your web app settings.
- `APPINSIGHTS_INSTRUMENTATIONKEY: [YOUR_APPINSIGHTS_KEY]`
+# [Enablement blade](#tab/enablement)
- When the app settings are changed, the site automatically restarts. After the new settings are applied, the Profiler immediately runs for two minutes. The Profiler then runs for two minutes every hour.
+1. In your web app on the Azure portal, select **Application Insights** in the left side menu.
+1. Click **Turn on Application Insights**.
-3. Generate some traffic to your website. You can generate traffic by refreshing the site **About** page a few times.
+ :::image type="content" source="./media/profiler-aspnetcore-linux/turn-on-app-insights.png" alt-text="Screenshot of turning on Application Insights.":::
-4. Wait two to five minutes for the events to aggregate to Application Insights.
+1. Under **Application Insights**, select **Enable**.
-5. Browse to the Application Insights **Performance** pane in the Azure portal. You can view the Profiler traces at the bottom right of the pane.
+ :::image type="content" source="./media/profiler-aspnetcore-linux/enable-app-insights.png" alt-text="Screenshot of enabling Application Insights.":::
- ![View Profiler traces](./media/profiler-aspnetcore-linux/view-traces.png)
+1. Under **Link to an Application Insights resource**, either create a new resource or select an existing resource. For this example, we'll create a new resource.
+ :::image type="content" source="./media/profiler-aspnetcore-linux/link-app-insights.png" alt-text="Screenshot of linking your Application Insights to a new or existing resource.":::
+1. Click **Apply** > **Yes** to apply and confirm.
-## Next steps
-If you use custom containers that are hosted by Azure App Service, follow the instructions in [
-Enable Service Profiler for a containerized ASP.NET Core application](https://github.com/Microsoft/ApplicationInsights-Profiler-AspNetCore/tree/master/examples/EnableServiceProfilerForContainerApp) to enable Application Insights Profiler.
+# [Configuration blade](#tab/config)
+
+1. [Create an Application Insights resource](../app/create-workspace-resource.md) in the same Azure subscription as your App Service.
+1. Navigate to the Application Insights resource.
+1. Copy the **Instrumentation Key** (iKey).
+1. In your web app on the Azure portal, select **Configuration** in the left side menu.
+1. Click **New application setting**.
+
+ :::image type="content" source="./media/profiler-aspnetcore-linux/new-setting-configuration.png" alt-text="Screenshot of adding new application setting in the configuration blade.":::
+
+1. Add the following settings in the **Add/Edit application setting** pane, using your saved iKey:
+
+ | Name | Value |
+ | - | -- |
+ | APPINSIGHTS_INSTRUMENTATIONKEY | [YOUR_APPINSIGHTS_KEY] |
+
+ :::image type="content" source="./media/profiler-aspnetcore-linux/add-ikey-settings.png" alt-text="Screenshot of adding iKey to the settings pane.":::
+
+1. Click **OK**.
+
+ :::image type="content" source="./media/profiler-aspnetcore-linux/save-app-insights-key.png" alt-text="Screenshot of saving the application insights key settings.":::
-Report any issues or suggestions to the Application Insights GitHub repository:
-[ApplicationInsights-Profiler-AspNetCore: Issues](https://github.com/Microsoft/ApplicationInsights-Profiler-AspNetCore/issues).
+1. Click **Save**.
+
+# [Web app settings](#tab/appsettings)
+
+1. [Create an Application Insights resource](../app/create-workspace-resource.md) in the same Azure subscription as your App Service.
+1. Navigate to the Application Insights resource.
+1. Copy the **Instrumentation Key** (iKey).
+1. In your preferred code editor, navigate to your ASP.NET Core project's `appsettings.json` file.
+1. Add the following and insert your copied iKey:
+
+ ```json
+ "ApplicationInsights":
+ {
+ "InstrumentationKey": "<your-instrumentation-key>"
+ }
+ ```
+
+1. Save `appsettings.json` to apply the settings change.
+++
+## Next steps
+Learn how to...
+> [!div class="nextstepaction"]
+> [Generate load and view Profiler traces](./profiler-data.md)
azure-monitor Profiler Settings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/profiler/profiler-settings.md
Title: Configure Application Insights Profiler | Microsoft Docs
description: Use the Azure Application Insights Profiler settings pane to see Profiler status and start profiling sessions ms.contributor: Charles.Weininger Previously updated : 04/26/2022- Last updated : 07/18/2022 # Configure Application Insights Profiler To open the Azure Application Insights Profiler settings pane, select **Performance** from the left menu within your Application Insights page. View profiler traces across your Azure resources via two methods:
View profiler traces across your Azure resources via two methods:
Select the **Profiler** button from the top menu. **By operation** 1. Select an operation from the **Operation name** list ("Overall" is highlighted by default). 1. Select the **Profiler traces** button.
- :::image type="content" source="./media/profiler-settings/operation-entry-inline.png" alt-text="Select operation and Profiler traces to view all profiler traces" lightbox="media/profiler-settings/operation-entry.png":::
+ :::image type="content" source="./media/profiler-settings/operation-entry-inline.png" alt-text="Screenshot of selecting operation and Profiler traces to view all profiler traces." lightbox="media/profiler-settings/operation-entry.png":::
1. Select one of the requests from the list to the left. 1. Select **Configure Profiler**.
- :::image type="content" source="./media/profiler-settings/configure-profiler-inline.png" alt-text="Overall selection and clicking Profiler traces to view all profiler traces" lightbox="media/profiler-settings/configure-profiler.png":::
+ :::image type="content" source="./media/profiler-settings/configure-profiler-inline.png" alt-text="Screenshot of the overall selection and clicking Profiler traces to view all profiler traces." lightbox="media/profiler-settings/configure-profiler.png":::
Once within the Profiler, you can configure and view the Profiler. The **Application Insights Profiler** page has these features: | Feature | Description | |-|-|
Select the Triggers button on the menu bar to open the CPU, Memory, and Sampling
You can set up a trigger to start profiling when the percentage of CPU or Memory use hits the level you set. | Setting | Description | |-|-|
Unlike CPU or memory triggers, the Sampling trigger isn't triggered by an event.
- Turn this trigger off to disable random sampling. - Set how often profiling will occur and the duration of the profiling session. | Setting | Description | |-|-|
Triggered by | How the session was started, either by a trigger, Profile Now, or
App Name | Name of the application that was profiled. Machine Instance | Name of the machine the profiler agent ran on. Timestamp | Time when the profile was captured.
-Tracee | Number of traces that were attached to individual requests.
CPU % | Percentage of CPU that was being used while the profiler was running. Memory % | Percentage of memory that was being used while the profiler was running.
azure-monitor Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/service-limits.md
This article lists limits in different areas of Azure Monitor.
[!INCLUDE [monitoring-limits](../../includes/azure-monitor-limits-autoscale.md)]
-## Custom logs
+## Logs ingestion API
[!INCLUDE [custom-logs](../../includes/azure-monitor-limits-custom-logs.md)]
azure-monitor Workbooks Commonly Used Components https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/visualize/workbooks-commonly-used-components.md
You may want to summarize status using a simple visual indication instead of pre
The example below shows how do setup a traffic light icon per computer based on the CPU utilization metric. 1. [Create a new empty workbook](workbooks-create-workbook.md).
-1. [Add a parameters](workbooks-create-workbook.md#add-a-parameter-to-an-azure-workbook), make it a [time range parameter](workbooks-time.md), and name it **TimeRange**.
+1. [Add a parameters](workbooks-create-workbook.md#add-a-parameter-to-a-workbook), make it a [time range parameter](workbooks-time.md), and name it **TimeRange**.
1. Select **Add query** to add a log query control to the workbook. 1. Select the `log` query type, a `Log Analytics' resource type, and a Log Analytics workspace in your subscription that has VM performance data as a resource. 1. In the Query editor, enter:
The following example shows how to enable this scenario: Let's say you want the
### Setup parameters
-1. [Create a new empty workbook](workbooks-create-workbook.md) and [add a parameter component](workbooks-create-workbook.md#add-a-parameter-to-an-azure-workbook).
+1. [Create a new empty workbook](workbooks-create-workbook.md) and [add a parameter component](workbooks-create-workbook.md#add-a-parameter-to-a-workbook).
1. Select **Add parameter** to create a new parameter. Use the following settings: - Parameter name: `OsFilter` - Display name: `Operating system`
azure-monitor Workbooks Create Workbook https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/visualize/workbooks-create-workbook.md
Title: Creating an Azure Workbook
-description: Learn how to create an Azure Workbook.
+ Title: Create an Azure workbook
+description: Learn how to create a workbook in Azure Workbooks.
Last updated 05/30/2022
-# Create an Azure Workbook
-This article describes how to create a new workbook and how to add elements to your Azure Workbook.
+# Create an Azure workbook
+
+This article describes how to create a new workbook and how to add elements to your Azure workbook.
This video walks you through creating workbooks. > [!VIDEO https://www.microsoft.com/en-us/videoplayer/embed/RE4B4Ap]
-## Create a new Azure Workbook
+## Create a new workbook
+
+To create a new workbook:
-To create a new Azure workbook:
-1. From the Azure Workbooks page, select an empty template or select **New** in the top toolbar.
+1. On the **Azure Workbooks** page, select an empty template or select **New**.
1. Combine any of these elements to add to your workbook:
- - [Text](#adding-text)
- - [Parameters](#adding-parameters)
- - [Queries](#adding-queries)
- - [Metric charts](#adding-metric-charts)
- - [Links](#adding-links)
- - [Groups](#adding-groups)
+ - [Text](#add-text)
+ - [Queries](#add-queries)
+ - [Parameters](#add-parameters)
+ - [Metric charts](#add-metric-charts)
+ - [Links](#add-links)
+ - [Groups](#add-groups)
- Configuration options
-## Adding text
+## Add text
+
+You can include text blocks in your workbooks. For example, the text can be human analysis of the telemetry, information to help users interpret the data, and section headings.
-Workbooks allow authors to include text blocks in their workbooks. The text can be human analysis of the telemetry, information to help users interpret the data, section headings, etc.
+ :::image type="content" source="media/workbooks-create-workbook/workbooks-text-example.png" alt-text="Screenshot that shows adding text to a workbook.":::
- :::image type="content" source="media/workbooks-create-workbook/workbooks-text-example.png" alt-text="Screenshot of adding text to a workbook.":::
+Text is added through a Markdown control that you use to add your content. You can use the full formatting capabilities of Markdown like different heading and font styles, hyperlinks, and tables. By using Markdown, you can create rich Word- or portal-like reports or analytic narratives. Text can contain parameter values in the Markdown text. Those parameter references are updated as the parameters change.
-Text is added through a markdown control into which an author can add their content. An author can use the full formatting capabilities of markdown. These include different heading and font styles, hyperlinks, tables, etc. Markdown allows authors to create rich Word- or Portal-like reports or analytic narratives. Text can contain parameter values in the markdown text, and those parameter references will be updated as the parameters change.
+Edit mode:
+ :::image type="content" source="media/workbooks-create-workbook/workbooks-text-control-edit-mode.png" alt-text="Screenshot that shows adding text to a workbook in edit mode.":::
-**Edit mode**:
- :::image type="content" source="media/workbooks-create-workbook/workbooks-text-control-edit-mode.png" alt-text="Screenshot showing adding text to a workbook in edit mode.":::
+Preview mode:
+ :::image type="content" source="media/workbooks-create-workbook/workbooks-text-control-edit-mode-preview.png" alt-text="Screenshot that shows adding text to a workbook in preview mode.":::
-**Preview mode**:
- :::image type="content" source="media/workbooks-create-workbook/workbooks-text-control-edit-mode-preview.png" alt-text="Screenshot showing adding text to a workbook in preview mode.":::
+### Add text to a workbook
-### Add text to an Azure workbook
+1. Make sure you're in edit mode by selecting **Edit**.
+1. Add text by doing one of these steps:
-1. Make sure you are in **Edit** mode by selecting the **Edit** in the toolbar.
-1. Add text by doing either of these steps:
- - Select **Add**, and **Add text** below an existing element, or at the bottom of the workbook.
- - Select the ellipses (...) to the right of the **Edit** button next to one of the elements in the workbook, then select **Add** and then **Add text**.
-1. Enter markdown text into the editor field.
-1. Use the **Text Style** option to switch between plain markdown, and markdown wrapped with the Azure portal's standard info/warning/success/error styling.
+ * Select **Add** > **Add text** below an existing element or at the bottom of the workbook.
+ * Select the ellipsis (...) to the right of the **Edit** button next to one of the elements in the workbook. Then select **Add** > **Add text**.
+
+1. Enter Markdown text in the editor field.
+1. Use the **Text Style** option to switch between plain Markdown and Markdown wrapped with the Azure portal's standard info, warning, success, and error styling.
> [!TIP]
- > Use this [markdown cheat sheet](https://github.com/adam-p/markdown-here/wiki/Markdown-Cheatsheet) to see the different formatting options.
+ > Use this [Markdown cheat sheet](https://github.com/adam-p/markdown-here/wiki/Markdown-Cheatsheet) to see the different formatting options.
-1. Use the **Preview** tab to see how your content will look. The preview shows the content inside a scrollable area to limit its size, but when displayed at runtime, the markdown content will expand to fill whatever space it needs, without a scrollbar.
+1. Use the **Preview** tab to see how your content will look. The preview shows the content inside a scrollable area to limit its size. At runtime, the Markdown content expands to fill whatever space it needs, without a scrollbar.
1. Select **Done Editing**. ### Text styles
-These text styles are available:
+
+These text styles are available.
| Style | Description | | | |
-| plain| No formatting is applied |
-|info| The portal's "info" style, with a `ℹ` or similar icon and blue background |
-|error| The portal's "error" style, with a `❌` or similar icon and red background |
-|success| The portal's "success" style, with a `Γ£ö` or similar icon and green background |
-|upsell| The portal's "upsell" style, with a `🚀` or similar icon and purple background |
-|warning| The portal's "warning" style, with a `ΓÜá` or similar icon and blue background |
-
+|plain| No formatting is applied. |
+|info| The portal's info style, with an `ℹ` or similar icon and blue background. |
+|error| The portal's error style, with an `❌` or similar icon and red background. |
+|success| The portal's success style, with a `Γ£ö` or similar icon and green background. |
+|upsell| The portal's upsell style, with a `🚀` or similar icon and purple background. |
+|warning| The portal's warning style, with a `ΓÜá` or similar icon and blue background. |
-You can also choose a text parameter as the source of the style. The parameter value must be one of the above text values. The absence of a value, or any unrecognized value will be treated as `plain` style.
+You can also choose a text parameter as the source of the style. The parameter value must be one of the preceding text values. The absence of a value or any unrecognized value is treated as plain style.
### Text style examples
-**Info style example**:
- :::image type="content" source="media/workbooks-create-workbook/workbooks-text-control-edit-mode-preview.png" alt-text="Screenshot of adding text to a workbook in preview mode showing info style.":::
+Info style example:
+ :::image type="content" source="media/workbooks-create-workbook/workbooks-text-control-edit-mode-preview.png" alt-text="Screenshot that shows adding text to a workbook in preview mode showing info style.":::
-**Warning style example**:
- :::image type="content" source="media/workbooks-create-workbook/workbooks-text-example-warning.png" alt-text="Screenshot of a text visualization in warning style.":::
+Warning style example:
+ :::image type="content" source="media/workbooks-create-workbook/workbooks-text-example-warning.png" alt-text="Screenshot that shows a text visualization in warning style.":::
-## Adding queries
+## Add queries
-Azure Workbooks allow you to query any of the supported workbook [data sources](workbooks-data-sources.md).
+You can query any of the supported workbook [data sources](workbooks-data-sources.md).
For example, you can query Azure Resource Health to help you view any service problems affecting your resources. You can also query Azure Monitor metrics, which is numeric data collected at regular intervals. Azure Monitor metrics provide information about an aspect of a system at a particular time.
-### Add a query to an Azure Workbook
+### Add a query to a workbook
-1. Make sure you are in **Edit** mode by selecting the **Edit** in the toolbar.
-1. Add a query by doing either of these steps:
- - Select **Add**, and **Add query** below an existing element, or at the bottom of the workbook.
- - Select the ellipses (...) to the right of the **Edit** button next to one of the elements in the workbook, then select **Add** and then **Add query**.
+1. Make sure you're in edit mode by selecting **Edit**.
+1. Add a query by doing one of these steps:
+ - Select **Add** > **Add query** below an existing element or at the bottom of the workbook.
+ - Select the ellipsis (...) to the right of the **Edit** button next to one of the elements in the workbook. Then select **Add** > **Add query**.
1. Select the [data source](workbooks-data-sources.md) for your query. The other fields are determined based on the data source you choose. 1. Select any other values that are required based on the data source you selected. 1. Select the [visualization](workbooks-visualizations.md) for your workbook.
-1. In the query section, enter your query, or select from a list of sample queries by selecting **Samples**, and then edit the query to your liking.
+1. In the query section, enter your query, or select from a list of sample queries by selecting **Samples**. Then edit the query to your liking.
1. Select **Run Query**.
-1. When you're sure you have the query you want in your workbook, select **Done editing**.
+1. When you're sure you have the query you want in your workbook, select **Done Editing**.
+## Add parameters
-### Best practices for using resource centric log queries
+This section discusses how to add parameters.
-This video shows you how to use resource level logs queries in Azure Workbooks. It also has tips and tricks on how to enable advanced scenarios and improve performance.
+### Best practices for using resource-centric log queries
+
+This video shows you how to use resource-level logs queries in Azure Workbooks. It also has tips and tricks on how to enable advanced scenarios and improve performance.
> [!VIDEO https://www.youtube.com/embed/8CvjM0VvOA80]
-#### Using a dynamic resource type parameter
-Dynamic resource type parameters use dynamic scopes for more efficient querying. The snippet below uses this heuristic:
-1. _Individual resources_: if the count of selected resource is less than or equal to 5
-2. _Resource groups_: if the number of resources is over 5 but the number of resource groups the resources belong to is less than or equal to 3
-3. _Subscriptions_: otherwise
+#### Use a dynamic resource type parameter
+
+Dynamic resource type parameters use dynamic scopes for more efficient querying. The following snippet uses this heuristic:
+
+1. **Individual resources**: If the count of selected resource is less than or equal to 5
+1. **Resource groups**: If the number of resources is over 5 but the number of resource groups the resources belong to is less than or equal to 3
+1. **Subscriptions**: Otherwise
``` Resources
Dynamic resource type parameters use dynamic scopes for more efficient querying.
x == 'microsoft.resources/subscriptions' and resourceGroups > 3 and resourceCount > 5, true, false) ```
-#### Using a static resource scope for querying multiple resource types
+
+#### Use a static resource scope for querying multiple resource types
```json [
Dynamic resource type parameters use dynamic scopes for more efficient querying.
{ "value":"microsoft.compute/virtualmachinescaleset", "label":"Virtual machine scale set", "selected":true } ] ```
-#### Using resource parameters grouped by resource type
+
+#### Use resource parameters grouped by resource type
+ ``` Resources | where type =~ 'microsoft.compute/virtualmachines' or type =~ 'microsoft.compute/virtualmachinescalesets'
Resources
group = iff(type =~ 'microsoft.compute/virtualmachines', 'Virtual machines', 'Virtual machine scale sets') ```
-## Adding parameters
+## Add a parameter
-You can collect input from consumers and reference it in other parts of the workbook using parameters. Often, you would use parameters to scope the result set or to set the right visual. Parameters help you build interactive reports and experiences.
+You can control how your parameter controls are presented to consumers with workbooks. Examples include text box versus dropdown list, single- versus multi-select, or values from text, JSON, KQL, or Azure Resource Graph.
-Workbooks allow you to control how your parameter controls are presented to consumers ΓÇô text box vs. drop down, single- vs. multi-select, values from text, JSON, KQL, or Azure Resource Graph, etc.
+### Add a parameter to a workbook
-### Add a parameter to an Azure Workbook
+1. Make sure you're in edit mode by selecting **Edit**.
+1. Add a parameter by doing one of these steps:
+ - Select **Add** > **Add parameter** below an existing element or at the bottom of the workbook.
+ - Select the ellipsis (...) to the right of the **Edit** button next to one of the elements in the workbook. Then select **Add** > **Add parameter**.
+1. In the new parameter pane that appears, enter values for these fields:
-1. Make sure you are in **Edit** mode by selecting the **Edit** in the toolbar.
-1. Add a parameter by doing either of these steps:
- - Select **Add**, and **Add parameter** below an existing element, or at the bottom of the workbook.
- - Select the ellipses (...) to the right of the **Edit** button next to one of the elements in the workbook, then select **Add** and then **Add parameter**.
-1. In the new parameter pane that pops up enter values for these fields:
+ - **Parameter name**: Parameter names can't include spaces or special characters.
+ - **Display name**: Display names can include spaces, special characters, and emojis.
+ - **Parameter type**:
+ - **Required**:
- - Parameter name: Parameter names can't include spaces or special characters
- - Display name: Display names can include spaces, special characters, emoji, etc.
- - Parameter type:
- - Required:
-
-1. Select **Done editing**.
+1. Select **Done Editing**.
- :::image type="content" source="media/workbooks-parameters/workbooks-time-settings.png" alt-text="Screenshot showing the creation of a time range parameter.":::
+ :::image type="content" source="media/workbooks-parameters/workbooks-time-settings.png" alt-text="Screenshot that shows the creation of a time range parameter.":::
-## Adding metric charts
+## Add metric charts
-Most Azure resources emit metric data about state and health such as CPU utilization, storage availability, count of database transactions, failing app requests, etc. Using workbooks, you can create visualizations of the metric data as time-series charts.
+Most Azure resources emit metric data about state and health, such as CPU utilization, storage availability, count of database transactions, and failing app requests. You can create visualizations of this metric data as time-series charts in workbooks.
-The example below shows the number of transactions in a storage account over the prior hour. This allows the storage owner to see the transaction trend and look for anomalies in behavior.
+The following example shows the number of transactions in a storage account over the prior hour. This information allows you to see the transaction trend and look for anomalies in behavior.
- :::image type="content" source="media/workbooks-create-workbook/workbooks-metric-chart-storage-area.png" alt-text="Screenshot showing a metric area chart for storage transactions in a workbook.":::
+ :::image type="content" source="media/workbooks-create-workbook/workbooks-metric-chart-storage-area.png" alt-text="Screenshot that shows a metric area chart for storage transactions in a workbook.":::
-### Add a metric chart to an Azure Workbook
+### Add a metric chart to a workbook
-1. Make sure you are in **Edit** mode by selecting the **Edit** in the toolbar.
-1. Add a metric chart by doing either of these steps:
- - Select **Add**, and **Add metric** below an existing element, or at the bottom of the workbook.
- - Select the ellipses (...) to the right of the **Edit** button next to one of the elements in the workbook, then select **Add** and then **Add metric**.
+1. Make sure you're in edit mode by selecting **Edit**.
+1. Add a metric chart by doing one of these steps:
+ - Select **Add** > **Add metric** below an existing element or at the bottom of the workbook.
+ - Select the ellipsis (...) to the right of the **Edit** button next to one of the elements in the workbook. Then select **Add** > **Add metric**.
1. Select a **resource type**, the resources to target, the metric namespace and name, and the aggregation to use.
-1. Set other parameters if needed such time range, split-by, visualization, size and color palette.
+1. Set parameters such as time range, split by, visualization, size, and color palette, if needed.
1. Select **Done Editing**.
-This is a metric chart in edit mode:
+Example of a metric chart in edit mode:
### Metric chart parameters
-| Parameter | Explanation | Example |
+| Parameter | Description | Examples |
| - |:-|:-|
-| Resource Type| The resource type to target | Storage or Virtual Machine. |
-| Resources| A set of resources to get the metrics value from | MyStorage1 |
-| Namespace | The namespace with the metric | Storage > Blob |
-| Metric| The metric to visualize | Storage > Blob > Transactions |
-| Aggregation | The aggregation function to apply to the metric | Sum, Count, Average, etc. |
-| Time Range | The time window to view the metric in | Last hour, Last 24 hours, etc. |
-| Visualization | The visualization to use | Area, Bar, Line, Scatter, Grid |
-| Split By| Optionally split the metric on a dimension | Transactions by Geo type |
-| Size | The vertical size of the control | Small, medium or large |
-| Color palette | The color palette to use in the chart. Ignored if the `Split by` parameter is used | Blue, green, red, etc. |
+| Resource type| The resource type to target. | Storage or Virtual Machine |
+| Resources| A set of resources to get the metrics value from. | MyStorage1 |
+| Namespace | The namespace with the metric. | Storage > Blob |
+| Metric| The metric to visualize. | Storage > Blob > Transactions |
+| Aggregation | The aggregation function to apply to the metric. | Sum, count, average |
+| Time range | The time window to view the metric in. | Last hour, last 24 hours |
+| Visualization | The visualization to use. | Area, bar, line, scatter, grid |
+| Split by| Optionally split the metric on a dimension. | Transactions by geo type |
+| Size | The vertical size of the control. | Small, medium, or large |
+| Color palette | The color palette to use in the chart. It's ignored if the **Split by** parameter is used. | Blue, green, red |
### Metric chart examples
-**Transactions split by API name as a line chart**
+Examples of metric charts are shown.
- :::image type="content" source="media/workbooks-create-workbook/workbooks-metric-chart-storage-split-line.png" alt-text="Screenshot showing a metric line chart for Storage transactions split by API name.":::
+#### Transactions split by API name as a line chart
+ :::image type="content" source="media/workbooks-create-workbook/workbooks-metric-chart-storage-split-line.png" alt-text="Screenshot that shows a metric line chart for storage transactions split by API name.":::
-**Transactions split by response type as a large bar chart**
+#### Transactions split by response type as a large bar chart
- :::image type="content" source="media/workbooks-create-workbook/workbooks-metric-chart-storage-bar-large.png" alt-text="Screenshot showing a large metric bar chart for Storage transactions split by response type.":::
+ :::image type="content" source="media/workbooks-create-workbook/workbooks-metric-chart-storage-bar-large.png" alt-text="Screenshot that shows a large metric bar chart for storage transactions split by response type.":::
-**Average latency as a scatter chart**
+#### Average latency as a scatter chart
- :::image type="content" source="media/workbooks-create-workbook/workbooks-metric-chart-storage-scatter.png" alt-text="Screenshot showing a metric scatter chart for storage latency.":::
+ :::image type="content" source="media/workbooks-create-workbook/workbooks-metric-chart-storage-scatter.png" alt-text="Screenshot that shows a metric scatter chart for storage latency.":::
-## Adding links
+## Add links
-You can use links to create links to other views, workbooks, other components inside a workbook, or to create tabbed views within a workbook. The links can be styled as hyperlinks, buttons, and tabs.
+You can use links to create links to other views, workbooks, and other components inside a workbook, or to create tabbed views within a workbook. The links can be styled as hyperlinks, buttons, and tabs.
- :::image type="content" source="media/workbooks-create-workbook/workbooks-empty-links.png" alt-text="Screenshot of adding a link to a workbook.":::
### Link styles
-You can apply styles to the link element itself and to individual links.
-**Link element styles**
+You can apply styles to the link element itself and to individual links.
+#### Link element styles
|Style |Sample |Notes | ||||
-|Bullet List | :::image type="content" source="media/workbooks-create-workbook/workbooks-link-style-bullet.png" alt-text="Screenshot of bullet style workbook link."::: | The default, links, appears as a bulleted list of links, one on each line. The **Text before link** and **Text after link** fields can be used to add more text before or after the link components. |
-|List |:::image type="content" source="media/workbooks-create-workbook/workbooks-link-style-list.png" alt-text="Screenshot of list style workbook link."::: | Links appear as a list of links, with no bullets. |
-|Paragraph | :::image type="content" source="media/workbooks-create-workbook/workbooks-link-style-paragraph.png" alt-text="Screenshot of paragraph style workbook link."::: |Links appear as a paragraph of links, wrapped like a paragraph of text. |
-|Navigation | :::image type="content" source="media/workbooks-create-workbook/workbooks-link-style-navigation.png" alt-text="Screenshot of navigation style workbook link."::: | Links appear as links, with vertical dividers, or pipes (`|`) between each link. |
-|Tabs | :::image type="content" source="media/workbooks-create-workbook/workbooks-link-style-tabs.png" alt-text="Screenshot of tabs style workbook link."::: |Links appear as tabs. Each link appears as a tab, no link styling options apply to individual links. See the [tabs](#using-tabs) section below for how to configure tabs. |
-|Toolbar | :::image type="content" source="media/workbooks-create-workbook/workbooks-link-style-toolbar.png" alt-text="Screenshot of toolbar style workbook link."::: | Links appear an Azure portal styled toolbar, with icons and text. Each link appears as a toolbar button. See the [toolbar](#using-toolbars) section below for how to configure toolbars. |
-
+|Bullet List | :::image type="content" source="media/workbooks-create-workbook/workbooks-link-style-bullet.png" alt-text="Screenshot that shows a bullet-style workbook link."::: | The default, links, appears as a bulleted list of links, one on each line. The **Text before link** and **Text after link** fields can be used to add more text before or after the link components. |
+|List |:::image type="content" source="media/workbooks-create-workbook/workbooks-link-style-list.png" alt-text="Screenshot that shows a list-style workbook link."::: | Links appear as a list of links, with no bullets. |
+|Paragraph | :::image type="content" source="media/workbooks-create-workbook/workbooks-link-style-paragraph.png" alt-text="Screenshot that shows a paragraph-style workbook link."::: |Links appear as a paragraph of links, wrapped like a paragraph of text. |
+|Navigation | :::image type="content" source="media/workbooks-create-workbook/workbooks-link-style-navigation.png" alt-text="Screenshot that shows a navigation-style workbook link."::: | Links appear as links with vertical dividers, or pipes, between each link. |
+|Tabs | :::image type="content" source="media/workbooks-create-workbook/workbooks-link-style-tabs.png" alt-text="Screenshot that shows a tabs-style workbook link."::: |Links appear as tabs. Each link appears as a tab. No link styling options apply to individual links. To configure tabs, see the [Use tabs](#use-tabs) section. |
+|Toolbar | :::image type="content" source="media/workbooks-create-workbook/workbooks-link-style-toolbar.png" alt-text="Screenshot that shows a toolbar-style workbook link."::: | Links appear as an Azure portal-styled toolbar, with icons and text. Each link appears as a toolbar button. To configure toolbars, see the [Use toolbars](#use-toolbars) section. |
-**Link styles**
+#### Link styles
| Style | Description | |:- |:-|
-| Link | By default links appear as a hyperlink. URL links can only be link style. |
-| Button (Primary) | The link appears as a "primary" button in the portal, usually a blue color |
-| Button (Secondary) | The links appear as a "secondary" button in the portal, usually a "transparent" color, a white button in light themes and a dark gray button in dark themes. |
+| Link | By default, links appear as a hyperlink. URL links can only be link style. |
+| Button (primary) | The link appears as a "primary" button in the portal, usually a blue color. |
+| Button (secondary) | The links appear as a "secondary" button in the portal, usually a "transparent" color, a white button in light themes, and a dark gray button in dark themes. |
-If required parameters are used in button text, tooltip text, or value fields, and the required parameter is unset when using buttons, the button is disabled. You can use this capability, for example, to disable buttons when no value is selected in another parameter or control.
+If required parameters are used in button text, tooltip text, or value fields, and the required parameter is unset when you use buttons, the button is disabled. You can use this capability, for example, to disable buttons when no value is selected in another parameter or control.
### Link actions
-Links can use all of the link actions available in [link actions](workbooks-link-actions.md), and have two more available actions:
+
+Links can use all the link actions available in [link actions](workbooks-link-actions.md), and they have two more available actions.
| Action | Description | |:- |:-|
-|Set a parameter value | A parameter can be set to a value when selecting a link, button, or tab. Tabs are often configured to set a parameter to a value, which hides and shows other parts of the workbook based on that value.|
-|Scroll to a step| When selecting a link, the workbook will move focus and scroll to make another component visible. This action can be used to create a "table of contents", or a "go back to the top" style experience. |
+|Set a parameter value | A parameter can be set to a value when you select a link, button, or tab. Tabs are often configured to set a parameter to a value, which hides and shows other parts of the workbook based on that value.|
+|Scroll to a step| When you select a link, the workbook moves focus and scrolls to make another component visible. This action can be used to create a "table of contents" or a "go back to the top"-style experience. |
-### Using tabs
+### Use tabs
-Most of the time, tab links are combined with the **Set a parameter value** action. Here's an example showing the links component configured to create 2 tabs, where selecting either tab will set a **selectedTab** parameter to a different value (the example shows a third tab being edited to show the parameter name and parameter value placeholders):
+Most of the time, tab links are combined with the **Set a parameter value** action. This example shows the links step configured to create two tabs, where selecting either tab sets a **selectedTab** parameter to a different value. The example also shows a third tab being edited to show the parameter name and parameter value placeholders.
- :::image type="content" source="media/workbooks-create-workbook/workbooks-creating-tabs.png" alt-text="Screenshot of creating tabs in workbooks.":::
+ :::image type="content" source="media/workbooks-create-workbook/workbooks-creating-tabs.png" alt-text="Screenshot that shows creating tabs in workbooks.":::
+You can then add other components in the workbook that are conditionally visible if the **selectedTab** parameter value is **1** by using the advanced settings.
-You can then add other components in the workbook that are conditionally visible if the **selectedTab** parameter value is "1" by using the advanced settings:
+ :::image type="content" source="media/workbooks-create-workbook/workbooks-selected-tab.png" alt-text="Screenshot that shows conditionally visible tab in workbooks.":::
- :::image type="content" source="media/workbooks-create-workbook/workbooks-selected-tab.png" alt-text="Screenshot of conditionally visible tab in workbooks.":::
+The first tab is selected by default, initially setting **selectedTab** to **1** and making that component visible. Selecting the second tab changes the value of the parameter to **2**, and different content is displayed.
-The first tab is selected by default, initially setting **selectedTab** to 1, and making that component visible. Selecting the second tab will change the value of the parameter to "2", and different content will be displayed:
+ :::image type="content" source="media/workbooks-create-workbook/workbooks-selected-tab2.png" alt-text="Screenshot that shows workbooks with content displayed when the selected tab is 2.":::
- :::image type="content" source="media/workbooks-create-workbook/workbooks-selected-tab2.png" alt-text="Screenshot of workbooks with content displayed when selected tab is 2.":::
-
-A sample workbook with the above tabs is available in [sample Azure Workbooks with links](workbooks-sample-links.md#sample-workbook-with-links).
+A sample workbook with the preceding tabs is available in [sample Azure workbooks with links](workbooks-sample-links.md#sample-workbook-with-links).
### Tabs limitations - URL links aren't supported in tabs. A URL link in a tab appears as a disabled tab. - No component styling is supported in tabs. components are displayed as tabs, and only the tab name (link text) field is displayed. Fields that aren't used in tab style are hidden while in edit mode. - The first tab is selected by default, invoking whatever action that tab has specified. If the first tab's action opens another view, as soon as the tabs are created, a view appears.
+ - You can use tabs to open other views, but use this functionality sparingly. Most users won't expect to navigate by selecting a tab. If other tabs set a parameter to a specific value, a tab that opens a view wouldn't change that value, so the rest of the workbook content will continue to show the view/data for the previous tab.
-### Using toolbars
+### Use toolbars
-Use the Toolbar style to have your links appear styled as a toolbar. In toolbar style, the author must fill in fields for:
+Use the toolbar style to have your links appear styled as a toolbar. In toolbar style, you must fill in fields for:
+ - **Button text**: The text to display on the toolbar. Parameters can be used in this field.
+ - **Icons**: The icons to display on the toolbar.
+ - **Tooltip text**: Text to be displayed on the toolbar button's tooltip text. Parameters can be used in this field.
:::image type="content" source="media/workbooks-create-workbook/workbooks-links-create-toolbar.png" alt-text="Screenshot of creating links styled as a toolbar in workbooks.":::
-If any required parameters are used in button text, tooltip text, or value fields, and the required parameter is unset, the toolbar button will be disabled. For example, this can be used to disable toolbar buttons when no value is selected in another parameter/control.
+If any required parameters are used in button text, tooltip text, or value fields, and the required parameter is unset, the toolbar button will be disabled. For example, this functionality can be used to disable toolbar buttons when no value is selected in another parameter/control.
-A sample workbook with toolbars, globals parameters, and ARM Actions is available in [sample Azure Workbooks with links](workbooks-sample-links.md#sample-workbook-with-toolbar-links).
+A sample workbook with toolbars, global parameters, and Azure Resource Manager actions is available in [sample workbooks with links](workbooks-sample-links.md#sample-workbook-with-toolbar-links).
-## Adding groups
+## Add groups
-A group component in a workbook allows you to logically group a set of components in a workbook.
+You can logically group a set of components by using a group component in a workbook.
Groups in workbooks are useful for several things: - **Layout**: When you want components to be organized vertically, you can create a group of components that will all stack up and set the styling of the group to be a percentage width instead of setting percentage width on all the individual components.
- - **Visibility**: When you want several components to hide or show together, you can set the visibility of the entire group of components, instead of setting visibility settings on each individual component. This can be useful in templates that use tabs, as you can use a group as the content of the tab, and the entire group can be hidden/shown based on a parameter set by the selected tab.
- - **Performance**: When you have a large template with many sections or tabs, you can convert each section into its own sub-template, and use groups to load all the sub-templates within the top-level template. The content of the sub-templates won't load or run until a user makes those groups visible. Learn more about [how to split a large template into many templates](#splitting-a-large-template-into-many-templates).
+ - **Visibility**: When you want several components to hide or show together, you can set the visibility of the entire group of components, instead of setting visibility settings on each individual component. This functionality can be useful in templates that use tabs. You can use a group as the content of the tab, and the entire group can be hidden or shown based on a parameter set by the selected tab.
+ - **Performance**: When you have a large template with many sections or tabs, you can convert each section into its own subtemplate. You can use groups to load all the subtemplates within the top-level template. The content of the subtemplates won't load or run until a user makes those groups visible. Learn more about [how to split a large template into many templates](#split-a-large-template-into-many-templates).
### Add a group to your workbook
-1. Make sure you are in **Edit** mode by selecting the **Edit** in the toolbar.
-1. Add a group by doing either of these steps:
- - Select **Add**, and **Add group** below an existing element, or at the bottom of the workbook.
- - Select the ellipses (...) to the right of the **Edit** button next to one of the elements in the workbook, then select **Add** and then **Add group**.
+1. Make sure you're in edit mode by selecting **Edit**.
+1. Add a group by doing one of these steps:
+ - Select **Add** > **Add group** below an existing element or at the bottom of the workbook.
+ - Select the ellipsis (...) to the right of the **Edit** button next to one of the elements in the workbook. Then select **Add** > **Add group**.
+
+ :::image type="content" source="media/workbooks-create-workbook/workbooks-add-group.png" alt-text="Screenshot that shows adding a group to a workbook. ":::
- :::image type="content" source="media/workbooks-create-workbook/workbooks-add-group.png" alt-text="Screenshot showing selecting adding a group to a workbook. ":::
1. Select components for your group.
-1. Select **Done editing.**
+1. Select **Done Editing.**
- This is a group in read mode with two components inside: a text component and a query component.
+ This group is in read mode with two components inside: a text component and a query component.
- :::image type="content" source="media/workbooks-create-workbook/workbooks-groups-view.png" alt-text="Screenshot showing a group in read mode in a workbook.":::
+ :::image type="content" source="media/workbooks-create-workbook/workbooks-groups-view.png" alt-text="Screenshot that shows a group in read mode in a workbook.":::
- In edit mode, you can see those two components are actually inside a group component. In the screenshot below, the group is in edit mode. The group contains two components inside the dashed area. Each component can be in edit or read mode, independent of each other. For example, the text step is in edit mode while the query step is in read mode.
+ In edit mode, you can see those two components are actually inside a group component. In the following screenshot, the group is in edit mode. The group contains two components inside the dashed area. Each component can be in edit or read mode, independent of each other. For example, the text step is in edit mode while the query step is in read mode.
:::image type="content" source="media/workbooks-create-workbook/workbooks-groups-edit.png" alt-text="Screenshot of a group in edit mode in a workbook."::: ### Scoping a group
-A group is treated as a new scope in the workbook. Any parameters created in the group are only visible inside the group. This is also true for merge - you can only see data inside their group or at the parent level.
+A group is treated as a new scope in the workbook. Any parameters created in the group are only visible inside the group. This is also true for merge. You can only see data inside the group or at the parent level.
### Group types You can specify which type of group to add to your workbook. There are two types of groups:
+ - **Editable**: The group in the workbook allows you to add, remove, or edit the contents of the components in the group. This group is most commonly used for layout and visibility purposes.
+ - **From a template**: The group in the workbook loads from the contents of another workbook by its ID. The content of that workbook is loaded and merged into the workbook at runtime. In edit mode, you can't modify any of the contents of the group. They'll just load again from the template the next time the component loads. When you load a group from a template, use the full Azure Resource ID of an existing workbook.
### Load types
You can specify how and when the contents of a group are loaded.
#### Lazy loading
-Lazy loading is the default. In lazy loading, the group is only loaded when the component is visible. This allows a group to be used by tab components. If the tab is never selected, the group never becomes visible and therefore the content isn't loaded.
+Lazy loading is the default. In lazy loading, the group is only loaded when the component is visible. This functionality allows a group to be used by tab components. If the tab is never selected, the group never becomes visible, so the content isn't loaded.
For groups created from a template, the content of the template isn't retrieved and the components in the group aren't created until the group becomes visible. Users see progress spinners for the whole group while the content is retrieved. #### Explicit loading
-In this mode, a button is displayed where the group would be, and no content is retrieved or created until the user explicitly clicks the button to load the content. This is useful in scenarios where the content might be expensive to compute or rarely used. The author can specify the text to appear on the button.
-
-This screenshot shows explicit load settings with a configured "Load more" button.
+In this mode, a button is displayed where the group would be. No content is retrieved or created until the user explicitly selects the button to load the content. This functionality is useful in scenarios where the content might be expensive to compute or rarely used. You can specify the text to appear on the button.
- :::image type="content" source="media/workbooks-create-workbook/workbooks-groups-explicitly-loaded.png" alt-text="Screenshot of explicit load settings for a group in workbooks.":::
+This screenshot shows explicit load settings with a configured **Load More** button:
-This is the group before being loaded in the workbook:
+ :::image type="content" source="media/workbooks-create-workbook/workbooks-groups-explicitly-loaded.png" alt-text="Screenshot that shows explicit load settings for a group in the workbook.":::
- :::image type="content" source="media/workbooks-create-workbook/workbooks-groups-explicitly-loaded-before.png" alt-text="Screenshot showing an explicit group before being loaded in the workbook.":::
+This screenshot shows the group before being loaded in the workbook:
+ :::image type="content" source="media/workbooks-create-workbook/workbooks-groups-explicitly-loaded-before.png" alt-text="Screenshot that shows an explicit group before being loaded in the workbook.":::
-The group after being loaded in the workbook:
+This screenshot shows the group after being loaded in the workbook:
- :::image type="content" source="media/workbooks-create-workbook/workbooks-groups-explicitly-loaded-after.png" alt-text="Screenshot showing an explicit group after being loaded in the workbook.":::
+ :::image type="content" source="media/workbooks-create-workbook/workbooks-groups-explicitly-loaded-after.png" alt-text="Screenshot that shows an explicit group after being loaded in the workbook.":::
#### Always mode
-In **Always** mode, the content of the group is always loaded and created as soon as the workbook loads. This is most frequently used when you're using a group only for layout purposes, where the content will always be visible.
+In **Always** mode, the content of the group is always loaded and created as soon as the workbook loads. This functionality is most frequently used when you're using a group only for layout purposes, where the content is always visible.
-### Using templates inside a group
+### Use templates inside a group
-When a group is configured to load from a template, by default, that content will be loaded in lazy mode, and it will only load when the group is visible.
+When a group is configured to load from a template, by default, that content is loaded in lazy mode. It only loads when the group is visible.
-When a template is loaded into a group, the workbook attempts to merge any parameters declared in the template with parameters that already exist in the group. Any parameters that already exist in the workbook with identical names will be merged out of the template being loaded. If all parameters in a parameter component are merged out, the entire parameters component will disappear.
+When a template is loaded into a group, the workbook attempts to merge any parameters declared in the template with parameters that already exist in the group. Any parameters that already exist in the workbook with identical names are merged out of the template being loaded. If all parameters in a parameter component are merged out, the entire parameters component disappears.
#### Example 1: All parameters have identical names
-Suppose you have a template that has two parameters at the top, a time range parameter and a text parameter named "**Filter**":
+Suppose you have a template that has two parameters at the top, a time range parameter and a text parameter named **Filter**:
- :::image type="content" source="media/workbooks-create-workbook/workbooks-groups-top-level-params.png" alt-text="Screenshot showing top level parameters in a workbook.":::
+ :::image type="content" source="media/workbooks-create-workbook/workbooks-groups-top-level-params.png" alt-text="Screenshot that shows top-level parameters in a workbook.":::
Then a group component loads a second template that has its own two parameters and a text component, where the parameters are named the same:
- :::image type="content" source="media/workbooks-create-workbook/workbooks-groups-merged-away.png" alt-text="Screenshot of a workbook template with top level parameters.":::
+ :::image type="content" source="media/workbooks-create-workbook/workbooks-groups-merged-away.png" alt-text="Screenshot that shows a workbook template with top-level parameters.":::
-When the second template is loaded into the group, the duplicate parameters are merged out. Since all of the parameters are merged away, the inner parameters component is also merged out, resulting in the group containing only the text component.
+When the second template is loaded into the group, the duplicate parameters are merged out. Because all the parameters are merged away, the inner parameters component is also merged out. The result is that the group contains only the text component.
### Example 2: One parameter has an identical name
-Suppose you have a template that has two parameters at the top, a **time range** parameter and a text parameter named "**FilterB**" ():
+Suppose you have a template that has two parameters at the top, a time range parameter and a text parameter named **FilterB** ():
- :::image type="content" source="media/workbooks-create-workbook/workbooks-groups-wont-merge-away.png" alt-text="Screenshot of a group component with the result of parameters merged away.":::
+ :::image type="content" source="media/workbooks-create-workbook/workbooks-groups-wont-merge-away.png" alt-text="Screenshot that shows a group component with the result of parameters merged away.":::
When the group's component's template is loaded, the **TimeRange** parameter is merged out of the group. The workbook contains the initial parameters component with **TimeRange** and **Filter**, and the group's parameter only includes **FilterB**.
- :::image type="content" source="media/workbooks-create-workbook/workbooks-groups-wont-merge-away-result.png" alt-text="Screenshot of workbook group where parameters won't merge away.":::
+ :::image type="content" source="media/workbooks-create-workbook/workbooks-groups-wont-merge-away-result.png" alt-text="Screenshot that shows a workbook group where parameters won't merge away.":::
-If the loaded template had contained **TimeRange** and **Filter** (instead of **FilterB**), then the resulting workbook would have a parameters component and a group with only the text component remaining.
+If the loaded template had contained **TimeRange** and **Filter** (instead of **FilterB**), the resulting workbook would have a parameters component and a group with only the text component remaining.
-### Splitting a large template into many templates
+### Split a large template into many templates
-To improve performance, it's helpful to break up a large template into multiple smaller templates that loads some content in lazy mode or on demand by the user. This makes the initial load faster since the top-level template can be much smaller.
+To improve performance, it's helpful to break up a large template into multiple smaller templates that load some content in lazy mode or on demand by the user. This arrangement makes the initial load faster because the top-level template can be much smaller.
-When splitting a template into parts, you'll basically need to split the template into many templates (sub-templates) that all work individually. If the top-level template has a **TimeRange** parameter that other components use, the sub-template will need to also have a parameters component that defines a parameter with same exact name. The sub-templates will work independently and can load inside larger templates in groups.
+When you split a template into parts, you need to split the template into many templates (subtemplates) that all work individually. If the top-level template has a **TimeRange** parameter that other components use, the subtemplate also needs to have a parameters component that defines a parameter with the same exact name. The subtemplates work independently and can load inside larger templates in groups.
-To turn a larger template into multiple sub-templates:
+To turn a larger template into multiple subtemplates:
-1. Create a new empty group near the top of the workbook, after the shared parameters. This new group will eventually become a sub-template.
-1. Create a copy of the shared parameters component, and then use **move into group** to move the copy into the group created in step 1. This parameter allows the sub-template to work independently of the outer template, and will get merged out when loaded inside the outer template.
+1. Create a new empty group near the top of the workbook, after the shared parameters. This new group eventually becomes a subtemplate.
+1. Create a copy of the shared parameters component. Then use **move into group** to move the copy into the group created in step 1. This parameter allows the subtemplate to work independently of the outer template and is merged out when it's loaded inside the outer template.
> [!NOTE]
- > sub-templates don't technically need to have the parameters that get merged out if you never plan on the sub-templates being visible by themselves. However, if the sub-templates do not have the parameters, it will make them very hard to edit or debug if you need to do so later.
-
-1. Move each component in the workbook you want to be in the sub-template into the group created in step 1.
-1. If the individual components moved in step 3 had conditional visibilities, that will become the visibility of the outer group (like used in tabs). Remove them from the components inside the group and add that visibility setting to the group itself. Save here to avoid losing changes and/or export and save a copy of the json content.
-1. If you want that group to be loaded from a template, you can use the **Edit** toolbar button in the group. This will open just the content of that group as a workbook in a new window. You can then save it as appropriate and close this workbook view (don't close the browser, just that view to go back to the previous workbook you were editing).
-1. You can then change the group component to load from template and set the template ID field to the workbook/template you created in step 5. To work with workbooks IDs, the source needs to be the full Azure Resource ID of a shared workbook. Press *Load* and the content of that group will now be loaded from that sub-template instead of saved inside this outer workbook.
+ > Subtemplates don't technically need to have the parameters that get merged out if you never plan on the subtemplates being visible by themselves. If the subtemplates don't have the parameters, they'll be hard to edit or debug if you need to do so later.
+1. Move each component in the workbook you want to be in the subtemplate into the group created in step 1.
+1. If the individual components moved in step 3 had conditional visibilities, that will become the visibility of the outer group (like used in tabs). Remove them from the components inside the group and add that visibility setting to the group itself. Save here to avoid losing changes. You can also export and save a copy of the JSON content.
+1. If you want that group to be loaded from a template, you can use **Edit** in the group. This action opens only the content of that group as a workbook in a new window. You can then save it as appropriate and close this workbook view. Don't close the browser. Only close that view to go back to the previous workbook where you were editing.
+1. You can then change the group component to load from a template and set the template ID field to the workbook/template you created in step 5. To work with workbook IDs, the source needs to be the full Azure Resource ID of a shared workbook. Select **Load** and the content of that group is now loaded from that subtemplate instead of being saved inside this outer workbook.
## Next steps-- [Common Workbook use cases](workbooks-commonly-used-components.md)+
+[Common Azure Workbooks use cases](workbooks-commonly-used-components.md)
azure-monitor Workbooks Criteria https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/visualize/workbooks-criteria.md
Title: Azure Workbooks criteria parameters.
-description: Learn about adding criteria parameters to your Azure workbook.
+ Title: Azure Workbooks criteria parameters
+description: Learn about adding criteria parameters to your workbook.
# Text parameter criteria
-When a query depends on many parameters, then the query will be stalled until each of its parameters have been resolved. Sometimes a parameter could have a simple query that concatenates a string or performs a conditional evaluation. However these queries still make network calls to services that perform these basic operations and that increases the time it takes for a parameter to resolve a value. This results in long load times for complex workbooks.
+When a query depends on many parameters, the query will be stalled until each of its parameters has been resolved. Sometimes a parameter could have a simple query that concatenates a string or performs a conditional evaluation. These queries still make network calls to services that perform these basic operations, and that process increases the time it takes for a parameter to resolve a value. The result is long load times for complex workbooks.
-Using criteria parameters, you can define a set of criteria based on previously specified parameters which will be evaluated to provide a dynamic value. The main benefit of using criteria parameters is that criteria parameters can resolve values of previously specified parameters and perform simple conditional operations without making any network calls. Below is an example of such a use case.
+When you use criteria parameters, you can define a set of criteria based on previously specified parameters that will be evaluated to provide a dynamic value. The main benefit of using criteria parameters is that criteria parameters can resolve values of previously specified parameters and perform simple conditional operations without making any network calls. The following example is a criteria-parameters use case.
## Example
-Consider the conditional query below:
+Consider the following conditional query:
+ ``` let metric = dynamic({Counter}); print tostring((metric.object == 'Network Adapter' and (metric.counter == 'Bytes Received/sec' or metric.counter == 'Bytes Sent/sec')) or (metric.object == 'Network' and (metric.counter == 'Total Bytes Received' or metric.counter == 'Total Bytes Transmitted'))) ```
-If the user is focused on the `metric.counter` object, essentially the value of the parameter `isNetworkCounter` should be true, if the parameter `Counter` has `Bytes Received/sec`, `Bytes Sent/sec`, `Total Bytes Received`, or `Total Bytes Transmitted`.
+If you're focused on the `metric.counter` object, the value of the parameter `isNetworkCounter` should be true if the parameter `Counter` has `Bytes Received/sec`, `Bytes Sent/sec`, `Total Bytes Received`, or `Total Bytes Transmitted`.
-This can be translated to a criteria text parameter like so:
+This can be translated to a criteria text parameter:
-In the image above, the conditions will be evaluated from top to bottom and the value of the parameter `isNetworkCounter` will take the value of which ever condition evaluates to true first. All conditions except for the default condition (the 'else' condition) can be reordered to get the desired outcome.
+In the preceding screenshot, the conditions will be evaluated from top to bottom and the value of the parameter `isNetworkCounter` will take the value of whichever condition evaluates to true first. All conditions except for the default condition (the "else" condition) can be reordered to get the desired outcome.
## Set up criteria+ 1. Start with a workbook with at least one existing parameter in edit mode.
- 1. Choose Add parameters from the links within the workbook.
- 1. Select on the blue Add Parameter button.
- 1. In the new parameter pane that pops up enter:
- - Parameter name: rand
- - Parameter type: Text
- - Required: checked
- - Get data from: Query
- - Enter `print rand(0-1)` into the query editor. This parameter will output a value between 0-1.
- 1. Choose 'Save' from the toolbar to create the parameter.
+ 1. Select **Add parameters** > **Add Parameter**.
+ 1. In the new parameter pane that opens, enter:
+ - **Parameter name**: `rand`
+ - **Parameter type**: `Text`
+ - **Required**: `checked`
+ - **Get data from**: `Query`
+ - Enter `print rand(0-1)` in the query editor. This parameter will output a value between 0-1.
+ 1. Select **Save** to create the parameter.
> [!NOTE]
- > The first parameter in the workbook will not show the `Criteria` tab.
+ > The first parameter in the workbook won't show the **Criteria** tab.
- :::image type="content" source="media/workbooks-criteria/workbooks-criteria-first-param.png" alt-text="Screenshot showing the first parameter.":::
+ :::image type="content" source="media/workbooks-criteria/workbooks-criteria-first-param.png" alt-text="Screenshot that shows the first parameter.":::
-1. In the table with the 'rand' parameter, select on the blue Add Parameter button.
-1. In the new parameter pane that pops up enter:
- - Parameter name: randCriteria
- - Parameter type: Text
- - Required: checked
- - Get data from: Criteria
-1. A grid appears. Select **Edit** next to the blank text box to open the 'Criteria Settings' form. Refer to [Criteria Settings form](#criteria-settings-form) for the description of each field.
+1. In the table with the `rand` parameter, select **Add Parameter**.
+1. In the new parameter pane that opens, enter:
+ - **Parameter name**: `randCriteria`
+ - **Parameter type**: `Text`
+ - **Required**: `checked`
+ - **Get data from**: `Criteria`
+1. A grid appears. Select **Edit** next to the blank text box to open the **Criteria Settings** form. For a description of each field, see [Criteria Settings form](#criteria-settings-form).
- :::image type="content" source="media/workbooks-criteria/workbooks-criteria-setting.png" alt-text="Screenshot showing the criteria settings form.":::
+ :::image type="content" source="media/workbooks-criteria/workbooks-criteria-setting.png" alt-text="Screenshot that shows the Criteria Settings form.":::
-1. Enter the data below to populate the first Criteria, then select 'OK'.
- - First operand: rand
- - Operator: >
- - Value from: Static Value
- - Second Operand: 0.25
- - Value from: Static Value
- - Result is: is over 0.25
+1. Enter the following data to populate the first criteria, and then select **OK**:
+ - **First operand**: `rand`
+ - **Operator**: `>`
+ - **Value from**: `Static Value`
+ - **Second operand**: `0.25`
+ - **Value from**: `Static Value`
+ - **Result is**: `is over 0.25`
- :::image type="content" source="media/workbooks-criteria/workbooks-criteria-setting-filled.png" alt-text="Screenshot showing the criteria settings form filled.":::
+ :::image type="content" source="media/workbooks-criteria/workbooks-criteria-setting-filled.png" alt-text="Screenshot that shows the Criteria Settings form filled in.":::
-1. Select on edit, next to the condition `Click edit to specify a result for the default condition.`, this will edit the default condition.
+1. Select **Edit** next to the condition `Click edit to specify a result for the default condition` to edit the default condition.
> [!NOTE]
- > For the default condition, everthing should be disabled except for the last `Value from` and `Result is` fields.
+ > For the default condition, everything should be disabled except for the last `Value from` and `Result is` fields.
+
+1. Enter the following data to populate the default condition, and then select **OK**:
+ - **Value from**: Static Value
+ - **Result is**: is 0.25 or under
-1. Enter the data below to populate the default condition, then select 'OK'.
- - Value from: Static Value
- - Result is: is 0.25 or under
+ :::image type="content" source="media/workbooks-criteria/workbooks-criteria-default.png" alt-text="Screenshot that shows the Criteria Settings default form filled.":::
- :::image type="content" source="media/workbooks-criteria/workbooks-criteria-default.png" alt-text="Screenshot showing the criteria settings default form filled.":::
+1. Save the parameter.
+1. Refresh the workbook to see the `randCriteria` parameter in action. Its value will be based on the value of `rand`.
-1. Save the Parameter
-1. Select on the refresh button on the workbook, to see the `randCriteria` parameter in action. Its value will be based on the value of `rand`!
+## Criteria Settings form
-## Criteria settings form
|Form fields|Description| |--|-|
-|First operand| This is a dropdown consisting of parameter names that have already been created. The value of the parameter will be used on the left hand side of the comparison |
-|Operator|The operator used to compare the first and the second operands. Can be a numerical or string evaluation. The operator `is empty` will disable the `Second operand` as only the `First operand` is required.|
-|Value from|If set to `Parameter`, a dropdown consisting of parameters that have already been created will be shown. The value of that parameter will be used on the right hand side of the comparison.<br/> If set to `Static Value`, a text box will be shown where an author can enter a value for the right hand side of the comparison.|
-|Second Operand| Will be either a dropdown menu consisting of created parameters, or a textbox depending on the above `Value from` selection.|
-|Value from|If set to `Parameter`, a dropdown consisting of parameters that have already been created will be shown. The value of that parameter will be used for the return value of the current parameter.<br/> If set to `Static Value`:<br>a text box will be shown where an author can enter a value for the result.<br>>An author can also dereference other parameters by using curly braces around the parameter name.<br>>It is possible concatenate multiple parameters and create a custom string, for example: "`{paramA}`, `{paramB}`, and some string" <br><br>If set to `Expression`:<br> a text box will be shown where an author can enter a mathematical expression that will be evaluated as the result<br>Like the `Static Value` case, multiple parameters may be dereferenced in this text box.<br>If the parameter value referenced in the text box is not a number, it will be treated as the value `0`|
-|Result is| Will be either a dropdown menu consisting of created parameters, or a textbox depending on the above Value from selection. The textbox will be evaluated as the final result of this Criteria Settings form.
+|First operand| This dropdown list consists of parameter names that have already been created. The value of the parameter will be used on the left side of the comparison. |
+|Operator|The operator used to compare the first and second operands. Can be a numerical or string evaluation. The operator `is empty` will disable the `Second operand` because only the `First operand` is required.|
+|Value from|If set to `Parameter`, a dropdown list consisting of parameters that have already been created appears. The value of that parameter will be used on the right side of the comparison.<br/> If set to `Static Value`, a text box appears where you can enter a value for the right side of the comparison.|
+|Second operand| Will be either a dropdown menu consisting of created parameters or a text box depending on the preceding `Value from` selection.|
+|Value from|If set to `Parameter`, a dropdown list consisting of parameters that have already been created appears. The value of that parameter will be used for the return value of the current parameter.<br/> If set to `Static Value`:<br>- A text box appears where you can enter a value for the result.<br>- You can also dereference other parameters by using curly braces around the parameter name.<br>- It's possible to concatenate multiple parameters and create a custom string, for example, "`{paramA}`, `{paramB}`, and some string." <br><br>If set to `Expression`:<br> - A text box appears where you can enter a mathematical expression that will be evaluated as the result.<br>- Like the `Static Value` case, multiple parameters might be dereferenced in this text box.<br>- If the parameter value referenced in the text box isn't a number, it will be treated as the value `0`.|
+|Result is| Will be either a dropdown menu consisting of created parameters or a textbox depending on the preceding `Value from` selection. The text box will be evaluated as the final result of this **Criteria Settings** form.
azure-monitor Workbooks Dropdowns https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/visualize/workbooks-dropdowns.md
Title: Azure Monitor Workbook drop down parameters
-description: Simplify complex reporting with prebuilt and custom parameterized workbooks containing dropdown parameters
+ Title: Azure Monitor workbook dropdown parameters
+description: Simplify complex reporting with prebuilt and custom parameterized workbooks containing dropdown parameters.
Last updated 07/05/2022
-# Workbook drop down parameters
+# Workbook dropdown parameters
-Drop downs allow user to collect one or more input values from a known set (for example, select one of your appΓÇÖs requests). Drop downs provide a user-friendly way to collect arbitrary inputs from users. Drop downs are especially useful in enabling filtering in your interactive reports.
+By using dropdown parameters, you can collect one or more input values from a known set. For example, you can use a dropdown parameter to select one of your app's requests. Dropdown parameters also provide a user-friendly way to collect arbitrary inputs from users. Dropdown parameters are especially useful in enabling filtering in your interactive reports.
-The easiest way to specify a drop-down is by providing a static list in the parameter setting. A more interesting way is to get the list dynamically via a KQL query. Parameter settings also allow you to specify whether it is single or multi-select, and if it is multi-select, how the result set should be formatted (delimiter, quotation, etc.).
+The easiest way to specify a dropdown parameter is by providing a static list in the parameter setting. A more interesting way is to get the list dynamically via a KQL query. You can also specify whether it's single or multi-select by using parameter settings. If it's multi-select, you can specify how the result set should be formatted, for example, as delimiter or quotation.
-## Creating a static drop-down parameter
+## Create a static dropdown parameter
1. Start with an empty workbook in edit mode.
-2. Choose _Add parameters_ from the links within the workbook.
-3. Click on the blue _Add Parameter_ button.
-4. In the new parameter pane that pops up enter:
- 1. Parameter name: `Environment`
- 2. Parameter type: `Drop down`
- 3. Required: `checked`
- 4. Allow `multiple selection`: `unchecked`
- 5. Get data from: `JSON`
-5. In the JSON Input text block, insert this json snippet:
+1. Select **Add parameters** > **Add Parameter**.
+1. In the new parameter pane that opens, enter:
+ 1. **Parameter name**: `Environment`
+ 1. **Parameter type**: `Drop down`
+ 1. **Required**: `checked`
+ 1. **Allow multiple selections**: `unchecked`
+ 1. **Get data from**: `JSON`
+1. In the **JSON Input** text block, insert this JSON snippet:
+ ```json [ { "value":"dev", "label":"Development" },
The easiest way to specify a drop-down is by providing a static list in the para
{ "value":"prod", "label":"Production", "selected":true } ] ```
-6. Hit the blue `Update` button.
-7. Choose 'Save' from the toolbar to create the parameter.
-8. The Environment parameter will be a drop-down with the three values.
- ![Image showing the creation of a static drown down](./media/workbooks-dropdowns/dropdown-create.png)
+1. Select **Update**.
+1. Select **Save** to create the parameter.
+1. The **Environment** parameter will be a dropdown list with the three values.
+
+ ![Screenshot that shows the creation of a static dropdown parameter.](./media/workbooks-dropdowns/dropdown-create.png)
-## Creating a static dropdown with groups of items
+## Create a static dropdown list with groups of items
-If your query result/json contains a "group" field, the dropdown will display groups of values. Follow the above sample, but use the following json instead:
+If your query result/JSON contains a `group` field, the dropdown list will display groups of values. Follow the preceding sample, but use the following JSON instead:
```json [
If your query result/json contains a "group" field, the dropdown will display gr
] ```
-![Image showing an example of a grouped dropdown](./media/workbooks-dropdowns/grouped-dropDown.png)
+![Screenshot that shows an example of a grouped dropdown list.](./media/workbooks-dropdowns/grouped-dropDown.png)
+## Create a dynamic dropdown parameter
-## Creating a dynamic drop-down parameter
1. Start with an empty workbook in edit mode.
-2. Choose _Add parameters_ from the links within the workbook.
-3. Click on the blue _Add Parameter_ button.
-4. In the new parameter pane that pops up enter:
- 1. Parameter name: `RequestName`
- 2. Parameter type: `Drop down`
- 3. Required: `checked`
- 4. Allow `multiple selection`: `unchecked`
- 5. Get data from: `Query`
-5. In the JSON Input text block, insert this json snippet:
+1. Select **Add parameters** > **Add Parameter**.
+1. In the new parameter pane that opens, enter:
+ 1. **Parameter name**: `RequestName`
+ 1. **Parameter type**: `Drop down`
+ 1. **Required**: `checked`
+ 1. **Allow multiple selections**: `unchecked`
+ 1. **Get data from**: `Query`
+1. In the **JSON Input** text block, insert this JSON snippet:
```kusto requests | summarize by name | order by name asc ```
-1. Hit the blue `Run Query` button.
-2. Choose 'Save' from the toolbar to create the parameter.
-3. The RequestName parameter will be a drop-down the names of all requests in the app.
- ![Image showing the creation of a dynamic drop-down](./media/workbooks-dropdowns/dropdown-dynamic.png)
+1. Select **Run Query**.
+1. Select **Save** to create the parameter.
+1. The **RequestName** parameter will be a dropdown list with the names of all requests in the app.
+
+ ![Screenshot that shows the creation of a dynamic dropdown parameter.](./media/workbooks-dropdowns/dropdown-dynamic.png)
+
+## Reference a dropdown parameter
-## Referencing drop down parameter
+You can reference dropdown parameters.
### In KQL
-1. Add a query control to the workbook and select an Application Insights resource.
-2. In the KQL editor, enter this snippet
+
+1. Select **Add query** to add a query control, and then select an Application Insights resource.
+1. In the KQL editor, enter this snippet:
```kusto requests
If your query result/json contains a "group" field, the dropdown will display gr
| summarize Requests = count() by bin(timestamp, 1h) ```
-3. This expands on query evaluation time to:
+
+1. The snippet expands on query evaluation time to:
```kusto requests
If your query result/json contains a "group" field, the dropdown will display gr
| summarize Requests = count() by bin(timestamp, 1h) ```
-4. Run query to see the results. Optionally, render it as a chart.
+1. Run the query to see the results. Optionally, render it as a chart.
- ![Image showing a drop-down referenced in KQL](./media/workbooks-dropdowns/dropdown-reference.png)
+ ![Screenshot that shows a dropdown parameter referenced in KQL.](./media/workbooks-dropdowns/dropdown-reference.png)
+## Parameter value, label, selection, and group
-## Parameter value, label, selection and group
-The query used in the dynamic drop-down parameter above just returns a list of values that are rendered faithfully in the drop-down. But what if you wanted a different display name, or one of these to be selected? Drop down parameters allow this via the value, label, selection and group columns.
+The query used in the preceding dynamic dropdown parameter returns a list of values that are rendered faithfully in the dropdown list. But what if you wanted a different display name or one of the names to be selected? Dropdown parameters use value, label, selection, and group columns for this functionality.
-The sample below shows how to get a list of Application Insights dependencies whose display names are styled with an emoji, has the first one selected, and is grouped by operation names.
+The following sample shows how to get a list of Application Insights dependencies whose display names are styled with an emoji, has the first one selected, and is grouped by operation names:
```kusto dependencies
dependencies
| project value = name, label = strcat('🌐 ', name), selected = iff(Rank == 1, true, false), group = operation_Name ```
-![Image showing a drop-down parameter using value, label, selection and group options](./media/workbooks-dropdowns/dropdown-more-options.png)
+![Screenshot that shows a dropdown parameter using value, label, selection, and group options.](./media/workbooks-dropdowns/dropdown-more-options.png)
+## Dropdown parameter options
-## Drop down parameter options
-| Parameter | Explanation | Example |
+| Parameter | Description | Example |
| - |:-|:-| | `{DependencyName}` | The selected value | GET fabrikamaccount | | `{DependencyName:label}` | The selected label | 🌐 GET fabrikamaccount | | `{DependencyName:value}` | The selected value | GET fabrikamaccount | ## Multiple selection
-The examples so far explicitly set the parameter to select only one value in the drop-down. Drop down parameters also support `multiple selection` - enabling this is as simple as checking the `Allow multiple selection` option.
-The user also has the option of specifying the format of the result set via the `delimiter` and `quote with` settings. The default just returns the values as a collection in this form: 'a', 'b', 'c'. They also have the option to limit the number of selections.
+The examples so far explicitly set the parameter to select only one value in the dropdown list. Dropdown parameters also support *multiple selection*. To enable this option, select the **Allow multiple selections** checkbox.
+
+You can specify the format of the result set via the **Delimiter** and **Quote with** settings. The default returns the values as a collection in the form of **a**, **b**, **c**. You can also limit the number of selections.
The KQL referencing the parameter will need to change to work with the format of the result. The most common way to enable it is via the `in` operator.
dependencies
| summarize Requests = count() by bin(timestamp, 1h), name ```
-Here is an example for multi-select drop-down at work:
+This example shows the multi-select dropdown parameter at work:
-![Image showing a multi-select drop-down parameter](./media/workbooks-dropdowns/dropdown-multiselect.png)
+![Screenshot that shows a multi-select dropdown parameter.](./media/workbooks-dropdowns/dropdown-multiselect.png)
## Next steps
+[Getting started with Azure Workbooks](workbooks-getting-started.md)
azure-monitor Workbooks Multi Value https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/visualize/workbooks-multi-value.md
Title: Azure Workbooks multi value parameters.
-description: Learn about adding multi value parameters to your Azure workbook.
+ Title: Azure Workbooks multi-value parameters
+description: Learn about adding multi-value parameters to your workbook.
Last updated 07/05/2022
-# Multi-value Parameters
+# Multi-value parameters
-A multi-value parameter allows the user to set one or more arbitrary text values. Multi-value parameters are commonly used for filtering, often when a drop-down control may contain too many values to be useful.
+A multi-value parameter allows the user to set one or more arbitrary text values. Multi-value parameters are commonly used for filtering, often when a dropdown control might contain too many values to be useful.
+## Create a static multi-value parameter
-## Creating a static multi-value parameter
1. Start with an empty workbook in edit mode.
-1. Select **Add parameters** from the links within the workbook.
-1. Select the blue _Add Parameter_ button.
-1. In the new parameter pane that pops up enter:
- - Parameter name: `Filter`
- - Parameter type: `Multi-value`
- - Required: `unchecked`
- - Get data from: `None`
-1. Select **Save** from the toolbar to create the parameter.
-1. The Filter parameter will be a multi-value parameter, initially with no values:
+1. Select **Add parameters** > **Add Parameter**.
+1. In the new parameter pane that opens, enter:
+ - **Parameter name**: `Filter`
+ - **Parameter type**: `Multi-value`
+ - **Required**: `unchecked`
+ - **Get data from**: `None`
+1. Select **Save** to create the parameter.
+1. The **Filter** parameter will be a multi-value parameter, initially with no values.
- :::image type="content" source="media/workbooks-multi-value/workbooks-multi-value-create.png" alt-text="Screenshot showing the creation of mulit-value parameter in workbooks.":::
+ :::image type="content" source="media/workbooks-multi-value/workbooks-multi-value-create.png" alt-text="Screenshot that shows the creation of a multi-value parameter in a workbook.":::
-1. You can then add multiple values:
+1. You can then add multiple values.
- :::image type="content" source="media/workbooks-multi-value/workbooks-multi-value-third-value.png" alt-text="Screenshot showing the user adding a third value in workbooks.":::
+ :::image type="content" source="media/workbooks-multi-value/workbooks-multi-value-third-value.png" alt-text="Screenshot that shows adding a third value in a workbook.":::
-
-A multi-value parameter behaves similarly to a multi-select [drop down parameter](workbooks-dropdowns.md). As such, it is commonly used in an "in" like scenario
+A multi-value parameter behaves similarly to a multi-select [dropdown parameter](workbooks-dropdowns.md) and is commonly used in an "in"-like scenario.
``` let computerFilter = dynamic([{Computer}]);
A multi-value parameter behaves similarly to a multi-select [drop down parameter
``` ## Parameter field style
-Multi-value parameter supports following field style:
-1. Standard: Allows a user to add or remove arbitrary text items
- :::image type="content" source="media/workbooks-multi-value/workbooks-multi-value-standard.png" alt-text="Screenshot showing standard workbooks multi-value field.":::
+A multi-value parameter supports the following field styles:
+
+1. **Standard**: Allows you to add or remove arbitrary text items.
+
+ :::image type="content" source="media/workbooks-multi-value/workbooks-multi-value-standard.png" alt-text="Screenshot that shows the workbook standard multi-value field.":::
+
+1. **Password**: Allows you to add or remove arbitrary password fields. The password values are only hidden in the UI when you type. The values are still fully accessible as a parameter value when referred. They're stored unencrypted when the workbook is saved.
-1. Password: Allows a user to add or remove arbitrary password fields. The password values are only hidden on UI when user types. The values are still fully accessible as a param value when referred and they are stored unencrypted when workbook is saved.
+ :::image type="content" source="media/workbooks-multi-value/workbooks-multi-value-password.png" alt-text="Screenshot that shows a workbook password multi-value field.":::
- :::image type="content" source="media/workbooks-multi-value/workbooks-multi-value-password.png" alt-text="Screenshot showing a workbooks password multi-value field.":::
+## Create a multi-value parameter with initial values
-## Creating a multi-value with initial values
-You can use a query to seed the multi-value parameter with initial values. The user can then manually remove values, or add more values. If a query is used to populate the multi-value parameter, a restore defaults button will appear on the parameter to restore back to the originally queried values.
+You can use a query to seed the multi-value parameter with initial values. You can then manually remove values or add more values. If a query is used to populate the multi-value parameter, a restore defaults button appears on the parameter to restore back to the originally queried values.
1. Start with an empty workbook in edit mode.
-1. Select **add parameters** from the links within the workbook.
-1. Select **Add Parameter**.
-1. In the new parameter pane that pops up enter:
- - Parameter name: `Filter`
- - Parameter type: `Multi-value`
- - Required: `unchecked`
- - Get data from: `JSON`
-1. In the JSON Input text block, insert this json snippet:
+1. Select **Add parameters** > **Add Parameter**.
+1. In the new parameter pane that opens, enter:
+ - **Parameter name**: `Filter`
+ - **Parameter type**: `Multi-value`
+ - **Required**: `unchecked`
+ - **Get data from**: `JSON`
+1. In the **JSON Input** text block, insert this JSON snippet:
+ ``` ["apple", "banana", "carrot" ] ```
- All of the items that are the result of the query will be shown as multi value items.
- (you are not limited to JSON, you can use any query provider to provide initial values, but will be limited to the first 100 results)
+
+ All the items that are the result of the query are shown as multi-value items.
+ You aren't limited to JSON. You can use any query provider to provide initial values, but you'll be limited to the first 100 results.
1. Select **Run Query**.
-1. Select **Save** from the toolbar to create the parameter.
-1. The Filter parameter will be a multi-value parameter with three initial values.
+1. Select **Save** to create the parameter.
+1. The **Filter** parameter will be a multi-value parameter with three initial values.
+
+ :::image type="content" source="media/workbooks-multi-value/workbooks-multi-value-initial-values.png" alt-text="Screenshot that shows the creation of a dynamic dropdown in a workbook.":::
- :::Screenshot type="content" source="media/workbooks-multi-value/workbooks-multi-value-initial-values.png" alt-text="Screenshot showing the creation of a dynamic drop-down in workbooks.":::
## Next steps -- [Workbook parameters](workbooks-parameters.md).-- [Workbook drop down parameters](workbooks-dropdowns.md)
+- [Workbook parameters](workbooks-parameters.md)
+- [Workbook dropdown parameters](workbooks-dropdowns.md)
azure-monitor Workbooks Options Group https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/visualize/workbooks-options-group.md
Title: Azure Workbooks options group parameters.
+ Title: Azure Workbooks options group parameters
description: Learn about adding options group parameters to your Azure workbook.
# Options group parameters
-An options group parameter allows the user to select one value from a known set (for example, select one of your appΓÇÖs requests). When there is a small number of values, an options group can be a better choice than a [drop-down parameter](workbooks-dropdowns.md), since the user can see all the possible values, and see which one is selected. Options groups are commonly used for yes/no or on/off style choices. When there are a large number of possible values, using a drop-down is a better choice. Unlike drop-down parameters, an options group always only allows one selected value.
+When you use an options group parameter, you can select one value from a known set. For example, you can select one of your app's requests. If you're working with a few values, an options group can be a better choice than a [dropdown parameter](workbooks-dropdowns.md). You can see all the possible values and see which one is selected.
+
+Options groups are commonly used for yes/no or on/off style choices. When there are many possible values, using a dropdown list is a better choice. Unlike dropdown parameters, an options group always allows only one selected value.
You can specify the list by:-- providing a static list in the parameter setting-- using a KQL query to retrieve the list dynamically
-## Creating a static options group parameter
+- Providing a static list in the parameter setting.
+- Using a KQL query to retrieve the list dynamically.
+
+## Create a static options group parameter
+ 1. Start with an empty workbook in edit mode.
-1. Choose **Add parameters** from the links within the workbook.
-1. Select **Add Parameter**.
-1. In the new parameter pane that pops up enter:
- - Parameter name: `Environment`
- - Parameter type: `Options Group`
- - Required: `checked`
- - Get data from: `JSON`
-1. In the JSON Input text block, insert this json snippet:
+1. Select **Add parameters** > **Add Parameter**.
+1. In the new parameter pane that opens, enter:
+ - **Parameter name**: `Environment`
+ - **Parameter type**: `Options Group`
+ - **Required**: `checked`
+ - **Get data from**: `JSON`
+1. In the **JSON Input** text block, insert this JSON snippet:
+ ```json [ { "value":"dev", "label":"Development" },
You can specify the list by:
{ "value":"prod", "label":"Production", "selected":true } ] ```
- (you are not limited to JSON, you can use any query provider to provide initial values, but will be limited to the first 100 results)
+
+ You aren't limited to JSON. You can use any query provider to provide initial values, but you'll be limited to the first 100 results.
1. Select **Update**.
-1. Select **Save** from the toolbar to create the parameter.
-1. The Environment parameter will be an options group control with the three values.
+1. Select **Save** to create the parameter.
+1. The **Environment** parameter will be an options group control with the three values.
- :::image type="content" source="media/workbooks-options-group/workbooks-options-group-create.png" alt-text="Screenshot showing the creation of a static options group in a workbook.":::
+ :::image type="content" source="media/workbooks-options-group/workbooks-options-group-create.png" alt-text="Screenshot that shows the creation of a static options group in a workbook.":::
## Next steps -- [Workbook parameters](workbooks-parameters.md).-- [Workbook drop down parameters](workbooks-dropdowns.md)
+- [Workbook parameters](workbooks-parameters.md)
+- [Workbook dropdown parameters](workbooks-dropdowns.md)
azure-monitor Workbooks Parameters https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/visualize/workbooks-parameters.md
Title: Creating Workbook parameters
+ Title: Create workbook parameters
description: Learn how to add parameters to your workbook to collect input from the consumers and reference it in other parts of the workbook.
Last updated 07/05/2022
# Workbook parameters
-Parameters allow workbook authors to collect input from the consumers and reference it in other parts of the workbook ΓÇô usually to scope the result set or setting the right visual. It is a key capability that allows authors to build interactive reports and experiences.
+By using parameters, you can collect input from consumers and reference it in other parts of a workbook. It's usually used to scope the result set or set the right visual. You can build interactive reports and experiences by using this key capability.
-Workbooks allow you to control how your parameter controls are presented to consumers ΓÇô text box vs. drop down, single- vs. multi-select, values from text, JSON, KQL, or Azure Resource Graph, etc.
+When you use workbooks, you can control how your parameter controls are presented to consumers. They can be text box versus dropdown list, single- versus multi-select, and values from text, JSON, KQL, or Azure Resource Graph.
Supported parameter types include:
-* [Time](workbooks-time.md) - allows a user to select from pre-populated time ranges or select a custom range
-* [Drop down](workbooks-dropdowns.md) - allows a user to select from a value or set of values
-* [Options group](workbooks-options-group.md)
-* [Text](workbooks-text.md) - allows a user to enter arbitrary text
-* [Criteria](workbooks-criteria.md)
-* [Resource](workbooks-resources.md) - allows a user to select one or more Azure resources
-* [Subscription](workbooks-resources.md) - allows a user to select one or more Azure subscription resources
-* [Multi-value](workbooks-multi-value.md)
-* Resource Type - allows a user to select one or more Azure resource type values
-* Location - allows a user to select one or more Azure location values
+
+* [Time](workbooks-time.md): Allows you to select from pre-populated time ranges or select a custom range
+* [Drop down](workbooks-dropdowns.md): Allows you to select from a value or set of values
+* [Options group](workbooks-options-group.md): Allows you to select one value from a known set
+* [Text](workbooks-text.md): Allows you to enter arbitrary text
+* [Criteria](workbooks-criteria.md): Allows you to define a set of criteria based on previously specified parameters, which will be evaluated to provide a dynamic value
+* [Resource](workbooks-resources.md): Allows you to select one or more Azure resources
+* [Subscription](workbooks-resources.md): Allows you to select one or more Azure subscription resources
+* [Multi-value](workbooks-multi-value.md): Allows you to set one or more arbitrary text values
+* Resource type: Allows you to select one or more Azure resource type values
+* Location: Allows you to select one or more Azure location values
## Reference a parameter
-You can reference parameters values from other parts of workbooks either using bindings or value expansions.
-### Reference a parameter with Bindings
+
+You can reference parameter values from other parts of workbooks either by using bindings or value expansions.
+
+### Reference a parameter with bindings
This example shows how to reference a time range parameter with bindings:
-1. Add a query control to the workbook and select an Application Insights resource.
-2. Open the _Time Range_ drop-down and select the `Time Range` option from the Parameters section at the bottom.
-3. This binds the time range parameter to the time range of the chart. The time scope of the sample query is now Last 24 hours.
-4. Run query to see the results
+1. Select **Add query** to add a query control, and then select an Application Insights resource.
+1. Open the **Time Range** dropdown list and select the **Time Range** option from the **Parameters** section at the bottom:
+ - This option binds the time range parameter to the time range of the chart.
+ - The time scope of the sample query is now **Last 24 hours**.
+1. Run the query to see the results.
- :::image type="content" source="media/workbooks-parameters/workbooks-time-binding.png" alt-text="Screenshot showing a time range parameter referenced via bindings.":::
+ :::image type="content" source="media/workbooks-parameters/workbooks-time-binding.png" alt-text="Screenshot that shows a time range parameter referenced via bindings.":::
### Reference a parameter with KQL This example shows how to reference a time range parameter with KQL:
-1. Add a query control to the workbook and select an Application Insights resource.
-2. In the KQL, enter a time scope filter using the parameter: `| where timestamp {TimeRange}`
-3. This expands on query evaluation time to `| where timestamp > ago(1d)`, which is the time range value of the parameter.
-4. Run query to see the results
+1. Select **Add query** to add a query control, and then select an Application Insights resource.
+1. In the KQL, enter a time scope filter by using the parameter `| where timestamp {TimeRange}`:
+ - This parameter expands on query evaluation time to `| where timestamp > ago(1d)`.
+ - This option is the time range value of the parameter.
+1. Run the query to see the results.
- :::image type="content" source="media/workbooks-parameters/workbooks-time-in-code.png" alt-text="Screenshot showing a time range referenced in the K Q L query.":::
+ :::image type="content" source="media/workbooks-parameters/workbooks-time-in-code.png" alt-text="Screenshot that shows a time range referenced in the KQL query.":::
-### Reference a parameter with Text
+### Reference a parameter with text
This example shows how to reference a time range parameter with text: 1. Add a text control to the workbook.
-2. In the markdown, enter `The chosen time range is {TimeRange:label}`
-3. Choose _Done Editing_
-4. The text control will show text: _The chosen time range is Last 24 hours_
+1. In the Markdown, enter `The chosen time range is {TimeRange:label}`.
+1. Select **Done Editing**.
+1. The text control shows the text *The chosen time range is Last 24 hours*.
## Parameter formatting options
-Each parameter type has its own formatting options. Use the **Previews** section of the **Edit Parameter** pane to see the formatting expansion options for your parameter:
+Each parameter type has its own formatting options. Use the **Previews** section of the **Edit Parameter** pane to see the formatting expansion options for your parameter.
+
+ :::image type="content" source="media/workbooks-parameters/workbooks-time-settings.png" alt-text="Screenshot that shows time range parameter options.":::
+
+You can use these options to format all parameter types except for **Time range picker**. For examples of formatting times, see [Time parameter options](workbooks-time.md#time-parameter-options).
- :::image type="content" source="media/workbooks-parameters/workbooks-time-settings.png" alt-text="Screenshot showing a time range parameter options.":::
+Other parameter types include:
-You can use these options to format all parameter types except for the time range picker. For examples of formatting times, see [Time parameter options](workbooks-time.md#time-parameter-options).
+ - **Resource picker**: Resource IDs are formatted.
+ - **Subscription picker**: Subscription values are formatted.
- - For Resource picker, resource IDs are formatted.
- - For Subscription picker, subscription values are formatted.
-
### Convert toml to json **Syntax**: `{param:tomltojson}`
-**Original Value**:
+**Original value**:
``` name = "Sam Green"
state = "New York"
country = "USA" ```
-**Formatted Value**:
+**Formatted value**:
``` {
country = "USA"
} } ```+ ### Escape JSON **Syntax**: `{param:escapejson}`
-**Original Value**:
+**Original value**:
``` {
country = "USA"
} ```
-**Formatted Value**:
+**Formatted value**:
``` {\r\n\t\"name\": \"Sam Green\",\r\n\t\"address\": {\r\n\t\t\"state\": \"New York\",\r\n\t\t\"country\": \"USA\"\r\n }\r\n}
country = "USA"
**Syntax**: `{param:base64}`
-**Original Value**:
+**Original value**:
``` Sample text to test base64 encoding ```
-**Formatted Value**:
+**Formatted value**:
``` U2FtcGxlIHRleHQgdG8gdGVzdCBiYXNlNjQgZW5jb2Rpbmc= ```
-## Formatting parameters using JSONPath
+## Format parameters by using JSONPath
+ For string parameters that are JSON content, you can use [JSONPath](workbooks-jsonpath.md) in the parameter format string.
-For example, you may have a string parameter named `selection` that was the result of a query or selection in a visualization that has the following value
+For example, you might have a string parameter named `selection` that was the result of a query or selection in a visualization that has the following value:
+ ```json { "series":"Failures", "x": 5, "y": 10 } ```
-Using JSONPath, you could get individual values from that object:
+By using JSONPath, you could get individual values from that object:
-format | result
+Format | Result
| `{selection:$.series}` | `Failures` `{selection:$.x}` | `5` `{selection:$.y}`| `10` > [!NOTE]
-> If the parameter value is not valid json, the result of the format will be an empty value.
+> If the parameter value isn't valid JSON, the result of the format will be an empty value.
+
+## Parameter style
-## Parameter Style
The following styles are available for the parameters.+ ### Pills
-In pills style, the default style, the parameters look like text, and require the user to select them once to go into the edit mode.
- :::image type="content" source="media/workbooks-parameters/workbooks-pills-read-mode.png" alt-text="Screenshot showing Workbooks pill style read mode.":::
+Pills style is the default style. The parameters look like text and require the user to select them once to go into the edit mode.
+
+ :::image type="content" source="media/workbooks-parameters/workbooks-pills-read-mode.png" alt-text="Screenshot that shows Azure Workbooks pills-style read mode.":::
- :::image type="content" source="media/workbooks-parameters/workbooks-pills-edit-mode.png" alt-text="Screenshot that shows Workbooks pill style edit mode.":::
+ :::image type="content" source="media/workbooks-parameters/workbooks-pills-edit-mode.png" alt-text="Screenshot that shows Azure Workbooks pills-style edit mode.":::
### Standard+ In standard style, the controls are always visible, with a label above the control.
- :::image type="content" source="media/workbooks-parameters/workbooks-standard.png" alt-text="Screenshot that shows Workbooks standard style.":::
+ :::image type="content" source="media/workbooks-parameters/workbooks-standard.png" alt-text="Screenshot that shows Azure Workbooks standard style.":::
+
+### Form horizontal
+
+In form horizontal style, the controls are always visible, with the label on the left side of the control.
-### Form Horizontal
-In horizontal style form, the controls are always visible, with label on left side of the control.
+ :::image type="content" source="media/workbooks-parameters/workbooks-form-horizontal.png" alt-text="Screenshot that shows Azure Workbooks form horizontal style.":::
- :::image type="content" source="media/workbooks-parameters/workbooks-form-horizontal.png" alt-text="Screenshot that shows Workbooks form horizontal style.":::
+### Form vertical
-### Form Vertical
-In vertical style from, the controls are always visible, with label above the control. Unlike standard style, there is only one label or control in one row.
+In form vertical style, the controls are always visible, with the label above the control. Unlike standard style, there's only one label or control in one row.
- :::image type="content" source="media/workbooks-parameters/workbooks-form-vertical.png" alt-text="Screenshot that shows Workbooks form vertical style.":::
+ :::image type="content" source="media/workbooks-parameters/workbooks-form-vertical.png" alt-text="Screenshot that shows Azure Workbooks form vertical style.":::
> [!NOTE]
-> In standard, form horizontal, and form vertical layouts, there's no concept of inline editing, the controls are always in edit mode.
+> In standard, form horizontal, and form vertical layouts, there's no concept of inline editing. The controls are always in edit mode.
## Global parameters
-Now that you've learned how parameters work, and the limitations about only being able to use a parameter "downstream" of where it is set, it is time to learn about global parameters, which change those rules.
-With a global parameter, the parameter must still be declared before it can be used, but any step that sets a value to that parameter will affect all instances of that parameter in the workbook.
+Now that you've learned how parameters work, and the limitations about only being able to use a parameter "downstream" of where it's set, it's time to learn about global parameters, which change those rules.
+
+With a global parameter, the parameter must still be declared before it can be used. But any step that sets a value to that parameter will affect all instances of that parameter in the workbook.
> [!NOTE]
-> Because changing a global parameter has this "update all" behavior, The global setting should only be turned on for parameters that require this behavior. A combination of global parameters that depend on each other can create a cycle or oscillation where the competing globals change each other over and over. In order to avoid cycles, you cannot "redeclare" a parameter that's been declared as global. Any subsequent declarations of a parameter with the same name will create a read only parameter that cannot be edited in that place.
+> Because changing a global parameter has this "update all" behavior, the global setting should only be turned on for parameters that require this behavior. A combination of global parameters that depend on each other can create a cycle or oscillation where the competing globals change each other over and over. To avoid cycles, you can't "redeclare" a parameter that's been declared as global. Any subsequent declarations of a parameter with the same name will create a read-only parameter that can't be edited in that place.
Common uses of global parameters:
-1. Synchronizing time ranges between many charts.
- - without a global parameter, any time range brush in a chart will only be exported after that chart, so selecting a time range in the third chart will only update the fourth chart
- - with a global parameter, you can create a global **timeRange** parameter, give it a default value, and have all the other charts use that as their bound time range and as their time brush output (additionally setting the "only export the parameter when the range is brushed" setting). Any change of time range in any chart will update the global **timeRange** parameter at the top of the workbook. This can be used to make a workbook act like a dashboard.
-
-1. Allowing changing the selected tab in a links step via links or buttons
- - without a global parameter, the links step only outputs a parameter for the selected tab
- - with a global parameter, you can create a global **selectedTab** parameter, and use that parameter name in the tab selections in the links step. This allows you to pass that parameter value into the workbook from a link, or by using another button or link to change the selected tab. Using buttons from a links step in this way can make a wizard-like experience, where buttons at the bottom of a step can affect the visible sections above it.
+1. Synchronize time ranges between many charts:
+ - Without a global parameter, any time range brush in a chart will only be exported after that chart. So, selecting a time range in the third chart will only update the fourth chart.
+ - With a global parameter, you can create a global **timeRange** parameter, give it a default value, and have all the other charts use that as their bound time range and time brush output. In addition, set the **Only export the parameter when a range is brushed** setting. Any change of time range in any chart updates the global **timeRange** parameter at the top of the workbook. This functionality can be used to make a workbook act like a dashboard.
+1. Allow changing the selected tab in a links step via links or buttons:
+ - Without a global parameter, the links step only outputs a parameter for the selected tab.
+ - With a global parameter, you can create a global **selectedTab** parameter. Then you can use that parameter name in the tab selections in the links step. You can pass that parameter value into the workbook from a link or by using another button or link to change the selected tab. Using buttons from a links step in this way can make a wizard-like experience, where buttons at the bottom of a step can affect the visible sections above it.
### Create a global parameter
-When creating the parameter in a parameters step, use the "Treat this parameter as a global" option in advanced settings. The only way to make a global parameter is to declare it with a parameters step. The other methods of creating parameters (via selections, brushing, links, buttons, tabs) can only update a global parameter, they cannot themselves declare one.
- :::image type="content" source="media/workbooks-parameters/workbooks-parameters-global-setting.png" alt-text="Screenshot of setting global parameters in Workbooks.":::
+When you create the parameter in a parameters step, use the **Treat this parameter as a global** option in **Advanced Settings**. The only way to make a global parameter is to declare it with a parameters step. The other methods of creating parameters, via selections, brushing, links, buttons, and tabs, can only update a global parameter. They can't declare one themselves.
+
+ :::image type="content" source="media/workbooks-parameters/workbooks-parameters-global-setting.png" alt-text="Screenshot that shows setting global parameters in a workbook.":::
The parameter will be available and function as normal parameters do.
-### Updating the value of an existing global parameter
-For the chart example above, the most common way to update a global parameter is by using Time Brushing.
+### Update the value of an existing global parameter
+
+For the chart example, the most common way to update a global parameter is by using time brushing.
-In this example, the **timerange** parameter above is declared as a global. In a query step below that, create and run a query that uses that **timerange** parameter in the query and returns a time chart result. In the advanced settings for the query step, enable the time range brushing setting, and use the same parameter name as the output for the time brush parameter, and also set the only export the parameter when brushed option.
+In this example, the **timerange** parameter is declared as global. In a query step below that, create and run a query that uses that **timerange** parameter in the query and returns a time chart result. In **Advanced Settings** for the query step, enable the time range brushing setting. Use the same parameter name as the output for the time brush parameter. Also, select the **Only export the parameter when a range is brushed** option.
- :::image type="content" source="media/workbooks-parameters/workbooks-global-time-range-brush.png" alt-text="Screenshot of global time brush setting in Workbooks.":::
+ :::image type="content" source="media/workbooks-parameters/workbooks-global-time-range-brush.png" alt-text="Screenshot that shows the global time brush setting in a workbook.":::
-Whenever a time range is brushed in this chart, it will also update the **timerange** parameter above this query, and the query step itself (since it also depends on **timerange**!):
+Whenever a time range is brushed in this chart, it also updates the **timerange** parameter above this query, and the query step itself, because it also depends on **timerange**.
1. Before brushing:
- - The time range is shown as "last hour".
+ - The time range is shown as **Last hour**.
- The chart shows the last hour of data.
- :::image type="content" source="media/workbooks-parameters/workbooks-global-before-brush.png" alt-text="Screenshot of setting global parameters before brushing.":::
+ :::image type="content" source="media/workbooks-parameters/workbooks-global-before-brush.png" alt-text="Screenshot that shows setting global parameters before brushing.":::
1. During brushing:
- - The time range is still last hour, and the brushing outlines are drawn.
- - No parameters/etc have changed. once you let go of the brush, the time range will be updated.
+ - The time range is still the last hour, and the brushing outlines are drawn.
+ - No parameters have changed. After you let go of the brush, the time range is updated.
- :::image type="content" source="media/workbooks-parameters/workbooks-global-during-brush.png" alt-text="Screenshot of setting global parameters during brushing.":::
+ :::image type="content" source="media/workbooks-parameters/workbooks-global-during-brush.png" alt-text="Screenshot that shows setting global parameters during brushing.":::
1. After brushing:
- - The time range specified by the time brush will be set by this step, overriding the global value (the timerange dropdown now displays that custom time range).
- - Because the global value at the top has changed, and because this chart depends on **timerange** as an input, the time range of the query used in the chart will also update, causing the query to and the chart to update.
+ - The time range specified by the time brush is set by this step. It overrides the global value. The **timerange** dropdown list now displays that custom time range.
+ - Because the global value at the top has changed, and because this chart depends on **timerange** as an input, the time range of the query used in the chart also updates. As a result, the query and the chart will update.
- Any other steps in the workbook that depend on **timerange** will also update.
- :::image type="content" source="media/workbooks-parameters/workbooks-global-after-brush.png" alt-text="Screenshot of setting global parameters after brushing.":::
+ :::image type="content" source="media/workbooks-parameters/workbooks-global-after-brush.png" alt-text="Screenshot that shows setting global parameters after brushing.":::
> [!NOTE]
- > If you do not use a global parameter, the **timerange** parameter value will only change below this query step, things above or this item itself would not update.
+ > If you don't use a global parameter, the **timerange** parameter value will only change below this query step. Things above this step or this item itself won't update.
azure-monitor Workbooks Resources https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/visualize/workbooks-resources.md
Title: Azure Monitor workbooks resource parameters
-description: Learn how to use resource parameters to allow picking of resources in workbooks. Use the resource parameters to set the scope from which to get the data from.
+ Title: Azure Monitor workbook resource parameters
+description: Learn how to use resource parameters to allow picking of resources in workbooks. Use the resource parameters to set the scope from which to get the data.
ibiza
Last updated 07/05/2022
# Workbook resource parameters
-Resource parameters allow picking of resources in workbooks. This is useful in setting the scope from which to get the data from. An example is allowing users to select the set of VMs, which the charts later will use when presenting the data.
+Resource parameters allow picking of resources in workbooks. This functionality is useful in setting the scope from which to get the data. An example would be allowing you to select the set of VMs, which charts use later when presenting the data.
-Values from resource pickers can come from the workbook context, static list or from Azure Resource Graph queries.
+Values from resource pickers can come from the workbook context, static list, or Azure Resource Graph queries.
## Create a resource parameter (workbook resources)+ 1. Start with an empty workbook in edit mode.
-2. Choose _Add parameters_ from the links within the workbook.
-3. Click on the blue _Add Parameter_ button.
-4. In the new parameter pane that pops up enter:
- 1. Parameter name: `Applications`
- 2. Parameter type: `Resource picker`
- 3. Required: `checked`
- 4. Allow multiple selections: `checked`
-5. Get data from: `Workbook Resources`
-6. Include only resource types: `Application Insights`
-7. Choose 'Save' from the toolbar to create the parameter.
-
-![Image showing the creation of a resource parameter using workbook resources](./media/workbooks-resources/resource-create.png)
+1. Select **Add parameters** > **Add Parameter**.
+1. In the new parameter pane that opens, enter:
+ 1. **Parameter name**: `Applications`
+ 1. **Parameter type**: `Resource picker`
+ 1. **Required**: `checked`
+ 1. **Allow multiple selections**: `checked`
+ 1. **Get data from**: `Workbook Resources`
+ 1. **Include only resource types**: `Application Insights`
+1. Select **Save** to create the parameter.
+
+ ![Screenshot that shows the creation of a resource parameter by using workbook resources.](./media/workbooks-resources/resource-create.png)
## Create an Azure Resource Graph resource parameter+ 1. Start with an empty workbook in edit mode.
-2. Choose _Add parameters_ from the links within the workbook.
-3. Click on the blue _Add Parameter_ button.
-4. In the new parameter pane that pops up enter:
- 1. Parameter name: `Applications`
- 2. Parameter type: `Resource picker`
- 3. Required: `checked`
- 4. Allow multiple selections: `checked`
-5. Get data from: `Query`
- 1. Query Type: `Azure Resource Graph`
- 2. Subscriptions: `Use default subscriptions`
- 3. In the query control, add this snippet
+1. Select **Add parameters** > **Add Parameter**.
+1. In the new parameter pane that opens, enter:
+ 1. **Parameter name**: `Applications`
+ 1. **Parameter type**: `Resource picker`
+ 1. **Required**: `checked`
+ 1. **Allow multiple selections**: `checked`
+ 1. **Get data from**: `Query`
+ 1. **Query Type**: `Azure Resource Graph`
+ 1. **Subscriptions**: `Use default subscriptions`
+ 1. In the query control, add this snippet:
+ ```kusto where type == 'microsoft.insights/components' | project value = id, label = name, selected = false, group = resourceGroup ```
-7. Choose 'Save' from the toolbar to create the parameter.
-![Image showing the creation of a resource parameter using Azure Resource Graph](./media/workbooks-resources/resource-query.png)
+1. Select **Save** to create the parameter.
+
+ ![Screenshot that shows the creation of a resource parameter by using Azure Resource Graph.](./media/workbooks-resources/resource-query.png)
> [!NOTE]
-> Azure Resource Graph is not yet available in all clouds. Ensure that it is supported in your target cloud if you choose this approach.
+> Azure Resource Graph isn't yet available in all clouds. Ensure that it's supported in your target cloud if you choose this approach.
-[Azure Resource Graph documentation](../../governance/resource-graph/overview.md)
+For more information on Azure Resource Graph, see [What is Azure Resource Graph?](../../governance/resource-graph/overview.md).
## Create a JSON list resource parameter+ 1. Start with an empty workbook in edit mode.
-2. Choose _Add parameters_ from the links within the workbook.
-3. Click on the blue _Add Parameter_ button.
-4. In the new parameter pane that pops up enter:
- 1. Parameter name: `Applications`
- 2. Parameter type: `Resource picker`
- 3. Required: `checked`
- 4. Allow multiple selections: `checked`
-5. Get data from: `JSON`
- 1. In the content control, add this json snippet
- ```json
- [
- { "value":"/subscriptions/<sub-id>/resourceGroups/<resource-group>/providers/<resource-type>/acmeauthentication", "label": "acmeauthentication", "selected":true, "group":"Acme Backend" },
- { "value":"/subscriptions/<sub-id>/resourceGroups/<resource-group>/providers/<resource-type>/acmeweb", "label": "acmeweb", "selected":false, "group":"Acme Frontend" }
- ]
- ```
- 2. Hit the blue _Update_ button.
-6. Optionally set the `Include only resource types` to _Application Insights_
-7. Choose 'Save' from the toolbar to create the parameter.
+1. Select **Add parameters** > **Add Parameter**.
+1. In the new parameter pane that opens, enter:
+ 1. **Parameter name**: `Applications`
+ 1. **Parameter type**: `Resource picker`
+ 1. **Required**: `checked`
+ 1. **Allow multiple selections**: `checked`
+ 1. **Get data from**: `JSON`
+ 1. In the content control, add this JSON snippet:
+
+ ```json
+ [
+ { "value":"/subscriptions/<sub-id>/resourceGroups/<resource-group>/providers/<resource-type>/acmeauthentication", "label": "acmeauthentication", "selected":true, "group":"Acme Backend" },
+ { "value":"/subscriptions/<sub-id>/resourceGroups/<resource-group>/providers/<resource-type>/acmeweb", "label": "acmeweb", "selected":false, "group":"Acme Frontend" }
+ ]
+ ```
+
+ 1. Select **Update**.
+1. Optionally, set `Include only resource types` to **Application Insights**.
+1. Select **Save** to create the parameter.
## Reference a resource parameter
-1. Add a query control to the workbook and select an Application Insights resource.
-2. Use the _Application Insights_ drop down to bind the parameter to the control. Doing this sets the scope of the query to the resources returned by the parameter at run time.
-4. In the KQL control, add this snippet
+
+1. Select **Add query** to add a query control, and then select an Application Insights resource.
+1. Use the **Application Insights** dropdown list to bind the parameter to the control. This step sets the scope of the query to the resources returned by the parameter at runtime.
+1. In the KQL control, add this snippet:
+ ```kusto requests | summarize Requests = count() by appName, name | order by Requests desc ```
-5. Run query to see the results.
-![Image showing a resource parameter referenced in a query control](./media/workbooks-resources/resource-reference.png)
+1. Run the query to see the results.
-> This approach can be used to bind resources to other controls like metrics.
+ ![Screenshot that shows a resource parameter referenced in a query control.](./media/workbooks-resources/resource-reference.png)
+
+This approach can be used to bind resources to other controls like metrics.
## Resource parameter options
-| Parameter | Explanation | Example |
+
+| Parameter | Description | Example |
| - |:-|:-|
-| `{Applications}` | The selected resource ID | _/subscriptions/\<sub-id\>/resourceGroups/\<resource-group\>/providers/\<resource-type\>/acmeauthentication_ |
-| `{Applications:label}` | The label of the selected resource | `acmefrontend` |
-| `{Applications:value}` | The value of the selected resource | _'/subscriptions/\<sub-id\>/resourceGroups/\<resource-group\>/providers/\<resource-type\>/acmeauthentication'_ |
-| `{Applications:name}` | The name of the selected resource | `acmefrontend` |
-| `{Applications:resourceGroup}` | The resource group of the selected resource | `acmegroup` |
-| `{Applications:resourceType}` | The type of the selected resource | _microsoft.insights/components_ |
-| `{Applications:subscription}` | The subscription of the selected resource | |
-| `{Applications:grid}` | A grid showing the resource properties. Useful to render in a text block while debugging | |
+| `{Applications}` | The selected resource ID. | _/subscriptions/\<sub-id\>/resourceGroups/\<resource-group\>/providers/\<resource-type\>/acmeauthentication_ |
+| `{Applications:label}` | The label of the selected resource. | `acmefrontend` |
+| `{Applications:value}` | The value of the selected resource. | _'/subscriptions/\<sub-id\>/resourceGroups/\<resource-group\>/providers/\<resource-type\>/acmeauthentication'_ |
+| `{Applications:name}` | The name of the selected resource. | `acmefrontend` |
+| `{Applications:resourceGroup}` | The resource group of the selected resource. | `acmegroup` |
+| `{Applications:resourceType}` | The type of the selected resource. | _microsoft.insights/components_ |
+| `{Applications:subscription}` | The subscription of the selected resource. | |
+| `{Applications:grid}` | A grid that shows the resource properties. Useful to render in a text block while debugging. | |
## Next steps
+[Getting started with Azure Workbooks](workbooks-getting-started.md)
azure-monitor Workbooks Text https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/visualize/workbooks-text.md
Title: Azure Monitor workbooks text parameters
+ Title: Azure Monitor workbook text parameters
description: Simplify complex reporting with prebuilt and custom parameterized workbooks. Learn more about workbook text parameters.
Last updated 07/05/2022
# Workbook text parameters
-Textbox parameters provide a simple way to collect text input from workbook users. They're used when it isn't practical to use a drop-down to collect the input (for example, an arbitrary threshold or generic filters). Workbooks allow authors to get the default value of the textbox from a query. This allows interesting scenarios like setting the default threshold based on the p95 of the metric.
+Text box parameters provide a simple way to collect text input from workbook users. They're used when it isn't practical to use a dropdown list to collect the input, for example, with an arbitrary threshold or generic filters. By using a workbook, you can get the default value of the text box from a query. This functionality allows for interesting scenarios like setting the default threshold based on the p95 of the metric.
-A common use of textboxes is as internal variables used by other workbook controls. This is done by using a query for default values, and making the input control invisible in read-mode. For example, a user may want a threshold to come from a formula (not a user) and then use the threshold in subsequent queries.
+A common use of text boxes is as internal variables used by other workbook controls. You use a query for default values and make the input control invisible in read mode. For example, you might want a threshold to come from a formula, not a user, and then use the threshold in subsequent queries.
## Create a text parameter+ 1. Start with an empty workbook in edit mode.
-2. Choose _Add parameters_ from the links within the workbook.
-3. Select on the blue _Add Parameter_ button.
-4. In the new parameter pane that pops up enter:
- 1. Parameter name: `SlowRequestThreshold`
- 2. Parameter type: `Text`
- 3. Required: `checked`
- 4. Get data from: `None`
-5. Choose 'Save' from the toolbar to create the parameter.
+1. Select **Add parameters** > **Add Parameter**.
+1. In the new parameter pane that opens, enter:
+ 1. **Parameter name**: `SlowRequestThreshold`
+ 1. **Parameter type**: `Text`
+ 1. **Required**: `checked`
+ 1. **Get data from**: `None`
+1. Select **Save** to create the parameter.
- :::image type="content" source="./media/workbooks-text/text-create.png" alt-text="Screenshot showing the creation of a text parameter.":::
+ :::image type="content" source="./media/workbooks-text/text-create.png" alt-text="Screenshot that shows the creation of a text parameter.":::
-This is how the workbook will look like in read-mode.
+This screenshot shows how the workbook looks in read mode:
## Parameter field style
-Text parameter supports following field style:
-- Standard: A single line text field.
+The text parameter supports the following field styles:
+
+- **Standard**: A single line text field.
+
+ :::image type="content" source="./media/workbooks-text/standard-text.png" alt-text="Screenshot that shows a standard text field.":::
- :::image type="content" source="./media/workbooks-text/standard-text.png" alt-text="Screenshot showing standard text field.":::
+- **Password**: A single line password field. The password value is only hidden in the UI when you type. The value is fully accessible as a parameter value when referred. It's stored unencrypted when the workbook is saved.
-- Password: A single line password field. The password value is only hidden on UI when user types. The value is still fully accessible as a param value when referred and it's stored unencrypted when workbook is saved.
+ :::image type="content" source="./media/workbooks-text/password-text.png" alt-text="Screenshot that shows a password field.":::
- :::image type="content" source="./media/workbooks-text/password-text.png" alt-text="Screenshot showing password field.":::
+- **Multiline**: A multiline text field with support of rich IntelliSense and syntax colorization for the following languages:
-- Multiline: A multiline text field with support of rich intellisense and syntax colorization for following languages: - Text - Markdown - JSON
Text parameter supports following field style:
- KQL - TOML
- User can also specify the height for the multiline editor.
+ You can also specify the height for the multiline editor.
- :::image type="content" source="./media/workbooks-text/kql-text.png" alt-text="Screenshot showing multiline text field.":::
+ :::image type="content" source="./media/workbooks-text/kql-text.png" alt-text="Screenshot that shows a multiline text field.":::
## Reference a text parameter
-1. Add a query control to the workbook by selecting the blue `Add query` link and select an Application Insights resource.
-2. In the KQL box, add this snippet:
+
+1. Select **Add query** to add a query control, and then select an Application Insights resource.
+1. In the KQL box, add this snippet:
+ ```kusto requests | summarize AllRequests = count(), SlowRequests = countif(duration >= {SlowRequestThreshold}) by name | extend SlowRequestPercent = 100.0 * SlowRequests / AllRequests | order by SlowRequests desc ```
-3. By using the text parameter with a value of 500 coupled with the query control you effectively running the query below:
+
+1. By using the text parameter with a value of 500 coupled with the query control, you effectively run the following query:
+ ```kusto requests | summarize AllRequests = count(), SlowRequests = countif(duration >= 500) by name | extend SlowRequestPercent = 100.0 * SlowRequests / AllRequests | order by SlowRequests desc ```
-4. Run query to see the results
- :::image type="content" source="./media/workbooks-text/text-reference.png" alt-text="Screenshot showing a text parameter referenced in KQL.":::
+1. Run the query to see the results.
+
+ :::image type="content" source="./media/workbooks-text/text-reference.png" alt-text="Screenshot that shows a text parameter referenced in KQL.":::
> [!NOTE]
-> In the example above, `{SlowRequestThreshold}` represents an integer value. If you were querying for a string like `{ComputerName}` you would need to modify your Kusto query to add quotes `"{ComputerName}"` in order for the parameter field to an accept input without quotes.
+> In the preceding example, `{SlowRequestThreshold}` represents an integer value. If you were querying for a string like `{ComputerName}`, you would need to modify your Kusto query to add quotation marks `"{ComputerName}"` in order for the parameter field to accept an input without quotation marks.
## Set the default values using queries+ 1. Start with an empty workbook in edit mode.
-2. Choose _Add parameters_ from the links within the workbook.
-3. Select on the blue _Add Parameter_ button.
-4. In the new parameter pane that pops up enter:
- 1. Parameter name: `SlowRequestThreshold`
- 2. Parameter type: `Text`
- 3. Required: `checked`
- 4. Get data from: `Query`
-5. In the KQL box, add this snippet:
+1. Select **Add parameters** > **Add Parameter**.
+1. In the new parameter pane that opens, enter:
+ 1. **Parameter name**: `SlowRequestThreshold`
+ 1. **Parameter type**: `Text`
+ 1. **Required**: `checked`
+ 1. **Get data from**: `Query`
+1. In the KQL box, add this snippet:
+ ```kusto requests | summarize round(percentile(duration, 95), 2) ```+ This query sets the default value of the text box to the 95th percentile duration for all requests in the app.
-6. Run query to see the result
-7. Choose 'Save' from the toolbar to create the parameter.
+1. Run the query to see the results.
+1. Select **Save** to create the parameter.
- :::image type="content" source="./media/workbooks-text/text-default-value.png" alt-text="Screenshot showing a text parameter with default value from KQL.":::
+ :::image type="content" source="./media/workbooks-text/text-default-value.png" alt-text="Screenshot that shows a text parameter with a default value from KQL.":::
> [!NOTE]
-> While this example queries Application Insights data, the approach can be used for any log based data source - Log Analytics, Azure Resource Graph, etc.
+> While this example queries Application Insights data, the approach can be used for any log-based data source, such as Log Analytics and Azure Resource Graph.
-## Add validations
+## Add validations
-For standard and password text parameters, user can add validation rules that are applied to the text field. Add a valid regex with error message. If message is set, it's shown as error when field is invalid.
+For standard and password text parameters, you can add validation rules that are applied to the text field. Add a valid regex with an error message. If the message is set, it's shown as an error when the field is invalid.
-If match is selected, the field is valid if value matches the regex and if match isn't selected then the field is valid if it doesn't match the regex.
+If the match is selected, the field is valid if the value matches the regex. If the match isn't selected, the field is valid if it doesn't match the regex.
-## Format JSON data
+## Format JSON data
-If JSON is selected as the language for the multiline text field, then the field will have a button that will format the JSON data of the field. User can also use the shortcut `(ctrl + \)` to format the JSON data.
+If JSON is selected as the language for the multiline text field, the field will have a button that formats the JSON data of the field. You can also use the shortcut Ctrl + \ to format the JSON data.
-If data is coming from a query, user can select the option to pre-format the JSON data returned by the query.
+If data is coming from a query, you can select the option to pre-format the JSON data that's returned by the query.
## Next steps
+[Get started with Azure Workbooks](workbooks-getting-started.md)
azure-monitor