Service | Microsoft Docs article | Related commit history on GitHub | Change details |
---|---|---|---|
active-directory-b2c | Find Help Open Support Ticket | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/find-help-open-support-ticket.md | |
active-directory-b2c | Idp Pass Through User Flow | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/idp-pass-through-user-flow.md | When testing your applications in Azure AD B2C, it can be useful to have the Azu  +## Pass the IDP refresh token (optional) ++The access token the identity provider returns is valid for a short period of time. Some identity providers also issue a refresh token along with the access token. Your client application can then exchange the identity provider's refresh token for a new access token when needed. ++Azure AD B2C custom policy supports passing the refresh token of OAuth 2.0 identity providers, which includes [Facebook](https://github.com/azure-ad-b2c/unit-tests/tree/main/Identity-providers#facebook-with-access-token), [Google](https://github.com/azure-ad-b2c/unit-tests/tree/main/Identity-providers#facebook-with-access-token) and [GitHub](https://github.com/azure-ad-b2c/unit-tests/tree/main/Identity-providers#github-with-access-token). ++To pass the identity provider's refresh token, follow these steps: ++1. Open your *TrustframeworkExtensions.xml* file and add the following **ClaimType** element with an identifier of `identityProviderRefreshToken` to the **ClaimsSchema** element. + + ```xml + <ClaimType Id="identityProviderRefreshToken"> + <DisplayName>Identity provider refresh token</DisplayName> + <DataType>string</DataType> + </ClaimType> + ``` + +1. Add the **OutputClaim** element to the **TechnicalProfile** element for each OAuth 2.0 identity provider that you would like the refresh token for. The following example shows the element added to the Facebook technical profile: + + ```xml + <ClaimsProvider> + <DisplayName>Facebook</DisplayName> + <TechnicalProfiles> + <TechnicalProfile Id="Facebook-OAUTH"> + <OutputClaims> + <OutputClaim ClaimTypeReferenceId="identityProviderRefreshToken" PartnerClaimType="{oauth2:refresh_token}" /> + </OutputClaims> + ... + </TechnicalProfile> + </TechnicalProfiles> + </ClaimsProvider> + ``` ++1. Some identity providers require you to include metadata or scopes to the identity provider's technical profile. ++ - For Google identity provider, add two claim types `access_type` and `prompt`. Then add the following input claims to the identity provider's technical profile: ++ ```xml + <InputClaims> + <InputClaim ClaimTypeReferenceId="access_type" PartnerClaimType="access_type" DefaultValue="offline" AlwaysUseDefaultValue="true" /> + + <!-- The refresh_token is return only on the first authorization for a given user. Subsequent authorization request doesn't return the refresh_token. + To fix this issue we add the prompt=consent query string parameter to the authorization request--> + <InputClaim ClaimTypeReferenceId="prompt" PartnerClaimType="prompt" DefaultValue="consent" AlwaysUseDefaultValue="true" /> + </InputClaims> + ``` + + - Other identity providers may have different methods to issue a refresh token. Follow the identity provider's audience and add the necessary elements to your identity provider's technical profile. ++1. Save the changes you made in your *TrustframeworkExtensions.xml* file. +1. Open your relying party policy file, such as *SignUpOrSignIn.xml*, and add the **OutputClaim** element to the **TechnicalProfile**: ++ ```xml + <RelyingParty> + <DefaultUserJourney ReferenceId="SignUpOrSignIn" /> + <TechnicalProfile Id="PolicyProfile"> + <OutputClaims> + <OutputClaim ClaimTypeReferenceId="identityProviderRefreshToken" PartnerClaimType="idp_refresh_token"/> + </OutputClaims> + ... + </TechnicalProfile> + </RelyingParty> + ``` ++1. Save the changes you made in your policy's relying party policy file. +1. Upload the *TrustframeworkExtensions.xml* file, and then the relying party policy file. +1. [Test your policy](#test-your-policy) + ++ + ::: zone-end ## Next steps |
active-directory-b2c | Partner Haventec | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/partner-haventec.md | Title: Tutorial to configure Azure Active Directory B2C with Haventec + Title: Configure Haventec Authenticate with Azure Active Directory B2C for single-step, multi-factor passwordless authentication -description: Learn how to integrate Azure AD B2C authentication with Haventec for multifactor passwordless authentication +description: Learn to integrate Azure AD B2C with Haventec Authenticate for multi-factor passwordless authentication -+ Previously updated : 12/02/2021 Last updated : 03/10/2023 -# Tutorial: Configure Haventec with Azure Active Directory B2C for single step, multifactor passwordless authentication +# Tutorial: Configure Haventec Authenticate with Azure Active Directory B2C for single-step, multi-factor passwordless authentication -In this sample tutorial, learn how to integrate Azure Active Directory (AD) B2C authentication with [Haventec](https://www.haventec.com/). Haventec provides decentralized identity platform that transform security, accessibility, and experience. Haventec Authenticate provides a passwordless technology that eliminates passwords, shared secrets, and friction. +Learn to integrate Azure Active Directory B2C (Azure AD B2C) with Haventec Authenticate, a passwordless technology that eliminates passwords, shared secrets, and friction. -## Scenario description +To learn more, go to haventec.com: [Haventec](https://www.haventec.com/) -The Haventec integration includes the following components: +## Scenario description -- Azure AD B2C - The authorization server, responsible for verifying the user's credentials, also known as the Identity Provider.+The Authenticate integration includes the following components: -- Web and mobile applications - Any Open ID Connect (OIDC) mobile or web applications protected by Haventec and Azure AD B2C.+* **Azure AD B2C** - authorization server that verifies user credentials + * Also known as the identity provider (IdP) +* **Web and mobile applications** - Open ID Connect (OIDC) mobile or web applications protected by Authenticate and Azure AD B2C +* **Haventec Authenticate service** - external IdP for the Azure AD B2C tenant -- Haventec Authenticate service - Acts as the external Identity Provider to your Azure AD B2C tenant.+The following diagram illustrates sign-up and sign-in user flows in the Haventec Authenticate integration. -The following architecture diagram shows the implementation. +  - +1. User selects sign-in or sign-up and enters a username. +2. The application sends user attributes to Azure AD B2C for identity verification. +3. Azure AD B2C collects user attributes and sends them to Haventec Authenticate. +4. For new users, Authenticate sends push notification to the user mobile device. It can send email with a one-time password (OTP) for device registration. +5. User responds and is granted or denied access. New cryptographic keys are pushed to the user device for a future session. -| Steps | Description | -|:-|:-| -| 1. | User arrives at a login page. Users select sign-in/sign-up and enter the username| -| 2. | The application sends the user attributes to Azure AD B2C for identity verification.| -| 3.| Azure AD B2C collects the user attributes and sends the attributes to Haventec to authenticate the user through the Haventec Authenticate app.| -| 4. |For new users only, Haventec Authenticate sends a push notification to the registered users' mobile device. It can also send an email with an OTP for device registration.| -| 5. | After the user responds to the push notification, the user is either granted or denied access to the customer application based on the verification results. New cryptographic keys are generated and pushed into the user's device to have it ready for the next session. | +## Get started with Authenticate -## Onboard with Haventec +Go to the haventec.com [Get a demo of Haventec Authenticate](https://www.haventec.com/products/get-started) page. In the personalized demo request form, indicate your interest in Azure AD B2C integration. An email arrives when the demo environment is ready. -Get in touch with Haventec to [request a demo](https://www.haventec.com/products/get-started). While filling out the request form, indicate that you want to onboard with Azure AD B2C. You'll be notified through email once your demo environment is ready. +## Integrate Authenticate with Azure AD B2C -## Integrate Haventec with Azure AD B2C +Use the following instructions to prepare for and integrate Azure AD B2C with Authenticate. ### Prerequisites -To get started, you'll need: --- An Azure AD subscription. If you don\'t have one, get a [free- account](https://azure.microsoft.com/free/). --- An [Azure AD B2C tenant](tutorial-create-tenant.md) that is linked to your Azure subscription.--- A Haventec Authenticate [demo environment](https://www.haventec.com/products/get-started).--### Part - 1 Create an application registration in Haventec --If you haven't already done so, [register](tutorial-register-applications.md) a web application. --### Part - 2 Add a new Identity provider in Azure AD B2C +To get started, you need: -1. Sign in to the [Azure portal](https://portal.azure.com/#home) as the global administrator of your Azure AD B2C tenant. +* An Azure AD subscription + * If you don't have one, get an [Azure free account](https://azure.microsoft.com/free/) +* An Azure AD B2C tenant linked to the Azure subscription + * see, [Tutorial: Create an Azure Active Directory B2C tenant](tutorial-create-tenant.md) +* A Haventec Authenticate demo environment + * See, [Get a demo of Haventec Authenticate](https://www.haventec.com/products/get-started) -2. Make sure you're using the directory that contains your Azure AD B2C tenant by selecting the **Directory + subscription** filter in the top menu and choosing the directory that contains your tenant. +### Create a web application registration -3. Choose **All services** in the top-left corner of the Azure portal, search for and select **Azure AD B2C**. +Before applications can interact with Azure AD B2C, register them in a tenant you manage. -4. Navigate to **Dashboard** > **Azure Active Directory B2C** > **Identity providers**. +See, [Tutorial: Register a web application in Azure Active Directory B2C](tutorial-register-applications.md) -5. Select **New OpenID Connect Provider**. +### Add a new identity provider in Azure AD B2C -6. Select **Add**. +For the following instructions, use the directory with the Azure AD B2C tenant. -### Part - 3 Configure an Identity provider +1. Sign in to the [Azure portal](https://portal.azure.com/#home) as the Global Administrator of your Azure AD B2C tenant. +2. In the top menu, select **Directory + subscription**. +3. Select the directory with the tenant. +4. In the top-left corner of the Azure portal, select **All services**. +5. Search for and select **Azure AD B2C**. +6. Navigate to **Dashboard** > **Azure Active Directory B2C** > **Identity providers**. +7. Select **New OpenID Connect Provider**. +8. Select **Add**. -To configure an identity provider, follow these steps: +### Configure an identity provider -1. Select **Identity provider type** > **OpenID Connect** +To configure an identity provider: -2. Fill out the form to set up the Identity provider: -- | Property | Value| - |:--|:| - |Name |Enter Haventec or a name of your choice| - |Metadata URL| `https://iam.demo.haventec.com/auth/realms/*your\_realm\_name*/.well-known/openid-configuration`| - |Client ID | The application ID from the Haventec admin UI captured in Part - 1 | - |Client Secret | The application Secret from the Haventec admin UI captured in Part - 1 | - |Scope | OpenID email profile| - |Response type | Code | - |Response mode | forms_post | - |Domain hint | Blank | --3. Select **OK**. --4. Select **Map this identity provider's claims**. --5. Fill out the form to map the Identity provider: -- | Property | Value| - |:--|:| - | User ID | From subscription | - | Display name | From subscription | - | Given name | given_name | - | Surname | family_name | - | Email | Email | --6. Select **Save** to complete the setup for your new OIDC Identity provider. +1. Select **Identity provider type** > **OpenID Connect**. +2. For **Name**, enter **Haventec**, or another name. +3. For **Metadata URL**, use `https://iam.demo.haventec.com/auth/realms/*your\_realm\_name*/.well-known/openid-configuration`. +4. For **Client ID**, enter the application ID recorded from the Haventec admin UI. +5. For **Client Secret**, enter the application Secret recorded from the Haventec admin UI. +6. For **Scope**, select **OpenID email profile**. +7. For **Response type**, select **Code**. +8. For **Response mode**, select **forms_post**. +9. For **Domain hint**, leave blank. +10. Select **OK**. +11. Select **Map this identity provider's claims**. +12. For **User ID**, select **From subscription**. +13. For **Display** name, select **From subscription**. +14. For **Given name**, use **given_name**. +15. For **Surname**, use **family_name**. +16. For **Email**, use **Email**. +17. Select **Save**. ## Create a user flow policy -You should now see Haventec as a new OIDC Identity provider listed within your B2C identity providers. --1. In your Azure AD B2C tenant, under **Policies**, select **User flows**. +For the following instructions, Haventec is a new OIDC identity provider in the B2C identity providers list. +1. In the Azure AD B2C tenant, under **Policies**, select **User flows**. 2. Select **New user flow**.- 3. Select **Sign up and sign in** > **version** > **Create**.--4. Enter a **Name** for your policy. --5. In the Identity providers section, select your newly created Haventec Identity provider. --6. Select **None** for Local Accounts to disable email and password-based authentication. --7. Select **Run user flow** --8. In the form, enter the Replying URL, for example, `https://jwt.ms` --9. The browser will be redirected to the Haventec login page --10. User will be asked to register if new or enter a PIN for an existing user. --11. Once the authentication challenge is accepted, the browser will redirect the user to the replying URL. +4. Enter a **Name** for the policy. +5. In **Identity providers**, select the created Haventec identity provider. +6. For **Local Accounts**, select **None**. This selection disables email and password authentication. +7. Select **Run user flow**. +8. In the form, enter the replying URL, for example, `https://jwt.ms`. +9. The browser redirects to the Haventec sign-in page. +10. User is prompted to register, or enter a PIN. +11. The authentication challenge is performed. +12. The browser redirects to the replying URL. ## Test the user flow -Open the Azure AD B2C tenant and under Policies select **User flows**. --1. Select your previously created **User Flow**. --2. Select **Run user flow** and select the settings: -- a. **Application**: select the registered app (sample is JWT) -- b. **Reply URL**: select the redirect URL -- c. Select **Run user flow**. --3. Go through sign-up flow and create an account --4. Haventec Authenticate will be called during the flow. --## Additional resources +1. In the Azure AD B2C tenant, under **Policies**, select **User flows**. +2. Select the created **User Flow**. +3. Select **Run user flow**. +4. For **Application**, select the registered app. The example is JWT. +5. For **Reply URL**, select the redirect URL. +6. Select **Run user flow**. +7. Perform a sign-up flow and create an account. +8. Haventec Authenticate is called. -For additional information, review the following articles: +## Next steps -- [Haventec](https://docs.haventec.com/) documentation+* Go to docs.haventec.com for [Haventec Documentation](https://docs.haventec.com/) +* [Azure AD B2C custom policy overview](custom-policy-overview.md) -- [Custom policies in Azure AD B2C](custom-policy-overview.md) -- [Get started with custom policies in Azure AD B2C](custom-policy-get-started.md?tabs=applications) |
active-directory-b2c | Partner Nok Nok | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/partner-nok-nok.md | Title: Tutorial to configure Azure Active Directory B2C with Nok Nok + Title: Tutorial to configure Nok Nok Passport with Azure Active Directory B2C for passwordless FIDO2 authentication -description: Tutorial to configure Nok Nok with Azure Active Directory B2C to enable passwordless FIDO2 authentication +description: Configure Nok Nok Passport with Azure AD B2C to enable passwordless FIDO2 authentication -+ - Previously updated : 09/20/2021 Last updated : 03/13/2023 -# Tutorial: Configure Nok Nok with Azure Active Directory B2C to enable passwordless FIDO2 authentication +# Tutorial: Configure Nok Nok Passport with Azure Active Directory B2C for passwordless FIDO2 authentication -In this sample tutorial, learn how to integrate the Nok Nok S3 authentication suite into your Azure Active Directory (AD) B2C tenant. [Nok Nok](https://noknok.com/) enables FIDO certified multifactor authentication such as FIDO UAF, FIDO U2F, WebAuthn, and FIDO2 for mobile and web applications. Using Nok Nok customers can improve their security posture while balancing user experience. +Learn to integrate the Nok Nok S3 Authentication Suite into your Azure Active Directory B2C (Azure AD B2C) tenant. Nok Nok solutions enable FIDO certified multi-factor authentication such as FIDO UAF, FIDO U2F, WebAuthn, and FIDO2 for mobile and web applications. Nok Nok solutions improve security posture while balancing user experience. -## Prerequisites -To get started, you'll need: +To to noknok.com to learn more: [Nok Nok Labs, Inc.](https://noknok.com/) -- An Azure subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).+## Prerequisites -- [An Azure AD B2C tenant](tutorial-create-tenant.md) that is linked to your Azure subscription.+To get started, you need: -- Get a free Nok Nok [trial tenant](https://noknok.com/products/strong-authentication-service/).+* An Azure subscription + * If you don't have one, get a [Azure free account](https://azure.microsoft.com/free/) +* An Azure AD B2C tenant linked to the Azure subscription + * [Tutorial: Create an Azure Active Directory B2C tenant](tutorial-create-tenant.md) +* Go to [noknok.com](https://noknok.com/products/strong-authentication-service/). On the top menu, select **Demo**. ## Scenario description -To enable passwordless FIDO authentication to your users, enable Nok Nok as an Identity provider to your Azure AD B2C tenant. The Nok Nok integration includes the following components: --- **Azure AD B2C** ΓÇô The authorization server, responsible for verifying the userΓÇÖs credentials.+To enable passwordless FIDO authentication for your users, enable Nok Nok as an identity provider (IdP) in your Azure AD B2C tenant. Nok Nok solution integration includes the following components: -- **Web and mobile applications** ΓÇô Your mobile or web applications that you choose to protect with Nok Nok and Azure AD B2C.+* **Azure AD B2C** ΓÇô authorization server that verifies user credentials +* **Web and mobile applications** ΓÇô mobile or web apps to protect with Nok Nok solutions and Azure AD B2C +* **Nok Nok app SDK or Passport app** ΓÇô authenticate Azure AD B2C enabled applications. + * Go to the Apple App Store for [Nok Nok Passport](https://apps.apple.com/us/app/nok-nok-passport/id1050437340) + * Or, Google Play [Nok Nok Passport](https://play.google.com/store/apps/details?id=com.noknok.android.passport2&hl=en&gl=US) -- **The Nok Nok app SDK or Nok Nok Passport app** ΓÇô Applications used to authenticate Azure AD B2C enabled applications. These applications are available on [Apple app store](https://apps.apple.com/us/app/nok-nok-passport/id1050437340) and [Google play store](https://play.google.com/store/apps/details?id=com.noknok.android.passport2&hl=en&gl=US).+The following diagram illustrates the Nok Nok solution as IdP for Azure AD B2C using Open ID Connect (OIDC) for passwordless authentication. -The following architecture diagram shows the implementation. Nok Nok is acting as an Identity provider for Azure AD B2C using Open ID Connect (OIDC) to enable passwordless authentication. +  - +1. At the sign-in page, user selects sign-in or sign-up and enters the username. +2. Azure AD B2C redirects user to the Nok Nok OIDC authentication provider. +3. For mobile authentications, a QR code appears or push notification goes to the user device. For desktop sign-in, the user is redirected to the web app sign-in page for passwordless authentication. +4. User scans the QR code with Nok Nok app SDK or Passport app. Or, username is sign-in page input. +5. User is prompted for authentication. User does passwordless authentication: biometrics, device PIN, or any roaming authenticator. Authentication prompt appears on web application. User does passwordless authentication: biometrics, device PIN, or any roaming authenticator. +6. Nok Nok server validates FIDO assertion and sends OIDC authentication response to Azure AD B2C. +7. User is granted or denied access. -| Step | Description | -|:|:--| -| 1. | User arrives at a login page. Users select sign-in/sign-up and enter the username | -| 2. | Azure AD B2C redirects the user to the Nok Nok OIDC authentication provider. | -| 3a. | For mobile based authentications, Nok Nok either displays a QR code or sends a push notification request to the end userΓÇÖs mobile device. | -| 3b. | For Desktop/PC based login, Nok Nok redirects the end user to the web application login page to initiate a passwordless authentication prompt. | -|4a. | The user scanΓÇÖs the displayed QR code in their smartphone using Nok Nok app SDK or Nok Nok Passport app.| -| 4b. | User provides username as an input on the login page of the web application and selects next. | -| 5a. | User is prompted for authentication on smartphone. <BR> User does passwordless authentication by using the userΓÇÖs preferred method, such as biometrics, device PIN, or any roaming authenticator.| -| 5b. | User is prompted for authentication on web application. <BR> User does passwordless authentication by using the userΓÇÖs preferred method, such as biometrics, device PIN, or any roaming authenticator. | -| 6. | Nok Nok server validates FIDO assertion and upon validation, sends OIDC authentication response to Azure AD B2C.| -| 7. | Based on the response user is granted or denied access. | +## Get started with Nok Nok -## Onboard with Nok Nok --Fill out the [Nok Nok cloud form](https://noknok.com/contact/) to create your own Nok Nok tenant. Once you submit the form, you'll receive an email explaining how to access your tenant. The email will also include access to Nok Nok guides. Follow the instructions provided in the Nok Nok integration guide to complete the OIDC configuration of your Nok Nok cloud tenant. +1. Go to the noknok.com [Contact](https://noknok.com/contact/) page. +2. Fill out the form for a Nok Nok tenant. +3. An email arrives with tenant access information and links to documentation. +4. Use the Nok Nok integration documentation to complete the tenant OIDC configuration. ## Integrate with Azure AD B2C +Use the following instructions to add and configure an IdP then configure a user flow. + ### Add a new Identity provider -To add a new Identity provider, follow these steps: +For the following instructions, use the directory with the Azure AD B2C tenant. To add a new IdP: -1. Sign in to the **[Azure portal](https://portal.azure.com/#home)** as the global administrator of your Azure AD B2C tenant. -1. Make sure you're using the directory that contains your Azure AD B2C tenant. Select the **Directories + subscriptions** icon in the portal toolbar. -1. On the **Portal settings | Directories + subscriptions** page, find your Azure AD B2C directory in the **Directory name** list, and then select **Switch**. -1. Choose **All services** in the top-left corner of the Azure portal, search for and select **Azure AD B2C**. -1. Navigate to **Dashboard** > **Azure Active Directory B2C** > **Identity providers** -1. Select **Identity providers**. -1. Select **Add**. +1. Sign in to the **[Azure portal](https://portal.azure.com/#home)** as Global Administrator of the Azure AD B2C tenant. +2. In the portal toolbar, select the **Directories + subscriptions**. +3. On **Portal settings, Directories + subscriptions**, in the **Directory name** list, locate the Azure AD B2C directory. +4. Select **Switch**. +5. In the top-left corner of the Azure portal, select **All services**. +6. Search for and select **Azure AD B2C**. +7. Navigate to **Dashboard** > **Azure Active Directory B2C** > **Identity providers**. +8. Select **Identity providers**. +9. Select **Add**. ### Configure an Identity provider -To configure an Identity provider, follow these steps: --1. Select **Identity provider type** > **OpenID Connect (Preview)** -1. Fill out the form to set up the Identity provider: -- |Property | Value | - |:--| :--| - | Name | Nok Nok Authentication Provider | - | Metadata URL | Insert the URI of the hosted Nok Nok Authentication app, followed by the specific path such as 'https://demo.noknok.com/mytenant/oidc/.well-known/openid-configuration' | - | Client Secret | Use the client Secret provided by the Nok Nok platform.| - | Client ID | Use the client ID provided by the Nok Nok platform.| - | Scope | OpenID profile email | - | Response type | code | - | Response mode | form_post| --1. Select **OK**. --1. Select **Map this identity providerΓÇÖs claims**. --1. Fill out the form to map the Identity provider: -- |Property | Value | - |:--| :--| - | UserID | From subscription | - | Display name | From subscription | - | Response mode | From subscription | --1. Select **Save** to complete the setup for your new OIDC Identity provider. +To configure an IdP: ++1. Select **Identity provider type** > **OpenID Connect (Preview)**. +2. For **Name**, enter Nok Nok Authentication Provider, or another name. +3. For **Metadata URL**, enter hosted Nok Nok Authentication app URI, followed by the path such as `https://demo.noknok.com/mytenant/oidc/.well-known/openid-configuration` +4. For **Client Secret**, use the Client Secret from Nok Nok. +5. For **Client ID**, use the client ID provided by Nok Nok. +6. For **Scope**, use **OpenID profile email**. +7. For **Response type**, use **code**. +8. For **Response mode**, use **form_post**. +9. Select **OK**. +10. Select **Map this identity providerΓÇÖs claims**. +11. For **UserID**, select **From subscription**. +12. For **Display name**, select **From subscription**. +13. For **Response mode**, select **From subscription**. +14. Select **Save**. ### Create a user flow policy -You should now see Nok Nok as a new OIDC Identity provider listed within your B2C identity providers. +For the following instructions, Nok Nok is a new OIDC IdP in the B2C identity providers list. 1. In your Azure AD B2C tenant, under **Policies**, select **User flows**.--2. Select **New** user flow. --3. Select **Sign up and sign in**, select a **version**, and then select **Create**. --4. Enter a **Name** for your policy. --5. In the Identity providers section, select your newly created Nok Nok Identity provider. --6. Set up the parameters of your User flow. Insert a name and select the Identity provider youΓÇÖve created. You can also add email address. In this case, Azure wonΓÇÖt redirect the login procedure directly to Nok Nok instead it will show a screen where the user can choose the option they would like to use. --7. Leave the **Multi-factor Authentication** field as is. --8. Select **Enforce conditional access policies** --9. Under **User attributes and token claims**, select **Email Address** in the Collect attribute option. You can add all the attributes that Azure AD can collect about the user alongside the claims that Azure AD B2C can return to the client application. --10. Select **Create**. --11. After a successful creation, select your new **User flow**. --12. On the left panel, select **Application Claims**. Under options, tick the **email** checkbox and select **Save**. +2. Select **New**. +3. Select **Sign up and sign in**. +4. Select a **version**. +5. Select **Create**. +6. Enter a policy **Name**. +7. In **Identity providers**, select the created Nok Nok IdP. +8. You can add an email address. Azure won't redirect sign-in to Nok Nok; a screen appears with user options. +9. Leave the **Multi-factor Authentication** field. +10. Select **Enforce conditional access policies**. +11. Under **User attributes and token claims**, in the Collect attribute option, select **Email Address**. +12. Add user attributes for Azure AD to collect, with claims that Azure AD B2C returns to the client application. +13. Select **Create**. +14. Select the new **User flow**. +15. On the left panel, select **Application Claims**. +16. Under options, select the **email** checkbox +17. Select **Save**. ## Test the user flow -1. Open the Azure AD B2C tenant and under Policies select Identity Experience Framework. +1. Open the Azure AD B2C tenant and under **Policies** select **Identity Experience Framework**. +2. Select the created **SignUpSignIn**. +3. Select **Run user flow**. +4. For **Application**, select the registered app. The example is JWT. +5. For **Reply URL**, select the redirect URL. +6. Select **Run user flow**. +7. Perform a sign-up flow and create an account. +8. After the user attribute is created, Nok Nok is called. -2. Select your previously created SignUpSignIn. --3. Select Run user flow and select the settings: -- a. Application: select the registered app (sample is JWT) -- b. Reply URL: select the redirect URL -- c. Select Run user flow. --4. Go through sign-up flow and create an account --5. Nok Nok will be called during the flow, after user attribute is created. If the flow is incomplete, check that user isn't saved in the directory. +If the flow is incomplete, confirm the user is or isn't saved in the directory. ## Next steps -For additional information, review the following articles: --- [Custom policies in Azure AD B2C](./custom-policy-overview.md)--- [Get started with custom policies in Azure AD B2C](tutorial-create-user-flows.md?pivots=b2c-custom-policy)+* [Azure AD B2C custom policy overview](./custom-policy-overview.md) +* [Tutorial: Create user flows and custom policies in Azure Active Directory B2C](tutorial-create-user-flows.md?pivots=b2c-custom-policy) |
active-directory-b2c | Relyingparty | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/relyingparty.md | The optional **RelyingParty** element contains the following elements: | UserJourneyBehaviors | 0:1 | The scope of the user journey behaviors. | | TechnicalProfile | 1:1 | A technical profile that's supported by the RP application. The technical profile provides a contract for the RP application to contact Azure AD B2C. | +You need to create the **RelyingParty** child elements in the order presented in the preceding table. + ## Endpoints The **Endpoints** element contains the following element: |
active-directory-b2c | Support Options | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/support-options.md | - Title: Support for Azure Active Directory B2C -description: How to file support requests for Azure Active Directory B2C. ------- Previously updated : 12/06/2016-----# Azure Active Directory B2C: File Support Requests -You can file support requests for Azure Active Directory B2C (Azure AD B2C) on the Azure portal using the following steps: --1. Switch from your B2C tenant to another tenant that has an Azure subscription associated with it. Typically, the latter is your employee tenant or the default tenant created for you when you signed up for an Azure subscription. To learn more, see [how an Azure subscription is related to Azure AD](../active-directory/fundamentals/active-directory-how-subscriptions-associated-directory.md). --  --1. After switching tenants, click **Help + support**. --  --1. Click **New support request**. --  --1. In the **Basics** blade, use these details and click **Next**. -- * **Issue type** is **Technical**. - * Choose the appropriate **Subscription**. - * **Service** is **Active Directory**. - * Choose the appropriate **Support plan**. If you don't have one, you can sign up for one [here](https://azure.microsoft.com/support/plans/). --  --1. In the **Problem** blade, use these details and click **Next**. -- * Choose the appropriate **Severity** level. - * **Problem type** is **B2C**. - * Choose the appropriate **Category**. - * Describe your issue in the **Details** field. Provide details such as the B2C tenant name, description of the problem, error messages, correlation IDs (if available), and so on. - * In the **Time frame** field, provide the date and time (including time zone) that the issue occurred. - * Under **File upload**, upload all screenshots and files that you think would assist in resolving the issue. --  --1. In the **Contact information** blade, add your contact information. Click **Create**. --  --1. After submitting your support request, you can monitor it by clicking **Help + support** on the Startboard, and then **Manage support requests**. --## Known issue: Filing a support request in the context of a B2C tenant --If you missed step 2 outlined above and try to create a support request in the context of your B2C tenant, you will see the following error. --> [!IMPORTANT] -> Don't attempt to sign up for a new Azure subscription in your B2C tenant. -- |
active-directory-b2c | Supported Azure Ad Features | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/supported-azure-ad-features.md | |
active-directory-b2c | Tutorial Register Applications | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/tutorial-register-applications.md | -A "web application" refers to a traditional web application that performs most of the application logic on the server. They may be built using frameworks like ASP.NET Core, Maven (Java), Flask (Python), and Express (Node.js). +A "web application" refers to a traditional web application that performs most of the application logic on the server. They may be built using frameworks like ASP.NET Core, Spring (Java), Flask (Python), and Express (Node.js). > [!IMPORTANT] > If you're using a single-page application ("SPA") instead (e.g. using Angular, Vue, or React), learn [how to register a single-page application](tutorial-register-spa.md). |
active-directory | User Provisioning | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-provisioning/user-provisioning.md | Title: What is automated app user provisioning in Azure Active Directory -description: An introduction to how you can use Azure Active Directory to automatically provision, de-provision, and continuously update user accounts across multiple third-party applications. +description: An introduction to how you can use Azure Active Directory to automatically provision, deprovision, and continuously update user accounts across multiple third-party applications. Previously updated : 02/21/2023 Last updated : 03/13/2023 Azure AD user provisioning can help address these challenges. To learn more abou ## What applications and systems can I use with Azure AD automatic user provisioning? -Azure AD features pre-integrated support for many popular SaaS apps and human resources systems, and generic support for apps that implement specific parts of the [SCIM 2.0 standard](https://techcommunity.microsoft.com/t5/Identity-Standards-Blog/Provisioning-with-SCIM-getting-started/ba-p/880010). +Azure AD features preintegrated support for many popular SaaS apps and human resources systems, and generic support for apps that implement specific parts of the [SCIM 2.0 standard](https://techcommunity.microsoft.com/t5/Identity-Standards-Blog/Provisioning-with-SCIM-getting-started/ba-p/880010). -* **Pre-integrated applications (gallery SaaS apps)**: You can find all applications for which Azure AD supports a pre-integrated provisioning connector in [Tutorials for integrating SaaS applications with Azure Active Directory](../saas-apps/tutorial-list.md). The pre-integrated applications listed in the gallery generally use SCIM 2.0-based user management APIs for provisioning. +* **Preintegrated applications (gallery SaaS apps)**: You can find all applications for which Azure AD supports a preintegrated provisioning connector in [Tutorials for integrating SaaS applications with Azure Active Directory](../saas-apps/tutorial-list.md). The preintegrated applications listed in the gallery generally use SCIM 2.0-based user management APIs for provisioning.  Azure AD features pre-integrated support for many popular SaaS apps and human re ## How do I set up automatic provisioning to an application? -For pre-integrated applications listed in the gallery, use existing step-by-step guidance to set up automatic provisioning, see [Tutorials for integrating SaaS applications with Azure Active Directory](../saas-apps/tutorial-list.md). The following video shows you how to set up automatic user provisioning for SalesForce. +For preintegrated applications listed in the gallery, use existing step-by-step guidance to set up automatic provisioning, see [Tutorials for integrating SaaS applications with Azure Active Directory](../saas-apps/tutorial-list.md). The following video shows you how to set up automatic user provisioning for SalesForce. > [!VIDEO https://www.youtube.com/embed/pKzyts6kfrw] |
active-directory | Application Proxy Configure Complex Application | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/app-proxy/application-proxy-configure-complex-application.md | To publish complex distributed app through Application Proxy with application se 3. On the Manage and configure application segments page, select "+ Add app segment" - :::image type="content" source="./media/application-proxy-configure-complex-application/add-application-segment-1.png" alt-text="Screenshot pf Manage and configure application segment blade."::: + :::image type="content" source="./media/application-proxy-configure-complex-application/add-application-segment-1.png" alt-text="Screenshot of Manage and configure application segment blade."::: 4. In the Internal Url field, enter the internal URL for your app. 5. In the External Url field, drop down the list and select the custom domain you want to use. -6. Add CORS Rules (optional). For more information see [Configuring CORS Rule](https://learn.microsoft.com/graph/api/resources/corsconfiguration_v2?view=graph-rest-beta) +6. Add CORS Rules (optional). For more information see [Configuring CORS Rule](/graph/api/resources/corsconfiguration_v2?view=graph-rest-beta). 7. Select Create. |
active-directory | Concept Certificate Based Authentication Technical Deep Dive | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/authentication/concept-certificate-based-authentication-technical-deep-dive.md | If CBA enabled user cannot use MF cert (such as on mobile device without smart c ## MFA with Single-factor certificate-based authentication -Azure AD CBA can be used as a second factor to meet MFA requirements with single-factor certificates. The supported combintaions are +Azure AD CBA can be used as a second factor to meet MFA requirements with single-factor certificates. +Some of the supported combintaions are -CBA (first factor) + passwordless phone sign-in (PSI as second factor) -CBA (first factor) + FIDO2 security keys -Password (first factor) + CBA (second factor) +1. CBA (first factor) + passwordless phone sign-in (PSI as second factor) +1. CBA (first factor) + FIDO2 security keys (second factor) +1. Password (first factor) + CBA (second factor) Users need to have another way to get MFA and register passwordless sign-in or FIDO2 in advance to signing in with Azure AD CBA. >[!IMPORTANT]->A user will be considered MFA capable when a user is in scope for Certificate-based authentication auth method. This means user will not be able to use proof up as part of their authentication to registerd other available methods. More info on [Azure AD MFA](../authentication/concept-mfa-howitworks.md) +>A user will be considered MFA capable when a user is in scope for Certificate-based authentication auth method. This means user will not be able to use proof up as part of their authentication to registerd other available methods. Make sure users who do not have a valid certificate are not part of CBA auth method scope. More info on [Azure AD MFA](../authentication/concept-mfa-howitworks.md) **Steps to set up passwordless phone signin(PSI) with CBA** |
active-directory | Usage Analytics Access Keys | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/cloud-infrastructure-entitlement-management/usage-analytics-access-keys.md | The **Analytics** dashboard in Permissions Management provides details about ide - **Access Keys**: Tracks the permission usage of access keys for a given user. - **Serverless Functions**: Tracks assigned permissions and usage of the serverless functions. +> [!NOTE] +> Currently, Microsoft Azure and Google Cloud Platform (GCP) do not provide significant information about access keys to return access keys data. Access Keys analytics are currently only available for Amazon Web Services (AWS) accounts. + This article describes how to view usage analytics about access keys. ## Create a query to view access keys When you select **Access keys**, the **Analytics** dashboard provides a high-lev The following components make up the **Access Keys** dashboard: - - **Authorization System Type**: Select the authorization you want to use: Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP). - - **Authorization System**: Select from a **List** of accounts and **Folders***. + - **Authorization System Type**: Select **AWS**. + - **Authorization System**: Select from a **List** of accounts and **Folders**. - **Key Status**: Select **All**, **Active**, or **Inactive**. - **Key Activity State**: Select **All**, how long the access key has been used, or **Not Used**. - **Key Age**: Select **All** or how long ago the access key was created. Filters can be applied in one, two, or all three categories depending on the typ ### Apply filters by authorization system type -1. From the **Authorization System Type** dropdown, select the authorization system you want to use: **AWS**, **Azure**, or **GCP**. +1. From the **Authorization System Type** dropdown, select **AWS**. 1. Select **Apply** to run your query and display the information you selected. Select **Reset Filter** to discard your changes. Filters can be applied in one, two, or all three categories depending on the typ ### Apply filters by authorization system -1. From the **Authorization System Type** dropdown, select the authorization system you want to use: **AWS**, **Azure**, or **GCP**. +1. From the **Authorization System Type** dropdown, select **AWS**. 1. From the **Authorization System** dropdown, select accounts from a **List** of accounts and **Folders**. 1. Select **Apply** to run your query and display the information you selected. Filters can be applied in one, two, or all three categories depending on the typ ### Apply filters by key status -1. From the **Authorization System Type** dropdown, select the authorization system you want to use: **AWS**, **Azure**, or **GCP**. +1. From the **Authorization System Type** dropdown, select **AWS**. 1. From the **Authorization System** dropdown, select from a **List** of accounts and **Folders**. 1. From the **Key Status** dropdown, select the type of key: **All**, **Active**, or **Inactive**. 1. Select **Apply** to run your query and display the information you selected. Filters can be applied in one, two, or all three categories depending on the typ ### Apply filters by key activity status -1. From the **Authorization System Type** dropdown, select the authorization system you want to use: **AWS**, **Azure**, or **GCP**. +1. From the **Authorization System Type** dropdown, select **AWS**. 1. From the **Authorization System** dropdown, select from a **List** of accounts and **Folders**. 1. From the **Key Activity State** dropdown, select **All**, the duration for how long the access key has been used, or **Not Used**. Filters can be applied in one, two, or all three categories depending on the typ ### Apply filters by key age -1. From the **Authorization System Type** dropdown, select the authorization system you want to use: **AWS**, **Azure**, or **GCP**. +1. From the **Authorization System Type** dropdown, select **AWS**. 1. From the **Authorization System** dropdown, select from a **List** of accounts and **Folders**. 1. From the **Key Age** dropdown, select **All** or how long ago the access key was created. Filters can be applied in one, two, or all three categories depending on the typ ### Apply filters by task type -1. From the **Authorization System Type** dropdown, select the authorization system you want to use: **AWS**, **Azure**, or **GCP**. +1. From the **Authorization System Type** dropdown, select **AWS**. 1. From the **Authorization System** dropdown, select from a **List** of accounts and **Folders**. 1. From the **Task Type** dropdown, select **All** tasks, **High Risk Tasks** or, for a list of tasks where users have deleted data, select **Delete tasks**. 1. Select **Apply** to run your query and display the information you selected. |
active-directory | Concept Conditional Access Cloud Apps | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/concept-conditional-access-cloud-apps.md | The following key applications are affected by the Office 365 cloud app: - Microsoft Flow - Microsoft Office 365 Portal - Microsoft Office client application-- Microsoft Stream - Microsoft To-Do WebApp - Microsoft Whiteboard Services - Office Delve |
active-directory | Custom Claims Provider Overview | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/custom-claims-provider-overview.md | When a user authenticates to an application, a custom claims provider can be use Key data about a user is often stored in systems external to Azure AD. For example, secondary email, billing tier, or sensitive information. Some applications may rely on these attributes for the application to function as designed. For example, the application may block access to certain features based on a claim in the token. +The following short video provides an excellent overview of the Azure AD custom extensions and custom claims providers: +> [!VIDEO https://www.youtube.com/embed/BYOMshjlwbc] + Use a custom claims provider for the following scenarios: - **Migration of legacy systems** - You may have legacy identity systems such as Active Directory Federation Services (AD FS) or data stores (such as LDAP directory) that hold information about users. You'd like to migrate these applications, but can't fully migrate the identity data into Azure AD. Your apps may depend on certain information on the token, and can't be rearchitected. |
active-directory | Custom Extension Get Started | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/custom-extension-get-started.md | -This how-to guide demonstrates the token issuance start event with a REST API running in Azure Functions and a sample OpenID Connect application. +This how-to guide demonstrates the token issuance start event with a REST API running in Azure Functions and a sample OpenID Connect application. Before you start, take a look at following video, which demonstrates how to configure Azure AD custom claims provider with Function App: ++> [!VIDEO https://www.youtube.com/embed/r-JEsMBJ7GE] ## Prerequisites To protect your Azure function, follow these steps to integrate Azure AD authent > [!NOTE] > If the Azure function app is hosted in a different Azure tenant than the tenant in which your custom extension is registered, skip to [using OpenID Connect identity provider](#51-using-openid-connect-identity-provider) step. -1. In the [Azure portal](https://poral.azure.com), navigate and select the function app you previously published. +1. In the [Azure portal](https://portal.azure.com), navigate and select the function app you previously published. 1. Select **Authentication** in the menu on the left. 1. Select **Add Identity provider**. 1. Select **Microsoft** as the identity provider. To protect your Azure function, follow these steps to integrate Azure AD authent If you configured the [Microsoft identity provider](#step-5-protect-your-azure-function), skip this step. Otherwise, if the Azure Function is hosted under a different tenant than the tenant in which your custom extension is registered, follow these steps to protect your function: -1. In the [Azure portal](https://poral.azure.com), navigate and select the function app you previously published. +1. In the [Azure portal](https://portal.azure.com), navigate and select the function app you previously published. 1. Select **Authentication** in the menu on the left. 1. Select **Add Identity provider**. 1. Select **OpenID Connect** as the identity provider. |
active-directory | Troubleshoot Publisher Verification | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/troubleshoot-publisher-verification.md | Below are some common issues that may occur during the process. Go to the [MPN User Management page](https://partner.microsoft.com/pcv/users) and filter the user list to see what users are in various admin roles. - **I am getting an error saying that my MPN ID is invalid or that I do not have access to it.**- 1. Go to your [partner profile](https://partner.microsoft.com/pcv/accountsettings/connectedpartnerprofile) and verify that: - - The MPN ID is correct. - - There are no errors or ΓÇ£pending actionsΓÇ¥ shown, and the verification status under Legal business profile and Partner info both say ΓÇ£authorizedΓÇ¥ or ΓÇ£successΓÇ¥. - 2. Go to the [MPN tenant management page](https://partner.microsoft.com/dashboard/account/v3/tenantmanagement) and confirm that the tenant the app is registered in and that you're signing with a user account from is on the list of associated tenants. To add another tenant, follow the instructions [here](/partner-center/multi-tenant-account). Be aware that all Global Admins of any tenant you add will be granted Global Administrator privileges on your Partner Center account. - 3. Go to the [MPN User Management page](https://partner.microsoft.com/pcv/users) and confirm the user you're signing in as is either a Global Administrator, MPN Admin, or Accounts Admin. To add a user to a role in Partner Center, follow the instructions [here](/partner-center/create-user-accounts-and-set-permissions). + Follow the [remediation guidance](#mpnaccountnotfoundornoaccess). - **When I sign into the Azure portal, I do not see any apps registered. Why?** Your app registrations may have been created using a different user account in this tenant, a personal/consumer account, or in a different tenant. Ensure you're signed in with the correct account in the tenant where your app registrations were created. The MPN ID you provided (`MPNID`) doesn't exist, or you don't have access to it. Most commonly caused by the signed-in user not being a member of the proper role for the MPN account in Partner Center- see [requirements](publisher-verification-overview.md#requirements) for a list of eligible roles and see [common issues](#common-issues) for more information. Can also be caused by the tenant the app is registered in not being added to the MPN account, or an invalid MPN ID. +**Remediation Steps** +1. Go to your [partner profile](https://partner.microsoft.com/pcv/accountsettings/connectedpartnerprofile) and verify that: ++ - The MPN ID is correct. + - There are no errors or ΓÇ£pending actionsΓÇ¥ shown, and the verification status under Legal business profile and Partner info both say ΓÇ£authorizedΓÇ¥ or ΓÇ£successΓÇ¥. +2. Go to the [MPN tenant management page](https://partner.microsoft.com/dashboard/account/v3/tenantmanagement) and confirm that the tenant the app is registered in and that you're signing with a user account from is on the list of associated tenants. To add another tenant, follow the [multi-tenant-account instructions](/partner-center/multi-tenant-account). Be aware that all Global Admins of any tenant you add will be granted Global Administrator privileges on your Partner Center account. +3. Go to the [MPN User Management page](https://partner.microsoft.com/pcv/users) and confirm the user you're signing in as is either a Global Administrator, MPN Admin, or Accounts Admin. To add a user to a role in Partner Center, follow the instructions for [creating user accounts and setting permissions](/partner-center/create-user-accounts-and-set-permissions). + ### MPNGlobalAccountNotFound The MPN ID you provided (`MPNID`) isn't valid. Provide a valid MPN ID and try again. Most commonly caused when an MPN ID is provided which corresponds to a Partner Location Account (PLA). Only Partner Global Accounts are supported. See [Partner Center account structure](/partner-center/account-structure) for more details. +**Remediation Steps** +1. Navigate to your [partner profile](https://partner.microsoft.com/pcv/accountsettings/connectedpartnerprofile) > Identifiers blade > Microsoft Cloud Partners Program Tab +2. Use the Partner ID with type PartnerGlobal + ### MPNAccountInvalid The MPN ID you provided (`MPNID`) isn't valid. Provide a valid MPN ID and try again. Most commonly caused by the wrong MPN ID being provided. +**Remediation Steps** +1. Navigate to your [partner profile](https://partner.microsoft.com/pcv/accountsettings/connectedpartnerprofile) > Identifiers blade > Microsoft Cloud Partners Program Tab +2. Use the Partner ID with type PartnerGlobal + ### MPNAccountNotVetted The MPN ID (`MPNID`) you provided hasn't completed the vetting process. Complete this process in Partner Center and try again. Most commonly caused by when the MPN account hasn't completed the [verification](/partner-center/verification-responses) process. +**Remediation Steps** +1. Navigate to your [partner profile](https://partner.microsoft.com/pcv/accountsettings/connectedpartnerprofile) and verify that there are no errors or **pending actions** shown, and that the verification status under Legal business profile and Partner info both say **authorized** or **success**. +2. If not, view pending action items in Partner Center and troubleshoot with [here](/partner-center/verification-responses) + ### NoPublisherIdOnAssociatedMPNAccount The MPN ID you provided (`MPNID`) isn't valid. Provide a valid MPN ID and try again. Most commonly caused by the wrong MPN ID being provided. +**Remediation Steps** +1. Navigate to your [partner profile](https://partner.microsoft.com/pcv/accountsettings/connectedpartnerprofile) > Identifiers blade > Microsoft Cloud Partners Program Tab +2. Use the Partner ID with type PartnerGlobal + ### MPNIdDoesNotMatchAssociatedMPNAccount The MPN ID you provided (`MPNID`) isn't valid. Provide a valid MPN ID and try again. Most commonly caused by the wrong MPN ID being provided. +**Remediation Steps** +1. Navigate to your [partner profile](https://partner.microsoft.com/pcv/accountsettings/connectedpartnerprofile) > Identifiers blade > Microsoft Cloud Partners Program Tab +2. Use the Partner ID with type PartnerGlobal + ### ApplicationNotFound The target application (`AppId`) canΓÇÖt be found. Provide a valid application ID and try again. -Most commonly caused when verification is being performed via Graph API, and the ID of the application provided is incorrect. Note that the ID of the application must be provided, not the AppId/ClientId. +Most commonly caused when verification is being performed via Graph API, and the ID of the application provided is incorrect. ++**Remediation Steps** +1. The Object ID of the application must be provided, not the AppId/ClientId. See **id** on the list of application properties [here](/graph/api/resources/application) +2. Log in to [Azure Active Directory](https://aad.portal.azure.com/) with a user account in your organization's primary tenant > Azure Active Directory > App Registrations blade +3. Find your app's registration to view the Object ID + ### ApplicationObjectisInvalid The target application's object ID is invalid. Please provide a valid ID and try Most commonly caused when the verification is being performed via Graph API, and the ID of the application provided does not exist. -> [!NOTE] -> The Object ID of the application must be provided, not the AppId/ClientId. See "id" on the list of application properties at [application resource type - Microsoft Graph v1.0 | Microsoft Learn](/graph/api/resources/application). +**Remediation Steps** +1. The Object ID of the application must be provided, not the AppId/ClientId. See **id** on the list of application properties [here](/graph/api/resources/application) +2. Log in to [Azure Active Directory](https://aad.portal.azure.com/) with a user account in your organization's primary tenant > Azure Active Directory > App Registrations blade +3. Find your app's registration to view the Object ID - ### B2CTenantNotAllowed This capability isn't supported in an Azure AD B2C tenant. The target application (`AppId`) must have a Publisher Domain set. Set a Publish Occurs when a [Publisher Domain](howto-configure-publisher-domain.md) isn't configured on the app. +**Remediation Steps** +1. Follow the directions [here](/azure/active-directory/develop/howto-configure-publisher-domain#set-a-publisher-domain-in-the-azure-portal) to set a Publisher Domain + ### PublisherDomainMismatch The target application's Publisher Domain (`publisherDomain`) either doesn't match the domain used to perform email verification in Partner Center (`pcDomain`) or has not been verified. Ensure these domains match and have been verified then try again. Occurs when neither the app's [Publisher Domain](howto-configure-publisher-domai See [requirements](publisher-verification-overview.md) for a list of allowed domain or sub-domain matches. +**Remediation Steps** +1. Navigate to your [partner profile](https://partner.microsoft.com/pcv/accountsettings/connectedpartnerprofile), and view the email listed as Primary Contact +2. The domain used to perform email verification in Partner Center is the portion after the ΓÇ£@ΓÇ¥ in the Primary ContactΓÇÖs email +3. Log in to [Azure Active Directory](https://aad.portal.azure.com/) > Azure Active Directory > App Registrations blade > (`Your App`) > Branding and Properties +4. Select **Update Publisher Domain** and follow the instructions to **Verify a New Domain**. +5. Add the domain used to perform email verification in Partner Center as a New Domain ++ ### NotAuthorizedToVerifyPublisher You aren't authorized to set the verified publisher property on application (<`AppId`). Most commonly caused by the signed-in user not being a member of the proper role for the MPN account in Azure AD- see [requirements](publisher-verification-overview.md#requirements) for a list of eligible roles and see [common issues](#common-issues) for more information. +**Remediation Steps** +1. Sign in to the [Azure AD Portal](https://aad.portal.azure.com) using a user account in your organization's primary tenant. +2. Navigate to [Role Management](https://aad.portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/RolesAndAdministrators). +3. Select the desired admin role and click ΓÇ£Add AssignmentΓÇ¥ if you have sufficient permissions. +4. If you do not have sufficient permissions, contact an admin role for assistance ++ ### MPNIdWasNotProvided The MPN ID wasn't provided in the request body or the request content type wasn't "application/json". Most commonly caused when the verification is being performed via Graph API, and the MPN ID wasnΓÇÖt provided in the request. +**Remediation Steps** +1. Navigate to your [partner profile](https://partner.microsoft.com/pcv/accountsettings/connectedpartnerprofile) > Identifiers blade > Microsoft Cloud Partners Program Tab +2. Use the Partner ID with type PartnerGlobal in the request + ### MSANotSupported This feature isn't supported for Microsoft consumer accounts. Only applications registered in Azure AD by an Azure AD user are supported. Occurs when multi-factor authentication (MFA) hasn't been enabled and performed The error message displayed will be: "Due to a configuration change made by your administrator, or because you moved to a new location, you must use multi-factor authentication to proceed." +**Remediation Steps** +1. Ensure [multi-factor authentication](../fundamentals/concept-fundamentals-mfa-get-started.md) is enabled and **required** for the user you're signing in with and for this scenario +2. Retry Publisher Verification + ### UserUnableToAddPublisher When a request to add a verified publisher is made, many signals are used to make a security risk assessment. If the user risk state is determined to be ΓÇÿAtRiskΓÇÖ, an error, ΓÇ£You're unable to add a verified publisher to this application. Contact your administrator for assistanceΓÇ¥ will be returned. Please investigate the user risk and take the appropriate steps to remediate the risk (guidance below): +**Remediation Steps** > [Investigate risk](../identity-protection/howto-identity-protection-investigate-risk.md#risky-users)-+> > [Remediate risk/unblock users](../identity-protection/howto-identity-protection-remediate-unblock.md)-+> > [Self-remediation guidance](../identity-protection/howto-identity-protection-remediate-unblock.md)-+> > Self-serve password reset (SSPR): If the organization allows SSPR, use aka.ms/sspr to reset the password for remediation. Please choose a strong password; Choosing a weak password may not reset the risk state. -+> > [!NOTE] > Please give some time after remediation for the risk state to update, and then try again. If you've reviewed all of the previous information and are still receiving an er - TenantId where app is registered - MPN ID - REST request being made -- Error code and message being returned+- Error code and message being returned |
active-directory | V2 Oauth2 Client Creds Grant Flow | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/v2-oauth2-client-creds-grant-flow.md | An error response (400 Bad Request) looks like this: Now that you've acquired a token, use the token to make requests to the resource. When the token expires, repeat the request to the `/token` endpoint to acquire a fresh access token. ```HTTP-GET /v1.0/me/messages +GET /v1.0/users Host: https://graph.microsoft.com Authorization: Bearer eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiIsIng1dCI6Ik5HVEZ2ZEstZnl0aEV1Q... ``` Try the following command in your terminal, ensuring to replace the token with your own. ```bash-curl -X GET -H "Authorization: Bearer eyJ0eXAiOiJKV1QiLCJhbG...." 'https://graph.microsoft.com/v1.0/me/messages' +curl -X GET -H "Authorization: Bearer eyJ0eXAiOiJKV1QiLCJhbG...." 'https://graph.microsoft.com/v1.0/users' ``` ## Code samples and other documentation |
active-directory | Device Management Azure Portal | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/device-management-azure-portal.md | To view or copy BitLocker keys, you need to be the owner of the device or have o - Security Administrator - Security Reader -## Block users from viewing their BitLocker keys (preview) -In this preview, admins can block self-service BitLocker key access to the registered owner of the device. Default users without the BitLocker read permission will be unable to view or copy their BitLocker key(s) for their owned devices. --To disable/enable self-service BitLocker recovery: --```PowerShell -Connect-MgGraph -Scopes Policy.ReadWrite.Authorization -$authPolicyUri = "https://graph.microsoft.com/beta/policies/authorizationPolicy/authorizationPolicy" -$body = @{ - defaultUserRolePermissions = @{ - allowedToReadBitlockerKeysForOwnedDevice = $false #Set this to $true to allow BitLocker self-service recovery - } -}| ConvertTo-Json -Invoke-MgGraphRequest -Uri $authPolicyUri -Method PATCH -Body $body -# Show current policy setting -$authPolicy = Invoke-MgGraphRequest -Uri $authPolicyUri -$authPolicy.defaultUserRolePermissions -``` - ## View and filter your devices (preview) In this preview, you have the ability to infinitely scroll, reorder columns, and select all devices. You can filter the device list by these device attributes: In this preview, you have the ability to infinitely scroll, reorder columns, and - OS - Device type (printer, secure VM, shared device, registered device) - MDM+- Autopilot - Extension attributes - Administrative unit - Owner You must be assigned one of the following roles to view or manage device setting > [!NOTE] > The **Users may join devices to Azure AD** setting is applicable only to Azure AD join on Windows 10 or newer. This setting doesn't apply to hybrid Azure AD joined devices, [Azure AD joined VMs in Azure](./howto-vm-sign-in-azure-ad-windows.md#enable-azure-ad-login-for-a-windows-vm-in-azure), or Azure AD joined devices that use [Windows Autopilot self-deployment mode](/mem/autopilot/self-deploying) because these methods work in a userless context. -- **Additional local administrators on Azure AD joined devices**: This setting allows you to select the users who are granted local administrator rights on a device. These users are added to the Device Administrators role in Azure AD. Global Administrators in Azure AD and device owners are granted local administrator rights by default. -This option is a premium edition capability available through products like Azure AD Premium and Enterprise Mobility + Security. - **Users may register their devices with Azure AD**: You need to configure this setting to allow users to register Windows 10 or newer personal, iOS, Android, and macOS devices with Azure AD. If you select **None**, devices aren't allowed to register with Azure AD. Enrollment with Microsoft Intune or mobile device management for Microsoft 365 requires registration. If you've configured either of these services, **ALL** is selected, and **NONE** is unavailable. - **Require Multi-Factor Authentication to register or join devices with Azure AD**: - We recommend organizations use the [Register or join devices user](../conditional-access/concept-conditional-access-cloud-apps.md#user-actions) action in Conditional Access to enforce multifactor authentication. You must configure this toggle to **No** if you use a Conditional Access policy to require multifactor authentication. This option is a premium edition capability available through products like Azur > [!NOTE] > The **Maximum number of devices** setting applies to devices that are either Azure AD joined or Azure AD registered. This setting doesn't apply to hybrid Azure AD joined devices. +- **Additional local administrators on Azure AD joined devices**: This setting allows you to select the users who are granted local administrator rights on a device. These users are added to the Device Administrators role in Azure AD. Global Administrators in Azure AD and device owners are granted local administrator rights by default. +This option is a premium edition capability available through products like Azure AD Premium and Enterprise Mobility + Security. ++- **Restrict non-admin users from recovering the BitLocker key(s) for their owned devices (preview)**: In this preview, admins can block self-service BitLocker key access to the registered owner of the device. Default users without the BitLocker read permission will be unable to view or copy their BitLocker key(s) for their owned devices. + - **Enterprise State Roaming**: For information about this setting, see [the overview article](enterprise-state-roaming-overview.md). ## Audit logs |
active-directory | Directory Delegated Administration Primer | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/enterprise-users/directory-delegated-administration-primer.md | There are two types of delegated administration relationships that are visible i ## Granular delegated admin permission -When a Microsoft CSP creates a GDAP relationship request for your tenant, a GDAP relationship is created in the tenant when a global administrator approves the request. The GDAP relationship request specifies: +When a Microsoft CSP creates a GDAP relationship request for your tenant a global administrator needs to approve the request. The GDAP relationship request specifies: * The CSP partner tenant * The roles that the partner needs to delegate to their technicians * The expiration date -If you have any GDAP relationships in your tenant, you will see a notification banner on the **Delegated Administration** page in the Azure portal. Select the notification banner to see and manage GDAP relationships in the **Partners** page in Microsoft Admin Center. +If you have GDAP relationships in your tenant, you will see a notification banner on the **Delegated Administration** page in the Azure AD admin portal. Select the notification banner to see and manage GDAP relationships in the **Partners** page in Microsoft Admin Center. ## Delegated admin permission -When a Microsoft CSP creates a DAP relationship request for your tenant, a GDAP relationship is created in the tenant when a global administrator approves the request. All DAP relationships enable the CSP to delegate Global administrator and Helpdesk administrator roles to their technicians. Unlike a GDAP relationship, a DAP relationship persists until they are revoked either by you or by your CSP. +All DAP relationships enable the CSP to delegate Global administrator and Helpdesk administrator roles to their technicians. Unlike a GDAP relationship, a DAP relationship persists until they are revoked either by you or by your CSP. If you have any DAP relationships in your tenant, you will see them in the list on the Delegated Administration page in the Azure portal. To remove a DAP relationship for a CSP, follow the link to the Partners page in the Microsoft Admin Center. ## Next steps -If you're a beginning Azure AD administrator, get the basics down in [Azure Active Directory Fundamentals](../fundamentals/index.yml). +If you're a beginning Azure AD administrator, get the basics down in [Azure Active Directory Fundamentals](../fundamentals/index.yml). ++- [Delegated administration privileges (DAP) FAQ](/partner-center/dap-faq) +- [Granular delegated admin privileges (GDAP) introduction](/partner-center/gdap-introduction) |
active-directory | Auth Ssh | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/auth-ssh.md | -Secure Shell (SSH) is a network protocol that provides encryption for operating network services securely over an unsecured network. It's commonly used in Unix-based systems such as Linux. SSH replaces the Telnet protocol, which doesn't provide encryption in an unsecured network. +Secure Shell (SSH) is a network protocol that provides encryption for operating network services securely over an unsecured network. It's commonly used in systems like Unix and Linux. SSH replaces the Telnet protocol, which doesn't provide encryption in an unsecured network. Azure Active Directory (Azure AD) provides a virtual machine (VM) extension for Linux-based systems that run on Azure. It also provides a client extension that integrates with the [Azure CLI](/cli/azure/) and the OpenSSH client. |
active-directory | Azure Active Directory B2c Deployment Plans | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/azure-active-directory-b2c-deployment-plans.md | Use the following checklist for delivery. |Protocol information| Gather the base path, policies, and metadata URL of both variants. </br>Specify attributes such as sample sign-in, client application ID, secrets, and redirects.| |Application samples | See, [Azure Active Directory B2C code samples](../../active-directory-b2c/integrate-with-app-code-samples.md).| |Penetration testing | Inform your operations team about pen tests, then test user flows including the OAuth implementation. </br>See, [Penetration testing](../../security/fundamentals/pen-testing.md) and [Penetration testing rules of engagement](https://www.microsoft.com/msrc/pentest-rules-of-engagement).-| Unit testing | Unit test and generate tokens. </br>See, [Microsoft identity platform and OAuth 2.0 Resource Owner Password Credentials](../develop/v2-oauth-ropc.md). </br>If you reach the Azure AD B2C token limit, see [Azure AD B2C: File Support Requests](../../active-directory-b2c/support-options.md). </br>Reuse tokens to reduce investigation on your infrastructure. </br>[Set up a resource owner password credentials flow in Azure Active Directory B2C](../../active-directory-b2c/add-ropc-policy.md?pivots=b2c-user-flow&tabs=app-reg-ga).| +| Unit testing | Unit test and generate tokens. </br>See, [Microsoft identity platform and OAuth 2.0 Resource Owner Password Credentials](../develop/v2-oauth-ropc.md). </br>If you reach the Azure AD B2C token limit, see [Azure AD B2C: File Support Requests](../../active-directory-b2c/find-help-open-support-ticket.md). </br>Reuse tokens to reduce investigation on your infrastructure. </br>[Set up a resource owner password credentials flow in Azure Active Directory B2C](../../active-directory-b2c/add-ropc-policy.md?pivots=b2c-user-flow&tabs=app-reg-ga).| | Load testing | Learn about [Azure AD B2C service limits and restrictions](../../active-directory-b2c/service-limits.md). </br>Calculate the expected authentications and user sign-ins per month. </br>Assess high load traffic durations and business reasons: holiday, migration, and event. </br>Determine expected peak rates for sign-up, traffic, and geographic distribution, for example per second. ### Security |
active-directory | Users Default Permissions | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/users-default-permissions.md | You can restrict default permissions for member users in the following ways: | **Create Microsoft 365 groups** | Setting this option to **No** prevents users from creating Microsoft 365 groups. Setting this option to **Some** allows a set of users to create Microsoft 365 groups. Global administrators and user administrators can still create Microsoft 365 groups. To learn how, see [Azure Active Directory cmdlets for configuring group settings](../enterprise-users/groups-settings-cmdlets.md). | | **Restrict access to Azure AD administration portal** | **What does this switch do?** <br>**No** lets non-administrators browse the Azure AD administration portal. <br>**Yes** Restricts non-administrators from browsing the Azure AD administration portal. Non-administrators who are owners of groups or applications are unable to use the Azure portal to manage their owned resources. </p><p></p><p>**What does it not do?** <br> It doesn't restrict access to Azure AD data using PowerShell, Microsoft GraphAPI, or other clients such as Visual Studio. <br>It doesn't restrict access as long as a user is assigned a custom role (or any role). </p><p></p><p>**When should I use this switch?** <br>Use this option to prevent users from misconfiguring the resources that they own. </p><p></p><p>**When should I not use this switch?** <br>Don't use this switch as a security measure. Instead, create a Conditional Access policy that targets Microsoft Azure Management that blocks non-administrators access to [Microsoft Azure Management](../conditional-access/concept-conditional-access-cloud-apps.md#microsoft-azure-management). </p><p></p><p> **How do I grant only a specific non-administrator users the ability to use the Azure AD administration portal?** <br> Set this option to **Yes**, then assign them a role like global reader. </p><p></p><p>**Restrict access to the Entra administration portal** <br>A Conditional Access policy that targets Microsoft Azure Management targets access to all Azure management. | | **Restrict non-admin users from creating tenants** | Users can create tenants in the Azure AD and Entra administration portal under Manage tenant. The creation of a tenant is recorded in the Audit log as category DirectoryManagement and activity Create Company. Anyone who creates a tenant becomes the Global Administrator of that tenant. The newly created tenant doesn't inherit any settings or configurations. </p><p></p><p>**What does this switch do?** <br> Setting this option to **Yes** restricts creation of Azure AD tenants to the Global Administrator or tenant creator roles. Setting this option to **No** allows non-admin users to create Azure AD tenants. Tenant create will continue to be recorded in the Audit log. </p><p></p><p>**How do I grant only a specific non-administrator users the ability to create new tenants?** <br> Set this option to Yes, then assign them the tenant creator role.|+| **Restrict non-admin users from reading BitLocker key(s) for their owned devices** | Setting this option to **Yes** restricts users from being able to self-service recover BitLocker key(s) for their owned devices. Setting this option to **No** allows users to recover their BitLocker key(s). | | **Read other users** | This setting is available in Microsoft Graph and PowerShell only. Setting this flag to `$false` prevents all non-admins from reading user information from the directory. This flag doesn't prevent reading user information in other Microsoft services like Exchange Online.</p><p>This setting is meant for special circumstances, so we don't recommend setting the flag to `$false`. | The **Restrict non-admin users from creating tenants** option is shown [below](https://portal.azure.com/#view/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/~/UserSettings) |
active-directory | Whats New Archive | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/whats-new-archive.md | A new policy API is available for the administrators to control tenant wide poli In some situations, you may want to restrict the ability for end users to self-service BitLocker keys. With this new functionality, you can now turn off self-service of BitLocker keys, so that only specific individuals with right privileges can recover a BitLocker key. -For more information, see: [Block users from viewing their BitLocker keys (preview)](../devices/device-management-azure-portal.md#block-users-from-viewing-their-bitlocker-keys-preview) +For more information, see: [Block users from viewing their BitLocker keys (preview)](../devices/device-management-azure-portal.md#configure-device-settings) For more information, see: - [Get started with Azure Active Directory Identity Protection and Microsoft Graph](../identity-protection/howto-identity-protection-graph-api.md) -+ |
active-directory | Choose Ad Authn | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/hybrid/choose-ad-authn.md | description: This guide helps CEOs, CIOs, CISOs, Chief Identity Architects, Ente keywords: Previously updated : 01/26/2023 Last updated : 03/11/2023 Choosing the correct authentication method is the first concern for organization 1. It's the first decision for an organization that wants to move to the cloud. -2. The authentication method is a critical component of an organizationΓÇÖs presence in the cloud. It controls access to all cloud data and resources. +2. The authentication method is a critical component of an organization's presence in the cloud. It controls access to all cloud data and resources. 3. It's the foundation of all the other advanced security and user experience features in Azure AD. -Identity is the new control plane of IT security, so authentication is an organizationΓÇÖs access guard to the new cloud world. Organizations need an identity control plane that strengthens their security and keeps their cloud apps safe from intruders. +Identity is the new control plane of IT security, so authentication is an organization's access guard to the new cloud world. Organizations need an identity control plane that strengthens their security and keeps their cloud apps safe from intruders. > [!NOTE]-> Changing your authentication method requires planning, testing, and potentially downtime. [Staged rollout](./how-to-connect-staged-rollout.md) is a great way to test users migration from federation to cloud authentication. +> Changing your authentication method requires planning, testing, and potentially downtime. [Staged rollout](./how-to-connect-staged-rollout.md) is a great way to test users' migration from federation to cloud authentication. ### Out of scope-Organizations that don't have an existing on-premises directory footprint aren't the focus of this article. Typically, those businesses create identities only in the cloud, which doesnΓÇÖt require a hybrid identity solution. Cloud-only identities exist solely in the cloud and aren't associated with corresponding on-premises identities. +Organizations that don't have an existing on-premises directory footprint aren't the focus of this article. Typically, those businesses create identities only in the cloud, which doesn't require a hybrid identity solution. Cloud-only identities exist solely in the cloud and aren't associated with corresponding on-premises identities. ## Authentication methods-When the Azure AD hybrid identity solution is your new control plane, authentication is the foundation of cloud access. Choosing the correct authentication method is a crucial first decision in setting up an Azure AD hybrid identity solution. Implement the authentication method that is configured by using Azure AD Connect, which also provisions users in the cloud. +When the Azure AD hybrid identity solution is your new control plane, authentication is the foundation of cloud access. Choosing the correct authentication method is a crucial first decision in setting up an Azure AD hybrid identity solution. The authentication method you choose, is configured by using Azure AD Connect, which also provisions users in the cloud. To choose an authentication method, you need to consider the time, existing infrastructure, complexity, and cost of implementing your choice. These factors are different for every organization and might change over time. To choose an authentication method, you need to consider the time, existing infr Azure AD supports the following authentication methods for hybrid identity solutions. ### Cloud authentication-When you choose this authentication method, Azure AD handles users' sign-in process. Coupled with seamless single sign-on (SSO), users can sign in to cloud apps without having to reenter their credentials. With cloud authentication, you can choose from two options: +When you choose this authentication method, Azure AD handles users' sign-in process. Coupled with single sign-on (SSO), users can sign in to cloud apps without having to reenter their credentials. With cloud authentication, you can choose from two options: -**Azure AD password hash synchronization**. The simplest way to enable authentication for on-premises directory objects in Azure AD. Users can use the same username and password that they use on-premises without having to deploy any additional infrastructure. Some premium features of Azure AD, like Identity Protection and [Azure AD Domain Services](../../active-directory-domain-services/tutorial-create-instance.md), require password hash synchronization, no matter which authentication method you choose. +**Azure AD password hash synchronization**. The simplest way to enable authentication for on-premises directory objects in Azure AD. Users can use the same username and password that they use on-premises without having to deploy any other infrastructure. Some premium features of Azure AD, like Identity Protection and [Azure AD Domain Services](../../active-directory-domain-services/tutorial-create-instance.md), require password hash synchronization, no matter which authentication method you choose. > [!NOTE] > Passwords are never stored in clear text or encrypted with a reversible algorithm in Azure AD. For more information on the actual process of password hash synchronization, see [Implement password hash synchronization with Azure AD Connect sync](../../active-directory/hybrid/how-to-connect-password-hash-synchronization.md). When you choose this authentication method, Azure AD handles users' sign-in proc Companies with a security requirement to immediately enforce on-premises user account states, password policies, and sign-in hours might use this authentication method. For more information on the actual pass-through authentication process, see [User sign-in with Azure AD pass-through authentication](../../active-directory/hybrid/how-to-connect-pta.md). ### Federated authentication-When you choose this authentication method, Azure AD hands off the authentication process to a separate trusted authentication system, such as on-premises Active Directory Federation Services (AD FS), to validate the userΓÇÖs password. +When you choose this authentication method, Azure AD hands off the authentication process to a separate trusted authentication system, such as on-premises Active Directory Federation Services (AD FS), to validate the user's password. -The authentication system can provide additional advanced authentication requirements. Examples are smartcard-based authentication or third-party multifactor authentication. For more information, see [Deploying Active Directory Federation Services](/windows-server/identity/ad-fs/deployment/windows-server-2012-r2-ad-fs-deployment-guide). +The authentication system can provide other advanced authentication requirements, for example, third-party multifactor authentication. The following section helps you decide which authentication method is right for you by using a decision tree. It helps you determine whether to deploy cloud or federated authentication for your Azure AD hybrid identity solution. The following section helps you decide which authentication method is right for Details on decision questions: 1. Azure AD can handle sign-in for users without relying on on-premises components to verify passwords.-2. Azure AD can hand off user sign-in to a trusted authentication provider such as MicrosoftΓÇÖs AD FS. +2. Azure AD can hand off user sign-in to a trusted authentication provider such as Microsoft's AD FS. 3. If you need to apply, user-level Active Directory security policies such as account expired, disabled account, password expired, account locked out, and sign-in hours on each user sign-in, Azure AD requires some on-premises components. 4. Sign-in features not natively supported by Azure AD:- * Sign-in using on-premises MFA Server. * Sign-in using third-party authentication solution. * Multi-site on-premises authentication solution. 5. Azure AD Identity Protection requires Password Hash Sync regardless of which sign-in method you choose, to provide the *Users with leaked credentials* report. Organizations can fail over to Password Hash Sync if their primary sign-in method fails and it was configured before the failure event. Details on decision questions: * **Effort**. Password hash synchronization requires the least effort regarding deployment, maintenance, and infrastructure. This level of effort typically applies to organizations that only need their users to sign in to Microsoft 365, SaaS apps, and other Azure AD-based resources. When turned on, password hash synchronization is part of the Azure AD Connect sync process and runs every two minutes. -* **User experience**. To improve users' sign-in experience, deploy seamless SSO with password hash synchronization. Seamless SSO eliminates unnecessary prompts when users are signed in. +* **User experience**. To improve users' sign-in experience, use [Azure AD joined devices (AADJ)](../../active-directory/devices/concept-azure-ad-join.md) or [Hybrid Azure AD joined devices (HAADJ)](../../active-directory/devices/howto-hybrid-azure-ad-join.md). If you can't join your Windows devices to Azure AD, we recommend deploying seamless SSO with password hash synchronization. Seamless SSO eliminates unnecessary prompts when users are signed in. * **Advanced scenarios**. If organizations choose to, it's possible to use insights from identities with Azure AD Identity Protection reports with Azure AD Premium P2. An example is the leaked credentials report. Windows Hello for Business has [specific requirements when you use password hash synchronization](/windows/access-protection/hello-for-business/hello-identity-verification). [Azure AD Domain Services](../../active-directory-domain-services/tutorial-create-instance.md) requires password hash synchronization to provision users with their corporate credentials in the managed domain. Details on decision questions: > [!NOTE] > Azure AD Conditional Access require [Azure AD Premium P1](https://www.microsoft.com/security/business/identity-access-management/azure-ad-pricing) licenses. -* **Business continuity**. Using password hash synchronization with cloud authentication is highly available as a cloud service that scales to all Microsoft datacenters. To make sure password hash synchronization does not go down for extended periods, deploy a second Azure AD Connect server in staging mode in a standby configuration. +* **Business continuity**. Using password hash synchronization with cloud authentication is highly available as a cloud service that scales to all Microsoft datacenters. To make sure password hash synchronization doesn't go down for extended periods, deploy a second Azure AD Connect server in staging mode in a standby configuration. * **Considerations**. Currently, password hash synchronization doesn't immediately enforce changes in on-premises account states. In this situation, a user has access to cloud apps until the user account state is synchronized to Azure AD. Organizations might want to overcome this limitation by running a new synchronization cycle after administrators do bulk updates to on-premises user account states. An example is disabling accounts. Refer to [implementing password hash synchronization](../../active-directory/hyb Pass-through Authentication requires unconstrained network access to domain controllers. All network traffic is encrypted and limited to authentication requests. For more information on this process, see the [security deep dive](../../active-directory/hybrid/how-to-connect-pta-security-deep-dive.md) on pass-through authentication. -* **User experience**. To improve users' sign-in experience, deploy seamless SSO with Pass-through Authentication. Seamless SSO eliminates unnecessary prompts after users sign in. +* **User experience**. To improve users' sign-in experience, use [Azure AD joined devices (AADJ)](../../active-directory/devices/concept-azure-ad-join.md) or [Hybrid Azure AD joined devices (HAADJ)](../../active-directory/devices/howto-hybrid-azure-ad-join.md). If you can't join your Windows devices to Azure AD, we recommend deploying seamless SSO with password hash synchronization. Seamless SSO eliminates unnecessary prompts when users are signed in. -* **Advanced scenarios**. Pass-through Authentication enforces the on-premises account policy at the time of sign-in. For example, access is denied when an on-premises userΓÇÖs account state is disabled, locked out, or their [password expires](../../active-directory/hybrid/how-to-connect-pta-faq.yml#what-happens-if-my-user-s-password-has-expired-and-they-try-to-sign-in-by-using-pass-through-authentication-) or the logon attempt falls outside the hours when the user is allowed to sign in. +* **Advanced scenarios**. Pass-through Authentication enforces the on-premises account policy at the time of sign-in. For example, access is denied when an on-premises user's account state is disabled, locked out, or their [password expires](../../active-directory/hybrid/how-to-connect-pta-faq.yml#what-happens-if-my-user-s-password-has-expired-and-they-try-to-sign-in-by-using-pass-through-authentication-) or the logon attempt falls outside the hours when the user is allowed to sign in. Organizations that require multi-factor authentication with pass-through authentication must use Azure AD Multi-Factor Authentication (MFA) or [Conditional Access custom controls](../../active-directory/conditional-access/controls.md#custom-controls-preview). Those organizations can't use a third-party or on-premises multifactor authentication method that relies on federation. Advanced features require that password hash synchronization is deployed whether or not you choose pass-through authentication. An example is the leaked credentials report of Identity Protection. -* **Business continuity**. We recommend that you deploy two extra pass-through authentication agents. These extras are in addition to the first agent on the Azure AD Connect server. This additional deployment ensures high availability of authentication requests. When you have three agents deployed, one agent can still fail when another agent is down for maintenance. +* **Business continuity**. We recommend that you deploy two extra pass-through authentication agents. These extras are in addition to the first agent on the Azure AD Connect server. This other deployment ensures high availability of authentication requests. When you have three agents deployed, one agent can still fail when another agent is down for maintenance. There's another benefit to deploying password hash synchronization in addition to pass-through authentication. It acts as a backup authentication method when the primary authentication method is no longer available. Refer to [implementing pass-through authentication](../../active-directory/hybri * **Advanced scenarios**. A federated authentication solution is required when customers have an authentication requirement that Azure AD doesn't support natively. See detailed information to help you [choose the right sign-in option](/archive/blogs/samueld/choosing-the-right-sign-in-option-to-connect-to-azure-ad-office-365). Consider the following common requirements: - * Authentication that requires smartcards or certificates. - * On-premises MFA servers or third-party multifactor providers requiring a federated identity provider. + * Third-party multifactor providers requiring a federated identity provider. * Authentication by using third-party authentication solutions. See the [Azure AD federation compatibility list](../../active-directory/hybrid/how-to-connect-fed-compatibility.md). * Sign in that requires a sAMAccountName, for example DOMAIN\username, instead of a User Principal Name (UPN), for example, user@domain.com. Refer to [implementing pass-through authentication](../../active-directory/hybri * **Considerations**. Federated systems typically require a more significant investment in on-premises infrastructure. Most organizations choose this option if they already have an on-premises federation investment. And if it's a strong business requirement to use a single-identity provider. Federation is more complex to operate and troubleshoot compared to cloud authentication solutions. -For a non-routable domain that can't be verified in Azure AD, you need extra configuration to implement user ID sign in. This requirement is known as Alternate login ID support. See [Configuring Alternate Login ID](/windows-server/identity/ad-fs/operations/configuring-alternate-login-id) for limitations and requirements. If you choose to use a third-party multi-factor authentication provider with federation, ensure the provider supports WS-Trust to allow devices to join Azure AD. +For a nonroutable domain that can't be verified in Azure AD, you need extra configuration to implement user ID sign in. This requirement is known as Alternate login ID support. See [Configuring Alternate Login ID](/windows-server/identity/ad-fs/operations/configuring-alternate-login-id) for limitations and requirements. If you choose to use a third-party multi-factor authentication provider with federation, ensure the provider supports WS-Trust to allow devices to join Azure AD. Refer to [Deploying Federation Servers](/windows-server/identity/ad-fs/deployment/deploying-federation-servers) for deployment steps. The following diagrams outline the high-level architecture components required f ## Comparing methods -|Consideration|Password hash synchronization + Seamless SSO|Pass-through Authentication + Seamless SSO|Federation with AD FS| +|Consideration|Password hash synchronization|Pass-through Authentication|Federation with AD FS| |:--|:--|:--|:--|-|Where does authentication happen?|In the cloud|In the cloud after a secure password verification exchange with the on-premises authentication agent|On-premises| +|Where does authentication happen?|In the cloud|In the cloud, after a secure password verification exchange with the on-premises authentication agent|On-premises| |What are the on-premises server requirements beyond the provisioning system: Azure AD Connect?|None|One server for each additional authentication agent|Two or more AD FS servers<br><br>Two or more WAP servers in the perimeter/DMZ network| |What are the requirements for on-premises Internet and networking beyond the provisioning system?|None|[Outbound Internet access](../../active-directory/hybrid/how-to-connect-pta-quick-start.md) from the servers running authentication agents|[Inbound Internet access](/windows-server/identity/ad-fs/overview/ad-fs-requirements) to WAP servers in the perimeter<br><br>Inbound network access to AD FS servers from WAP servers in the perimeter<br><br>Network load balancing| |Is there a TLS/SSL certificate requirement?|No|No|Yes| |Is there a health monitoring solution?|Not required|Agent status provided by [Azure portal](../../active-directory/hybrid/tshoot-connect-pass-through-authentication.md)|[Azure AD Connect Health](../../active-directory/hybrid/how-to-connect-health-adfs.md)|-|Do users get single sign-on to cloud resources from domain-joined devices within the company network?|Yes with [Seamless SSO](../../active-directory/hybrid/how-to-connect-sso.md)|Yes with [Seamless SSO](../../active-directory/hybrid/how-to-connect-sso.md)|Yes| -|What sign-in types are supported?|UserPrincipalName + password<br><br>Windows-Integrated Authentication by using [Seamless SSO](../../active-directory/hybrid/how-to-connect-sso.md)<br><br>[Alternate login ID](../../active-directory/hybrid/how-to-connect-install-custom.md)|UserPrincipalName + password<br><br>Windows-Integrated Authentication by using [Seamless SSO](../../active-directory/hybrid/how-to-connect-sso.md)<br><br>[Alternate login ID](../../active-directory/hybrid/how-to-connect-pta-faq.yml)|UserPrincipalName + password<br><br>sAMAccountName + password<br><br>Windows-Integrated Authentication<br><br>[Certificate and smart card authentication](/windows-server/identity/ad-fs/operations/configure-user-certificate-authentication)<br><br>[Alternate login ID](/windows-server/identity/ad-fs/operations/configuring-alternate-login-id)| +|Do users get single sign-on to cloud resources from domain-joined devices within the company network?|Yes with [Azure AD joined devices (AADJ)](../../active-directory/devices/concept-azure-ad-join.md), [Hybrid Azure AD joined devices (HAADJ)](../../active-directory/devices/howto-hybrid-azure-ad-join.md), the [Microsoft Enterprise SSO plug-in for Apple devices](../../active-directory/develop/apple-sso-plugin.md), or [Seamless SSO](../../active-directory/hybrid/how-to-connect-sso.md)|Yes with [Azure AD joined devices (AADJ)](../../active-directory/devices/concept-azure-ad-join.md), [Hybrid Azure AD joined devices (HAADJ)](../../active-directory/devices/howto-hybrid-azure-ad-join.md), the [Microsoft Enterprise SSO plug-in for Apple devices](../../active-directory/develop/apple-sso-plugin.md), or [Seamless SSO](../../active-directory/hybrid/how-to-connect-sso.md)|Yes| +|What sign-in types are supported?|UserPrincipalName + password<br><br>Windows-Integrated Authentication by using [Seamless SSO](../../active-directory/hybrid/how-to-connect-sso.md)<br><br>[Alternate login ID](../../active-directory/hybrid/how-to-connect-install-custom.md)<br><br>[Azure AD Joined Devices](../../active-directory/devices/concept-azure-ad-join.md)<br><br>[Hybrid Azure AD joined devices (HAADJ)](../../active-directory/devices/howto-hybrid-azure-ad-join.md)<br><br>[Certificate and smart card authentication](../../active-directory/authentication/concept-certificate-based-authentication-smartcard.md)|UserPrincipalName + password<br><br>Windows-Integrated Authentication by using [Seamless SSO](../../active-directory/hybrid/how-to-connect-sso.md)<br><br>[Alternate login ID](../../active-directory/hybrid/how-to-connect-pta-faq.yml)<br><br>[Azure AD Joined Devices](../../active-directory/devices/concept-azure-ad-join.md)<br><br>[Hybrid Azure AD joined devices (HAADJ)](../../active-directory/devices/howto-hybrid-azure-ad-join.md)<br><br>[Certificate and smart card authentication](../../active-directory/authentication/concept-certificate-based-authentication-smartcard.md)|UserPrincipalName + password<br><br>sAMAccountName + password<br><br>Windows-Integrated Authentication<br><br>[Certificate and smart card authentication](/windows-server/identity/ad-fs/operations/configure-user-certificate-authentication)<br><br>[Alternate login ID](/windows-server/identity/ad-fs/operations/configuring-alternate-login-id)| |Is Windows Hello for Business supported?|[Key trust model](/windows/security/identity-protection/hello-for-business/hello-identity-verification)<br><br>[Hybrid Cloud Trust](/windows/security/identity-protection/hello-for-business/hello-hybrid-cloud-trust)|[Key trust model](/windows/security/identity-protection/hello-for-business/hello-identity-verification)<br><br>[Hybrid Cloud Trust](/windows/security/identity-protection/hello-for-business/hello-hybrid-cloud-trust)<br><br>*Both require Windows Server 2016 Domain functional level*|[Key trust model](/windows/security/identity-protection/hello-for-business/hello-identity-verification)<br><br>[Hybrid Cloud Trust](/windows/security/identity-protection/hello-for-business/hello-hybrid-cloud-trust)<br><br>[Certificate trust model](/windows/security/identity-protection/hello-for-business/hello-key-trust-adfs)|-|What are the multifactor authentication options?|[Azure AD MFA](/azure/multi-factor-authentication/)<br><br>[Custom Controls with Conditional Access*](../../active-directory/conditional-access/controls.md)|[Azure AD MFA](/azure/multi-factor-authentication/)<br><br>[Custom Controls with Conditional Access*](../../active-directory/conditional-access/controls.md)|[Azure AD MFA](/azure/multi-factor-authentication/)<br><br>[Azure MFA server](../../active-directory/authentication/howto-mfaserver-deploy.md)<br><br>[Third-party MFA](/windows-server/identity/ad-fs/operations/configure-additional-authentication-methods-for-ad-fs)<br><br>[Custom Controls with Conditional Access*](../../active-directory/conditional-access/controls.md)| +|What are the multifactor authentication options?|[Azure AD MFA](/azure/multi-factor-authentication/)<br><br>[Custom Controls with Conditional Access*](../../active-directory/conditional-access/controls.md)|[Azure AD MFA](/azure/multi-factor-authentication/)<br><br>[Custom Controls with Conditional Access*](../../active-directory/conditional-access/controls.md)|[Azure AD MFA](/azure/multi-factor-authentication/)<br><br>[Third-party MFA](/windows-server/identity/ad-fs/operations/configure-additional-authentication-methods-for-ad-fs)<br><br>[Custom Controls with Conditional Access*](../../active-directory/conditional-access/controls.md)| |What user account states are supported?|Disabled accounts<br>(up to 30-minute delay)|Disabled accounts<br><br>Account locked out<br><br>Account expired<br><br>Password expired<br><br>Sign-in hours|Disabled accounts<br><br>Account locked out<br><br>Account expired<br><br>Password expired<br><br>Sign-in hours| |What are the Conditional Access options?|[Azure AD Conditional Access, with Azure AD Premium](../../active-directory/conditional-access/overview.md)|[Azure AD Conditional Access, with Azure AD Premium](../../active-directory/conditional-access/overview.md)|[Azure AD Conditional Access, with Azure AD Premium](../../active-directory/conditional-access/overview.md)<br><br>[AD FS claim rules](https://adfshelp.microsoft.com/AadTrustClaims/ClaimsGenerator)| |Is blocking legacy protocols supported?|[Yes](../../active-directory/conditional-access/overview.md)|[Yes](../../active-directory/conditional-access/overview.md)|[Yes](/windows-server/identity/ad-fs/operations/access-control-policies-w2k12)| The following diagrams outline the high-level architecture components required f > Custom controls in Azure AD Conditional Access do not currently support device registration. ## Recommendations-Your identity system ensures your users' access to cloud apps and the line-of-business apps that you migrate and make available in the cloud. To keep authorized users productive and bad actors out of your organizationΓÇÖs sensitive data, authentication controls access to apps. --Use or enable password hash synchronization for whichever authentication method you choose, for the following reasons: +Your identity system ensures your users' access to apps that you migrate and make available in the cloud. Use or enable password hash synchronization with whichever authentication method you choose, for the following reasons: 1. **High availability and disaster recovery**. Pass-through Authentication and federation rely on on-premises infrastructure. For pass-through authentication, the on-premises footprint includes the server hardware and networking the Pass-through Authentication agents require. For federation, the on-premises footprint is even larger. It requires servers in your perimeter network to proxy authentication requests and the internal federation servers. - To avoid single points of failure, deploy redundant servers. Then authentication requests will always be serviced if any component fails. Both pass-through authentication and federation also rely on domain controllers to respond to authentication requests, which can also fail. Many of these components need maintenance to stay healthy. Outages are more likely when maintenance isn't planned and implemented correctly. Avoid outages by using password hash synchronization because the Microsoft Azure AD cloud authentication service scales globally and is always available. + To avoid single points of failure, deploy redundant servers. Then authentication requests will always be serviced if any component fails. Both pass-through authentication and federation also rely on domain controllers to respond to authentication requests, which can also fail. Many of these components need maintenance to stay healthy. Outages are more likely when maintenance isn't planned and implemented correctly. 2. **On-premises outage survival**. The consequences of an on-premises outage due to a cyber-attack or disaster can be substantial, ranging from reputational brand damage to a paralyzed organization unable to deal with the attack. Recently, many organizations were victims of malware attacks, including targeted ransomware, which caused their on-premises servers to go down. When Microsoft helps customers deal with these kinds of attacks, it sees two categories of organizations: * Organizations that previously also turned on password hash synchronization on top of federated or pass-through authentication changed their primary authentication method to then use password hash synchronization. They were back online in a matter of hours. By using access to email via Microsoft 365, they worked to resolve issues and access other cloud-based workloads. - * Organizations that didnΓÇÖt previously enable password hash synchronization had to resort to untrusted external consumer email systems for communications to resolve issues. In those cases, it took them weeks to restore their on-premises identity infrastructure, before users were able to sign in to cloud-based apps again. + * Organizations that didn't previously enable password hash synchronization had to resort to untrusted external consumer email systems for communications to resolve issues. In those cases, it took them weeks to restore their on-premises identity infrastructure, before users were able to sign in to cloud-based apps again. 3. **Identity protection**. One of the best ways to protect users in the cloud is Azure AD Identity Protection with Azure AD Premium P2. Microsoft continually scans the Internet for user and password lists that bad actors sell and make available on the dark web. Azure AD can use this information to verify if any of the usernames and passwords in your organization are compromised. Therefore, it's critical to enable password hash synchronization no matter which authentication method you use, whether it's federated or pass-through authentication. Leaked credentials are presented as a report. Use this information to block or force users to change their passwords when they try to sign in with leaked passwords. Consider each authentication method. Does the effort to deploy the solution, and ## Next steps -In todayΓÇÖs world, threats are present 24 hours a day and come from everywhere. Implement the correct authentication method, and it will mitigate your security risks and protect your identities. +In today's world, threats are present 24 hours a day and come from everywhere. Implement the correct authentication method, and it will mitigate your security risks and protect your identities. [Get started](../fundamentals/active-directory-whatis.md) with Azure AD and deploy the right authentication solution for your organization. -If you're thinking about migrating from federated to cloud authentication, learn more about [changing the sign-in method](../../active-directory/hybrid/plan-connect-user-signin.md). To help you plan and implement the migration, use [these project deployment plans](../fundamentals/active-directory-deployment-plans.md) or consider using the new [Staged Rollout](../../active-directory/hybrid/how-to-connect-staged-rollout.md) feature to migrate federated users to using cloud authentication in a staged approach. +If you're thinking about migrating from federated to cloud authentication, learn more about [changing the sign-in method](../../active-directory/hybrid/plan-connect-user-signin.md). To help you plan and implement the migration, use [these project deployment plans](../fundamentals/active-directory-deployment-plans.md), or consider using the new [Staged Rollout](../../active-directory/hybrid/how-to-connect-staged-rollout.md) feature to migrate federated users to using cloud authentication in a staged approach. |
active-directory | Concept Workload Identity Risk | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/identity-protection/concept-workload-identity-risk.md | We detect risk on workload identities across sign-in behavior and offline indica | Suspicious Sign-ins | Offline | This risk detection indicates sign-in properties or patterns that are unusual for this service principal. <br><br> The detection learns the baselines sign-in behavior for workload identities in your tenant in between 2 and 60 days, and fires if one or more of the following unfamiliar properties appear during a later sign-in: IP address / ASN, target resource, user agent, hosting/non-hosting IP change, IP country, credential type. <br><br> Because of the programmatic nature of workload identity sign-ins, we provide a timestamp for the suspicious activity instead of flagging a specific sign-in event. <br><br> Sign-ins that are initiated after an authorized configuration change may trigger this detection. | | Admin confirmed account compromised | Offline | This detection indicates an admin has selected 'Confirm compromised' in the Risky Workload Identities UI or using riskyServicePrincipals API. To see which admin has confirmed this account compromised, check the accountΓÇÖs risk history (via UI or API). | | Leaked Credentials | Offline | This risk detection indicates that the account's valid credentials have been leaked. This leak can occur when someone checks in the credentials in public code artifact on GitHub, or when the credentials are leaked through a data breach. <br><br> When the Microsoft leaked credentials service acquires credentials from GitHub, the dark web, paste sites, or other sources, they're checked against current valid credentials in Azure AD to find valid matches. |-| Malicious application | Offline | This detection indicates that Microsoft has disabled an application for violating our terms of service. We recommend [conducting an investigation](https://go.microsoft.com/fwlink/?linkid=2208429) of the application.| -| Suspicious application | Offline | This detection indicates that Microsoft has identified an application that may be violating our terms of service, but hasn't disabled it. We recommend [conducting an investigation](https://go.microsoft.com/fwlink/?linkid=2208429) of the application.| +| Malicious application | Offline | This detection indicates that Microsoft has disabled an application for violating our terms of service. We recommend [conducting an investigation](https://go.microsoft.com/fwlink/?linkid=2208429) of the application. Note: These applications will show `DisabledDueToViolationOfServicesAgreement` on the `disabledByMicrosoftStatus` property on the related [application](/graph/api/resources/application) and [service principal](/graph/api/resources/serviceprincipal) resource types in Microsoft Graph. To prevent them from being instantiated in your organization again in the future, you cannot delete these objects. | +| Suspicious application | Offline | This detection indicates that Microsoft has identified an application that may be violating our terms of service, but hasn't disabled it. We recommend [conducting an investigation](https://go.microsoft.com/fwlink/?linkid=2208429) of the application.| | Anomalous service principal activity | Offline | This risk detection baselines normal administrative service principal behavior in Azure AD, and spots anomalous patterns of behavior like suspicious changes to the directory. The detection is triggered against the administrative service principal making the change or the object that was changed. | ## Identify risky workload identities The [Azure AD Toolkit](https://github.com/microsoft/AzureADToolkit) is a PowerSh - [Azure AD sign-in logs](../reports-monitoring/concept-sign-ins.md) - [Simulate risk detections](howto-identity-protection-simulate-risk.md) + |
active-directory | Servicenow Provisioning Tutorial | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/servicenow-provisioning-tutorial.md | Title: Configure ServiceNow for automatic user provisioning with Azure Active Directory description: Learn how to automatically provision and deprovision user accounts from Azure AD to ServiceNow. -+ For more information on the Azure AD automatic user provisioning service, see [A 1. Obtain credentials for an admin in ServiceNow. Go to the user profile in ServiceNow and verify that the user has the admin role. + +1. Enable the SCIM v2 Plugin using the steps outlined by this [ServiceNow doc](https://docs.servicenow.com/en-US/bundle/utah-platform-security/page/integrate/authentication/task/activate-scim-plugin.html) ## Step 3: Add ServiceNow from the Azure AD application gallery To configure automatic user provisioning for ServiceNow in Azure AD: 1. In the **Admin Credentials** section, enter your ServiceNow tenant URL, Client ID, Client Secret and Authorization Endpoint. Select **Test Connection** to ensure that Azure AD can connect to ServiceNow. [This ServiceNow documentation](https://docs.servicenow.com/bundle/utah-platform-security/page/administer/security/task/t_CreateEndpointforExternalClients.html) outlines how to generate these values. +- Tenant URL: https://**InsertInstanceName**.service-now.com/api/now/scim +- Authorization Endpoint: https://**InsertInstanceName**.service-now.com/oauth_auth.do?response_type=code&client_id=**InsertClientID**&state=1&scope=useraccount&redirect_uri=https%3A%2F%2Fportal.azure.com%2FTokenAuthorize +- Token Endoint: https://**InsertInstanceName**.service-now.com/api/now/scim ++ + 1. In the **Notification Email** box, enter the email address of a person or group that should receive the provisioning error notifications. Then select the **Send an email notification when a failure occurs** check box. 1. Select **Save**. POST https://graph.microsoft.com/beta/servicePrincipals/[object-id]/synchronizat 11. Restore any previous changes you made to the application (Authentication details, Scoping filters, Custom attribute mappings) and re-enable provisioning. > [!NOTE] -> Failure to restore the previous settings may results in attributes (name.formatted for example) updating in Workplace unexpectedly. Be sure to check the configuration before enabling provisioning +> Failure to restore the previous settings may results in attributes (name.formatted for example) updating in ServiceNow unexpectedly. Be sure to check the configuration before enabling provisioning ## Additional resources |
active-directory | Signiant Media Shuttle Tutorial | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/signiant-media-shuttle-tutorial.md | + + Title: Azure Active Directory SSO integration with Signiant Media Shuttle +description: Learn how to configure single sign-on between Azure Active Directory and Signiant Media Shuttle. ++++++++ Last updated : 03/13/2023+++++# Azure Active Directory SSO integration with Signiant Media Shuttle ++In this article, you learn how to integrate Signiant Media Shuttle with Azure Active Directory (Azure AD). Media Shuttle is a solution for securely moving large files and data sets to, and from, cloud-based or on-premises storage. Transfers are accelerated and can be up to 100 s of times faster than FTP. When you integrate Signiant Media Shuttle with Azure AD, you can: ++* Control in Azure AD who has access to Signiant Media Shuttle. +* Enable your users to be automatically signed-in to Signiant Media Shuttle with their Azure AD accounts. +* Manage your accounts in one central location - the Azure portal. ++You need to configure and test Azure AD single sign-on for Signiant Media Shuttle in a test environment. Signiant Media Shuttle supports only **SP** initiated single sign-on and **Just In Time** user provisioning. ++## Prerequisites ++To integrate Azure Active Directory with Signiant Media Shuttle, you need: ++* An Azure AD user account. If you don't already have one, you can [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F). +* One of the following roles: Global Administrator, Cloud Application Administrator, Application Administrator, or owner of the service principal. +* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/). +* Signiant Media Shuttle single sign-on (SSO) enabled subscription. ++## Add application and assign a test user ++Before you begin the process of configuring single sign-on, you need to add the Signiant Media Shuttle application from the Azure AD gallery. You need a test user account to assign to the application and test the single sign-on configuration. ++### Add Signiant Media Shuttle from the Azure AD gallery ++Add Signiant Media Shuttle from the Azure AD application gallery to configure single sign-on with Signiant Media Shuttle. For more information on how to add application from the gallery, see the [Quickstart: Add application from the gallery](../manage-apps/add-application-portal.md). ++### Create and assign Azure AD test user ++Follow the guidelines in the [create and assign a user account](../manage-apps/add-application-portal-assign-users.md) article to create a test user account in the Azure portal called B.Simon. ++Alternatively, you can also use the [Enterprise App Configuration Wizard](https://portal.office.com/AdminPortal/home?Q=Docs#/azureadappintegration). In this wizard, you can add an application to your tenant, add users/groups to the app, and assign roles. The wizard also provides a link to the single sign-on configuration pane in the Azure portal. [Learn more about Microsoft 365 wizards.](/microsoft-365/admin/misc/azure-ad-setup-guides). ++## Configure Azure AD SSO ++Complete the following steps to enable Azure AD single sign-on in the Azure portal. ++1. In the Azure portal, on the **Signiant Media Shuttle** application integration page, find the **Manage** section and select **single sign-on**. +1. On the **Select a single sign-on method** page, select **SAML**. +1. On the **Set up single sign-on with SAML** page, select the pencil icon for **Basic SAML Configuration** to edit the settings. ++  ++1. On the **Basic SAML Configuration** section, perform the following steps: ++ a. In the **Identifier** textbox, type a value or URL using one of the following patterns: ++ | **Identifier** | + || + | `https://<PORTALNAME>.mediashuttle.com` | + | `mediashuttle` | ++ b. In the **Reply URL** textbox, type a URL using one of the following patterns: ++ | **Reply URL**| + || + | `https://portals.mediashuttle.com/auth` | + | `https://<PORTALNAME>.mediashuttle.com/auth` | ++ c. In the **Sign on URL** textbox, type a URL using one of the following patterns: ++ | **Sign on URL**| + || + | `https://portals.mediashuttle.com/auth` | + | `https://<PORTALNAME>.mediashuttle.com/auth` | ++ > [!Note] + > These values are not real. Update these values with the actual Identifier, Reply URL and Sign on URL. Contact [Signiant Media Shuttle support team](mailto:support@signiant.com) to get these values. You can also refer to the patterns shown in the Basic SAML Configuration section in the Azure portal. ++1. Your Signiant Media Shuttle application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows an example. The default value of **Unique User Identifier** is **user.userprincipalname** but Signiant Media Shuttle expects to be mapped with the user's email address. For that you can use **user.mail** attribute from the list or use the appropriate attribute value based on your organization configuration. ++  ++1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, click copy button to copy **App Federation Metadata Url** and save it on your computer. ++  ++## Configure Signiant Media Shuttle SSO ++To configure single sign-on on **Signiant Media Shuttle** side, you need to send the **App Federation Metadata Url** to [Signiant Media Shuttle support team](mailto:support@signiant.com). They set this setting to have the SAML SSO connection set properly on both sides. ++### Create Signiant Media Shuttle test user ++In this section, a user called Britta Simon is created in Signiant Media Shuttle. Signiant Media Shuttle supports just-in-time user provisioning, which is enabled by default. There's no action item for you in this section. If a user doesn't already exist in Signiant Media Shuttle, a new one is created after authentication. ++## Test SSO ++In this section, you test your Azure AD single sign-on configuration with following options. ++* Click on **Test this application** in Azure portal. This will redirect to Signiant Media Shuttle Sign-on URL where you can initiate the login flow. ++* Go to Signiant Media Shuttle Sign-on URL directly and initiate the login flow from there. ++* You can use Microsoft My Apps. When you click the Signiant Media Shuttle tile in the My Apps, this will redirect to Signiant Media Shuttle Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md). ++## Additional resources ++* [What is single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md) +* [Plan a single sign-on deployment](../manage-apps/plan-sso-deployment.md). ++## Next steps ++Once you configure Signiant Media Shuttle you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad). |
active-directory | Superannotate Tutorial | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/superannotate-tutorial.md | + + Title: Azure Active Directory SSO integration with SuperAnnotate +description: Learn how to configure single sign-on between Azure Active Directory and SuperAnnotate. ++++++++ Last updated : 03/13/2023+++++# Azure Active Directory SSO integration with SuperAnnotate ++In this article, you learn how to integrate SuperAnnotate with Azure Active Directory (Azure AD). SuperAnnotate is the all-in-one AI data infrastructure platform that helps ML and data teams save time on building accurate AI models with the highest quality training data - SuperData. When you integrate SuperAnnotate with Azure AD, you can: ++* Control in Azure AD who has access to SuperAnnotate. +* Enable your users to be automatically signed-in to SuperAnnotate with their Azure AD accounts. +* Manage your accounts in one central location - the Azure portal. ++You'll configure and test Azure AD single sign-on for SuperAnnotate in a test environment. SuperAnnotate supports only **SP** initiated single sign-on. ++## Prerequisites ++To integrate Azure Active Directory with SuperAnnotate, you need: ++* An Azure AD user account. If you don't already have one, you can [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F). +* One of the following roles: Global Administrator, Cloud Application Administrator, Application Administrator, or owner of the service principal. +* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/). +* SuperAnnotate single sign-on (SSO) enabled subscription. ++## Add application and assign a test user ++Before you begin the process of configuring single sign-on, you need to add the SuperAnnotate application from the Azure AD gallery. You need a test user account to assign to the application and test the single sign-on configuration. ++### Add SuperAnnotate from the Azure AD gallery ++Add SuperAnnotate from the Azure AD application gallery to configure single sign-on with SuperAnnotate. For more information on how to add application from the gallery, see the [Quickstart: Add application from the gallery](../manage-apps/add-application-portal.md). ++### Create and assign Azure AD test user ++Follow the guidelines in the [create and assign a user account](../manage-apps/add-application-portal-assign-users.md) article to create a test user account in the Azure portal called B.Simon. ++Alternatively, you can also use the [Enterprise App Configuration Wizard](https://portal.office.com/AdminPortal/home?Q=Docs#/azureadappintegration). In this wizard, you can add an application to your tenant, add users/groups to the app, and assign roles. The wizard also provides a link to the single sign-on configuration pane in the Azure portal. [Learn more about Microsoft 365 wizards.](/microsoft-365/admin/misc/azure-ad-setup-guides). ++## Configure Azure AD SSO ++Complete the following steps to enable Azure AD single sign-on in the Azure portal. ++1. In the Azure portal, on the **SuperAnnotate** application integration page, find the **Manage** section and select **single sign-on**. +1. On the **Select a single sign-on method** page, select **SAML**. +1. On the **Set up single sign-on with SAML** page, select the pencil icon for **Basic SAML Configuration** to edit the settings. ++  ++1. On the **Basic SAML Configuration** section, perform the following steps: ++ a. In the **Identifier** textbox, type a value using the following pattern: + `urn:amazon:cognito:sp:<USER_POOL_ID>` ++ b. In the **Reply URL** textbox, type a URL using the following pattern: + `https://<DOMAIN PREFIX>.auth.<REGION>.amazoncognito.com/saml2/idpresponse` ++ c. In the **Sign on URL** textbox, type the URL: + `https://auth.superannotate.com/login` ++ > [!Note] + > These values are not real. Update these values with the actual Identifier and Reply URL. Contact [SuperAnnotate support team](mailto:support@superannotate.com) to get these values. You can also refer to the patterns shown in the Basic SAML Configuration section in the Azure portal. ++1. SuperAnnotate application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes. ++  ++1. In addition to above, SuperAnnotate application expects few more attributes to be passed back in SAML response, which are shown below. These attributes are also pre populated but you can review them as per your requirements. ++ | Name | Source Attribute| + | | | + | groups | user.groups [ApplicationGroup] | ++1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, click copy button to copy **App Federation Metadata Url** and save it on your computer. ++  ++## Configure SuperAnnotate SSO ++To configure single sign-on on **SuperAnnotate** side, you need to send the **App Federation Metadata Url** to [SuperAnnotate support team](mailto:support@superannotate.com). They set this setting to have the SAML SSO connection set properly on both sides ++### Create SuperAnnotate test user ++In this section, you create a user called Britta Simon in SuperAnnotate. Work with [SuperAnnotate support team](mailto:support@superannotate.com) to add the users in the SuperAnnotate platform. Users must be created and activated before you use single sign-on. ++## Test SSO ++In this section, you test your Azure AD single sign-on configuration with following options. ++* Click on **Test this application** in Azure portal. This will redirect to SuperAnnotate Sign-on URL where you can initiate the login flow. ++* Go to SuperAnnotate Sign-on URL directly and initiate the login flow from there. ++* You can use Microsoft My Apps. When you click the SuperAnnotate tile in the My Apps, this will redirect to SuperAnnotate Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md). ++## Additional resources ++* [What is single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md) +* [Plan a single sign-on deployment](../manage-apps/plan-sso-deployment.md). ++## Next steps ++Once you configure SuperAnnotate you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad). |
active-directory | Verifiable Credentials Standards | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/verifiable-credentials/verifiable-credentials-standards.md | Entra Verified ID supports the following open standards: | User authentication | [Self-Issued OpenID Provider v2](https://openid.net/specs/openid-connect-self-issued-v2-1_0.html)| OIDF | | Presentation | [OpenID for Verifiable Credentials](https://openid.net/specs/openid-connect-4-verifiable-presentations-1_0.html) | OIDF| | Query language | [Presentation Exchange v1.0](https://identity.foundation/presentation-exchange/spec/v1.0.0/)| DIF |-| User authentication | [Self-Issued OpenID Provider v2](https://openid.net/specs/openid-connect-self-issued-v2-1_0.html)| OIDF | | Trust in DID (decentralized identifier) owner | [Well Known DID Configuration](https://identity.foundation/.well-known/resources/did-configuration)| DIF |-| Revocation |[Verifiable Credential Status List 2021](https://github.com/w3c-ccg/vc-status-list-2021/tree/343b8b59cddba4525e1ef355356ae760fc75904e)| W3C CCG | +| Revocation |[Verifiable Credential Status List 2021](https://w3c.github.io/vc-status-list-2021/)| W3C CCG | ## Supported algorithms |
aks | Api Server Vnet Integration | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/api-server-vnet-integration.md | -An Azure Kubernetes Service (AKS) cluster configured with API Server VNet Integration (Preview) projects the API server endpoint directly into a delegated subnet in the VNet where AKS is deployed. API Server VNet Integartion enables network communication between the API server and the cluster nodes without requiring a private link or tunnel. The API server is available behind an Internal Load Balancer VIP in the delegated subnet, which the nodes are configured to utilize. By using API Server VNet Integration, you can ensure network traffic between your API server and your node pools remains on the private network only. +An Azure Kubernetes Service (AKS) cluster configured with API Server VNet Integration (Preview) projects the API server endpoint directly into a delegated subnet in the VNet where AKS is deployed. API Server VNet Integration enables network communication between the API server and the cluster nodes without requiring a private link or tunnel. The API server is available behind an Internal Load Balancer VIP in the delegated subnet, which the nodes are configured to utilize. By using API Server VNet Integration, you can ensure network traffic between your API server and your node pools remains on the private network only. ## API server connectivity |
aks | Cluster Configuration | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/cluster-configuration.md | Mariner can be deployed on AKS through Azure CLI or ARM templates. ### Prerequisites -1. You need the latest version of Azure CLI. Run `az --version` to find the version. If you need to install or upgrade, see [Install Azure CLI][azure-cli-install]. -2. You need the `aks-preview` Azure CLI extension for the ability to select the Mariner 2.0 operating system SKU. Run `az extension remove --name aks-preview` to clear any previous versions, then run `az extension add --name aks-preview`. -3. If you don't already have kubectl installed, install it through Azure CLI using `az aks install-cli` or follow the [upstream instructions](https://kubernetes.io/docs/tasks/tools/install-kubectl-linux/). +1. You need the Azure CLI version 2.44.1 or later installed and configured. Run `az --version` to find the version currently installed. If you need to install or upgrade, see [Install Azure CLI][azure-cli-install]. +1. If you don't already have kubectl installed, install it through Azure CLI using `az aks install-cli` or follow the [upstream instructions](https://kubernetes.io/docs/tasks/tools/install-kubectl-linux/). ### Deploy an AKS Mariner cluster with Azure CLI Use the following example commands to create a Mariner cluster. ```azurecli az group create --name MarinerTest --location eastus -az aks create --name testMarinerCluster --resource-group MarinerTest --os-sku mariner +az aks create --name testMarinerCluster --resource-group MarinerTest --os-sku mariner --generate-ssh-keys az aks get-credentials --resource-group MarinerTest --name testMarinerCluster kubectl get pods --all-namespaces ### Deploy an AKS Mariner cluster with an ARM template -To add Mariner to an existing ARM template, you need to add `"osSKU": "mariner"` and `"mode": "System"` to `agentPoolProfiles` and set the apiVersion to 2021-03-01 or newer (`"apiVersion": "2021-03-01"`). The following deployment uses the ARM template "marineraksarm.yml". +To add Mariner to an existing ARM template, you need to do the following: -```yml +- Add `"osSKU": "mariner"` and `"mode": "System"` to agentPoolProfiles property. +- Set the apiVersion to 2021-03-01 or newer: `"apiVersion": "2021-03-01"` ++The following deployment uses the ARM template `marineraksarm.json`. ++```json { "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#", "contentVersion": "1.0.0.1", To add Mariner to an existing ARM template, you need to add `"osSKU": "mariner"` }, "dnsPrefix": { "type": "string",+ "defaultValue": "mariner", "metadata": { "description": "Optional DNS prefix to use with hosted Kubernetes API server FQDN." } To add Mariner to an existing ARM template, you need to add `"osSKU": "mariner"` } ``` -Create this file on your system and fill it with the contents of the Mariner AKS YAML file. +Create this file on your system and include the settings defined in the `marineraksarm.json` file. ```azurecli az group create --name MarinerTest --location eastus -az deployment group create --resource-group MarinerTest --template-file marineraksarm.yml --parameters clusterName=testMarinerCluster dnsPrefix=marineraks1 linuxAdminUsername=azureuser sshRSAPublicKey=`<contents of your id_rsa.pub>` +az deployment group create --resource-group MarinerTest --template-file marineraksarm.json --parameters linuxAdminUsername=azureuser sshRSAPublicKey=`<contents of your id_rsa.pub>` az aks get-credentials --resource-group MarinerTest --name testMarinerCluster default_node_pool { name = "default" node_count = 2 vm_size = "Standard_D2_v2"- os_sku = "CBLMariner" + os_sku = "mariner" } ``` |
aks | Cluster Container Registry Integration | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/cluster-container-registry-integration.md | Title: Integrate Azure Container Registry with Azure Kubernetes Service description: Learn how to integrate Azure Kubernetes Service (AKS) with Azure Container Registry (ACR) Previously updated : 11/16/2022 Last updated : 03/13/2023 ms.tool: azure-cli, azure-powershell ms.devlang: azurecli -You need to establish an authentication mechanism when using [Azure Container Registry (ACR)][acr-intro] with Azure Kubernetes Service (AKS). This operation is implemented as part of the Azure CLI, Azure PowerShell, and Azure portal experiences by granting the required permissions to your ACR. This article provides examples for configuring authentication between these Azure services. +When using [Azure Container Registry (ACR)][acr-intro] with Azure Kubernetes Service (AKS), you need to establish an authentication mechanism. Configuring the required permissions between ACR and AKS can be accomplished using the Azure CLI, Azure PowerShell, and Azure portal. This article provides examples to configure authentication between these Azure services using the Azure CLI or Azure PowerShell. -You can set up the AKS to ACR integration using the Azure CLI or Azure PowerShell. The AKS to ACR integration assigns the [**AcrPull** role][acr-pull] to the [Azure Active Directory (Azure AD) **managed identity**][aad-identity] associated with the agent pool in your AKS cluster. For more information on AKS managed identities, see [Summary of managed identities][summary-msi]. +The AKS to ACR integration assigns the [**AcrPull** role][acr-pull] to the [Azure Active Directory (Azure AD) **managed identity**][aad-identity] associated with the agent pool in your AKS cluster. For more information on AKS managed identities, see [Summary of managed identities][summary-msi]. > [!IMPORTANT]-> There is a latency issue with Azure Active Directory groups when attaching ACR. If the AcrPull role is granted to an Azure AD group and the kubelet identity is added to the group to complete the RBAC configuration, there may be a delay before the RBAC group takes effect. If you are running automation that requires the RBAC configuration to be complete, we recommended you use the [Bring your own kubelet identity][byo-kubelet-identity] as a workaround. You can pre-create a user-assigned identity, add it to the Azure AD group, then use the identity as the kubelet identity to create an AKS cluster. This ensures the identity is added to the Azure AD group before a token is generated by kubelet, which avoids the latency issue. +> There is a latency issue with Azure Active Directory groups when attaching ACR. If the **AcrPull** role is granted to an Azure AD group and the kubelet identity is added to the group to complete the RBAC configuration, there may be a delay before the RBAC group takes effect. If you are running automation that requires the RBAC configuration to be complete, we recommended you use the [Bring your own kubelet identity][byo-kubelet-identity] as a workaround. You can pre-create a user-assigned identity, add it to the Azure AD group, then use the identity as the kubelet identity to create an AKS cluster. This ensures the identity is added to the Azure AD group before a token is generated by kubelet, which avoids the latency issue. > [!NOTE] > This article covers automatic authentication between AKS and ACR. If you need to pull an image from a private external registry, use an [image pull secret][image-pull-secret]. You can set up the AKS to ACR integration using the Azure CLI or Azure PowerShel * To avoid needing one of these roles, you can instead use an existing managed identity to authenticate ACR from AKS. For more information, see [Use an Azure managed identity to authenticate to an ACR](../container-registry/container-registry-authentication-managed-identity.md). * If you're using Azure CLI, this article requires that you're running Azure CLI version 2.7.0 or later. Run `az --version` to find the version. If you need to install or upgrade, see [Install Azure CLI][azure-cli-install]. * If you're using Azure PowerShell, this article requires that you're running Azure PowerShell version 5.9.0 or later. Run `Get-InstalledModule -Name Az` to find the version. If you need to install or upgrade, see [Install Azure PowerShell][azure-powershell-install].+* Examples and syntax to use Terraform for configuring ACR can be found in the [Terraform reference][terraform-reference]. ## Create a new AKS cluster with ACR integration Alternatively, you can specify the ACR name using an ACR resource ID using the f > az aks create -n myAKSCluster -g myResourceGroup --generate-ssh-keys --attach-acr /subscriptions/<subscription-id>/resourceGroups/myContainerRegistryResourceGroup/providers/Microsoft.ContainerRegistry/registries/myContainerRegistry > ``` +This command may take several minutes to complete. + #### [Azure PowerShell](#tab/azure-powershell) ```azurepowershell $MYACR = 'myContainerRegistry' New-AzAksCluster -Name myAKSCluster -ResourceGroupName myResourceGroup -GenerateSshKey -AcrNameToAttach $MYACR ``` -+This command may take several minutes to complete. -This step may take several minutes to complete. + ## Configure ACR integration for existing AKS clusters nginx0-deployment-669dfc4d4b-xdpd6 1/1 Running 0 20s [cli-param]: /cli/azure/aks#az-aks-update-optional-parameters [ps-attach]: /powershell/module/az.aks/set-azakscluster#-acrnametoattach [byo-kubelet-identity]: use-managed-identity.md#use-a-pre-created-kubelet-managed-identity+[terraform-reference]: https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/container_registry |
aks | Ingress Basic | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/ingress-basic.md | To see the ingress controller in action, run two demo applications in your AKS c spec: type: ClusterIP ports:- - port: 80 + - port: 80 selector: app: aks-helloworld-one ``` To see the ingress controller in action, run two demo applications in your AKS c spec: type: ClusterIP ports:- - port: 80 + - port: 80 selector: app: aks-helloworld-two ``` In the following example, traffic to *EXTERNAL_IP/hello-world-one* is routed to spec: ingressClassName: nginx rules:- - http: + - http: paths: - path: /hello-world-one(/|$)(.*) pathType: Prefix In the following example, traffic to *EXTERNAL_IP/hello-world-one* is routed to spec: ingressClassName: nginx rules:- - http: + - http: paths: - path: /static(/|$)(.*) pathType: Prefix |
aks | Quick Kubernetes Deploy Bicep | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/learn/quick-kubernetes-deploy-bicep.md | For more AKS samples, see the [AKS quickstart templates][aks-quickstart-template ```azurecli az group create --name myResourceGroup --location eastus- az deployment group create --resource-group myResourceGroup --template-file main.bicep --parameters clusterName=<cluster-name> dnsPrefix=<dns-previs> linuxAdminUsername=<linux-admin-username> sshRSAPublicKey='<ssh-key>' + az deployment group create --resource-group myResourceGroup --template-file main.bicep --parameters clusterName=<cluster-name> dnsPrefix=<dns-prefix> linuxAdminUsername=<linux-admin-username> sshRSAPublicKey='<ssh-key>' ``` # [PowerShell](#tab/PowerShell) |
aks | Node Pool Snapshot | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/node-pool-snapshot.md | This article assumes that you have an existing AKS cluster. If you need an AKS c ### Limitations - Any node pool or cluster created from a snapshot must use a VM from the same virtual machine family as the snapshot, for example, you can't create a new N-Series node pool based of a snapshot captured from a D-Series node pool because the node images in those cases are structurally different.-- Snapshots must be created in the same region as the source node pool.+- Snapshots must be created same region as the source node pool, those snapshots can be used to create or update clusters and node pools in other regions. + ## Take a node pool snapshot |
aks | Node Updates Kured | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/node-updates-kured.md | helm repo update kubectl create namespace kured # Install kured in that namespace with Helm 3 (only on Linux nodes, kured is not working on Windows nodes)-helm install my-release kubereboot/kured --namespace kured --set nodeSelector."kubernetes\.io/os"=linux +helm install my-release kubereboot/kured --namespace kured --set controller.nodeSelector."kubernetes\.io/os"=linux ``` You can also configure additional parameters for `kured`, such as integration with Prometheus or Slack. For more information about additional configuration parameters, see the [kured Helm chart][kured-install]. |
aks | Support Policies | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/support-policies.md | Microsoft and users share responsibility for Kubernetes agent nodes where: * `Kube-proxy` * Networking tunnels that provide communication paths to the Kubernetes master components * `Kubelet`- * Docker or `containerd` + * `containerd` > [!NOTE] > If an agent node is not operational, AKS might restart individual components or the entire agent node. These restart operations are automated and provide auto-remediation for common issues. If you want to know more about the auto-remediation mechanisms, see [Node Auto-Repair](node-auto-repair.md) |
aks | Use Managed Identity | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/use-managed-identity.md | The output should resemble the following: ### Add role assignment -For Vnet, attached Azure disk, static IP address, route table which are outside the default worker node resource group, you need to assign the `Contributor` role on custom resource group. +For VNet, attached Azure disk, static IP address, route table which are outside the default worker node resource group, you need to assign the `Contributor` role on custom resource group. ```azurecli-interactive az role assignment create --assignee <control-plane-identity-principal-id> --role "Contributor" --scope "<custom-resource-group-resource-id>" |
aks | Use Multiple Node Pools | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/use-multiple-node-pools.md | az aks nodepool add \ Mariner is an open-source Linux distribution available as an AKS container host. It provides high reliability, security, and consistency. Mariner only includes the minimal set of packages needed for running container workloads, which improves boot times and overall performance. -You can add a Mariner node pool into your existing cluster using the `az aks nodepool add` command and specifying `--os-sku CBLMariner`. +You can add a Mariner node pool into your existing cluster using the `az aks nodepool add` command and specifying `--os-sku mariner`. ```azurecli az aks nodepool add \ --resource-group myResourceGroup \ --cluster-name myAKSCluster \ --name marinerpool \- --os-sku CBLMariner + --os-sku mariner ``` ### Migrate Ubuntu nodes to Mariner Use the following instructions to migrate your Ubuntu nodes to Mariner nodes. -1. Add a Mariner node pool into your existing cluster using the `az aks nodepool add` command and specifying `--os-sku CBLMariner`. +1. Add a Mariner node pool into your existing cluster using the `az aks nodepool add` command and specifying `--os-sku mariner`. > [!NOTE] > When adding a new Mariner node pool, you need to add at least one as `--mode System`. Otherwise, AKS won't allow you to delete your existing Ubuntu node pool. |
aks | Workload Identity Deploy Cluster | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/workload-identity-deploy-cluster.md | Serviceaccount/workload-identity-sa created Use the [az identity federated-credential create][az-identity-federated-credential-create] command to create the federated identity credential between the managed identity, the service account issuer, and the subject. ```azurecli-az identity federated-credential create --name myfederatedIdentity --identity-name "${USER_ASSIGNED_IDENTITY_NAME}" --resource-group "${RG_NAME}" --issuer "${AKS_OIDC_ISSUER}" --subject system:serviceaccount:"${SERVICE_ACCOUNT_NAMESPACE}":"${SERVICE_ACCOUNT_NAME}" +az identity federated-credential create --name myfederatedIdentity --identity-name "${USER_ASSIGNED_IDENTITY_NAME}" --resource-group "${RG_NAME}" --issuer "${AKS_OIDC_ISSUER}" --subject system:serviceaccount:"${SERVICE_ACCOUNT_NAMESPACE}":"${SERVICE_ACCOUNT_NAME}" --audience api://AzureADTokenExchange ``` > [!NOTE] |
app-service | Overview Disaster Recovery | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/overview-disaster-recovery.md | + + Title: Disaster recovery guide +description: Learn three common disaster recovery patterns for Azure App Service. +keywords: app service, azure app service, hadr, disaster recovery, business continuity, high availability, bcdr ++ Last updated : 03/07/2023++++# Strategies for business continuity and disaster recovery in Azure App Service ++Most organizations have a business continuity plan to maintain availability of their applications during downtime and the preservation of their data in a regional disaster. This article covers some common strategies for web apps deployed to App Service. ++For example, when you create a web app in App Service and choose an Azure region during resource creation, it's a single-region app. When the region becomes unavailable during a disaster, your application also becomes unavailable. If you create an identical deployment in a secondary Azure region, your application becomes less susceptible to a single-region outage, which guarantees business continuity, and any data replication across the regions lets you recover your last application state. ++For IT, business continuity plans are largely driven by two metrics: + +- Recovery Time Objective (RTO) ΓÇô the time duration in which your application must come back online after an outage. +- Recovery Point Objective (RPO) ΓÇô the acceptable amount of data loss in a disaster, expressed as a unit of time (for example, 1 minute of transactional database records). ++Normally, maintaining an SLA around RTO is impractical for regional disasters, and you would typically design your disaster recovery strategy around RPO alone (i.e. focus on recovering data and not on minimizing interruption). With Azure, however, it's not only practical but could even be straightforward to deploy App Service for automatic geo-failovers. This lets you disaster-proof your applications further by take care of both RTO and RPO. ++Depending on your desired RTO and RPO metrics, three disaster recovery architectures are commonly used, as shown in the following table: + +|.| Active-Active regions | Active-Passive regions | Passive/Cold region| +|-|-|-|-| +|RTO| Real-time or seconds| Minutes| Hours | +|RPO| Real-time or seconds| Minutes| Hours | +|Cost | $$$| $$| $| +|Scenarios| Mission-critical apps| High-priority apps| Low-priority apps| +|Ability to serve multi-region user traffic| Yes| Yes/maybe| No| +|Code deployment | CI/CD pipelines preferred| CI/CD pipelines preferred| Backup and restore | +|Creation of new App Service resources during downtime | Not required | Not required| Required | ++## Active-Active architecture ++In this disaster recovery approach, identical web apps are deployed in two separate regions and Azure Front door is used to route traffic to both the active regions. +++With this example architecture: ++- Identical App Service apps are deployed in two separate regions, including pricing tier and instance count. +- Public traffic directly to the App Service apps is blocked. +- Azure Front Door is used to route traffic to both the active regions. +- During a disaster, one of the regions becomes offline, and Azure Front Door routes traffic exclusively to the region that remains online. The RTO during such a geo-failover is near-zero. +- Application files should be deployed to both web apps with a CI/CD solution. This ensures that the RPO is practically zero. +- If your application actively modifies the file system, the best way to minimize RPO is to only write to a [mounted Azure Storage share](configure-connect-to-azure-storage.md) instead of writing directly to the web app's */home* content share. Then, use the Azure Storage redundancy features ([GZRS](../storage/common/storage-redundancy.md#geo-zone-redundant-storage) or [GRS](../storage/common/storage-redundancy.md#geo-redundant-storage)) for your mounted share, which has an [RPO of about 15 minutes](../storage/common/storage-redundancy.md#redundancy-in-a-secondary-region). +- Review [important considerations](#important-considerations) for disaster recovery guidance on the rest of your architecture, such as Azure SQL Database and Azure Storage. ++Steps to create an active-active architecture for your web app in App Service are summarized as follows: ++1. Create two App Service plans in two different Azure regions. Configure the two App Service plans identically. +1. Create two instances of your web app, with one in each App Service plan. +1. Create an Azure Front Door profile with: + - An endpoint. + - Two origin groups, each with a priority of 1. The equal priority tells Azure Front Door to route traffic to both regions equally (thus active-active). + - A route. +1. [Limit network traffic to the web apps only from the Azure Front Door instance](app-service-ip-restrictions.md#restrict-access-to-a-specific-azure-front-door-instance). +1. Setup and configure all other back-end Azure service, such as databases, storage accounts, and authentication providers. +1. Deploy code to both the web apps with [continuous deployment](deploy-continuous-deployment.md). ++[Tutorial: Create a highly available multi-region app in Azure App Service](tutorial-multi-region-app.md) shows you how to set up an *active-passive* architecture. The same steps with minimal changes (setting priority to ΓÇ£1ΓÇ¥ for both origin groups in Azure Front Door) give you an *active-active* architecture. ++## Active-passive architecture ++In this disaster recovery approach, identical web apps are deployed in two separate regions and Azure Front door is used to route traffic to one region only (the *active* region). +++With this example architecture: ++- Identical App Service apps are deployed in two separate regions. +- Public traffic directly to the App Service apps is blocked. +- Azure Front Door is used to route traffic to the primary region. +- To save cost, the secondary App Service plan is configured to have fewer instances and/or be in a lower pricing tier. There are three possible approaches: + - **Preferred** The secondary App Service plan has the same pricing tier as the primary, with the same number of instances or fewer. This approach ensures parity in both feature and VM sizing for the two App Service plans. The RTO during a geo-failover only depends on the time to scale out the instances. + - **Less preferred** The secondary App Service plan has the same pricing tier type (such as PremiumV3) but smaller VM sizing, with lesser instances. For example, the primary region may be in P3V3 tier while the secondary region is in P1V3 tier. This approach still ensures feature parity for the two App Service plans, but the lack of size parity may require a manual scale-up when the secondary region becomes the active region. The RTO during a geo-failover depends on the time to both scale up and scale out the instances. + - **Least-preferred** The secondary App Service plan has a different pricing tier than the primary and lesser instances. For example, the primary region may be in P3V3 tier while the secondary region is in S1 tier. Make sure that the secondary App Service plan has all the features your application needs in order to run. Differences in features availability between the two may cause delays to your web app recovery. The RTO during a geo-failover depends on the time to both scale up and scale out the instances. +- Autoscale is configured on the secondary region in the event the active region becomes inactive. ItΓÇÖs advisable to have similar autoscale rules in both active and passive regions. +- During a disaster, the primary region becomes inactive, and the secondary region starts receiving traffic and becomes the active region. +- Once the secondary region becomes active, the network load triggers preconfigured autoscale rules to scale out the secondary web app. +- You may need to scale up the pricing tier for the secondary region manually, if it doesn't already have the needed features to run as the active region. For example, [autoscaling requires Standard tier or higher](https://azure.microsoft.com/pricing/details/app-service/windows/). +- When the primary region is active again, Azure Front Door automatically directs traffic back to it, and the architecture is back to active-passive as before. +- Application files should be deployed to both web apps with a CI/CD solution. This ensures that the RPO is practically zero. +- If your application actively modifies the file system, the best way to minimize RPO is to only write to a [mounted Azure Storage share](configure-connect-to-azure-storage.md) instead of writing directly to the web app's */home* content share. Then, use the Azure Storage redundancy features ([GZRS](../storage/common/storage-redundancy.md#geo-zone-redundant-storage) or [GRS](../storage/common/storage-redundancy.md#geo-redundant-storage)) for your mounted share, which has an [RPO of about 15 minutes](../storage/common/storage-redundancy.md#redundancy-in-a-secondary-region). +- Review [important considerations](#important-considerations) for disaster recovery guidance on the rest of your architecture, such as Azure SQL Database and Azure Storage. ++Steps to create an active-passive architecture for your web app in App Service are summarized as follows: ++1. Create two App Service plans in two different Azure regions. The secondary App Service plan may be provisioned using one of the approaches mentioned previously. +1. Configure autoscaling rules for the secondary App Service plan so that it scales to the same instance count as the primary when the primary region becomes inactive. +1. Create two instances of your web app, with one in each App Service plan. +1. Create an Azure Front Door profile with: + - An endpoint. + - An origin group with a priority of 1 for the primary region. + - A second origin group with a priority of 2 for the secondary region. The difference in priority tells Azure Front Door to prefer the primary region when it's online (thus active-passive). + - A route. +1. [Limit network traffic to the web apps only from the Azure Front Door instance](app-service-ip-restrictions.md#restrict-access-to-a-specific-azure-front-door-instance). +1. Setup and configure all other back-end Azure service, such as databases, storage accounts, and authentication providers. +1. Deploy code to both the web apps with [continuous deployment](deploy-continuous-deployment.md). ++[Tutorial: Create a highly available multi-region app in Azure App Service](tutorial-multi-region-app.md) shows you how to set up an *active-passive* architecture. ++## Passive/cold region ++In this disaster recovery approach, you create regular backups of your web app to an Azure Storage account. ++With this example architecture: ++- A single web app is deployed to a singled region. +- The web app is regularly backed up to an Azure Storage account in the same region. +- The cross-region replication of your backups depends on the data redundancy configuration in the Azure storage account. You should set your Azure Storage account as [GZRS](../storage/common/storage-redundancy.md#geo-zone-redundant-storage) if possible. GZRS offers both synchronous zone redundancy within a region and asynchronous in a secondary region. If GZRS isn't available, configure the account as [GRS](../storage/common/storage-redundancy.md#geo-redundant-storage). Both GZRS and GRS have an [RPO of about 15 minutes](../storage/common/storage-redundancy.md#redundancy-in-a-secondary-region). +- To ensure that you can retrieve backups when the storage account's primary region becomes unavailable, [**enable read only access to secondary region**](../storage/common/storage-redundancy.md#read-access-to-data-in-the-secondary-region) (making the storage account **RA-GZRS** or **RA-GRS**, respectively). For more information on designing your applications to take advantage of geo-redundancy, see [Use geo-redundancy to design highly available applications](../storage/common/geo-redundant-design.md). +- During a disaster in the web app's region, you must manually deploy all required App Service dependent resources by using the backups from the Azure Storage account, most likely from the secondary region with read access. The RTO may be hours or days. +- To minimize RTO, it's highly recommended that you have a comprehensive playbook outlining all the steps required to restore your web app backup to another Azure Region. For more information, see [Important considerations](#important-considerations). +- Review [important considerations](#important-considerations) for disaster recovery guidance on the rest of your architecture, such as Azure SQL Database and Azure Storage. ++Steps to create a passive-cold region for your web app in App Service are summarized as follows: ++1. Create an Azure storage account in the same region as your web app. Choose Standard performance tier and select redundancy as Geo-redundant storage (GRS) or Geo-Zone-redundant storage (GZRS). +1. Enable RA-GRS or RA-GZRS (read access for the secondary region). +1. [Configure custom backup](manage-backup.md) for your web app. You may decide to set a schedule for your web app backups, such as hourly. +1. Verify that the web app backup files can be retrieved the secondary region of your storage account. ++#### What if my web app's region doesn't have GZRS or GRS storage? ++[Azure regions that don't have a regional pair](../reliability/cross-region-replication-azure.md#regions-with-availability-zones-and-no-region-pair) don't have GRS nor GZRS. In this scenario, utilize zone-redundant storage (ZRS) or locally redundant storage (LRS) to create a similar architecture. For example, you can manually create a secondary region for the storage account as follows: +++Steps to create a passive-cold region without GRS and GZRS are summarized as follows: ++1. Create an Azure storage account in the same region of your web app. Choose Standard performance tier and select redundancy as zone-redundant storage (ZRS). +1. [Configure custom backup](manage-backup.md) for your web app. You may decide to set a schedule for your web app backups, such as hourly. +1. Verify that the web app backup files can be retrieved the secondary region of your storage account. +1. Create a second Azure storage account in a different region. Choose Standard performance tier and select redundancy as locally redundant storage (LRS). +1. By using a tool like [AzCopy](../storage/common/storage-use-azcopy-v10.md#use-in-a-script), replicate your custom backup (Zip, XML and log files) from primary region to the secondary storage. For example: ++ ``` + azcopy copy 'https://<source-storage-account-name>.blob.core.windows.net/<container-name>/<blob-path>' 'https://<destination-storage-account-name>.blob.core.windows.net/<container-name>/<blob-path>' + ``` + You can use [Azure Automation with a PowerShell Workflow runbook](../automation/learn/automation-tutorial-runbook-textual.md) to run your replication script [on a schedule](../automation/shared-resources/schedules.md). Make sure that the replication schedule follows a similar schedule to the web app backups. ++## Important considerations ++- These disaster recovery strategies are applicable to both App Service multitenant and App Service Environments. +- Within the same region, an App Service app can be deployed into [availability zones (AZ)](../reliability/availability-zones-overview.md) to help you achieve high availability for your mission-critical workloads. For more information, see [Migrate App Service to availability zone support](../reliability/migrate-app-service.md). +- There are multiple ways to replicate your web apps content and configurations across Azure regions in an active-active or active-passive architecture, such as using [App service backup and restore](manage-backup.md). However, these options are point-in-time snapshots and eventually lead to web app versioning challenges across regions. To avoid these limitations, configure your CI/CD pipelines to deploy code to both the Azure regions. Consider using [Azure Pipelines](/azure/devops/pipelines/get-started/what-is-azure-pipelines) or [GitHub Actions](https://docs.github.com/actions). For more information, see [Continuous deployment to Azure App Service](deploy-continuous-deployment.md). +- Use an infrastructure-as-code (IoC) mechanism to manage your application resources in Azure. In a complex deployment across multiple regions, to manage the regions independently and to keep the configuration synchronized across regions in a reliable manner requires a predictable, testable, and repeatable process. Consider an IoC tool such as [Azure Resource Manager templates](../azure-resource-manager/management/overview.md) or [Terraform](/azure/developer/terraform/overview). +- Your application most likely depends on other data services in Azure, such as Azure SQL Database and Azure Storage accounts. You should develop disaster recovery strategies for each of these dependent Azure Services as well. For SQL Database, see [Active geo-replication for Azure SQL Database](/azure/azure-sql/database/active-geo-replication-overview). For Azure Storage, see [Azure Storage redundancy](../storage/common/storage-redundancy.md). +- Aside from Azure Front Door, which is proposed in this article, Azure provides other load balancing options, such as Azure Traffic Manager. For a comparison of the various options, see [Load-balancing options - Azure Architecture Center](/azure/architecture/guide/technology-choices/load-balancing-overview). +- It's also recommended to set up monitoring and alerts for your web apps to for timely notifications during a disaster. For more information, see [Application Insights availability tests](../azure-monitor/app/availability-overview.md). ++## Next steps ++[Tutorial: Create a highly available multi-region app in Azure App Service](tutorial-multi-region-app.md) |
application-gateway | Overview V2 | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/overview-v2.md | The Standard_v2 and WAF_v2 SKU is not currently available in the following regio - US DOD East - US DOD Central - US Gov Central-- Germany Northeast-- Germany Central - Qatar Central ## Pricing |
applied-ai-services | Concept Add On Capabilities | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-add-on-capabilities.md | + + Title: Add-on capabilities - Form Recognizer ++description: How to increase service limit capacity with add-on capabilities. +++++ Last updated : 03/03/2023++monikerRange: 'form-recog-3.0.0' +recommendations: false ++<!-- markdownlint-disable MD033 --> ++# Azure Form Recognizer add-on capabilities ++**This article applies to:**  **Form Recognizer v3.0**. ++> [!NOTE] +> +> Add-on capabilities for Form Recognizer Studio are only available within the Read and Layout models for the `2023-02-28-preview` release. ++Form Recognizer now supports more sophisticated analysis capabilities. These optional capabilities can be enabled and disabled depending on the scenario of the document extraction. There are three add-on capabilities available for the `2023-02-28-preview`: ++* [`ocr.highResolution`](#high-resolution-extraction) ++* [`ocr.formula`](#formula-extraction) ++* [`ocr.font`](#font-property-extraction) ++## High resolution extraction ++The task of recognizing small text from large-size documents, like engineering drawings, is a challenge. Often the text is mixed with other graphical elements and has varying fonts, sizes and orientations. Moreover, the text may be broken into separate parts or connected with other symbols. Form Recognizer now supports extracting content from these types of documents with the `ocr.highResolution` capability. You get improved quality of content extraction from A1/A2/A3 documents by enabling this add-on capability. ++## Formula extraction ++The `ocr.formula` capability extracts all identified formulas, such as mathematical equations, in the `formulas` collection as a top level object under `content`. Inside `content`, detected formulas are represented as `:formula:`. Each entry in this collection represents a formula that includes the formula type as `inline` or `display`, and its LaTeX representation as `value` along with its `polygon` coordinates. Initially, formulas appear at the end of each page. ++ > [!NOTE] + > The `confidence` score is hard-coded for the `2023-02-28` public preview release. ++ ```json + "content": ":formula:", + "pages": [ + { + "pageNumber": 1, + "formulas": [ + { + "kind": "inline", + "value": "\\frac { \\partial a } { \\partial b }", + "polygon": [...], + "span": {...}, + "confidence": 0.99 + }, + { + "kind": "display", + "value": "y = a \\times b + a \\times c", + "polygon": [...], + "span": {...}, + "confidence": 0.99 + } + ] + } + ] + ``` ++## Font property extraction ++The `ocr.font` capability extracts all font properties of text extracted in the `styles` collection as a top-level object under `content`. Each style object specifies a single font property, the text span it applies to, and its corresponding confidence score. The existing style property is extended with more font properties such as `similarFontFamily` for the font of the text, `fontStyle` for styles such as italic and normal, `fontWeight` for bold or normal, `color` for color of the text, and `backgroundColor` for color of the text bounding box. ++ ```json + "content": "Foo bar", + "styles": [ + { + "similarFontFamily": "Arial, sans-serif", + "spans": [ { "offset": 0, "length": 3 } ], + "confidence": 0.98 + }, + { + "similarFontFamily": "Times New Roman, serif", + "spans": [ { "offset": 4, "length": 3 } ], + "confidence": 0.98 + }, + { + "fontStyle": "italic", + "spans": [ { "offset": 1, "length": 2 } ], + "confidence": 0.98 + }, + { + "fontWeight": "bold", + "spans": [ { "offset": 2, "length": 3 } ], + "confidence": 0.98 + }, + { + "color": "#FF0000", + "spans": [ { "offset": 4, "length": 2 } ], + "confidence": 0.98 + }, + { + "backgroundColor": "#00FF00", + "spans": [ { "offset": 5, "length": 2 } ], + "confidence": 0.98 + } + ] + ``` ++## Next steps ++> [!div class="nextstepaction"] +> Learn more: +> [**Read model**](concept-read.md) [**Layout model**](concept-layout.md). |
applied-ai-services | Concept Business Card | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-business-card.md | recommendations: false [!INCLUDE [applies to v2.1](includes/applies-to-v2-1.md)] ::: moniker-end -The Form Recognizer business card model combines powerful Optical Character Recognition (OCR) capabilities with deep learning models to analyze and extract key information from business card images. The API analyzes printed business cards; extracts key information such as first name, last name, company name, email address, and phone number; and returns a structured JSON data representation. +The Form Recognizer business card model combines powerful Optical Character Recognition (OCR) capabilities with deep learning models to analyze and extract data from business card images. The API analyzes printed business cards; extracts key information such as first name, last name, company name, email address, and phone number; and returns a structured JSON data representation. ## Business card data extraction Business cards are a great way to represent a business or a professional. The co ::: moniker range="form-recog-3.0.0" -The following tools are supported by Form Recognizer v3.0: +Form Recognizer v3.0 supports the following tools: | Feature | Resources | Model ID | |-|-|--| The following tools are supported by Form Recognizer v3.0: ::: moniker range="form-recog-2.1.0" -The following tools are supported by Form Recognizer v2.1: +Form Recognizer v2.1 supports the following tools: | Feature | Resources | |-|-| The following tools are supported by Form Recognizer v2.1: ### Try business card data extraction -See how data, including name, job title, address, email, and company name, is extracted from business cards. You'll need the following resources: +See how data, including name, job title, address, email, and company name, is extracted from business cards. You need the following resources: * An Azure subscriptionΓÇöyou can [create one for free](https://azure.microsoft.com/free/cognitive-services/) See how data, including name, job title, address, email, and company name, is ex :::image type="content" source="media/fott-select-form-type.png" alt-text="Screenshot of the select-form-type dropdown menu."::: -1. Select **Run analysis**. The Form Recognizer Sample Labeling tool will call the Analyze Prebuilt API and analyze the document. +1. Select **Run analysis**. The Form Recognizer Sample Labeling tool calls the Analyze Prebuilt API and analyze the document. 1. View the results - see the key-value pairs extracted, line items, highlighted text extracted and tables detected. |
applied-ai-services | Concept Composed Models | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-composed-models.md | With composed models, you can assign multiple custom models to a composed model * For ```Custom neural``` models the best practice is to add all the different variations of a single document type into a single training dataset and train on custom neural model. Model compose is best suited for scenarios when you have documents of different types being submitted for analysis. -* Pricing is the same whether you're using a composed model or selecting a specific model. One model analyzes each document. With composed models, the system performs a classification to check which of the composed custom models should be invoked and invokes the single best model for the document. +++With the introduction of [****custom classifier models****](./concept-custom-classifier.md), you can choose to use [**composed models**](./concept-composed-models.md) or the classifier model as an explicit step before analysis. For a deeper understanding of when to use a classifier or composed model, _see_ [**Custom classifier models**](concept-custom-classifier.md). ## Compose model limits With composed models, you can assign multiple custom models to a composed model * To compose a model trained with a prior version of the API (v2.1 or earlier), train a model with the v3.0 API using the same labeled dataset. That addition ensures that the v2.1 model can be composed with other models. -* Models composed with v2.1 of the API continue to be supported, requiring no updates. +* Models composed with v2.1 of the API continues to be supported, requiring no updates. * The limit for maximum number of custom models that can be composed is 100. Learn to create and compose custom models: > [!div class="nextstepaction"] > [**Build a custom model**](how-to-guides/build-a-custom-model.md)-> [**Compose custom models**](how-to-guides/compose-custom-models.md) +> [**Compose custom models**](how-to-guides/compose-custom-models.md) |
applied-ai-services | Concept Custom Classifier | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-custom-classifier.md | + + Title: Custom classifier model - Form Recognizer ++description: Use the custom classifier model to train a model to identify and split the documents you process within your application. +++++ Last updated : 03/03/2023+++monikerRange: 'form-recog-3.0.0' +recommendations: false +++# Custom classifier model ++**This article applies to:**  **Form Recognizer v3.0**. ++Custom classifier models are deep-learning-model types that combine layout and language features to accurately detect and identify documents you process within your application. Custom classifier models can classify each page in an input file to identify the document(s) within and can also identify multiple documents or multiple instances of a single document within an input file. ++## Model capabilities ++Custom classifier models can analyze a single- or multi-file documents to identify if any of the trained document types are contained within an input file. Here are the currently supported scenarios: ++* A single file containing one document. For instance, a loan application form. ++* A single file containing multiple documents. For instance, a loan application package containing a loan application form, payslip, and bank statement. ++* A single file containing multiple instances of the same document. For instance, a collection of scanned invoices. ++Training a custom classifier model requires at least two distinct classes and a minimum of five samples per class. ++### Compare custom classifier and composed models ++A custom classifier model can replace [a composed model](concept-composed-models.md) in some scenarios but there are a few differences to be aware of: ++| Capability | Custom classifier process | Composed model process | +|--|--|--| +|Analyze a single document of unknown type belonging to one of the types trained for extraction model processing.| ● Requires multiple calls. </br> ● Call the classifier models based on the document class. This step allows for a confidence-based check before invoking the extraction model analysis.</br> ● Invoke the extraction model. | ● Requires a single call to a composed model containing the model corresponding to the input document type. | + |Analyze a single document of unknown type belonging to several types trained for extraction model processing.| ●Requires multiple calls.</br> ● Make a call to the classifier that ignores documents not matching a designated type for extraction.</br> ● Invoke the extraction model. | ● Requires a single call to a composed model. The service selects a custom model within the composed model with the highest match.</br> ● A composed model can't ignore documents.| +|Analyze a file containing multiple documents of known or unknown type belonging to one of the types trained for extraction model processing.| ● Requires multiple calls. </br> ● Call the extraction model for each identified document in the input file.</br> ● Invoke the extraction model. | ● Requires a single call to a composed model.</br> ● The composed model invokes the component model once on the first instance of the document. </br> ●The remaining documents are ignored. | ++## Language support ++Classifier models currently only support English language documents. ++## Best practices ++Custom classifier models require a minimum of five samples per class to train. If the classes are similar, adding extra training samples improves model accuracy. ++## Training a model ++Custom classifier models are only available in the [v3.0 API](v3-migration-guide.md) starting with API version ```2023-02-28-preview```. [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio) provides a no-code user interface to interactively train a custom classifier. ++When using the REST API, if you've organized your documents by folders, you can use the ```azureBlobSource``` property of the request to train a classifier model. ++```rest +https://{endpoint}/formrecognizer/documentClassifiers:build?api-version=2023-02-28-preview ++{ + "classifierId": "demo2.1", + "description": "", + "docTypes": { + "car-maint": { + "azureBlobSource": { + "containerUrl": "SAS URL to container", + "prefix": "sample1/car-maint/" + } + }, + "cc-auth": { + "azureBlobSource": { + "containerUrl": "SAS URL to container", + "prefix": "sample1/cc-auth/" + } + }, + "deed-of-trust": { + "azureBlobSource": { + "containerUrl": "SAS URL to container", + "prefix": "sample1/deed-of-trust/" + } + } + } +} ++``` ++Alternatively, if you have a flat list of files or only plan to use a few select files within each folder to train the model, you can use the ```azureBlobFileListSource``` property to train the model. This step requires a ```file list``` in [JSON Lines](https://jsonlines.org/) format. For each class, add a new file with a list of files to be submitted for training. ++```rest +{ + "classifierId": "demo2", + "description": "", + "docTypes": { + "car-maint": { + "azureBlobFileListSource": { + "containerUrl": "SAS URL to container", + "fileList": "sample1/car-maint.jsonl" + } + }, + "cc-auth": { + "azureBlobFileListSource": { + "containerUrl": "SAS URL to container", + "fileList": "sample1/cc-auth.jsonl" + } + }, + "deed-of-trust": { + "azureBlobFileListSource": { + "containerUrl": "SAS URL to container", + "fileList": "sample1/deed-of-trust.jsonl" + } + } + } +} ++``` ++File list `car-maint.jsonl` contains the following files. ++```json +{"file":"sample1/car-maint/Commercial Motor Vehicle - Adatum.pdf"} +{"file":"sample1/car-maint/Commercial Motor Vehicle - Fincher.pdf"} +{"file":"sample1/car-maint/Commercial Motor Vehicle - Lamna.pdf"} +{"file":"sample1/car-maint/Commercial Motor Vehicle - Liberty.pdf"} +{"file":"sample1/car-maint/Commercial Motor Vehicle - Trey.pdf"} +``` ++## Next steps ++Learn to create custom classifier models: ++> [!div class="nextstepaction"] +> [**Build a custom classifier model**](how-to-guides/build-a-custom-classifier.md) +> [**Custom models overview**](concept-custom.md) |
applied-ai-services | Concept Custom Label Tips | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-custom-label-tips.md | This article highlights the best methods for labeling custom model datasets in t * The following video is the second of two presentations intended to help you build custom models with higher accuracy (the first presentation explores [How to create a balanced data set](concept-custom-label.md#video-custom-label-tips-and-pointers)). -* Here, we'll examine best practices for labeling your selected documents. With semantically relevant and consistent labeling, you should see an improvement in model performance.</br></br> +* Here, we examine best practices for labeling your selected documents. With semantically relevant and consistent labeling, you should see an improvement in model performance.</br></br> > [!VIDEO https://www.microsoft.com/en-us/videoplayer/embed/RE5fZKB ] |
applied-ai-services | Concept Custom Label | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-custom-label.md | Custom models (template and neural) require a labeled dataset of at least five d A labeled dataset consists of several files: -* You'll provide a set of sample documents (typically PDFs or images). A minimum of five documents is needed to train a model. +* You provide a set of sample documents (typically PDFs or images). A minimum of five documents is needed to train a model. -* Additionally, the labeling process will generate the following files: +* Additionally, the labeling process generates the following files: * A `fields.json` file is created when the first field is added. There's one `fields.json` file for the entire training dataset, the field list contains the field name and associated sub fields and types. A labeled dataset consists of several files: * The following video is the first of two presentations intended to help you build custom models with higher accuracy (The second presentation examines [Best practices for labeling documents](concept-custom-label-tips.md#video-custom-labels-best-practices)). -* Here, we'll explore how to create a balanced data set and select the right documents to label. This process will set you on the path to higher quality models.</br></br> +* Here, we explore how to create a balanced data set and select the right documents to label. This process sets you on the path to higher quality models.</br></br> > [!VIDEO https://www.microsoft.com/en-us/videoplayer/embed/RWWHru] ## Create a balanced dataset -Before you start labeling, it's a good idea to look at a few different samples of the document to identify which samples you want to use in your labeled dataset. A balanced dataset represents all the typical variations you would expect to see for the document. Creating a balanced dataset will result in a model with the highest possible accuracy. A few examples to consider are: +Before you start labeling, it's a good idea to look at a few different samples of the document to identify which samples you want to use in your labeled dataset. A balanced dataset represents all the typical variations you would expect to see for the document. Creating a balanced dataset results in a model with the highest possible accuracy. A few examples to consider are: * **Document formats**: If you expect to analyze both digital and scanned documents, add a few examples of each type to the training dataset * **Variations (template model)**: Consider splitting the dataset into folders and train a model for each of variation. Any variations that include either structure or layout should be split into different models. You can then compose the individual models into a single [composed model](concept-composed-models.md). -* **Variations (Neural models)**: When your dataset has a manageable set of variations, about 15 or fewer, create a single dataset with a few samples of each of the different variations to train a single model. If the number of template variations is larger than 15, you'll train multiple models and [compose](concept-composed-models.md) them together. +* **Variations (Neural models)**: When your dataset has a manageable set of variations, about 15 or fewer, create a single dataset with a few samples of each of the different variations to train a single model. If the number of template variations is larger than 15, you train multiple models and [compose](concept-composed-models.md) them together. * **Tables**: For documents containing tables with a variable number of rows, ensure that the training dataset also represents documents with different numbers of rows. Use the following guidelines to define the fields: * For tabular fields spanning multiple pages, define and label the fields as a single table. -. [!NOTE] +> [!NOTE] > Custom neural models share the same labeling format and strategy as custom template models. Currently custom neural models only support a subset of the field types supported by custom template models. ## Model capabilities -Custom neural models currently only support key-value pairs, structured fields (tables), and selection marks. +Custom neural models currently only support key-value pairs, structured fields (tables), and selection marks. | Model type | Form fields | Selection marks | Tabular fields | Signature | Region | |--|--|--|--|--|--| Tabular fields are also useful when extracting repeating information within a do * **Consistent labeling**. If a value appears in multiple contexts withing the document, consistently pick the same context across documents to label the value. -* **Visually repeating data**. Tables support visually repeating groups of information not just explicit tables. Explicit tables will be identified in tables section of the analyzed documents as part of the layout output and don't need to be labeled as tables. Only label a table field if the information is visually repeating and not identified as a table as part of the layout response. An example would be the repeating work experience section of a resume. +* **Visually repeating data**. Tables support visually repeating groups of information not just explicit tables. Explicit tables are identified in tables section of the analyzed documents as part of the layout output and don't need to be labeled as tables. Only label a table field if the information is visually repeating and not identified as a table as part of the layout response. An example would be the repeating work experience section of a resume. * **Region labeling (custom template)**. Labeling specific regions allows you to define a value when none exists. If the value is optional, ensure that you leave a few sample documents with the region not labeled. When labeling regions, don't include the surrounding text with the label. |
applied-ai-services | Concept Custom Neural | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-custom-neural.md | Custom neural models share the same labeling format and strategy as [custom temp ## Model capabilities -Custom neural models currently only support key-value pairs and selection marks and structured fields (tables), future releases will include support for signatures. +Custom neural models currently only support key-value pairs and selection marks and structured fields (tables), future releases include support for signatures. | Form fields | Selection marks | Tabular fields | Signature | Region | |:--:|:--:|:--:|:--:|:--:| | Supported | Supported | Supported | Unsupported | Supported <sup>1</sup> | -<sup>1</sup> Region labels in custom neural models will use the results from the Layout API for specified region. This feature is different from template models where, if no value is present, text is generated at training time. +<sup>1</sup> Region labels in custom neural models use the results from the Layout API for specified region. This feature is different from template models where, if no value is present, text is generated at training time. ### Build mode The build custom model operation has added support for the *template* and *neural* custom models. Previous versions of the REST API and SDKs only supported a single build mode that is now known as the *template* mode. -Neural models support documents that have the same information, but different page structures. Examples of these documents include United States W2 forms, which share the same information, but may vary in appearance across companies. Neural models currently only support English text. For more information, *see* [Custom model build mode](concept-custom.md#build-mode). +Neural models support documents that have the same information, but different page structures. Examples of these documents include United States W2 forms, which share the same information, but may vary in appearance across companies. For more information, *see* [Custom model build mode](concept-custom.md#build-mode). ++## Language support ++1. Neural models now support added languages in the ```2023-02-28-preview``` API. ++| Languages | API version | +|:--:|:--:| +| English | `2022-08-31` (GA), `2023-02-28-preview`| +| German | `2023-02-28-preview`| +| Italian | `2023-02-28-preview`| +| French | `2023-02-28-preview`| +| Spanish | `2023-02-28-preview`| +| Dutch | `2023-02-28-preview`| ## Tabular fields Custom neural models can generalize across different formats of a single documen ### Field naming -When you label the data, labeling the field relevant to the value will improve the accuracy of the key-value pairs extracted. For example, for a field value containing the supplier ID, consider naming the field "supplier_id". Field names should be in the language of the document. +When you label the data, labeling the field relevant to the value improves the accuracy of the key-value pairs extracted. For example, for a field value containing the supplier ID, consider naming the field "supplier_id". Field names should be in the language of the document. ### Labeling contiguous values Values in training cases should be diverse and representative. For example, if a ## Current Limitations * The model doesn't recognize values split across page boundaries.-* Custom neural models are only trained in English and model performance will be lower for documents in other languages. +* Custom neural models are only trained in English. Model performance is lower for documents in other languages. * If a dataset labeled for custom template models is used to train a custom neural model, the unsupported field types are ignored. * Custom neural models are limited to 10 build operations per month. Open a support request if you need the limit increased. |
applied-ai-services | Concept Custom | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-custom.md | -# Azure Form Recognizer Custom document model +# Azure Form Recognizer Custom document models ::: moniker range="form-recog-3.0.0" [!INCLUDE [applies to v3.0](includes/applies-to-v3-0.md)] recommendations: false [!INCLUDE [applies to v2.1](includes/applies-to-v2-1.md)] ::: moniker-end -Form Recognizer uses advanced machine learning technology to detect and extract information from forms and documents and returns the extracted data in a structured JSON output. With Form Recognizer, you can use prebuilt or pre-trained models or you can train standalone custom models. Custom models extract and analyze distinct data and use cases from forms and documents specific to your business. Standalone custom models can be combined to create [composed models](concept-composed-models.md). +Form Recognizer uses advanced machine learning technology to identify documents, detect and extract information from forms and documents, and return the extracted data in a structured JSON output. With Form Recognizer, you can use document analysis models, pre-built/pre-trained, or your trained standalone custom models. -To create a custom model, you label a dataset of documents with the values you want extracted and train the model on the labeled dataset. You only need five examples of the same form or document type to get started. +Custom models now include [custom classifier models](./concept-custom-classifier.md) for scenarios where you need to identify the document type prior to invoking the extraction model. Classifier models are available starting with the ```2023-02-28-preview``` API. A classifier model can be paired with a custom extraction model to analyze and extract fields from forms and documents specific to your business to create a document processing solution. Standalone custom extraction models can be combined to create [composed models](concept-composed-models.md). ::: moniker range="form-recog-3.0.0" To create a custom model, you label a dataset of documents with the values you w Custom document models can be one of two types, [**custom template**](concept-custom-template.md ) or custom form and [**custom neural**](concept-custom-neural.md) or custom document models. The labeling and training process for both models is identical, but the models differ as follows: +### Custom extraction models ++To create a custom extraction model, label a dataset of documents with the values you want extracted and train the model on the labeled dataset. You only need five examples of the same form or document type to get started. + ### Custom template model -The custom template or custom form model relies on a consistent visual template to extract the labeled data. The accuracy of your model is affected by variances in the visual structure of your documents. Structured forms such as questionnaires or applications are examples of consistent visual templates. +The custom template or custom form model relies on a consistent visual template to extract the labeled data. Variances in the visual structure of your documents affect the accuracy of your model. Structured forms such as questionnaires or applications are examples of consistent visual templates. -Your training set will consist of structured documents where the formatting and layout are static and constant from one document instance to the next. Custom template models support key-value pairs, selection marks, tables, signature fields, and regions. Template models and can be trained on documents in any of the [supported languages](language-support.md). For more information, *see* [custom template models](concept-custom-template.md ). +Your training set consists of structured documents where the formatting and layout are static and constant from one document instance to the next. Custom template models support key-value pairs, selection marks, tables, signature fields, and regions. Template models and can be trained on documents in any of the [supported languages](language-support.md). For more information, *see* [custom template models](concept-custom-template.md ). > [!TIP] > This table provides links to the build mode programming language SDK references ## Compare model features -The table below compares custom template and custom neural features: +The following table compares custom template and custom neural features: |Feature|Custom template (form) | Custom neural (document) | |||| The table below compares custom template and custom neural features: |Training time | 1 to 5 minutes | 20 minutes to 1 hour | |Data extraction | Key-value pairs, tables, selection marks, coordinates, and signatures | Key-value pairs, selection marks and tables| |Document variations | Requires a model per each variation | Uses a single model for all variations |-|Language support | Multiple [language support](language-support.md#read-layout-and-custom-form-template-model) | United States English (en-US) [language support](language-support.md#custom-neural-model) | +|Language support | Multiple [language support](language-support.md#read-layout-and-custom-form-template-model) | English, with preview support for Spanish, French, German, Italian and Dutch [language support](language-support.md#custom-neural-model) | ++### Custom classifier model ++ Document classification is a new scenario supported by Form Recognizer with the ```2023-02-28-preview``` API. Document classifier supports classification and splitting scenarios. Train a classifier model to identify the different types of documents your application supports. The input file for the classifier model can contain multiple documents and classifies each document within an associated page range. See [custom classification](concept-custom-classifier.md) models to learn more. ## Custom model tools -The following tools are supported by Form Recognizer v3.0: +Form Recognizer v3.0 supports the following tools: | Feature | Resources | Model ID| |||:| The following tools are supported by Form Recognizer v3.0: ::: moniker range="form-recog-2.1.0" -The following tools are supported by Form Recognizer v2.1: +Form Recognizer v2.1 supports the following tools: > [!NOTE] > Custom model types [custom neural](concept-custom-neural.md) and [custom template](concept-custom-template.md) are only available with Form Recognizer version v3.0. The following tools are supported by Form Recognizer v2.1: ||| |Custom model| <ul><li>[Form Recognizer labeling tool](https://fott-2-1.azurewebsites.net)</li><li>[REST API](./how-to-guides/use-sdk-rest-api.md?pivots=programming-language-rest-api&preserve-view=true&tabs=windows&view=form-recog-2.1.0#analyze-forms-with-a-custom-model)</li><li>[Client library SDK](/azure/applied-ai-services/form-recognizer/how-to-guides/v2-1-sdk-rest-api)</li><li>[Form Recognizer Docker container](containers/form-recognizer-container-install-run.md?tabs=custom#run-the-container-with-the-docker-compose-up-command)</li></ul>| - ::: moniker-end ## Build a custom model +### [Custom extraction](#tab/extraction) + Extract data from your specific or unique documents using custom models. You need the following resources: * An Azure subscription. You can [create one for free](https://azure.microsoft.com/free/cognitive-services/). Extract data from your specific or unique documents using custom models. You nee > [!NOTE] > Form Recognizer Studio is available with the v3.0 API. -1. On the **Form Recognizer Studio** home page, select **Custom form**. +1. On the **Form Recognizer Studio** home page, select **Custom extraction models**. 1. Under **My Projects**, select **Create a project**. Extract data from your specific or unique documents using custom models. You nee 1. Review and create your project. -1. Use the sample documents to build and test your custom model. +1. Add your sample documents to label, build and test your custom model. > [!div class="nextstepaction"] > [Try Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/customform/projects) +For a detailed walkthrough to create your first custom extraction model, see [how to create a custom extraction model](how-to-guides/build-a-custom-model.md) ++### [Custom classification](#tab/classification) ++Extract data from your specific or unique documents using custom models. You need the following resources: ++* An Azure subscription. You can [create one for free](https://azure.microsoft.com/free/cognitive-services/). +* A [Form Recognizer instance](https://portal.azure.com/#create/Microsoft.CognitiveServicesFormRecognizer) in the Azure portal. You can use the free pricing tier (`F0`) to try the service. After your resource deploys, select **Go to resource** to get your key and endpoint. ++ :::image type="content" source="media/containers/keys-and-endpoint.png" alt-text="Screenshot that shows the keys and endpoint location in the Azure portal."::: ++## Form Recognizer Studio ++> [!NOTE] +> Form Recognizer Studio is available with the v3.0 API. ++1. On the **Form Recognizer Studio** home page, select **Custom classification models**. ++1. Under **My Projects**, select **Create a project**. ++1. Complete the project details fields. ++1. Configure the service resource by adding your **Storage account** and **Blob container** to **Connect your training data source**. ++1. Review and create your project. ++1. Label your documents to build and test your custom classifier model. ++ > [!div class="nextstepaction"] + > [Try Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/document-classifier/projects) ++For a detailed walkthrough to create your first custom extraction model, see [how to create a custom extraction model](how-to-guides/build-a-custom-classifier.md) +++ ## Custom model extraction summary This table compares the supported data extraction areas: This table compares the supported data extraction areas: |Model| Form fields | Selection marks | Structured fields (Tables) | Signature | Region labeling | |--|:--:|:--:|:--:|:--:|:--:| |Custom template| Γ£ö | Γ£ö | Γ£ö | Γ£ö | Γ£ö |-|Custom neural| Γ£ö| Γ£ö | Γ£ö | **n/a** | **n/a** | +|Custom neural| Γ£ö| Γ£ö | Γ£ö | **n/a** | * | -**Table symbols**: Γ£öΓÇösupported; **n/aΓÇöcurrently unavailable +**Table symbols**: +Γ£öΓÇösupported; +**n/aΓÇöcurrently unavailable; +*-behaves differently. With template models, synthetic data is generated at training time. With neural models, exiting text recognized in the region is selected. > [!TIP] > When choosing between the two model types, start with a custom neural model if it meets your functional needs. See [custom neural](concept-custom-neural.md ) to learn more about custom neural models. |
applied-ai-services | Concept Form Recognizer Studio | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-form-recognizer-studio.md | recommendations: false **This article applies to:**  **Form Recognizer v3.0**. -[Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/) is an online tool for visually exploring, understanding, and integrating features from the Form Recognizer service into your applications. Use the [Form Recognizer Studio quickstart](quickstarts/try-v3-form-recognizer-studio.md) to get started analyzing documents with pre-trained models. Build custom template models and reference the models in your applications using the [Python SDK v3.0](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true) and other quickstarts. +[Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/) is an online tool for visually exploring, understanding, and integrating features from the Form Recognizer service into your applications. Use the [Form Recognizer Studio quickstart](quickstarts/try-v3-form-recognizer-studio.md) to get started analyzing documents with pretrained models. Build custom template models and reference the models in your applications using the [Python SDK v3.0](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true) and other quickstarts. The following image shows the Invoice prebuilt model feature at work. |
applied-ai-services | Concept General Document | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-general-document.md | recommendations: false The General document v3.0 model combines powerful Optical Character Recognition (OCR) capabilities with deep learning models to extract key-value pairs, tables, and selection marks from documents. General document is only available with the v3.0 API. For more information on using the v3.0 API, see our [migration guide](v3-migration-guide.md). -### Key-value pair extraction --The general document API supports most form types and will analyze your documents and extract keys and associated values. It's ideal for extracting common key-value pairs from documents. You can use the general document model as an alternative to training a custom model without labels. - > [!NOTE]-> The ```2022-06-30``` and subsequent versions of the general document model add support for selection marks. +> The ```2023-02-28-preview``` version of the general document model adds support for **normalized keys**. ## General document features The general document API supports most form types and will analyze your document * The general document model supports structured, semi-structured, and unstructured documents. -* Key names are spans of text within the document that are associated with a value. +* Key names are spans of text within the document that are associated with a value. With the ```2023-02-28-preview``` API version, key names are normalized where applicable. -* Selection marks are identified as fields with a value of ```:selected:``` or ```:unselected:``` +* Selection marks are identified as fields with a value of ```:selected:``` or ```:unselected:``` ***Sample document processed in the Form Recognizer Studio*** ++## Key-value pair extraction ++The general document API supports most form types and analyzes your documents and extract keys and associated values. It's ideal for extracting common key-value pairs from documents. You can use the general document model as an alternative to training a custom model without labels. ++### Key normalization (common name) ++When the service analyzes documents with variations in key names like ```Social Security Number```, ```Social Security Nbr```, ```SSN```, the output normalizes the key variations to a single common name, ```SocialSecurityNumber```. This normalization simplifies downstream processing for documents where you no longer need to account for variations in the key name. + ## Development options -The following tools are supported by Form Recognizer v3.0: +Form Recognizer v3.0 supports the following tools: | Feature | Resources | Model ID |-|-|| The following tools are supported by Form Recognizer v3.0: Try extracting data from forms and documents using the Form Recognizer Studio. -You'll need the following resources: +You need the following resources: * An Azure subscriptionΓÇöyou can [create one for free](https://azure.microsoft.com/free/cognitive-services/) You'll need the following resources: Key-value pairs are specific spans within the document that identify a label or key and its associated response or value. In a structured form, these pairs could be the label and the value the user entered for that field. In an unstructured document, they could be the date a contract was executed on based on the text in a paragraph. The AI model is trained to extract identifiable keys and values based on a wide variety of document types, formats, and structures. -Keys can also exist in isolation when the model detects that a key exists, with no associated value or when processing optional fields. For example, a middle name field may be left blank on a form in some instances. Key-value pairs are spans of text contained in the document. For documents where the same value is described in different ways, for example, customer/user, the associated key will be either customer or user (based on context). +Keys can also exist in isolation when the model detects that a key exists, with no associated value or when processing optional fields. For example, a middle name field may be left blank on a form in some instances. Key-value pairs are spans of text contained in the document. For documents where the same value is described in different ways, for example, customer/user, the associated key is either customer or user (based on context). ## Data extraction -| **Model** | **Text extraction** |**Key-Value pairs** |**Selection Marks** | **Tables** | -| | :: |::| :: | :: | -|General document | Γ£ô | Γ£ô | Γ£ô | Γ£ô | +| **Model** | **Text extraction** |**Key-Value pairs** |**Selection Marks** | **Tables** | **Common Names** | +| | :: |::| :: | :: | :: | +|General document | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô* | ++Γ£ô* - Only available in the 2023-02-28-preview API version. ## Input requirements |
applied-ai-services | Concept Id Document | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-id-document.md | The following are the fields extracted per document type. The Azure Form Recogni |`LastName`|`string`|Surname|TALBOT| |`DateOfIssue`|`date`|Date of issue|08/12/2012| -#### `idDocument` field extracted +#### `idDocument` fields extracted | Field | Type | Description | Example | |:|:--|:|:--| |
applied-ai-services | Concept Insurance Card | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-insurance-card.md | + + Title: Form Recognizer insurance card prebuilt model ++description: Data extraction and analysis extraction using the insurance card model +++++ Last updated : 03/03/2023++monikerRange: 'form-recog-3.0.0' +recommendations: false +++# Azure Form Recognizer health insurance card model ++**This article applies to:**  **Form Recognizer v3.0**. ++The Form Recognizer health insurance card model combines powerful Optical Character Recognition (OCR) capabilities with deep learning models to analyze and extract key information from US health insurance cards. A health insurance card is a key document for care processing and can be digitally analyzed for patient onboarding, financial coverage information, cashless payments, and insurance claim processing. The health insurance card model analyzes health card images; extracts key information such as insurer, member, prescription, and group number; and returns a structured JSON representation. Health insurance cards can be presented in various formats and quality including phone-captured images, scanned documents, and digital PDFs. ++***Sample health insurance card processed using Form Recognizer Studio*** +++## Development options ++Form Recognizer v3.0 supports the prebuilt health insurance card model with the following tools: ++| Feature | Resources | Model ID | +|-|-|--| +|**health insurance card model**|<ul><li> [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true#prebuilt-model)</li><li>[**Python SDK**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true#prebuilt-model)</li><li>[**Java SDK**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true#prebuilt-model)</li><li>[**JavaScript SDK**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true#prebuilt-model)</li></ul>|**prebuilt-healthInsuranceCard.us**| ++### Try Form Recognizer ++See how data is extracted from health insurance cards using the Form Recognizer Studio. You need the following resources: ++* An Azure subscriptionΓÇöyou can [create one for free](https://azure.microsoft.com/free/cognitive-services/) ++* A [Form Recognizer instance](https://ms.portal.azure.com/#create/Microsoft.CognitiveServicesFormRecognizer) in the Azure portal. You can use the free pricing tier (`F0`) to try the service. After your resource deploys, select **Go to resource** to get your key and endpoint. ++ :::image type="content" source="media/containers/keys-and-endpoint.png" alt-text="Screenshot of keys and endpoint location in the Azure portal."::: ++#### Form Recognizer Studio ++> [!NOTE] +> Form Recognizer studio is available with API version v3.0. ++1. On the [Form Recognizer Studio home page](https://formrecognizer.appliedai.azure.com/studio), select **Health insurance cards**. ++1. You can analyze the sample insurance card document or select the **Γ₧ò Add** button to upload your own sample. ++1. Select the **Analyze** button: ++ :::image type="content" source="media/studio/insurance-card-analyze.png" alt-text="Screenshot: analyze health insurance card window in the Form Recognizer Studio."::: ++ > [!div class="nextstepaction"] + > [Try Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=healthInsuranceCard.us) ++## Input requirements +++## Supported languages and locales ++| Model | LanguageΓÇöLocale code | Default | +|--|:-|:| +|prebuilt-healthInsuranceCard.us| <ul><li>English (United States)</li></ul>|English (United States)ΓÇöen-US| ++## Field extraction ++| Field | Type | Description | Example | +|:|:--|:|:--| +|`Insurer`|`string`|Health insurance provider name|PREMERA<br>BLUE CROSS| +|`Member`|`object`||| +|`Member.Name`|`string`|Member name|ANGEL BROWN| +|`Member.BirthDate`|`date`|Member date of birth|01/06/1958| +|`Member.Employer`|`string`|Member name employer|Microsoft| +|`Member.Gender`|`string`|Member gender|M| +|`Member.IdNumberSuffix`|`string`|Identification Number Suffix as it appears on some health insurance cards|01| +|`Dependents`|`array`|Array holding list of dependents, ordered where possible by membership suffix value|| +|`Dependents.*`|`object`||| +|`Dependents.*.Name`|`string`|Dependent name|01| +|`IdNumber`|`object`||| +|`IdNumber.Prefix`|`string`|Identification Number Prefix as it appears on some health insurance cards|ABC| +|`IdNumber.Number`|`string`|Identification Number|123456789| +|`GroupNumber`|`string`|Insurance Group Number|1000000| +|`PrescriptionInfo`|`object`||| +|`PrescriptionInfo.Issuer`|`string`|ANSI issuer identification number (IIN)|(80840) 300-11908-77| +|`PrescriptionInfo.RxBIN`|`string`|Prescription issued BIN number|987654| +|`PrescriptionInfo.RxPCN`|`string`|Prescription processor control number|63200305| +|`PrescriptionInfo.RxGrp`|`string`|Prescription group number|BCAAXYZ| +|`PrescriptionInfo.RxId`|`string`|Prescription identification number. If not present, defaults to membership ID number|P97020065| +|`PrescriptionInfo.RxPlan`|`string`|Prescription Plan number|A1| +|`Pbm`|`string`|Pharmacy Benefit Manager for the plan|CVS CAREMARK| +|`EffectiveDate`|`date`|Date from which the plan is effective|08/12/2012| +|`Copays`|`array`|Array holding list of CoPay Benefits|| +|`Copays.*`|`object`||| +|`Copays.*.Benefit`|`string`|Co-Pay Benefit name|Deductible| +|`Copays.*.Amount`|`currency`|Co-Pay required amount|$1,500| +|`Payer`|`object`||| +|`Payer.Id`|`string`|Payer ID Number|89063| +|`Payer.Address`|`address`|Payer address|123 Service St., Redmond WA, 98052| +|`Payer.PhoneNumber`|`phoneNumber`|Payer phone number|+1 (987) 213-5674| +|`Plan`|`object`||| +|`Plan.Number`|`string`|Plan number|456| +|`Plan.Name`|`string`|Plan name - If see Medicaid -> then Medicaid|HEALTH SAVINGS PLAN| +|`Plan.Type`|`string`|Plan type|PPO| ++### Migration guide and REST API v3.0 ++* Follow our [**Form Recognizer v3.0 migration guide**](v3-migration-guide.md) to learn how to use the v3.0 version in your applications and workflows. ++* Explore our [**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument) to learn more about the v3.0 version and new capabilities. ++## Next steps ++* Try processing your own forms and documents with the [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio) ++* Complete a [Form Recognizer quickstart](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true) and get started creating a document processing app in the development language of your choice. |
applied-ai-services | Concept Invoice | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-invoice.md | See how data, including customer information, vendor details, and line items, is | ServiceStartDate | Date | First date for the service period (for example, a utility bill service period) | yyyy-mm-dd | | ServiceEndDate | Date | End date for the service period (for example, a utility bill service period) | yyyy-mm-dd| | PreviousUnpaidBalance | Number | Explicit previously unpaid balance | Integer |-| PaymentOptions | Array | An array that holds Payment Option details such as `IBAN`and `SWIFT` | | +| CurrencyCode | String | The currency code associated with the extracted amount | | +| PaymentDetails | Array | An array that holds Payment Option details such as `IBAN`and `SWIFT` | | | TotalDiscount | Number | The total discount applied to an invoice | Integer | | TaxItems (en-IN only) | Array | AN array that holds added tax information such as `CGST`, `IGST`, and `SGST`. This line item is currently only available for the en-in locale | | |
applied-ai-services | Concept Layout | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-layout.md | recommendations: false [!INCLUDE [applies to v2.1](includes/applies-to-v2-1.md)] ::: moniker-end -Form Recognizer layout model is an advanced machine-learning based document analysis API available in the Form Recognizer cloud. It enables you to take documents in a variety of formats and return structured data representations of the documents. It combines an enhanced version of our powerful [Optical Character Recognition (OCR)](../../cognitive-services/computer-vision/overview-ocr.md) capabilities with deep learning models to extract text, tables, selection marks, and document structure. +Form Recognizer layout model is an advanced machine-learning based document analysis API available in the Form Recognizer cloud. It enables you to take documents in various formats and return structured data representations of the documents. It combines an enhanced version of our powerful [Optical Character Recognition (OCR)](../../cognitive-services/computer-vision/overview-ocr.md) capabilities with deep learning models to extract text, tables, selection marks, and document structure. ## Document layout analysis The following illustration shows the typical components in an image of a sample ## Development options -The following tools are supported by Form Recognizer v3.0: +Form Recognizer v3.0 supports the following tools: | Feature | Resources | Model ID | |-||| The following tools are supported by Form Recognizer v3.0: ### Try layout extraction -See how data, including text, tables, table headers, selection marks, and structure information is extracted from documents using Form Recognizer. You'll need the following resources: +See how data, including text, tables, table headers, selection marks, and structure information is extracted from documents using Form Recognizer. You need the following resources: * An Azure subscriptionΓÇöyou can [create one for free](https://azure.microsoft.com/free/cognitive-services/) See how data, including text, tables, table headers, selection marks, and struct * Select the **Fetch** button. -1. Select **Run Layout**. The Form Recognizer Sample Labeling tool will call the Analyze Layout API and analyze the document. +1. Select **Run Layout**. The Form Recognizer Sample Labeling tool calls the Analyze Layout API and analyze the document. :::image type="content" source="media/fott-layout.png" alt-text="Screenshot: Layout dropdown window."::: The paragraph roles are best used with unstructured documents. Paragraph roles | | | | | | Layout | Γ£ô | Γ£ô| Γ£ô | -The following tools are supported by Form Recognizer v2.1: +Form Recognizer v2.1 supports the following tools: | Feature | Resources | |-|-| The response includes classifying whether each text line is of handwriting style } ``` -### Extracts selected pages from documents +### Annotations extraction ++The Layout model extracts annotations in documents, such as checks and crosses. The response includes the kind of annotation, along with a confidence score and bounding polygon. ++```json + { + "pages": [ + { + "annotations": [ + { + "kind": "cross", + "polygon": [...], + "confidence": 1 + } + ] + } + ] +} +``` ++### Extracting barcodes from documents ++The Layout model extracts all identified barcodes in the `barcodes` collection as a top level object under `content`. Inside the `content`, detected barcodes are represented as `:barcode:`. Each entry in this collection represents a barcode and includes the barcode type as `kind` and the embedded barcode content as `value` along with its `polygon` coordinates. Initially, barcodes appear at the end of each page. ++#### Supported barcode types ++| **Barcode Type** | **Example** | +| | | +| QR Code |:::image type="content" source="media/barcodes/qr-code.png" alt-text="Screenshot of the QR Code.":::| +| Code 39 |:::image type="content" source="media/barcodes/code-39.png" alt-text="Screenshot of the Code 39.":::| +| Code 128 |:::image type="content" source="media/barcodes/code-128.png" alt-text="Screenshot of the Code 128.":::| +| UPC (UPC-A & UPC-E) |:::image type="content" source="media/barcodes/upc.png" alt-text="Screenshot of the UPC.":::| +| PDF417 |:::image type="content" source="media/barcodes/pdf-417.png" alt-text="Screenshot of the PDF417.":::| ++ > [!NOTE] + > The `confidence` score is hard-coded for the `2023-02-28` public preview. ++ ```json + "content": ":barcode:", + "pages": [ + { + "pageNumber": 1, + "barcodes": [ + { + "kind": "QRCode", + "value": "http://test.com/", + "span": { ... }, + "polygon": [...], + "confidence": 1 + } + ] + } + ] + ``` ++### Extract selected pages from documents For large multi-page documents, use the `pages` query parameter to indicate specific page numbers or page ranges for text extraction. For large multi-page documents, use the `pages` query parameter to indicate spec ## The Get Analyze Layout Result operation -The second step is to call the [Get Analyze Layout Result](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/GetAnalyzeLayoutResult) operation. This operation takes as input the Result ID that was created by the Analyze Layout operation. It returns a JSON response that contains a **status** field with the following possible values. +The second step is to call the [Get Analyze Layout Result](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/GetAnalyzeLayoutResult) operation. This operation takes as input the Result ID the Analyze Layout operation created. It returns a JSON response that contains a **status** field with the following possible values. |Field| Type | Possible values | |:--|:-:|:-| The second step is to call the [Get Analyze Layout Result](https://westcentralus Call this operation iteratively until it returns the `succeeded` value. Use an interval of 3 to 5 seconds to avoid exceeding the requests per second (RPS) rate. -When the **status** field has the `succeeded` value, the JSON response will include the extracted layout, text, tables, and selection marks. The extracted data includes extracted text lines and words, bounding boxes, text appearance with handwritten indication, tables, and selection marks with selected/unselected indicated. +When the **status** field has the `succeeded` value, the JSON response includes the extracted layout, text, tables, and selection marks. The extracted data includes extracted text lines and words, bounding boxes, text appearance with handwritten indication, tables, and selection marks with selected/unselected indicated. ### Handwritten classification for text lines (Latin only) See here for a [sample document file](https://github.com/Azure-Samples/cognitive The JSON output has two parts: -* `readResults` node contains all of the recognized text and selection marks. Text is organized by page, then by line, then by individual words. +* `readResults` node contains all of the recognized text and selection mark. The text presentation hierarchy is page, then line, then individual words. * `pageResults` node contains the tables and cells extracted with their bounding boxes, confidence, and a reference to the lines and words in "readResults". ## Example Output Layout API also extracts selection marks from documents. Extracted selection mar * Complete a [Form Recognizer quickstart](quickstarts/get-started-sdks-rest-api.md?view=form-recog-2.1.0&preserve-view=true) and get started creating a document processing app in the development language of your choice. |
applied-ai-services | Concept Model Overview | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-model-overview.md | +<!--┬ámarkdownlint-disable┬áMD011┬á--> # Document processing models recommendations: false | [Layout analysis](#layout-analysis) | Extract text and document layout elements like tables, selection marks, titles, section headings, and more.| | [General document](#general-document) | Extract key-value pairs in addition to text and document structure information.| |**Prebuilt models**||+| [Health insurance card](#health-insurance-card) | Automate healthcare processes by extracting insurer, member, prescription, group number and other key information from US health insurance cards.| | [W-2](#w-2) | Process W2 forms to extract employee, employer, wage, and other information. |-| [Invoice](#invoice) | Automate invoice processing for English and Spanish invoices. | -| [Receipt](#receipt) | Extract receipt data from English receipts.| +| [Invoice](#invoice) | Automate invoices. | +| [Receipt](#receipt) | Extract receipt data from receipts.| | [Identity document (ID)](#identity-document-id) | Extract identity (ID) fields from US driver licenses and international passports. | | [Business card](#business-card) | Scan business cards to extract key fields and data into your applications. | |**Custom models**||-| [Custom models](#custom-models) | Extract data from forms and documents specific to your business. Custom models are trained for your distinct data and use cases. | +| [Custom model (overview)](#custom-models) | Extract data from forms and documents specific to your business. Custom models are trained for your distinct data and use cases. | +| [Custom extraction models](#custom-extraction)| ● **Custom template models** use layout cues to extract values from documents and are suitable to extract fields from highly structured documents with defined visual templates.</br>● **Custom neural models** are trained on various document types to extract fields from structured, semi-structured and unstructured documents.| +| [Custom classifier model](#custom-classifier)| The **Custom classifier model** can classify each page in an input file to identify the document(s) within and can also identify multiple documents or multiple instances of a single document within an input file. | [Composed models](#composed-models) | Combine several custom models into a single model to automate processing of diverse document types with a single composed model. ### Read OCR The Layout analysis model analyzes and extracts text, tables, selection marks, a [:::image type="icon" source="media/studio/general-document.png":::](https://formrecognizer.appliedai.azure.com/studio/document) -The general document model is ideal for extracting common key-value pairs from forms and documents. ItΓÇÖs a pre-trained model and can be directly invoked via the REST API and the SDKs. You can use the general document model as an alternative to training a custom model. +The general document model is ideal for extracting common key-value pairs from forms and documents. It's a pre-trained model and can be directly invoked via the REST API and the SDKs. You can use the general document model as an alternative to training a custom model. ***Sample document processed using the [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/document)***: > [!div class="nextstepaction"] > [Learn more: general document model](concept-general-document.md) +### Health insurance card -### W-2 ++The health insurance card model combines powerful Optical Character Recognition (OCR) capabilities with deep learning models to analyze and extract key information from US health insurance cards. ++***Sample US health insurance card processed using [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=healthInsuranceCard.us)***: +++> [!div class="nextstepaction"] +> [Learn more: Health insurance card model](concept-insurance-card.md) ++### W-2 [:::image type="icon" source="media/studio/w2.png":::](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=tax.us.w2) Use the business card model to scan and extract key information from business ca [:::image type="icon" source="media/studio/custom.png":::](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects) -Custom document models analyze and extract data from forms and documents specific to your business. They are trained to recognize form fields within your distinct content and extract key-value pairs and table data. You only need five examples of the same form type to get started. +Custom document models analyze and extract data from forms and documents specific to your business. They're trained to recognize form fields within your distinct content and extract key-value pairs and table data. You only need five examples of the same form type to get started. Version v3.0 custom model supports signature detection in custom forms (template model) and cross-page tables in both template and neural models. Version v3.0 custom model supports signature detection in custom forms (template > [!div class="nextstepaction"] > [Learn more: custom model](concept-custom.md) +#### Custom extraction ++[:::image type="icon" source="media/studio/custom-extraction.png":::](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects) ++Custom extraction model can be one of two types, **custom template** or **custom neural**. To create a custom extraction model, label a dataset of documents with the values you want extracted and train the model on the labeled dataset. You only need five examples of the same form or document type to get started. ++***Sample custom extraction processed using [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/customform/projects)***: +++> [!div class="nextstepaction"] +> [Learn more: custom template model](concept-custom-template.md) ++> [!div class="nextstepaction"] +> [Learn more: custom neural model](./concept-custom-neural.md) ++#### Custom classifier ++[:::image type="icon" source="media/studio/custom-classifier.png":::](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects) ++The custom classifier model enables you to identify the document type prior to invoking the extraction model. The classifier model is available starting with the 2023-02-28-preview. Training a custom classifier model requires at least two distinct classes and a minimum of five samples per class. ++> [!div class="nextstepaction"] +> [Learn more: custom classifier model](concept-custom-classifier.md) + #### Composed models -A composed model is created by taking a collection of custom models and assigning them to a single model built from your form types. You can assign multiple custom models to a composed model called with a single model ID. You can assign up to 100 trained custom models to a single composed model. +A composed model is created by taking a collection of custom models and assigning them to a single model built from your form types. You can assign multiple custom models to a composed model called with a single model ID. You can assign up to 200 trained custom models to a single composed model. ***Composed model dialog window in [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/customform/projects)***: A composed model is created by taking a collection of custom models and assignin | **Model ID** | **Text extraction** | **Language detection** | **Selection Marks** | **Tables** | **Paragraphs** | **Structure** | **Key-Value pairs** | **Fields** | |:--|:-:|:-:|:-:|:-:|:-:|:-:|:-:|:-:| | [prebuilt-read](concept-read.md#data-extraction) | Γ£ô | Γ£ô | | | Γ£ô | | | |+| [prebuilt-healthInsuranceCard.us](concept-insurance-card.md#field-extraction) | Γ£ô | | Γ£ô | | Γ£ô | | | Γ£ô | | [prebuilt-tax.us.w2](concept-w2.md#field-extraction) | Γ£ô | | Γ£ô | | Γ£ô | | | Γ£ô | | [prebuilt-document](concept-general-document.md#data-extraction)| Γ£ô | | Γ£ô | Γ£ô | Γ£ô | | Γ£ô | | | [prebuilt-layout](concept-layout.md#data-extraction) | Γ£ô | | Γ£ô | Γ£ô | Γ£ô | Γ£ô | | | |
applied-ai-services | Concept Query Fields | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-query-fields.md | + + Title: Query field extraction - Form Recognizer ++description: Use Form Recognizer to extract query field data. +++++ Last updated : 03/07/2023++monikerRange: 'form-recog-3.0.0' +recommendations: false ++<!-- markdownlint-disable MD033 --> ++# Azure Form Recognizer query field extraction ++**This article applies to:**  **Form Recognizer v3.0**. ++> [!IMPORTANT] +> +> * The Form Recognizer Studio query fields extraction feature is currently in gated preview. Features, approaches and processes may change, prior to General Availability (GA), based on user feedback. +> * Complete and submit the [**Form Recognizer private preview request form**](https://aka.ms/form-recognizer/preview/survey) to request access. ++Form Recognizer now supports query field extractions using Azure OpenAI capabilities. With query field extraction, you can add fields to the extraction process using a query request without the need for added training. ++> [!NOTE] +> +> Form Recognizer Studio query field extraction is currently available with the general document model for the `2023-02-28-preview` release. ++## Select query fields ++For query field extraction, specify the fields you want to extract and Form Recognizer analyzes the document accordingly. Here's an example: ++* If you're processing a contract in the Form Recognizer Studio, you can pass a list of field labels like `Party1`, `Party2`, `TermsOfUse`, `PaymentTerms`, `PaymentDate`, and `TermEndDate`" as part of the analyze document request. ++ :::image type="content" source="media/studio/query-field-select.png" alt-text="Screenshot of query fields selection window in Form Recognizer Studio."::: ++* Form Recognizer utilizes the capabilities of both [**Azure OpenAI Service**](../../cognitive-services/openai/overview.md) and extraction models to analyze and extract the field data and return the values in a structured JSON output. ++* In addition to the query fields, the response includes text, tables, selection marks, general document key-value pairs, and other relevant data. ++## Next steps ++> [!div class="nextstepaction"] +> [Try the Form Recognizer Studio quickstart](./quickstarts/try-form-recognizer-studio.md) |
applied-ai-services | Concept Read | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-read.md | recommendations: false > [!NOTE] >-> For extracting text from in-the-wild images like labels, street signs, and posters, use the [Computer Vision v4.0 preview Read](../../cognitive-services/Computer-vision/concept-ocr.md) feature optimized for general, non-document images with a performance-enhanced synchronous API that makes it easier to embed OCR in your user experience scenarios. +> For extracting text from external images like labels, street signs, and posters, use the [Computer Vision v4.0 preview Read](../../cognitive-services/Computer-vision/concept-ocr.md) feature optimized for general, non-document images with a performance-enhanced synchronous API that makes it easier to embed OCR in your user experience scenarios. > +Form Recognizer v3.0's Read Optical Character Recognition (OCR) model runs at a higher resolution than Computer Vision Read and extracts print and handwritten text from PDF documents and scanned images. It also includes preview support for extracting text from Microsoft Word, Excel, PowerPoint, and HTML documents. It detects paragraphs, text lines, words, locations, and languages. The Read model is the underlying OCR engine for other Form Recognizer prebuilt models like Layout, General Document, Invoice, Receipt, Identity (ID) document, in addition to custom models. + ## What is OCR for documents? Optical Character Recognition (OCR) for documents is optimized for large text-heavy documents in multiple file formats and global languages. It includes features like higher-resolution scanning of document images for better handling of smaller and dense text; paragraph detection; and fillable form management. OCR capabilities also include advanced scenarios like single character boxes and accurate extraction of key fields commonly found in invoices, receipts, and other prebuilt scenarios. -## OCR in Form Recognizer - Read model --Form Recognizer v3.0's Read Optical Character Recognition (OCR) model runs at a higher resolution than Computer Vision Read and extracts print and handwritten text from PDF documents and scanned images. It also includes preview support for extracting text from Microsoft Word, Excel, PowerPoint, and HTML documents. It detects paragraphs, text lines, words, locations, and languages. The read model is the underlying OCR engine for other Form Recognizer prebuilt models like Layout, General Document, Invoice, Receipt, Identity (ID) document, in addition to custom models. --## OCR supported document types +## Read OCR supported document types > [!NOTE] > Form Recognizer v3.0's Read Optical Character Recognition (OCR) model runs at a ## Development options -The following resources are supported by Form Recognizer v3.0: +Form Recognizer v3.0 supports the following resources: | Model | Resources | Model ID | |-||| The following resources are supported by Form Recognizer v3.0: ## Try OCR in Form Recognizer -Try extracting text from forms and documents using the Form Recognizer Studio. You'll need the following assets: +Try extracting text from forms and documents using the Form Recognizer Studio. You need the following assets: * An Azure subscriptionΓÇöyou can [create one for free](https://azure.microsoft.com/free/cognitive-services/) Try extracting text from forms and documents using the Form Recognizer Studio. Y :::image type="content" source="media/containers/keys-and-endpoint.png" alt-text="Screenshot: keys and endpoint location in the Azure portal."::: -### Form Recognizer Studio +### Form Recognizer Studio > [!NOTE] > Currently, Form Recognizer Studio doesn't support Microsoft Word, Excel, PowerPoint, and HTML file formats in the Read version v3.0. The page units in the model output are computed as shown: |PowerPoint | Each slide = 1 page unit, Each embedded image = 1 page unit | Total slides + Total images |HTML | Up to 3,000 characters = 1 page unit, embedded or linked images not supported | Total pages of up to 3,000 characters each | +### Extracting barcodes from documents ++The Read OCR model extracts all identified barcodes in the `barcodes` collection as a top level object under `content`. Inside the `content`, detected barcodes are represented as `:barcode:`. Each entry in this collection represents a barcode and includes the barcode type as `kind` and the embedded barcode content as `value` along with its `polygon` coordinates. Initially, barcodes appear at the end of each page. Here, the `confidence` is hard-coded for the public preview (`2023-02-28`) release. ++#### Supported barcode types ++| **Barcode Type** | **Example** | +| | | +| QR Code |:::image type="content" source="media/barcodes/qr-code.png" alt-text="Screenshot of the QR Code.":::| +| Code 39 |:::image type="content" source="media/barcodes/code-39.png" alt-text="Screenshot of the Code 39.":::| +| Code 128 |:::image type="content" source="media/barcodes/code-128.png" alt-text="Screenshot of the Code 128.":::| +| UPC (UPC-A & UPC-E) |:::image type="content" source="media/barcodes/upc.png" alt-text="Screenshot of the UPC.":::| +| PDF417 |:::image type="content" source="media/barcodes/pdf-417.png" alt-text="Screenshot of the PDF417.":::| ++```json +"content": ":barcode:", + "pages": [ + { + "pageNumber": 1, + "barcodes": [ + { + "kind": "QRCode", + "value": "http://test.com/", + "span": { ... }, + "polygon": [...], + "confidence": 1 + } + ] + } + ] +``` + ### Paragraphs extraction The Read OCR model in Form Recognizer extracts all identified blocks of text in the `paragraphs` collection as a top level object under `analyzeResults`. Each entry in this collection represents a text block and includes the extracted text as`content`and the bounding `polygon` coordinates. The `span` information points to the text fragment within the top-level `content` property that contains the full text from the document. The Read OCR model in Form Recognizer extracts all identified blocks of text in ] ``` -### Language detection +### Language detection -The Read OCR model in Form Recognizer adds [language detection](language-support.md#detected-languages-read-api) as a new feature for text lines. Read will predict the detected primary language for each text line along with the `confidence` in the `languages` collection under `analyzeResult`. +The Read OCR model in Form Recognizer adds [language detection](language-support.md#detected-languages-read-api) as a new feature for text lines. Read predicts the detected primary language for each text line along with the `confidence` in the `languages` collection under `analyzeResult`. ```json "languages": [ The page units in the model output are computed as shown: The Read OCR model extracts print and handwritten style text as `lines` and `words`. The model outputs bounding `polygon` coordinates and `confidence` for the extracted words. The `styles` collection includes any handwritten style for lines if detected along with the spans pointing to the associated text. This feature applies to [supported handwritten languages](language-support.md). -For the preview of Microsoft Word, Excel, PowerPoint, and HTML file support, Read will extract all embedded text as is. For any embedded images, it will run OCR on the images to extract text and append the text from each image as an added entry to the `pages` collection. These added entries will include the extracted text lines and words, their bounding polygons, confidences, and the spans pointing to the associated text. +For the preview of Microsoft Word, Excel, PowerPoint, and HTML file support, Read extracts all embedded text as is. For any embedded images, it runs OCR on the images to extract text and append the text from each image as an added entry to the `pages` collection. These added entries include the extracted text lines and words, their bounding polygons, confidences, and the spans pointing to the associated text. ```json "words": [ |
applied-ai-services | Concept Receipt | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-receipt.md | See how Form Recognizer extracts data, including time and date of transactions, ::: moniker range="form-recog-3.0.0" -## Supported languages and locales v3.0 +## Supported languages and locales >[!NOTE] > Form Recognizer auto-detects language and locale data. -The receipt model supports all English receipts and the following locales: --|Supported Languages| Details | -|:--|:-:| -|• English| United States (-US), Australia (-AU), Great Britain (-GB), India (-IN), United Arab Emirates (-AE)| -|• Dutch| Netherlands (nl-NL)| -|• French | France (fr-FR), Canada (fr-CA) | -|• German | Germany (de-DE) | -|• Italian | Italy (it-IT) | -|• Japanese | Japan (ja-JP)| -|• Portuguese| Portugal (pt-PT), Brazil (pt-BR)| -|• Spanish | Spain (es-ES) | +### [**2022-08-31 (GA)**](#tab/2022-08-31) ++#### Thermal receipts (retail, meal, parking, etc.) ++| Supported Languages | Details | +|:--|:-:| +|English|United States (`en-US`), Australia (`en-AU`), Canada (`en-CA`), United Kingdom (`en-GB`), India (`en-IN`), United Arab Emirates (`en-AE`)| +|Croatian|Croatia (`hr-HR`)| +|Czech|Czechia (`cs-CZ`)| +|Danish|Denmark (`da-DK`)| +|Dutch|Netherlands (`nl-NL`)| +|Finnish|Finland (`fi-FI`)| +|French|Canada (`fr-CA`), France (`fr-FR`)| +|German|Germany (`de-DE`)| +|Hungarian|Hungary (`hu-HU`)| +|Italian|Italy (`it-IT`)| +|Japanese|Japan (`ja-JP`)| +|Latvian|Latvia (`lv-LV`)| +|Lithuanian|Lithuania (`lt-LT`)| +|Norwegian|Norway (`no-NO`)| +|Portuguese|Brazil (`pt-BR`), Portugal (`pt-PT`)| +|Spanish|Spain (`es-ES`)| +|Swedish|Sweden (`sv-SE`)| +|Vietnamese|Vietnam (`vi-VN`)| ++#### Hotel receipts ++| Supported Languages | Details | +|:--|:-:| +|English|United States (`en-US`)| +|French|France (`fr-FR`)| +|German|Germany (`de-DE`)| +|Italian|Italy (`it-IT`)| +|Japanese|Japan (`ja-JP`)| +|Portuguese|Portugal (`pt-PT`)| +|Spanish|Spain (`es-ES`)| ++### [2023-02-28-preview](#tab/2023-02-28-preview) ++#### Thermal receipts (retail, meal, parking, etc.) ++| Supported Languages | Details | +|:--|:-:| +|English|United States (`en-US`), Australia (`en-AU`), Canada (`en-CA`), United Kingdom (`en-GB`), India (`en-IN`), United Arab Emirates (`en-AE`)| +|Croatian|Croatia (`hr-HR`)| +|Czech|Czechia (`cs-CZ`)| +|Danish|Denmark (`da-DK`)| +|Dutch|Netherlands (`nl-NL`)| +|Finnish|Finland (`fi-FI`)| +|French|Canada (`fr-CA`), France (`fr-FR`)| +|German|Germany (`de-DE`)| +|Hungarian|Hungary (`hu-HU`)| +|Italian|Italy (`it-IT`)| +|Japanese|Japan (`ja-JP`)| +|Latvian|Latvia (`lv-LV`)| +|Lithuanian|Lithuania (`lt-LT`)| +|Norwegian|Norway (`no-NO`)| +|Portuguese|Brazil (`pt-BR`), Portugal (`pt-PT`)| +|Spanish|Spain (`es-ES`)| +|Swedish|Sweden (`sv-SE`)| +|Vietnamese|Vietnam (`vi-VN`)| ++#### Hotel receipts ++| Supported Languages | Details | +|:--|:-:| +|English|United States (`en-US`)| +|French|France (`fr-FR`)| +|German|Germany (`de-DE`)| +|Italian|Italy (`it-IT`)| +|Japanese|Japan (`ja-JP`)| +|Portuguese|Portugal (`pt-PT`)| +|Spanish|Spain (`es-ES`)| ++ ::: moniker-end ::: moniker range="form-recog-2.1.0" The receipt model supports all English receipts and the following locales: ## Field extraction + |Name| Type | Description | Standardized output | |:--|:-|:-|:-| | ReceiptType | String | Type of sales receipt | Itemized | The receipt model supports all English receipts and the following locales: | Price | Number | Individual price of each item unit| Two-decimal float | | TotalPrice | Number | Total price of line item | Two-decimal float | -- Form Recognizer v3.0 introduces several new features and capabilities. The **Receipt** model supports single-page hotel receipt processing. -### Hotel receipt field extraction -|Name| Type | Description | Standardized output | -|:--|:-|:-|:-| -| ArrivalDate | Date | Date of arrival | yyyy-mm-dd | -| Currency | Currency | Currency unit of receipt amounts. For example USD, EUR, or MIXED if multiple values are found || -| DepartureDate | Date | Date of departure | yyyy-mm-dd | -| Items | Array | | | -| Items.*.Category | String | Item category, for example, Room, Tax, etc. | | -| Items.*.Date | Date | Item date | yyyy-mm-dd | -| Items.*.Description | String | Item description | | -| Items.*.TotalPrice | Number | Item total price | Two-decimal float | -| MerchantAddress | String | Listed address of merchant | | -| MerchantAliases | Array| | | -| MerchantAliases.* | String | Alternative name of merchant | | -| MerchantName | String | Name of the merchant issuing the receipt | | -| MerchantPhoneNumber | phoneNumber | Listed phone number of merchant | +1 xxx xxx xxxx| -| ReceiptType | String | Type of receipt, for example, Hotel, Itemized | | -| Total | Number | Full transaction total of receipt | Two-decimal float | --### Hotel receipt supported languages and locales + Form Recognizer v3.0 introduces several new features and capabilities. In addition to thermal receipts, the **Receipt** model supports single-page hotel receipt processing and tax detail extraction for all receipt types. ++### [**2022-08-31 (GA)**](#tab/2022-08-31) ++#### Thermal receipts (receipt, receipt.retailMeal, receipt.creditCard, receipt.gas, receipt.parking) ++| Field | Type | Description | Example | +|:|:--|:|:--| +|`MerchantName`|`string`|Name of the merchant issuing the receipt|Contoso| +|`MerchantPhoneNumber`|`phoneNumber`|Listed phone number of merchant|987-654-3210| +|`MerchantAddress`|`address`|Listed address of merchant|123 Main St. Redmond WA 98052| +|`Total`|`number`|Full transaction total of receipt|$14.34| +|`TransactionDate`|`date`|Date the receipt was issued|June 06, 2019| +|`TransactionTime`|`time`|Time the receipt was issued|4:49 PM| +|`Subtotal`|`number`|Subtotal of receipt, often before taxes are applied|$12.34| +|`TotalTax`|`number`|Tax on receipt, often sales tax or equivalent|$2.00| +|`Tip`|`number`|Tip included by buyer|$1.00| +|`Items`|`array`||| +|`Items.*`|`object`|Extracted line item|1<br>Surface Pro 6<br>$999.00<br>$999.00| +|`Items.*.TotalPrice`|`number`|Total price of line item|$999.00| +|`Items.*.Description`|`string`|Item description|Surface Pro 6| +|`Items.*.Quantity`|`number`|Quantity of each item|1| +|`Items.*.Price`|`number`|Individual price of each item unit|$999.00| +|`Items.*.ProductCode`|`string`|Product code, product number, or SKU associated with the specific line item|A123| +|`Items.*.QuantityUnit`|`string`|Quantity unit of each item|| +|`TaxDetails`|`array`||| +|`TaxDetails.*`|`object`|Extracted line item|1<br>Surface Pro 6<br>$999.00<br>$999.00| +|`TaxDetails.*.Amount`|`currency`|The amount of the tax detail|$999.00| +#### Hotel receipts (receipt.hotel) ++| Field | Type | Description | Example | +|:|:--|:|:--| +|`MerchantName`|`string`|Name of the merchant issuing the receipt|Contoso| +|`MerchantPhoneNumber`|`phoneNumber`|Listed phone number of merchant|987-654-3210| +|`MerchantAddress`|`address`|Listed address of merchant|123 Main St. Redmond WA 98052| +|`Total`|`number`|Full transaction total of receipt|$14.34| +|`ArrivalDate`|`date`|Date of arrival|27Mar21| +|`DepartureDate`|`date`|Date of departure|28Mar21| +|`Currency`|`string`|Currency unit of receipt amounts (ISO 4217), or 'MIXED' if multiple values are found|USD| +|`MerchantAliases`|`array`||| +|`MerchantAliases.*`|`string`|Alternative name of merchant|Contoso (R)| +|`Items`|`array`||| +|`Items.*`|`object`|Extracted line item|1<br>Surface Pro 6<br>$999.00<br>$999.00| +|`Items.*.TotalPrice`|`number`|Total price of line item|$999.00| +|`Items.*.Description`|`string`|Item description|Room Charge| +|`Items.*.Date`|`date`|Item date|27Mar21| +|`Items.*.Category`|`string`|Item category|Room| ++### [2023-02-28-preview](#tab/2023-02-28-preview) ++#### Thermal receipts (receipt, receipt.retailMeal, receipt.creditCard, receipt.gas, receipt.parking) +| Field | Type | Description | Example | +|:|:--|:|:--| +|`MerchantName`|`string`|Name of the merchant issuing the receipt|Contoso| +|`MerchantPhoneNumber`|`phoneNumber`|Listed phone number of merchant|987-654-3210| +|`MerchantAddress`|`address`|Listed address of merchant|123 Main St. Redmond WA 98052| +|`Total`|`number`|Full transaction total of receipt|$14.34| +|`TransactionDate`|`date`|Date the receipt was issued|June 06, 2019| +|`TransactionTime`|`time`|Time the receipt was issued|4:49 PM| +|`Subtotal`|`number`|Subtotal of receipt, often before taxes are applied|$12.34| +|`TotalTax`|`number`|Tax on receipt, often sales tax or equivalent|$2.00| +|`Tip`|`number`|Tip included by buyer|$1.00| +|`Items`|`array`||| +|`Items.*`|`object`|Extracted line item|1<br>Surface Pro 6<br>$999.00<br>$999.00| +|`Items.*.TotalPrice`|`number`|Total price of line item|$999.00| +|`Items.*.Description`|`string`|Item description|Surface Pro 6| +|`Items.*.Quantity`|`number`|Quantity of each item|1| +|`Items.*.Price`|`number`|Individual price of each item unit|$999.00| +|`Items.*.ProductCode`|`string`|Product code, product number, or SKU associated with the specific line item|A123| +|`Items.*.QuantityUnit`|`string`|Quantity unit of each item|| +|`TaxDetails`|`array`||| +|`TaxDetails.*`|`object`|Extracted line item|1<br>Surface Pro 6<br>$999.00<br>$999.00| +|`TaxDetails.*.Amount`|`currency`|The amount of the tax detail|$999.00| ++#### Hotel receipts (receipt.hotel) ++| Field | Type | Description | Example | +|:|:--|:|:--| +|`MerchantName`|`string`|Name of the merchant issuing the receipt|Contoso| +|`MerchantPhoneNumber`|`phoneNumber`|Listed phone number of merchant|987-654-3210| +|`MerchantAddress`|`address`|Listed address of merchant|123 Main St. Redmond WA 98052| +|`Total`|`number`|Full transaction total of receipt|$14.34| +|`ArrivalDate`|`date`|Date of arrival|27Mar21| +|`DepartureDate`|`date`|Date of departure|28Mar21| +|`Currency`|`string`|Currency unit of receipt amounts (ISO 4217), or 'MIXED' if multiple values are found|USD| +|`MerchantAliases`|`array`||| +|`MerchantAliases.*`|`string`|Alternative name of merchant|Contoso (R)| +|`Items`|`array`||| +|`Items.*`|`object`|Extracted line item|1<br>Surface Pro 6<br>$999.00<br>$999.00| +|`Items.*.TotalPrice`|`number`|Total price of line item|$999.00| +|`Items.*.Description`|`string`|Item description|Room Charge| +|`Items.*.Date`|`date`|Item date|27Mar21| +|`Items.*.Category`|`string`|Item category|Room| -| Model | LanguageΓÇöLocale code | Default | -|--|:-|:| -|Receipt (hotel) | <ul><li>English (United States)ΓÇöen-US</li></ul>| English (United States)ΓÇöen-US| + ::: moniker-end |
applied-ai-services | Concept W2 | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-w2.md | The Form Recognizer W-2 model, combines Optical Character Recognition (OCR) with ## Automated W-2 form processing -Form W-2, also known as the Wage and Tax Statement, is sent by an employer to each employee and the Internal Revenue Service (IRS) at the end of the year. A W-2 form reports employees' annual wages and the amount of taxes withheld from their paychecks. The IRS also uses W-2 forms to track individuals' tax obligations. The Social Security Administration (SSA) uses the information on this and other forms to compute the Social Security benefits for all workers. +An employer sends form W-2, also known as the Wage and Tax Statement, to each employee and the Internal Revenue Service (IRS) at the end of the year. A W-2 form reports employees' annual wages and the amount of taxes withheld from their paychecks. The IRS also uses W-2 forms to track individuals' tax obligations. The Social Security Administration (SSA) uses the information on this and other forms to compute the Social Security benefits for all workers. ***Sample W-2 tax form processed using Form Recognizer Studio*** Form W-2, also known as the Wage and Tax Statement, is sent by an employer to ea ## Development options -The prebuilt W-2 model is supported by Form Recognizer v3.0 with the following tools: +Form Recognizer v3.0 supports the following tools: | Feature | Resources | Model ID | |-|-|--| The prebuilt W-2 model is supported by Form Recognizer v3.0 with the following t ### Try W-2 data extraction -Try extracting data from W-2 forms using the Form Recognizer Studio. You'll need the following resources: +Try extracting data from W-2 forms using the Form Recognizer Studio. You need the following resources: * An Azure subscriptionΓÇöyou can [create one for free](https://azure.microsoft.com/free/cognitive-services/) Try extracting data from W-2 forms using the Form Recognizer Studio. You'll need | MedicareTaxWithheld | 6 | Number | Medicare tax with held | 1111 | | SocialSecurityTips | 7 | Number | Social security tips | 1111 | | AllocatedTips | 8 | Number | Allocated tips | 1111 |-| VerificationCode | 9 | String | Verification Code on Form W-2 | A123-B456-C789-DXYZ | +| Verification​Code | 9 | String | Verification Code on Form W-2 | A123-B456-C789-DXYZ | | DependentCareBenefits | 10 | Number | Dependent care benefits | 1111 | | NonqualifiedPlans | 11 | Number | The non-qualified plan, a type of retirement savings plan that is employer-sponsored and tax-deferred | 1111 | | AdditionalInfo | | Array of objects | An array of LetterCode and Amount | | Try extracting data from W-2 forms using the Form Recognizer Studio. You'll need ## Next steps -* Complete a Form Recognizer quickstart: -> [!div class="checklist"] -> -> * [**REST API**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true) -> * [**C# SDK**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true#prebuilt-model) -> * [**Python SDK**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true#prebuilt-model) -> * [**Java SDK**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true#prebuilt-model) -> * [**JavaScript**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true#prebuilt-model)</li></ul> +* Try processing your own forms and documents with the [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio) ++* Complete a [Form Recognizer quickstart](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true) and get started creating a document processing app in the development language of your choice. |
applied-ai-services | Build A Custom Classifier | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/how-to-guides/build-a-custom-classifier.md | + + Title: "Build and train a custom classifier" ++description: Learn how to label, and build a custom document classifier model. +++++ Last updated : 03/03/2023++monikerRange: 'form-recog-3.0.0' +recommendations: false +++# Build and train a custom classifier model +++Custom classifier models can classify each page in an input file to identify the document(s) within. Classifier models can also identify multiple documents or multiple instances of a single document in the input file. Form Recognizer custom models require as few as five training documents per document class to get started. To get started training a custom classifier model, you need at least **five documents** for each class and **two classes** of documents. ++## Custom classifier model input requirements ++Make sure your training data set follows the input requirements for Form Recognizer. +++## Training data tips ++Follow these tips to further optimize your data set for training: ++* If possible, use text-based PDF documents instead of image-based documents. Scanned PDFs are handled as images. ++* If your form images are of lower quality, use a larger data set (10-15 images, for example). ++## Upload your training data ++Once you've put together the set of forms or documents for training, you need to upload it to an Azure blob storage container. If you don't know how to create an Azure storage account with a container, follow the [Azure Storage quickstart for Azure portal](../../../storage/blobs/storage-quickstart-blobs-portal.md). You can use the free pricing tier (F0) to try the service, and upgrade later to a paid tier for production. If your dataset is organized as folders, preserve that structure as the Studio can use your folder names for labels to simplify the labeling process. ++## Create a classification project in the Form Recognizer Studio ++The Form Recognizer Studio provides and orchestrates all the API calls required to complete your dataset and train your model. ++1. Start by navigating to the [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio). The first time you use the Studio, you need to [initialize your subscription, resource group, and resource](../quickstarts/try-v3-form-recognizer-studio.md). Then, follow the [prerequisites for custom projects](../quickstarts/try-v3-form-recognizer-studio.md#additional-prerequisites-for-custom-projects) to configure the Studio to access your training dataset. ++1. In the Studio, select the **Custom classifier models** tile, on the custom models section of the page and select the **Create a project** button. ++ :::image type="content" source="../media/how-to/studio-create-classifier-project.png" alt-text="Screenshot of how to create a classifier project in the Form Recognizer Studio."::: ++ 1. On the create project dialog, provide a name for your project, optionally a description, and select continue. ++ 1. Next, choose or create a Form Recognizer resource before you select continue. ++ :::image type="content" source="../media/how-to/studio-select-resource.png" alt-text="Screenshot showing the project setup dialog window."::: ++1. Next select the storage account you used to upload your custom model training dataset. The **Folder path** should be empty if your training documents are in the root of the container. If your documents are in a subfolder, enter the relative path from the container root in the **Folder path** field. Once your storage account is configured, select continue. ++ > [!IMPORTANT] + > You can either organize the training dataset by folders where the folder name is the label or class for documents or create a flat list of documents that you can assign a label to in the Studio. ++ :::image type="content" source="../media/how-to/studio-select-storage.png" alt-text="Screenshot showing how to select the Form Recognizer resource."::: ++1. Training a custom classifier requires the output from the Layout model for each document in your dataset. Run layout on all documents as an optional step to speed up the model training process. ++1. Finally, review your project settings and select **Create Project** to create a new project. You should now be in the labeling window and see the files in your dataset listed. ++## Label your data ++In your project, you only need to label each document with the appropriate class label. +++You see the files you uploaded to storage in the file list, ready to be labeled. You have a few options to label your dataset. ++1. If the documents are organized in folders, the Studio prompts you to use the folder names as labels. This step simplifies your labeling down to a single select. ++1. To assign a label to a document, select on the add label selection mark to assign a label. ++1. Control select to multi-select documents to assign a label ++You should now have all the documents in your dataset labeled. If you look at the storage account, you find *.ocr.json* files that correspond to each document in your training dataset and a new **class-name.jsonl** file for each class labeled. This training dataset is submitted to train the model. ++## Train your model ++With your dataset labeled, you're now ready to train your model. Select the train button in the upper-right corner. ++1. On the train model dialog, provide a unique classifier ID and, optionally, a description. The classifier ID accepts a string data type. ++1. Select **Train** to initiate the training process. ++1. Classifier models train in a few minutes. ++1. Navigate to the *Models* menu to view the status of the train operation. ++## Test the model ++Once the model training is complete, you can test your model by selecting the model on the models list page. ++1. Select the model and select on the **Test** button. ++1. Add a new file by browsing for a file or dropping a file into the document selector. ++1. With a file selected, choose the **Analyze** button to test the model. ++1. The model results are displayed with the list of identified documents, a confidence score for each document identified and the page range for each of the documents identified. ++1. Validate your model by evaluating the results for each document identified. ++Congratulations you've trained a custom classifier model in the Form Recognizer Studio! Your model is ready for use with the REST API or the SDK to analyze documents. ++## Next steps ++> [!div class="nextstepaction"] +> [Learn about custom model types](../concept-custom.md) ++> [!div class="nextstepaction"] +> [Learn about accuracy and confidence with custom models](../concept-accuracy-confidence.md) |
applied-ai-services | Compose Custom Models | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/how-to-guides/compose-custom-models.md | recommendations: false ::: moniker range="form-recog-3.0.0" -A composed model is created by taking a collection of custom models and assigning them to a single model ID. You can assign up to 100 trained custom models to a single composed model ID. When a document is submitted to a composed model, the service performs a classification step to decide which custom model accurately represents the form presented for analysis. Composed models are useful when you've trained several models and want to group them to analyze similar form types. For example, your composed model might include custom models trained to analyze your supply, equipment, and furniture purchase orders. Instead of manually trying to select the appropriate model, you can use a composed model to determine the appropriate custom model for each analysis and extraction. +A composed model is created by taking a collection of custom models and assigning them to a single model ID. You can assign up to 200 trained custom models to a single composed model ID. When a document is submitted to a composed model, the service performs a classification step to decide which custom model accurately represents the form presented for analysis. Composed models are useful when you've trained several models and want to group them to analyze similar form types. For example, your composed model might include custom models trained to analyze your supply, equipment, and furniture purchase orders. Instead of manually trying to select the appropriate model, you can use a composed model to determine the appropriate custom model for each analysis and extraction. To learn more, see [Composed custom models](../concept-composed-models.md). -In this article, you'll learn how to create and use composed custom models to analyze your forms and documents. +In this article, you learn how to create and use composed custom models to analyze your forms and documents. ## Prerequisites -To get started, you'll need the following resources: +To get started, you need the following resources: * **An Azure subscription**. You can [create a free Azure subscription](https://azure.microsoft.com/free/cognitive-services/). To get started, you'll need the following resources: 1. After the resource deploys, select **Go to resource**. - 1. Copy the **Keys and Endpoint** values from the Azure portal and paste them in a convenient location, such as *Microsoft Notepad*. You'll need the key and endpoint values to connect your application to the Form Recognizer API. + 1. Copy the **Keys and Endpoint** values from the Azure portal and paste them in a convenient location, such as *Microsoft Notepad*. You need the key and endpoint values to connect your application to the Form Recognizer API. :::image type="content" source="../media/containers/keys-and-endpoint.png" alt-text="Still photo showing how to access resource key and endpoint URL."::: To get started, you'll need the following resources: ## Create your custom models -First, you'll need a set of custom models to compose. You can use the Form Recognizer Studio, REST API, or client-library SDKs. The steps are as follows: +First, you need a set of custom models to compose. You can use the Form Recognizer Studio, REST API, or client-library SDKs. The steps are as follows: * [**Assemble your training dataset**](#assemble-your-training-dataset) * [**Upload your training set to Azure blob storage**](#upload-your-training-dataset) First, you'll need a set of custom models to compose. You can use the Form Recog ## Assemble your training dataset -Building a custom model begins with establishing your training dataset. You'll need a minimum of five completed forms of the same type for your sample dataset. They can be of different file types (jpg, png, pdf, tiff) and contain both text and handwriting. Your forms must follow the [input requirements](../how-to-guides/build-a-custom-model.md?view=form-recog-2.1.0&preserve-view=true#custom-model-input-requirements) for Form Recognizer. +Building a custom model begins with establishing your training dataset. You need a minimum of five completed forms of the same type for your sample dataset. They can be of different file types (jpg, png, pdf, tiff) and contain both text and handwriting. Your forms must follow the [input requirements](../how-to-guides/build-a-custom-model.md?view=form-recog-2.1.0&preserve-view=true#custom-model-input-requirements) for Form Recognizer. >[!TIP] > Follow these tips to optimize your data set for training: See [Build a training data set](../how-to-guides/build-a-custom-model.md?view=fo ## Upload your training dataset -When you've gathered a set of training documents, you'll need to [upload your training data](../how-to-guides/build-a-custom-model.md?view=form-recog-2.1.0&preserve-view=true#upload-your-training-data) to an Azure blob storage container. +When you've gathered a set of training documents, you need to [upload your training data](../how-to-guides/build-a-custom-model.md?view=form-recog-2.1.0&preserve-view=true#upload-your-training-data) to an Azure blob storage container. -If you want to use manually labeled data, you'll also have to upload the *.labels.json* and *.ocr.json* files that correspond to your training documents. +If you want to use manually labeled data, you have to upload the *.labels.json* and *.ocr.json* files that correspond to your training documents. ## Train your custom model See [Form Recognizer Studio: labeling as tables](../quickstarts/try-v3-form-reco Training with labels leads to better performance in some scenarios. To train with labels, you need to have special label information files (*\<filename\>.pdf.labels.json*) in your blob storage container alongside the training documents. -Label files contain key-value associations that a user has entered manually. They're needed for labeled data training, but not every source file needs to have a corresponding label file. Source files without labels will be treated as ordinary training documents. We recommend five or more labeled files for reliable training. You can use a UI tool like [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/customform/projects) to generate these files. +Label files contain key-value associations that a user has entered manually. They're needed for labeled data training, but not every source file needs to have a corresponding label file. Source files without labels are treated as ordinary training documents. We recommend five or more labeled files for reliable training. You can use a UI tool like [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/customform/projects) to generate these files. Once you have your label files, you can include them with by calling the training method with the *useLabelFile* parameter set to `true`. When you train models using the [**Form Recognizer Studio**](https://formrecogni 1. In the pop-up window, name your newly composed model and select **Compose**. -1. When the operation completes, your newly composed model will appear in the list. +1. When the operation completes, your newly composed model appears in the list. 1. Once the model is ready, use the **Test** command to validate it with your test documents and observe the results. Form Recognizer uses advanced machine-learning technology to detect and extract * **Composed models**. A composed model is created by taking a collection of custom models and assigning them to a single model that encompasses your form types. When a document is submitted to a composed model, the service performs a classification step to decide which custom model accurately represents the form presented for analysis. -In this article, you'll learn how to create Form Recognizer custom and composed models using our [Form Recognizer Sample Labeling tool](../label-tool.md), [REST APIs](../how-to-guides/use-sdk-rest-api.md?view=form-recog-2.1.0&preserve-view=true#train-a-custom-model), or [client-library SDKs](../how-to-guides/use-sdk-rest-api.md?view=form-recog-2.1.0&preserve-view=true#train-a-custom-model). +In this article, you learn how to create Form Recognizer custom and composed models using our [Form Recognizer Sample Labeling tool](../label-tool.md), [REST APIs](../how-to-guides/use-sdk-rest-api.md?view=form-recog-2.1.0&preserve-view=true#train-a-custom-model), or [client-library SDKs](../how-to-guides/use-sdk-rest-api.md?view=form-recog-2.1.0&preserve-view=true#train-a-custom-model). ## Sample Labeling tool -Try extracting data from custom forms using our Sample Labeling tool. You'll need the following resources: +Try extracting data from custom forms using our Sample Labeling tool. You need the following resources: * An Azure subscription—you can [create one for free](https://azure.microsoft.com/free/cognitive-services/) The steps for building, training, and using custom and composed models are as fo ## Assemble your training dataset -Building a custom model begins with establishing your training dataset. You'll need a minimum of five completed forms of the same type for your sample dataset. They can be of different file types (jpg, png, pdf, tiff) and contain both text and handwriting. Your forms must follow the [input requirements](build-a-custom-model.md?view=form-recog-2.1.0&preserve-view=true#custom-model-input-requirements) for Form Recognizer. +Building a custom model begins with establishing your training dataset. You need a minimum of five completed forms of the same type for your sample dataset. They can be of different file types (jpg, png, pdf, tiff) and contain both text and handwriting. Your forms must follow the [input requirements](build-a-custom-model.md?view=form-recog-2.1.0&preserve-view=true#custom-model-input-requirements) for Form Recognizer. ## Upload your training dataset -You'll need to [upload your training data](build-a-custom-model.md?view=form-recog-2.1.0&preserve-view=true#upload-your-training-data) +You need to [upload your training data](build-a-custom-model.md?view=form-recog-2.1.0&preserve-view=true#upload-your-training-data) to an Azure blob storage container. If you don't know how to create an Azure storage account with a container, *see* [Azure Storage quickstart for Azure portal](../../../storage/blobs/storage-quickstart-blobs-portal.md). You can use the free pricing tier (F0) to try the service, and upgrade later to a paid tier for production. ## Train your custom model Form Recognizer uses the [Layout](../concept-layout.md) API to learn the expecte > [!NOTE] > **Model Compose is only available for custom models trained *with* labels.** Attempting to compose unlabeled models will produce an error. -With the Model Compose operation, you can assign up to 100 trained custom models to a single model ID. When you call Analyze with the composed model ID, Form Recognizer will first classify the form you submitted, choose the best matching assigned model, and then return results for that model. This operation is useful when incoming forms may belong to one of several templates. +With the Model Compose operation, you can assign up to 200 trained custom models to a single model ID. When you call Analyze with the composed model ID, Form Recognizer classifies the form you submitted first, chooses the best matching assigned model, and then returns results for that model. This operation is useful when incoming forms may belong to one of several templates. -Using the Form Recognizer Sample Labeling tool, the REST API, or the Client-library SDKs, follow the steps below to set up a composed model: +Using the Form Recognizer Sample Labeling tool, the REST API, or the Client-library SDKs, follow the steps to set up a composed model: 1. [**Gather your custom model IDs**](#gather-your-custom-model-ids) 1. [**Compose your custom models**](#compose-your-custom-models) ### Gather your custom model IDs -Once the training process has successfully completed, your custom model will be assigned a model ID. You can retrieve a model ID as follows: +Once the training process has successfully completed, your custom model is assigned a model ID. You can retrieve a model ID as follows: <!-- Applies to FOTT but labeled studio to eliminate tab grouping warning --> ### [**Form Recognizer Sample Labeling tool**](#tab/studio) When you train models using the [**Form Recognizer Sample Labeling tool**](https ### [**REST API**](#tab/rest) -The [**REST API**](build-a-custom-model.md?view=form-recog-2.1.0&preserve-view=true#train-your-model) will return a `201 (Success)` response with a **Location** header. The value of the last parameter in this header is the model ID for the newly trained model: +The [**REST API**](build-a-custom-model.md?view=form-recog-2.1.0&preserve-view=true#train-your-model) returns a `201 (Success)` response with a **Location** header. The value of the last parameter in this header is the model ID for the newly trained model: :::image type="content" source="../media/model-id.png" alt-text="Screenshot: the returned location header containing the model ID."::: After you have completed training, compose your models as follows: 1. In the pop-up window, name your newly composed model and select **Compose**. -When the operation completes, your newly composed model will appear in the list. +When the operation completes, your newly composed model appears in the list. :::image type="content" source="../media/custom-model-compose.png" alt-text="Screenshot of the model compose window." lightbox="../media/custom-model-compose-expanded.png"::: Using the **REST API**, you can make a [**Compose Custom Model**](https://westu ### [**Client-library SDKs**](#tab/sdks) -Use the programming language code of your choice to create a composed model that will be called with a single model ID. Below are links to code samples that demonstrate how to create a composed model from existing custom models: +Use the programming language code of your choice to create a composed model that is called with a single model ID. The following links are code samples that demonstrate how to create a composed model from existing custom models: * [**C#/.NET**](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/Sample_ModelCompose.md). Use the programming language code of your choice to create a composed model that 1. Select the **Run Analysis** button. -1. The tool will apply tags in bounding boxes and report the confidence percentage for each tag. +1. The tool applies tags in bounding boxes and reports the confidence percentage for each tag. :::image type="content" source="../media/analyze.png" alt-text="Screenshot: Form Recognizer tool analyze-a-custom-form window."::: Using the REST API, you can make an [Analyze Document](https://westus.dev.cognit ### [**Client-library SDKs**](#tab/sdks) -Using the programming language of your choice to analyze a form or document with a custom or composed model. You'll need your Form Recognizer endpoint, key, and model ID. +Using the programming language of your choice to analyze a form or document with a custom or composed model. You need your Form Recognizer endpoint, key, and model ID. * [**C#/.NET**](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/Sample_ModelCompose.md) |
applied-ai-services | Use Sdk Rest Api | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/how-to-guides/use-sdk-rest-api.md | recommendations: false ::: moniker-end ::: moniker range="form-recog-3.0.0"- In this guide, you'll learn how to add Form Recognizer models to your applications and workflows using a programming language SDK of your choice or the REST API. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract key text and structure elements from documents. We recommend that you use the free service as you're learning the technology. Remember that the number of free pages is limited to 500 per month. + In this guide, you learn how to add Form Recognizer models to your applications and workflows using a programming language SDK of your choice or the REST API. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract key text and structure elements from documents. We recommend that you use the free service as you're learning the technology. Remember that the number of free pages is limited to 500 per month. Choose from the following Form Recognizer models to analyze and extract data and values from forms and documents: Choose from the following Form Recognizer models to analyze and extract data and > > * The [prebuilt-document](../concept-general-document.md) model extracts key-value pairs, tables, and selection marks from documents and can be used as an alternative to training a custom model without labels. >+> * The [prebuilt-healthInsuranceCard.us](../concept-insurance-card.md) model extracts key information from US health insurance cards. +> > * The [prebuilt-tax.us.w2](../concept-w2.md) model extracts information reported on US Internal Revenue Service (IRS) tax forms. > > * The [prebuilt-invoice](../concept-invoice.md) model extracts key fields and line items from sales invoices in various formats and quality including phone-captured images, scanned documents, and digital PDFs. Congratulations! You've learned to use Form Recognizer models to analyze various ::: moniker-end ::: moniker range="form-recog-2.1.0"-In this how-to guide, you'll learn how to add Form Recognizer to your applications and workflows using an SDK, in a programming language of your choice, or the REST API. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract key-value pairs, text, and tables from your documents. We recommend that you use the free service when you're learning the technology. Remember that the number of free pages is limited to 500 per month. +In this how-to guide, you learn how to add Form Recognizer to your applications and workflows using an SDK, in a programming language of your choice, or the REST API. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract key-value pairs, text, and tables from your documents. We recommend that you use the free service when you're learning the technology. Remember that the number of free pages is limited to 500 per month. -You'll use the following APIs to extract structured data from forms and documents: +You use the following APIs to extract structured data from forms and documents: * [Authenticate the client](#authenticate-the-client) * [Analyze Layout](#analyze-layout) |
applied-ai-services | Label Tool | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/label-tool.md | recommendations: false > * You can refer to the [API migration guide](v3-migration-guide.md) for detailed information about migrating from v2.1 to v3.0. > * *See* our [**REST API**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true) or [**C#**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true), [**Java**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true), [**JavaScript**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true), or [Python](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true) SDK quickstarts to get started with the V3.0. -In this article, you'll use the Form Recognizer REST API with the Sample Labeling tool to train a custom model with manually labeled data. +In this article, you use the Form Recognizer REST API with the Sample Labeling tool to train a custom model with manually labeled data. > [!VIDEO https://learn.microsoft.com/Shows/Docs-Azure/Azure-Form-Recognizer/player] ## Prerequisites - You'll need the following resources to complete this project: + You need the following resources to complete this project: * Azure subscription - [Create one for free](https://azure.microsoft.com/free/cognitive-services) * Once you have your Azure subscription, <a href="https://portal.azure.com/#create/Microsoft.CognitiveServicesFormRecognizer" title="Create a Form Recognizer resource" target="_blank">create a Form Recognizer resource </a> in the Azure portal to get your key and endpoint. After it deploys, select **Go to resource**.- * You'll need the key and endpoint from the resource you create to connect your application to the Form Recognizer API. You'll paste your key and endpoint into the code later in the quickstart. + * You need the key and endpoint from the resource you create to connect your application to the Form Recognizer API. You paste your key and endpoint into the code later in the quickstart. * You can use the free pricing tier (`F0`) to try the service, and upgrade later to a paid tier for production.-* A set of at least six forms of the same type. You'll use this data to train the model and test a form. You can use a [sample data set](https://go.microsoft.com/fwlink/?linkid=2090451) (download and extract *sample_data.zip*) for this quickstart. Upload the training files to the root of a blob storage container in a standard-performance-tier Azure Storage account. +* A set of at least six forms of the same type. You use this data to train the model and test a form. You can use a [sample data set](https://go.microsoft.com/fwlink/?linkid=2090451) (download and extract *sample_data.zip*) for this quickstart. Upload the training files to the root of a blob storage container in a standard-performance-tier Azure Storage account. ## Create a Form Recognizer resource Try out the [**Form Recognizer Sample Labeling tool**](https://fott-2-1.azureweb > [!div class="nextstepaction"] > [Try Prebuilt Models](https://fott-2-1.azurewebsites.net/) -You'll need an Azure subscription ([create one for free](https://azure.microsoft.com/free/cognitive-services)) and a [Form Recognizer resource](https://portal.azure.com/#create/Microsoft.CognitiveServicesFormRecognizer) endpoint and key to try out the Form Recognizer service. +You need an Azure subscription ([create one for free](https://azure.microsoft.com/free/cognitive-services)) and a [Form Recognizer resource](https://portal.azure.com/#create/Microsoft.CognitiveServicesFormRecognizer) endpoint and key to try out the Form Recognizer service. ## Set up the Sample Labeling tool You'll need an Azure subscription ([create one for free](https://azure.microsoft > > If your storage data is behind a VNet or firewall, you must deploy the **Form Recognizer Sample Labeling tool** behind your VNet or firewall and grant access by creating a [system-assigned managed identity](managed-identity-byos.md "Azure managed identity is a service principal that creates an Azure Active Directory (Azure AD) identity and specific permissions for Azure managed resources"). -You'll use the Docker engine to run the Sample Labeling tool. Follow these steps to set up the Docker container. For a primer on Docker and container basics, see the [Docker overview](https://docs.docker.com/engine/docker-overview/). +You use the Docker engine to run the Sample Labeling tool. Follow these steps to set up the Docker container. For a primer on Docker and container basics, see the [Docker overview](https://docs.docker.com/engine/docker-overview/). > [!TIP] > The OCR Form Labeling Tool is also available as an open source project on GitHub. The tool is a TypeScript web application built using React + Redux. To learn more or contribute, see the [OCR Form Labeling Tool](https://github.com/microsoft/OCR-Form-Tools/blob/master/README.md#run-as-web-application) repo. To try out the tool online, go to the [Form Recognizer Sample Labeling tool website](https://fott-2-1.azurewebsites.net/). -1. First, install Docker on a host computer. This guide will show you how to use local computer as a host. If you want to use a Docker hosting service in Azure, see the [Deploy the Sample Labeling tool](deploy-label-tool.md) how-to guide. +1. First, install Docker on a host computer. This guide shows you how to use local computer as a host. If you want to use a Docker hosting service in Azure, see the [Deploy the Sample Labeling tool](deploy-label-tool.md) how-to guide. The host computer must meet the following hardware requirements: You'll use the Docker engine to run the Sample Labeling tool. Follow these steps docker run -it -p 3000:80 mcr.microsoft.com/azure-cognitive-services/custom-form/labeltool:latest-2.1 eula=accept ``` - This command will make the sample-labeling tool available through a web browser. Go to `http://localhost:3000`. + This command makes the sample-labeling tool available through a web browser. Go to `http://localhost:3000`. > [!NOTE] > You can also label documents and train models using the Form Recognizer REST API. To train and Analyze with the REST API, see [Train with labels using the REST API and Python](https://github.com/Azure-Samples/cognitive-services-quickstart-code/blob/master/python/FormRecognizer/rest/python-labeled-data.md). ## Set up input data -First, make sure all the training documents are of the same format. If you have forms in multiple formats, organize them into subfolders based on common format. When you train, you'll need to direct the API to a subfolder. +First, make sure all the training documents are of the same format. If you have forms in multiple formats, organize them into subfolders based on common format. When you train, you need to direct the API to a subfolder. ### Configure cross-domain resource sharing (CORS) Fill in the fields with the following values: In the Sample Labeling tool, projects store your configurations and settings. Create a new project and fill in the fields with the following values: * **Display Name** - the project display name-* **Security Token** - Some project settings can include sensitive values, such as keys or other shared secrets. Each project will generate a security token that can be used to encrypt/decrypt sensitive project settings. You can find security tokens in the Application Settings by selecting the gear icon at the bottom of the left navigation bar. +* **Security Token** - Some project settings can include sensitive values, such as keys or other shared secrets. Each project generates a security token that can be used to encrypt/decrypt sensitive project settings. You can find security tokens in the Application Settings by selecting the gear icon at the bottom of the left navigation bar. * **Source Connection** - The Azure Blob Storage connection you created in the previous step that you would like to use for this project. * **Folder Path** - Optional - If your source forms are located in a folder on the blob container, specify the folder name here * **Form Recognizer Service Uri** - Your Form Recognizer endpoint URL. When you create or open a project, the main tag editor window opens. The tag edi ### Identify text and tables -Select **Run Layout on unvisited documents** on the left pane to get the text and table layout information for each document. The labeling tool will draw bounding boxes around each text element. +Select **Run Layout on unvisited documents** on the left pane to get the text and table layout information for each document. The labeling tool draws bounding boxes around each text element. -The labeling tool will also show which tables have been automatically extracted. Select the table/grid icon on the left hand of the document to see the extracted table. In this quickstart, because the table content is automatically extracted, we won't be labeling the table content, but rather rely on the automated extraction. +The labeling tool also shows which tables have been automatically extracted. Select the table/grid icon on the left hand of the document to see the extracted table. In this quickstart, because the table content is automatically extracted, we don't label the table content, but rather rely on the automated extraction. :::image type="content" source="media/label-tool/table-extraction.png" alt-text="Table visualization in Sample Labeling tool."::: In v2.1, if your training document doesn't have a value filled in, you can draw ### Apply labels to text -Next, you'll create tags (labels) and apply them to the text elements that you want the model to analyze. +Next, you create tags (labels) and apply them to the text elements that you want the model to analyze. 1. First, use the tags editor pane to create the tags you'd like to identify. 1. Select **+** to create a new tag. Next, you'll create tags (labels) and apply them to the text elements that you w ### Specify tag value types -You can set the expected data type for each tag. Open the context menu to the right of a tag and select a type from the menu. This feature allows the detection algorithm to make assumptions that will improve the text-detection accuracy. It also ensures that the detected values will be returned in a standardized format in the final JSON output. Value type information is saved in the **fields.json** file in the same path as your label files. +You can set the expected data type for each tag. Open the context menu to the right of a tag and select a type from the menu. This feature allows the detection algorithm to make assumptions that improve the text-detection accuracy. It also ensures that the detected values are returned in a standardized format in the final JSON output. Value type information is saved in the **fields.json** file in the same path as your label files. > [!div class="mx-imgBorder"] >  The following value types and variations are currently supported: * `number` * default, `currency` * Formatted as a Floating point value.- * Example: 1234.98 on the document will be formatted into 1234.98 on the output + * Example: 1234.98 on the document is formatted into 1234.98 on the output * `date` * default, `dmy`, `mdy`, `ymd` The following value types and variations are currently supported: * `time` * `integer` * Formatted as an integer value.- * Example: 1234.98 on the document will be formatted into 123498 on the output. + * Example: 1234.98 on the document is formatted into 123498 on the output. * `selectionMark` > [!NOTE] The following value types and variations are currently supported: ### Label tables (v2.1 only) -At times, your data might lend itself better to being labeled as a table rather than key-value pairs. In this case, you can create a table tag by selecting **Add a new table tag**. Specify whether the table will have a fixed number of rows or variable number of rows depending on the document and define the schema. +At times, your data might lend itself better to being labeled as a table rather than key-value pairs. In this case, you can create a table tag by selecting **Add a new table tag**. Specify whether the table has a fixed number of rows or variable number of rows depending on the document and define the schema. :::image type="content" source="media/label-tool/table-tag.png" alt-text="Configuring a table tag."::: Once you've defined your table tag, tag the cell values. ## Train a custom model -Choose the Train icon on the left pane to open the Training page. Then select the **Train** button to begin training the model. Once the training process completes, you'll see the following information: +Choose the Train icon on the left pane to open the Training page. Then select the **Train** button to begin training the model. Once the training process completes, you see the following information: -* **Model ID** - The ID of the model that was created and trained. Each training call creates a new model with its own ID. Copy this string to a secure location; you'll need it if you want to do prediction calls through the [REST API](/azure/applied-ai-services/form-recognizer/how-to-guides/v2-1-sdk-rest-api?pivots=programming-language-rest-api&tabs=preview%2cv2-1) or [client library guide](/azure/applied-ai-services/form-recognizer/how-to-guides/v2-1-sdk-rest-api). +* **Model ID** - The ID of the model that was created and trained. Each training call creates a new model with its own ID. Copy this string to a secure location; you need it if you want to do prediction calls through the [REST API](/azure/applied-ai-services/form-recognizer/how-to-guides/v2-1-sdk-rest-api?pivots=programming-language-rest-api&tabs=preview%2cv2-1) or [client library guide](/azure/applied-ai-services/form-recognizer/how-to-guides/v2-1-sdk-rest-api). * **Average Accuracy** - The model's average accuracy. You can improve model accuracy by adding and labeling more forms, then retraining to create a new model. We recommend starting by labeling five forms and adding more forms as needed. * The list of tags, and the estimated accuracy per tag. :::image type="content" source="media/label-tool/train-screen.png" alt-text="Training view."::: -After training finishes, examine the **Average Accuracy** value. If it's low, you should add more input documents and repeat the labeling steps. The documents you've already labeled will remain in the project index. +After training finishes, examine the **Average Accuracy** value. If it's low, you should add more input documents and repeat the labeling steps. The documents you've already labeled remain in the project index. > [!TIP] > You can also run the training process with a REST API call. To learn how to do this, see [Train with labels using Python](https://github.com/Azure-Samples/cognitive-services-quickstart-code/blob/master/python/FormRecognizer/rest/python-labeled-data.md). ## Compose trained models -With Model Compose, you can compose up to 100 models to a single model ID. When you call Analyze with the composed `modelID`, Form Recognizer will first classify the form you submitted, choose the best matching model, and then return results for that model. This operation is useful when incoming forms may belong to one of several templates. +With Model Compose, you can compose up to 200 models to a single model ID. When you call Analyze with the composed `modelID`, Form Recognizer classifies the form you submitted, choose the best matching model, and then return results for that model. This operation is useful when incoming forms may belong to one of several templates. * To compose models in the Sample Labeling tool, select the Model Compose (merging arrow) icon from the navigation bar. * Select the models you wish to compose together. Models with the arrows icon are already composed models. With Model Compose, you can compose up to 100 models to a single model ID. When ## Analyze a form -Select the Analyze icon from the navigation bar to test your model. Select source 'Local file'. Browse for a file and select a file from the sample dataset that you unzipped in the test folder. Then choose the **Run analysis** button to get key/value pairs, text and tables predictions for the form. The tool will apply tags in bounding boxes and will report the confidence of each tag. +Select the Analyze icon from the navigation bar to test your model. Select source 'Local file'. Browse for a file and select a file from the sample dataset that you unzipped in the test folder. Then choose the **Run analysis** button to get key/value pairs, text and tables predictions for the form. The tool applies tags in bounding boxes and reports the confidence of each tag. :::image type="content" source="media/analyze.png" alt-text="Screenshot: analyze-a-custom-form window"::: When you want to resume your project, you first need to create a connection to t ### Resume a project -Finally, go to the main page (house icon) and select **Open Cloud Project**. Then select the blob storage connection, and select your project's `.fott` file. The application will load all of the project's settings because it has the security token. +Finally, go to the main page (house icon) and select **Open Cloud Project**. Then select the blob storage connection, and select your project's `.fott` file. The application loads all of the project's settings because it has the security token. ## Next steps |
applied-ai-services | Language Support | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/language-support.md | This article covers the supported languages for text and field **extraction (by ## Read, layout, and custom form (template) model -The following lists include the currently GA languages in the most recent v3.0 version. These languages are supported by Read, Layout, and Custom form (template) model features. +The following lists include the currently GA languages in the most recent v3.0 version for Read, Layout, and Custom template (form) models. > [!NOTE] > **Language code optional** Use the parameter `api-version=2022-06-30-preview` when using the REST API or th ## Custom neural model -Language| Locale code | +Language| API Version | |:--|:-:|-|English (United States)|en-us| +|English | `2022-08-31` (GA), `2023-02-28-preview`| +|Spanish | `2023-02-28-preview`| +|German | `2023-02-28-preview`| +|French | `2023-02-28-preview`| +|Italian | `2023-02-28-preview`| +|Dutch | `2023-02-28-preview`| ## Receipt model >[!NOTE] > It's not necessary to specify a locale. This is an optional parameter. The Form Recognizer deep-learning technology will auto-detect the language of the text in your image. +Receipt supports all English receipts and the following locales: + |Language| Locale code | |:--|:-:|-|English (Australia)|`en-au`| +|English |`en-au`| |English (Canada)|`en-ca`| |English (United Kingdom)|`en-gb`| |English (India)|`en-in`| |English (United States)| `en-us`|-|French (France) | `fr` | -|French (Canada)| `fr-ca`| -|German | `de`| -|Italian| `it`| -|Spanish | `es` | +|French | `fr` | +| Spanish | `es` | ## Business card model Business Card supports all English business cards with the following locales: |Language| Locale code | |:--|:-:|-|English (Australia)|`en-au`| -|English (Canada)|`en-ca`| -|English (United Kingdom)|`en-gb`| -|English (India|`en-in`| -|English (United States)| `en-us`| +|English |`en-US`, `en-CA`, `en-GB`, `en-IN`| +|German | de| +|French | fr| +|Italian |it| +|Portuguese |pt| +|Dutch | nl| The **2022-06-30** and later releases include Japanese language support: The **2022-06-30** and later releases include Japanese language support: Language| Locale code | |:--|:-:|-|English |en-US, en-IN, en-GB, en-CA, en-AU| +|English |`en-US`, `en-CA`, `en-GB`, `en-IN`| |Spanish| es|-|German (**2022-06-30** and later)| de| -|French (**2022-06-30** and later)| fr| -|Italian (**2022-06-30** and later)|it| -|Portuguese (**2022-06-30** and later)|pt| -|Dutch (**2022-06-30** and later)| nl| +|German | de| +|French | fr| +|Italian |it| +|Portuguese |pt| +|Dutch | nl| ## ID document model This table lists the written languages supported by each Form Recognizer service ## Prebuilt receipt and business card >[!NOTE]- >The Form Recognizer deep-learning technology will auto-detect the language of the text in your image. + > It's not necessary to specify a locale. This is an optional parameter. The Form Recognizer deep-learning technology will auto-detect the language of the text in your image. Prebuilt Receipt and Business Cards support all English receipts and business cards with the following locales: -|Supported Languages| Details | +|Language| Locale code | |:--|:-:|-|English| United States (-us), Australia (-au), Great Britain (-gb), India (-in| -|French | France (FR) | -|Spanish | Spain (ES) | +|English (Australia)|`en-au`| +|English (Canada)|`en-ca`| +|English (United Kingdom)|`en-gb`| +|English (India|`en-in`| +|English (United States)| `en-us`| ## Prebuilt invoice ->[!NOTE] - >The Form Recognizer deep-learning technology will auto-detect the language of the text in your image. --| Supported languages | Details | -|:-|:| -| <ul><li>English</li></ul>| United States (-us), Australia (-au), Canada (-ca), Great Britain (-gb), India (-in)| -| <ul><li>Spanish</li></ul>|Spain (ES)| -| <ul><li>German</li></ul>| Germany (DE)| -| <ul><li>French</li></ul>| France (FR) | -| <ul><li>Italian</li></ul>| Italy (IT)| -| <ul><li>Portuguese</li></ul>| Portugal (-pt), Brazil (-br)| -| <ul><li>Dutch</li></ul>| Netherlands (DE)| +Language| Locale code | +|:--|:-:| +|English (United States)|en-us| ## Prebuilt identity documents This technology is currently available for US driver licenses and the biographic ::: moniker range="form-recog-2.1.0" > [!div class="nextstepaction"] > [Try Form Recognizer Sample Labeling tool](https://aka.ms/fott-2.1-ga) |
applied-ai-services | Managed Identities Secured Access | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/managed-identities-secured-access.md | |
applied-ai-services | Overview | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/overview.md | recommendations: false ::: moniker range="form-recog-3.0.0" -Azure Form Recognizer is a cloud-based [Azure Applied AI Service](../../applied-ai-services/index.yml) for developers to build intelligent document processing solutions. Form Recognizer applies machine-learning-based optical character recognition (OCR) and document understanding technologies to extract text, tables, structure, and key-value pairs from documents. You can also label and train custom models to automate data extraction from structured, semi-structured, and unstructured documents. To learn more about each model, *see* the Concepts articles: --| Model type | Model name | -||--| -|**Document analysis models**| ● [**Read OCR model**](concept-read.md)</br> ● [**General document model**](concept-general-document.md)</br> ● [**Layout analysis model**](concept-layout.md) </br> | -| **Prebuilt models** | ● [**W-2 form model**](concept-w2.md) </br>● [**Invoice model**](concept-invoice.md)</br>● [**Receipt model**](concept-receipt.md) </br>● [**Identity (ID) document model**](concept-id-document.md) </br>● [**Business card model**](concept-business-card.md) </br> -| **Custom models** | ● [**Custom model**](concept-custom.md) </br>● [**Composed model**](concept-model-overview.md)| --## Video: Form Recognizer models --The following video introduces Form Recognizer models and their associated output to help you choose the best model to address your document scenario needs.</br></br> +Azure Form Recognizer is a cloud-based [Azure Applied AI Service](../../applied-ai-services/index.yml) for developers to build intelligent document processing solutions. Form Recognizer applies machine-learning-based optical character recognition (OCR) and document understanding technologies to classify documents, extract text, tables, structure, and key-value pairs from documents. You can also label and train custom models to automate data extraction from structured, semi-structured, and unstructured documents. To learn more about each model, *see* the Concepts articles: - > [!VIDEO https://www.microsoft.com/en-us/videoplayer/embed/RE5fX1b] ## Which Form Recognizer model should I use? -This section will help you decide which **Form Recognizer v3.0** supported model you should use for your application: +This section helps you decide which **Form Recognizer v3.0** supported model you should use for your application: | Type of document | Data to extract |Document format | Your best solution | | --|-| -|-|-|**A generic document** like a contract or letter.|You want to extract primarily text lines, words, locations, and detected languages.|</li></ul>The document is written or printed in a [supported language](language-support.md#read-layout-and-custom-form-template-model).| [**Read OCR model**](concept-read.md)| +|**A generic document** like a contract or letter.|You want to extract primarily text lines, words, locations, and detected languages.|The document is written or printed in a [supported language](language-support.md#read-layout-and-custom-form-template-model).| [**Read OCR model**](concept-read.md)| |**A document that includes structural information** like a report or study.|In addition to text, you need to extract structural information like tables, selection marks, paragraphs, titles, headings, and subheadings.|The document is written or printed in a [supported language](language-support.md#read-layout-and-custom-form-template-model)| [**Layout analysis model**](concept-layout.md) |**A structured or semi-structured document that includes content formatted as fields and values**, like a credit application or survey form.|You want to extract fields and values including ones not covered by the scenario-specific prebuilt models **without having to train a custom model**.| The form or document is a standardized format commonly used in your business or industry and printed in a [supported language](language-support.md#read-layout-and-custom-form-template-model).|[**General document model**](concept-general-document.md)-|**U.S. W-2 form**|You want to extract key information such as salary, wages, and taxes withheld from US W2 tax forms.</li></ul> |The W-2 document is in United States English (en-US) text.|[**W-2 model**](concept-w2.md) -|**Invoice**|You want to extract key information such as customer name, billing address, and amount due from invoices.</li></ul> |The invoice document is written or printed in a [supported language](language-support.md#invoice-model).|[**Invoice model**](concept-invoice.md) - |**Receipt**|You want to extract key information such as merchant name, transaction date, and transaction total from a sales or single-page hotel receipt.</li></ul> |The receipt is written or printed in a [supported language](language-support.md#receipt-model). |[**Receipt model**](concept-receipt.md)| +|**U.S. W-2 form**|You want to extract key information such as salary, wages, and taxes withheld from US W2 tax forms. |The W-2 document is in United States English (en-US) text.|[**W-2 model**](concept-w2.md) +|**Invoice**|You want to extract key information such as customer name, billing address, and amount due from invoices. |The invoice document is written or printed in a [supported language](language-support.md#invoice-model).|[**Invoice model**](concept-invoice.md) + |**Receipt**|You want to extract key information such as merchant name, transaction date, and transaction total from a sales or single-page hotel receipt. |The receipt is written or printed in a [supported language](language-support.md#receipt-model). |[**Receipt model**](concept-receipt.md)| |**Identity document (ID)** like a passport or driver's license. |You want to extract key information such as first name, last name, and date of birth from US drivers' licenses or international passports. |Your ID document is a US driver's license or the biographical page from an international passport (not a visa).| [**Identity document (ID) model**](concept-id-document.md)|-|**Business card**|You want to extract key information such as first name, last name, company name, email address, and phone number from business cards.</li></ul>|The business card document is in English or Japanese text. | [**Business card model**](concept-business-card.md)| -|**Mixed-type document(s)**| You want to extract key-value pairs, selection marks, tables, signature fields, and selected regions not extracted by prebuilt or general document models.| You have various documents with structured, semi-structured, and/or unstructured elements.| [**Custom model**](concept-custom.md)| +|**Business card**|You want to extract key information such as first name, last name, company name, email address, and phone number from business cards.|The business card document is in English or Japanese text. | [**Business card model**](concept-business-card.md)| +|**Application specific documents**| You want to extract key-value pairs, selection marks, tables, signature fields, and selected regions not extracted by prebuilt or general document models.| You have various documents with structured, semi-structured, and/or unstructured elements.| [**Custom extraction model**](concept-custom.md)| +|**Mixed-type document(s)**| You want to classify documents or split a file into individual documents.| You have various documents with structured, semi-structured, and/or unstructured elements.| [**Custom classification model**](concept-custom.md)| >[!Tip] > You can Use Form Recognizer to automate your document processing in applications | Model | Description |Automation use cases | Development options | |-|--|-|--|-|[**Read OCR model**](concept-read.md)|Extract text lines, words, detected languages, and handwritten style if detected.| <ul><li>Contract processing. </li><li>Financial or medical report processing.</li></ul>|<ul ><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/read)</li><li>[**REST API**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-rest-api)</li><li>[**C# SDK**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-csharp)</li><li>[**Python SDK**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-python)</li><li>[**Java SDK**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-java)</li><li>[**JavaScript**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-javascript)</li></ul> | -|[**General document model**](concept-general-document.md)|Extract text, tables, structure, and key-value pairs.|<ul><li>Key-value pair extraction.</li><li>Form processing.</li><li>Survey data collection and analysis.</li></ul>|<ul ><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/document)</li><li>[**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#general-document-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#general-document-model)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#general-document-model)</li><li>[**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#general-document-model)</li></ul> | -|[**Layout analysis model**](concept-layout.md) | Extract text, selection marks, and tables structures, along with their bounding box coordinates, from forms and documents.</br></br> Layout API has been updated to a prebuilt model. |<ul><li>Document indexing and retrieval by structure.</li><li>Preprocessing prior to OCR analysis.</li></ul> |<ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/layout)</li><li>[**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#layout-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#layout-model)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#layout-model)</li><li>[**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#layout-model)</li></ul>| -|[**Custom model (updated)**](concept-custom.md) | Extraction and analysis of data from forms and documents specific to distinct business data and use cases.</br></br>Custom model API v3.0 supports **signature detection for custom template (custom form) models**.</br></br>Custom model API v3.0 now supports two model types:<ul><li>[**Custom Template model**](concept-custom-template.md) (custom form) is used to analyze structured and semi-structured documents.</li><li> [**Custom Neural model**](concept-custom-neural.md) (custom document) is used to analyze unstructured documents.</li></ul>|<ul><li>Identification and compilation of data, unique to your business, impacted by a regulatory change or market event.</li><li>Identification and analysis of previously overlooked unique data.</li></ul> |[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects)</li><li>[**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md)</li></ul>| -|[**W-2 Form**](concept-w2.md) | Extract information reported in each box on a W-2 form.|<ul><li>Automated tax document management.</li><li>Mortgage loan application processing.</li></ul> |<ul ><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=tax.us.w2)<li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li></ul> | -|[**Invoice model**](concept-invoice.md) | Automated data processing and extraction of key information from sales invoices. |<ul><li>Accounts payable processing.</li><li>Automated tax recording and reporting.</li></ul> |<ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=invoice)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li></ul>| -|[**Receipt model (updated)**](concept-receipt.md) | Automated data processing and extraction of key information from sales receipts.</br></br>Receipt model v3.0 supports processing of **single-page hotel receipts**.|<ul><li>Expense management.</li><li>Consumer behavior data analysis.</li><li>Customer loyalty program.</li><li>Merchandise return processing.</li><li>Automated tax recording and reporting.</li></ul> |<ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=receipt)</li><li>[**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li></ul>| -|[**Identity document (ID) model (updated)**](concept-id-document.md) |Automated data processing and extraction of key information from US driver's licenses and international passports.</br></br>Prebuilt ID document API supports the **extraction of endorsements, restrictions, and vehicle classifications from US driver's licenses**. |<ul><li>Know your customer (KYC) financial services guidelines compliance.</li><li>Medical account management.</li><li>Identity checkpoints and gateways.</li><li>Hotel registration.</li></ul> |<ul><li> [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=idDocument)</li><li>[**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li></ul>| -|[**Business card model**](concept-business-card.md) |Automated data processing and extraction of key information from business cards.|<ul><li>Sales lead and marketing management.</li></ul> |<ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=businessCard)</li><li>[**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li></ul>| +|[**Read OCR model**](concept-read.md)|Extract text lines, words, detected languages, and handwritten style if detected.| ● Contract processing. </br>● Financial or medical report processing.|● [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/read)</br>● [**REST API**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-rest-api)</br>● [**C# SDK**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-csharp)</br>● [**Python SDK**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-python)</br>● [**Java SDK**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-java)</br>● [**JavaScript**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-javascript) | +|[**General document model**](concept-general-document.md)|Extract text, tables, structure, and key-value pairs.|● Key-value pair extraction.</br>● Form processing.</br>● Survey data collection and analysis.|● [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/document)</br>● [**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</br>● [**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#general-document-model)</br>● [**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#general-document-model)</br>● [**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#general-document-model)</br>● [**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#general-document-model) | +|[**Layout analysis model**](concept-layout.md) | Extract text, selection marks, and tables structures, along with their bounding box coordinates, from forms and documents.</br></br> Layout API has been updated to a prebuilt model. |● Document indexing and retrieval by structure.</br>● Preprocessing prior to OCR analysis. |● [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/layout)</br>● [**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</br>● [**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#layout-model)</br>● [**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#layout-model)</br>● [**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#layout-model)</br>● [**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#layout-model)| +|[**Custom model (updated)**](concept-custom.md) | Classification, extraction and analysis of data from forms and documents specific to distinct business data and use cases. Custom model API v3.0 supports two model types:● [**Custom Classifier model**](concept-custom-classifier.md) is used to identify and split document types.</br>● [**Custom Extraction model**](concept-custom.md) is used to analyze forms or documents and extract specific fields and tables. [Custom template](concept-custom-template.md) and [custom neural](concept-custom-neural.md) are the two types of custom extraction models.|● Identification and extraction of data from documents unique to your business, impacted by a regulatory change or market event.</br>● Identification and analysis of previously overlooked unique data. |● [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects)</br>● [**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</br>● [**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</br>● [**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</br>● [**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</br>● [**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md)| +|[**W-2 Form**](concept-w2.md) | Extract information reported in each box on a W-2 form.|● Automated tax document management.</br>● Mortgage loan application processing. |● [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=tax.us.w2)● [**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument)</br>● [**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</br>● [**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</br>● [**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</br>● [**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model) | +|[**Invoice model**](concept-invoice.md) | Automated data processing and extraction of key information from sales invoices. |● Accounts payable processing.</br>● Automated tax recording and reporting. |● [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=invoice)</br>● [**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)</br>● [**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</br>● [**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)| +|[**Receipt model (updated)**](concept-receipt.md) | Automated data processing and extraction of key information from sales receipts.</br></br>Receipt model v3.0 supports processing of **single-page hotel receipts**.|● Expense management.</br>● Consumer behavior data analysis.</br>● Customer loyalty program.</br>● Merchandise return processing.</br>● Automated tax recording and reporting. |● [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=receipt)</br>● [**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</br>● [**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</br>● [**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</br>● [**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</br>● [**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)| +|[**Identity document (ID) model (updated)**](concept-id-document.md) |Automated data processing and extraction of key information from US driver's licenses and international passports.</br></br>Prebuilt ID document API supports the **extraction of endorsements, restrictions, and vehicle classifications from US driver's licenses**. |● Know your customer (KYC) financial services guidelines compliance.</br>● Medical account management.</br>● Identity checkpoints and gateways.</br>● Hotel registration. |● [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=idDocument)</br>● [**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</br>● [**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</br>● [**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</br>● [**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</br>● [**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)| +|[**Business card model**](concept-business-card.md) |Automated data processing and extraction of key information from business cards.|● Sales lead and marketing management. |● [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=businessCard)</br>● [**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</br>● [**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</br>● [**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</br>● [**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</br>● [**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)| ::: moniker-end Azure Form Recognizer is a cloud-based [Azure Applied AI Service](../../applied- ## Which document processing model should I use? -This section will help you decide which Form Recognizer v2.1 supported model you should use for your application: +This section helps you decide which Form Recognizer v2.1 supported model you should use for your application: | Type of document | Data to extract |Document format | Your best solution | | --|-| -|-| |**A document that includes structural information** like a report or study.|In addition to text, you need to extract structural information like tables and selection marks.|The document is written or printed in a [supported language](language-support.md#read-layout-and-custom-form-template-model)| [**Layout analysis model**](concept-layout.md?view=form-recog-2.1.0&preserve-view=true)-|**Invoice**|You want to extract key information such as customer name, billing address, and amount due from invoices.</li></ul> |The invoice document is written or printed in a [supported language](language-support.md#invoice-model).|[**Invoice model**](concept-invoice.md?view=form-recog-2.1.0&preserve-view=true) - |**Receipt**|You want to extract key information such as merchant name, transaction date, and transaction total from a sales or single-page hotel receipt.</li></ul> |The receipt is written or printed in a [supported language](language-support.md#receipt-model). |[**Receipt model**](concept-receipt.md?view=form-recog-2.1.0&preserve-view=true)| +|**Invoice**|You want to extract key information such as customer name, billing address, and amount due from invoices. |The invoice document is written or printed in a [supported language](language-support.md#invoice-model).|[**Invoice model**](concept-invoice.md?view=form-recog-2.1.0&preserve-view=true) + |**Receipt**|You want to extract key information such as merchant name, transaction date, and transaction total from a sales or single-page hotel receipt. |The receipt is written or printed in a [supported language](language-support.md#receipt-model). |[**Receipt model**](concept-receipt.md?view=form-recog-2.1.0&preserve-view=true)| |**Identity document (ID)** like a passport or driver's license. |You want to extract key information such as first name, last name, and date of birth from US drivers' licenses or international passports. |Your ID document is a US driver's license or the biographical page from an international passport (not a visa).| [**ID document model**](concept-id-document.md?view=form-recog-2.1.0&preserve-view=true)|-|**Business card**|You want to extract key information such as first name, last name, company name, email address, and phone number from business cards.</li></ul>|The business card document is in English or Japanese text. | [**Business card model**](concept-business-card.md?view=form-recog-2.1.0&preserve-view=true)| +|**Business card**|You want to extract key information such as first name, last name, company name, email address, and phone number from business cards.|The business card document is in English or Japanese text. | [**Business card model**](concept-business-card.md?view=form-recog-2.1.0&preserve-view=true)| |**Mixed-type document(s)**| You want to extract key-value pairs, selection marks, tables, signature fields, and selected regions not extracted by prebuilt or general document models.| You have various documents with structured, semi-structured, and/or unstructured elements.| [**Custom model**](concept-custom.md?view=form-recog-2.1.0&preserve-view=true)| ## Form Recognizer models and development options Use the links in the table to learn more about each model and browse the API ref | Model| Description | Development options | |-|--|-|-|[**Layout analysis**](concept-layout.md?view=form-recog-2.1.0&preserve-view=true) | Extraction and analysis of text, selection marks, tables, and bounding box coordinates, from forms and documents. | <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-layout)</li><li>[**REST API**](quickstarts/get-started-v2-1-sdk-rest-api.md#try-it-layout-model)</li><li>[**Client-library SDK**](quickstarts/get-started-sdks-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?branch=main&tabs=layout#run-the-container-with-the-docker-compose-up-command)</li></ul>| -|[**Custom model**](concept-custom.md?view=form-recog-2.1.0&preserve-view=true) | Extraction and analysis of data from forms and documents specific to distinct business data and use cases.| <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#train-a-custom-form-model)</li><li>[**REST API**](quickstarts/get-started-sdks-rest-api.md)</li><li>[**Sample Labeling Tool**](concept-custom.md?view=form-recog-2.1.0&preserve-view=true#build-a-custom-model)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=custom#run-the-container-with-the-docker-compose-up-command)</li></ul>| -|[**Invoice model**](concept-invoice.md?view=form-recog-2.1.0&preserve-view=true) | Automated data processing and extraction of key information from sales invoices. | <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</li><li>[**REST API**](quickstarts/get-started-v2-1-sdk-rest-api.md#try-it-prebuilt-model)</li><li>[**Client-library SDK**](quickstarts/get-started-sdks-rest-api.md#try-it-prebuilt-model)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=invoice#run-the-container-with-the-docker-compose-up-command)</li></ul>| -|[**Receipt model**](concept-receipt.md?view=form-recog-2.1.0&preserve-view=true) | Automated data processing and extraction of key information from sales receipts.| <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</li><li>[**REST API**](quickstarts/get-started-v2-1-sdk-rest-api.md#try-it-prebuilt-model)</li><li>[**Client-library SDK**](quickstarts/get-started-sdks-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=receipt#run-the-container-with-the-docker-compose-up-command)</li></ul>| -|[**Identity document (ID) model**](concept-id-document.md?view=form-recog-2.1.0&preserve-view=true) | Automated data processing and extraction of key information from US driver's licenses and international passports.| <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</li><li>[**REST API**](quickstarts/get-started-v2-1-sdk-rest-api.md#try-it-prebuilt-model)</li><li>[**Client-library SDK**](quickstarts/get-started-sdks-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=id-document#run-the-container-with-the-docker-compose-up-command)</li></ul>| -|[**Business card model**](concept-business-card.md?view=form-recog-2.1.0&preserve-view=true) | Automated data processing and extraction of key information from business cards.| <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</li><li>[**REST API**](quickstarts/get-started-v2-1-sdk-rest-api.md#try-it-prebuilt-model)</li><li>[**Client-library SDK**](quickstarts/get-started-sdks-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=business-card#run-the-container-with-the-docker-compose-up-command)</li></ul>| +|[**Layout analysis**](concept-layout.md?view=form-recog-2.1.0&preserve-view=true) | Extraction and analysis of text, selection marks, tables, and bounding box coordinates, from forms and documents. | ● [**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-layout)</br>● [**REST API**](quickstarts/get-started-v2-1-sdk-rest-api.md#try-it-layout-model)</br>● [**Client-library SDK**](quickstarts/get-started-sdks-rest-api.md)</br>● [**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?branch=main&tabs=layout#run-the-container-with-the-docker-compose-up-command)| +|[**Custom model**](concept-custom.md?view=form-recog-2.1.0&preserve-view=true) | Extraction and analysis of data from forms and documents specific to distinct business data and use cases.| ● [**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#train-a-custom-form-model)</br>● [**REST API**](quickstarts/get-started-sdks-rest-api.md)</br>● [**Sample Labeling Tool**](concept-custom.md?view=form-recog-2.1.0&preserve-view=true#build-a-custom-model)</br>● [**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=custom#run-the-container-with-the-docker-compose-up-command)| +|[**Invoice model**](concept-invoice.md?view=form-recog-2.1.0&preserve-view=true) | Automated data processing and extraction of key information from sales invoices. | ● [**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</br>● [**REST API**](quickstarts/get-started-v2-1-sdk-rest-api.md#try-it-prebuilt-model)</br>● [**Client-library SDK**](quickstarts/get-started-sdks-rest-api.md#try-it-prebuilt-model)</br>● [**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=invoice#run-the-container-with-the-docker-compose-up-command)| +|[**Receipt model**](concept-receipt.md?view=form-recog-2.1.0&preserve-view=true) | Automated data processing and extraction of key information from sales receipts.| ● [**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</br>● [**REST API**](quickstarts/get-started-v2-1-sdk-rest-api.md#try-it-prebuilt-model)</br>● [**Client-library SDK**](quickstarts/get-started-sdks-rest-api.md)</br>● [**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=receipt#run-the-container-with-the-docker-compose-up-command)| +|[**Identity document (ID) model**](concept-id-document.md?view=form-recog-2.1.0&preserve-view=true) | Automated data processing and extraction of key information from US driver's licenses and international passports.| ● [**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</br>● [**REST API**](quickstarts/get-started-v2-1-sdk-rest-api.md#try-it-prebuilt-model)</br>● [**Client-library SDK**](quickstarts/get-started-sdks-rest-api.md)</br>● [**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=id-document#run-the-container-with-the-docker-compose-up-command)| +|[**Business card model**](concept-business-card.md?view=form-recog-2.1.0&preserve-view=true) | Automated data processing and extraction of key information from business cards.| ● [**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</br>● [**REST API**](quickstarts/get-started-v2-1-sdk-rest-api.md#try-it-prebuilt-model)</br>● [**Client-library SDK**](quickstarts/get-started-sdks-rest-api.md)</br>● [**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=business-card#run-the-container-with-the-docker-compose-up-command)| ::: moniker-end |
applied-ai-services | Try Form Recognizer Studio | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/try-form-recognizer-studio.md | Title: "Quickstart: Form Recognizer Studio | v3.0" -description: Form and document processing, data extraction, and analysis using Form Recognizer Studio +description: Form and document processing, data extraction, and analysis using Form Recognizer Studio Previously updated : 02/02/2023 Last updated : 03/03/2023 monikerRange: 'form-recog-3.0.0' -# Get started: Form Recognizer Studio +<!-- markdownlint-disable MD001 --> ++# Get started: Form Recognizer Studio [!INCLUDE [applies to v3.0](../includes/applies-to-v3-0.md)] -[Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/) is an online tool for visually exploring, understanding, and integrating features from the Form Recognizer service in your applications. You can get started by exploring the pre-trained models with sample or your own documents. You can also create projects to build custom template models and reference the models in your applications using the [Python SDK](get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true) and other quickstarts. +[Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/) is an online tool for visually exploring, understanding, and integrating features from the Form Recognizer service in your applications. You can get started by exploring the pretrained models with sample or your own documents. You can also create projects to build custom template models and reference the models in your applications using the [Python SDK](get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true) and other quickstarts. > [!VIDEO https://www.microsoft.com/en-us/videoplayer/embed/RE56n49] monikerRange: 'form-recog-3.0.0' > [!TIP] > Create a Cognitive Services resource if you plan to access multiple cognitive services under a single endpoint/key. For Form Recognizer access only, create a Form Recognizer resource. Please note that you'll need a single-service resource if you intend to use [Azure Active Directory authentication](../../../active-directory/authentication/overview-authentication.md). -## Prebuilt models +## Models ++Prebuilt models help you add Form Recognizer features to your apps without having to build, train, and publish your own models. You can choose from several prebuilt models, each of which has its own set of supported data fields. The choice of model to use for the analyze operation depends on the type of document to be analyzed. Form Recognizer currently supports the following prebuilt models: -Prebuilt models help you add Form Recognizer features to your apps without having to build, train, and publish your own models. You can choose from several prebuilt models, each of which has its own set of supported data fields. The choice of model to use for the analyze operation depends on the type of document to be analyzed. The following prebuilt models are currently supported by Form Recognizer: +#### Document analysis * [**General document**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=document): extract text, tables, structure, key-value pairs and named entities.-* [**W-2**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=tax.us.w2): extract text and key information from W-2 tax forms. -* [**Read**](https://formrecognizer.appliedai.azure.com/studio/read): extract text lines, words, their locations, detected languages, and handwritten style if detected from documents (PDF, TIFF) and images (JPG, PNG, BMP). * [**Layout**](https://formrecognizer.appliedai.azure.com/studio/layout): extract text, tables, selection marks, and structure information from documents (PDF, TIFF) and images (JPG, PNG, BMP).+* [**Read**](https://formrecognizer.appliedai.azure.com/studio/read): extract text lines, words, their locations, detected languages, and handwritten style if detected from documents (PDF, TIFF) and images (JPG, PNG, BMP). ++#### Prebuilt + * [**Invoice**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=invoice): extract text, selection marks, tables, key-value pairs, and key information from invoices. * [**Receipt**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=receipt): extract text and key information from receipts.+* [**Health insurance card**](https://formrecognizer-dogfood.appliedai.azure.com/studio/prebuilt?formType=healthInsuranceCard.us): extract insurer, member, prescription, group number and other key information from US health insurance cards. +* [**W-2**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=tax.us.w2): extract text and key information from W-2 tax forms. * [**ID document**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=idDocument): extract text and key information from driver licenses and international passports. * [**Business card**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=businessCard): extract text and key information from business cards. -After you've completed the prerequisites, navigate to [Form Recognizer Studio General Documents](https://formrecognizer.appliedai.azure.com/studio/document). +#### Custom ++* [**Custom extraction models**](https://formrecognizer-dogfood.appliedai.azure.com/studio/custommodel/projects): extract information from forms and documents with custom extraction models. Quickly train a model by labeling as few as five sample documents. +* [**Custom classifier model**](https://formrecognizer-dogfood.appliedai.azure.com/studio/document-classifier/projects): train a custom classifier to distinguish between the different document types within your applications. Quickly train a model with as few as two classes and five samples per class. ++#### Gated preview models ++> [!NOTE] +> To request access for gated preview models in Form Recognizer Studio, complete and submit the [**Form Recognizer private preview request form**](https://aka.ms/form-recognizer/preview/survey). ++* [**General document with query fields**](https://formrecognizer.appliedai.azure.com/studio): extract labels, values such as names, dates, and amounts from documents. +* [**Contract**](https://formrecognizer.appliedai.azure.com/studio): extract the title and signatory party information (including names, references, and addresses) from contracts. +* [**Vaccination card**](https://formrecognizer.appliedai.azure.com/studio): extract card holder name, health provider, and vaccination records from US COVID-19 vaccination cards. +* [**US 1098 tax form**](https://formrecognizer.appliedai.azure.com/studio): extract mortgage interest information from US 1098 tax forms. +* [**US 1098-E tax form**](https://formrecognizer.appliedai.azure.com/studio): extract student loan information from US 1098-E tax forms. +* [**US 1098-T tax form**](https://formrecognizer.appliedai.azure.com/studio): extract tuition information from US 1098-T forms. ++> [!NOTE] +> To request access for gated preview models in Form Recognizer Studio, complete and submit the [**Form Recognizer private preview request form**](https://aka.ms/form-recognizer/preview/survey). ++After you've completed the prerequisites, navigate to [Form Recognizer Studio General Documents](https://formrecognizer.appliedai.azure.com/studio/document). -In the following example, we use the General Documents feature. The steps to use other pre-trained features like [W2 tax form](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=tax.us.w2), [Read](https://formrecognizer.appliedai.azure.com/studio/read), [Layout](https://formrecognizer.appliedai.azure.com/studio/layout), [Invoice](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=invoice), [Receipt](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=receipt), [Business card](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=businessCard), and [ID documents](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=idDocument) models are similar. +In the following example, we use the General Documents feature. The steps to use other pretrained features like [W2 tax form](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=tax.us.w2), [Read](https://formrecognizer.appliedai.azure.com/studio/read), [Layout](https://formrecognizer.appliedai.azure.com/studio/layout), [Invoice](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=invoice), [Receipt](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=receipt), [Business card](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=businessCard), and [ID documents](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=idDocument) models are similar. :::image border="true" type="content" source="../media/quickstarts/form-recognizer-general-document-demo-preview3.gif" alt-text="Selecting the General Document API to analysis a document in the Form Recognizer Studio."::: In the following example, we use the General Documents feature. The steps to use ## Added prerequisites for custom projects -In addition to the Azure account and a Form Recognizer or Cognitive Services resource, you'll need: +In addition to the Azure account and a Form Recognizer or Cognitive Services resource, you need: ### Azure Blob Storage container -A **standard performance** [**Azure Blob Storage account**](https://portal.azure.com/#create/Microsoft.StorageAccount-ARM). You'll create containers to store and organize your training documents within your storage account. If you don't know how to create an Azure storage account with a container, following these quickstarts: +A **standard performance** [**Azure Blob Storage account**](https://portal.azure.com/#create/Microsoft.StorageAccount-ARM). You create containers to store and organize your training documents within your storage account. If you don't know how to create an Azure storage account with a container, following these quickstarts: * [**Create a storage account**](../../../storage/common/storage-account-create.md). When creating your storage account, make sure to select **Standard** performance in the **Instance details → Performance** field. * [**Create a container**](../../../storage/blobs/storage-quickstart-blobs-portal.md#create-a-container). When creating your container, set the **Public access level** field to **Container** (anonymous read access for containers and blobs) in the **New Container** window. ### Configure CORS -[CORS (Cross Origin Resource Sharing)](/rest/api/storageservices/cross-origin-resource-sharing--cors--support-for-the-azure-storage-services) needs to be configured on your Azure storage account for it to be accessible from the Form Recognizer Studio. To configure CORS in the Azure portal, you'll need access to the CORS tab of your storage account. +[CORS (Cross Origin Resource Sharing)](/rest/api/storageservices/cross-origin-resource-sharing--cors--support-for-the-azure-storage-services) needs to be configured on your Azure storage account for it to be accessible from the Form Recognizer Studio. To configure CORS in the Azure portal, you need access to the CORS tab of your storage account. 1. Select the CORS tab for the storage account. CORS should now be configured to use the storage account from Form Recognizer St :::image border="true" type="content" source="../media/sas-tokens/container-upload-button.png" alt-text="Screenshot: container upload button in the Azure portal."::: -1. The **Upload blob** window will appear. +1. The **Upload blob** window appears. 1. Select your file(s) to upload. |
applied-ai-services | Sdk Overview | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/sdk-overview.md | recommendations: false [!INCLUDE [applies to v3.0 and v2.1](includes/applies-to-v3-0-and-v2-1.md)] +> [!IMPORTANT] +> The **2023-02-28-preview** version is currently only available through the [**Form Recognizer 2023-02-28-preview REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2023-02-28-preview/operations/AnalyzeDocument). + Azure Cognitive Services Form Recognizer is a cloud service that uses machine learning to analyze text and structured data from documents. The Form Recognizer software development kit (SDK) is a set of libraries and tools that enable you to easily integrate Form Recognizer models and capabilities into your applications. Form Recognizer SDK is available across platforms in C#/.NET, Java, JavaScript, and Python programming languages. ## Supported languages |
applied-ai-services | Service Limits | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/service-limits.md | This article contains both a quick reference and detailed description of Azure F > > * [**Custom template model**](concept-custom-template.md) > * [**Custom neural model**](concept-custom-neural.md)+> * [**Composed classification models**](concept-custom-classifier.md) > * [**Composed custom models**](concept-composed-models.md) |Quota|Free (F0) <sup>1</sup>|Standard (S0)| This article contains both a quick reference and detailed description of Azure F | Adjustable | No | No | | **Training dataset size * Neural** | 1 GB <sup>3</sup> | 1 GB (default value) | | Adjustable | No | No |-| **Training file size * Template** | 50 MB <sup>4</sup> | 50 MB (default value) | -| Adjustable | No | No | -| **Total Training dataset size * Template** | 150 MB <sup>4</sup> | 150 MB (default value) | +| **Training dataset size * Template** | 50 MB <sup>4</sup> | 50 MB (default value) | | Adjustable | No | No | | **Max number of pages (Training) * Template** | 500 | 500 (default value) | | Adjustable | No | No | This article contains both a quick reference and detailed description of Azure F | Adjustable | No | No | | **Custom neural model train** | 10 per month | 10 per month | | Adjustable | No |Yes <sup>3</sup>|+| **Max number of pages (Training) * Classifier** | 10,000 | 10,000 (default value) | +| Adjustable | No | No | +| **Training dataset size * Classifier** | 1GB | 1GB (default value) | +| Adjustable | No | No | ::: moniker-end This article contains both a quick reference and detailed description of Azure F | Quota | Free (F0) <sup>1</sup> | Standard (S0) | |--|--|--|-| **Compose Model limit** | 5 | 100 (default value) | +| **Compose Model limit** | 5 | 200 (default value) | | Adjustable | No | No | | **Training dataset size** | 50 MB | 50 MB (default value) | | Adjustable | No | No | To minimize issues related to throttling (Response Code 429), we recommend using * Implement retry logic in your application * Avoid sharp changes in the workload. Increase the workload gradually <br/>-*Example.* Your application is using Form Recognizer and your current workload is 10 TPS (transactions per second). The next second you increase the load to 40 TPS (that is four times more). The Service immediately starts scaling up to fulfill the new load, but likely it will not be able to do it within a second, so some of the requests will get Response Code 429. +*Example.* Your application is using Form Recognizer and your current workload is 10 TPS (transactions per second). The next second you increase the load to 40 TPS (that is four times more). The Service immediately starts scaling up to fulfill the new load, but likely it can't do it within a second, so some of the requests get Response Code 429. The next sections describe specific cases of adjusting quotas. Jump to [Form Recognizer: increasing concurrent request limit](#create-and-submit-support-request) Initiate the increase of transactions per second(TPS) limit for your resource by * Go to [Azure portal](https://portal.azure.com/) * Select the Form Recognizer Resource for which you would like to increase the TPS limit * Select *New support request* (*Support + troubleshooting* group)-* A new window will appear with auto-populated information about your Azure Subscription and Azure Resource +* A new window appears with auto-populated information about your Azure Subscription and Azure Resource * Enter *Summary* (like "Increase Form Recognizer TPS limit") * In Problem type,* select "Quota or usage validation" * Select *Next: Solutions* * Proceed further with the request creation-* Under the *Details* tab enters the following in the *Description* field: +* Under the *Details* tab, enter the following information in the *Description* field: * a note, that the request is about **Form Recognizer** quota. * Provide a TPS expectation you would like to scale to meet. * Azure resource information you [collected](#have-the-required-information-ready). * Complete entering the required information and select *Create* button in *Review + create* tab- * Note the support request number in Azure portal notifications. You'll be contacted shortly for further processing + * Note the support request number in Azure portal notifications. You're contacted shortly for further processing ## Example of a workload pattern best practice This example presents the approach we recommend following to mitigate possible r Let us suppose that a Form Recognizer resource has the default limit set. Start the workload to submit your analyze requests. If you find that you're seeing frequent throttling with response code 429, start by implementing an exponential backoff on the GET analyze response request. By using a progressively longer wait time between retries for consecutive error responses, for example a 2-5-13-34 pattern of delays between requests. In general, it's recommended to not call the get analyze response more than once every 2 seconds for a corresponding POST request. -If you find that you're being throttled on the number of POST requests for documents being submitted, consider adding a delay between the requests. If your workload requires a higher degree of concurrent processing, you'll then need to create a support request to increase your service limits on transactions per second. +If you find that you're being throttled on the number of POST requests for documents being submitted, consider adding a delay between the requests. If your workload requires a higher degree of concurrent processing, you then need to create a support request to increase your service limits on transactions per second. Generally, it's highly recommended to test the workload and the workload patterns before going to production. |
applied-ai-services | Studio Overview | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/studio-overview.md | The studio supports Form Recognizer v3.0 models and v3.0 model training. Previou 1. After you've tried Form Recognizer Studio, use the [**C#**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true), [**Java**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true), [**JavaScript**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true) or [**Python**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true) client libraries or the [**REST API**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true) to get started incorporating Form Recognizer models into your own applications. - To learn more about each model, *see* concepts pages. +To learn more about each model, *see* concept pages. - | Model type| Models | - |--|--| - |Document analysis models| <ul><li>[**Read model**](concept-read.md)</li><li>[**Layout model**](concept-layout.md)</li><li>[**General document model**](concept-general-document.md)</li></ul>.</br></br> - |**Prebuilt models**|<ul><li>[**W-2 form model**](concept-w2.md)</li><li>[**Invoice model**](concept-invoice.md)</li><li>[**Receipt model**](concept-receipt.md)</li><li>[**ID document model**](concept-id-document.md)</li><li>[**Business card model**](concept-business-card.md)</li></ul> - |Custom models|<ul><li>[**Custom model**](concept-custom.md)</li><ul><li>[**Template model**](concept-custom-template.md)</li><li>[**Neural model**](concept-custom-template.md)</li></ul><li>[**Composed model**](concept-model-overview.md)</li></ul> ### Manage your resource |
applied-ai-services | Whats New | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/whats-new.md | -Form Recognizer service updates on an ongoing basis. Bookmark this page to stay up to date with release notes, feature enhancements, and our newest documentation. +Form Recognizer service is updated on an ongoing basis. Bookmark this page to stay up to date with release notes, feature enhancements, and our newest documentation. >[!NOTE] > With the release of the 2022-08-31 GA API, the associated preview APIs are being deprecated. If you are using the 2021-09-30-preview or the 2022-01-30-preview API versions, please update your applications to target the 2022-08-31 API version. There are a few minor changes involved, for more information, _see_ the [migration guide](v3-migration-guide.md). -## February 2023 +## March 2023 ++> [!IMPORTANT] +> Document classification, Query fields, and Add-on capabilities are currently only available in the following regions: +> +> * West Europe +> * West US2 +> * East US ++* **Document classification** is now a new capability within Form Recognizer starting with the ```2023-02-28-preview``` API. Try out the document classification capability in the [Studio](https://formrecognizer-dogfood.appliedai.azure.com/studio/) or the REST API. +* **Query fields** added to the General Document model uses Open AI model to extract specific fields from documents. See the [general document](concept-general-document.md) model to learn more or try the feature in the [Studio](https://formrecognizer-dogfood.appliedai.azure.com/studio/). Query fields are currently only active for resources in the East US region. +* **Additions to the Read and Layout APIs** + * **Barcodes** are now supported with the ```2023-02-28-preview``` API. + * **Fonts** are now recognized with the ```2023-02-28-preview``` API. + * **Formulas** are now recognized with the ```2023-02-28-preview``` API. +* **Common name** normalizing key variation to a common name makes the General Document model more useful in processing forms with variations in key names. Learn more about the common name feature in the [General Document model](concept-general-document.md). +* **Custom extraction model updates** + * Custom neural models now support added languages for training and analysis. Train neural models for Dutch, French, German, Italian and Spanish. + * Custom template models now have an improved signature detection capability. +* **Service Updates** + * Support for high resolution documents +* **Studio updates** + * In addition to support for all the new features like classification and query fields, the Studio now enables project sharing for custom model projects. +* **Receipt model updates** + * Receipt model has added support for thermal receipts. + * Receipt model now has added language support for 18 languages and three language dialects (English, French, Portuguese). + * Receipt model now supports `TaxDetails` extraction. +* **Layout model** now has improved table recognition. +* **Read model** now has added improvement for single-digit character recognition. -* Form Recognizer v3.0 container support +++## February 2023 - * The v3.0 [**Read**](concept-read.md) and [**Layout**](concept-layout.md) containers are now available for use! +* Select Form Recognizer containers for v3.0 are now available for use! +* Currently **Read v3.0** and **Layout v3.0** containers are available. - * For more information on containers, _see_ [Install and run containers](containers/form-recognizer-container-install-run.md) + For more information, _see_ [Install and run Form Recognizer containers](containers/form-recognizer-container-install-run.md?view=form-recog-3.0.0&preserve-view=true) ## January 2023 +* Prebuilt receipt model - added languages supported. The receipt model now supports these added languages and locales + * Japanese - Japan (ja-JP) + * French - Canada (fr-CA) + * Dutch - Netherlands (nl-NL) + * English - United Arab Emirates (en-AE) + * Portuguese - Brazil (pt-BR) ++* Prebuilt invoice model - added languages supported. The invoice model now supports these added languages and locales + * English - United States (en-US), Australia (en-AU), Canada (en-CA), Great Britain (en-GB), India (en-IN) + * Spanish - Spain (es-ES) + * French - France (fr-FR) + * Italian - Italy (it-IT) + * Portuguese - Portugal (pt-PT) + * Dutch - Netherlands (nl-NL) ++* Prebuilt invoice model - added fields recognized. The invoice model now recognizes these added fields + * Currency code + * Payment options + * Total discount + * Tax items (en-IN only) ++* Prebuilt ID model - added document types supported. The ID model now supports these added document types + * US Military ID + > [!TIP] > All January 2023 updates are available with [REST API version **2022-08-31 (GA)**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument). -* **[Prebuilt receipt model](concept-receipt.md#supported-languages-and-locales-v30)ΓÇöadditional language support**: +* **[Prebuilt receipt model](concept-receipt.md#supported-languages-and-locales)ΓÇöadditional language support**: The **prebuilt receipt model** now has added support for the following languages: Form Recognizer service updates on an ongoing basis. Bookmark this page to stay * Canada ID cards and documents (identification card, Maple card) * United Kingdom ID cards and documents (national identity card) ++ ## December 2022 * [**Form Recognizer Studio updates**](https://formrecognizer.appliedai.azure.com/studio) Form Recognizer service updates on an ongoing basis. Bookmark this page to stay * **Label subtypes and second-level subtypes** The Studio now supports subtypes for table columns, table rows, and second-level subtypes for types such as dates and numbers. -* The US Gov Virginia region now supports building custom neural models. +* Building custom neural models is now supported in the US Gov Virginia region. -* Preview API versions ```2022-01-30-preview``` and ```2021-09-30-preview``` retires January 31 2023. Update to the ```2022-08-31``` API version to avoid any service disruptions. +* Preview API versions ```2022-01-30-preview``` and ```2021-09-30-preview``` will be retired January 31 2023. Update to the ```2022-08-31``` API version to avoid any service disruptions. Form Recognizer service updates on an ongoing basis. Bookmark this page to stay ## October 2022 * **Form Recognizer versioned content**- * Form Recognizer documentation now presents a versioned experience. You can choose to view content targeting the v3.0 GA experience or the v2.1 GA experience. The v3.0 experience is the default. + * Form Recognizer documentation has been updated to present a versioned experience. Now, you can choose to view content targeting the v3.0 GA experience or the v2.1 GA experience. The v3.0 experience is the default. :::image type="content" source="media/versioning-and-monikers.png" alt-text="Screenshot of the Form Recognizer landing page denoting the version dropdown menu."::: Form Recognizer service updates on an ongoing basis. Bookmark this page to stay > * UK South > * West US2 - * For a complete list of supported training regions, see [custom neural models](concept-custom-neural.md). + * For a complete list of regions where training is supported see [custom neural models](concept-custom-neural.md). * Form Recognizer SDK version 4.0.0 GA release * **Form Recognizer SDKs version 4.0.0 (.NET/C#, Java, JavaScript) and version 3.2.0 (Python) are generally available and ready for use in production applications!** Form Recognizer service updates on an ongoing basis. Bookmark this page to stay * [**prebuilt-read**](concept-read.md). Read OCR model is now also available in Form Recognizer with paragraphs and language detection as the two new features. Form Recognizer Read targets advanced document scenarios aligned with the broader document intelligence capabilities in Form Recognizer. * [**prebuilt-layout**](concept-layout.md). The Layout model extracts paragraphs and whether the extracted text is a paragraph, title, section heading, footnote, page header, page footer, or page number.- * [**prebuilt-invoice**](concept-invoice.md). The TotalVAT and Line/VAT fields now resolve to the existing fields TotalTax and Line/Tax respectively. - * [**prebuilt-idDocument**](concept-id-document.md). Data extraction support for US state ID, social security, and green cards. + * [**prebuilt-invoice**](concept-invoice.md). The TotalVAT and Line/VAT fields now resolves to the existing fields TotalTax and Line/Tax respectively. + * [**prebuilt-idDocument**](concept-id-document.md). Data extraction support for US state ID, social security, and green cards. Support for passport visa information. * [**prebuilt-receipt**](concept-receipt.md). Expanded locale support for French (fr-FR), Spanish (es-ES), Portuguese (pt-PT), Italian (it-IT) and German (de-DE). * [**prebuilt-businessCard**](concept-business-card.md). Address parsing support to extract subfields for address components like address, city, state, country, and zip code. Form Recognizer service updates on an ongoing basis. Bookmark this page to stay * [**Invoice language expansion**](concept-invoice.md). The invoice model includes expanded language support. _See_ [supported languages](concept-invoice.md#supported-languages-and-locales). * [**Prebuilt business card**](concept-business-card.md) now includes Japanese language support. _See_ [supported languages](concept-business-card.md#supported-languages-and-locales). * [**Prebuilt ID document model**](concept-id-document.md). The ID document model now extracts DateOfIssue, Height, Weight, EyeColor, HairColor, and DocumentDiscriminator from US driver's licenses. _See_ [field extraction](concept-id-document.md).- * [**Read model now supports common Microsoft Office document types**](concept-read.md). Read API supports document types like Word (docx) and PowerPoint (ppt). See [Microsoft Office and HTML text extraction](concept-read.md#microsoft-office-and-html-text-extraction). + * [**Read model now supports common Microsoft Office document types**](concept-read.md). Document types like Word (docx) and PowerPoint (ppt) are now supported with the Read API. See [Microsoft Office and HTML text extraction](concept-read.md#microsoft-office-and-html-text-extraction). Form Recognizer service updates on an ongoing basis. Bookmark this page to stay * [**Custom neural model**](concept-custom-neural.md) or custom document model is a new custom model to extract text and selection marks from structured forms, semi-strutured and **unstructured documents**. * [**W-2 prebuilt model**](concept-w2.md) is a new prebuilt model to extract fields from W-2 forms for tax reporting and income verification scenarios. * [**Read**](concept-read.md) API extracts printed text lines, words, text locations, detected languages, and handwritten text, if detected.- * [**General document**](concept-general-document.md) pre-trained model now support selection marks in addition to API text, tables, structure, key-value pairs, and named entities from forms and documents. + * [**General document**](concept-general-document.md) pre-trained model is now updated to support selection marks in addition to API text, tables, structure, key-value pairs, and named entities from forms and documents. * [**Invoice API**](language-support.md#invoice-model) Invoice prebuilt model expands support to Spanish invoices. * [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com) adds new demos for Read, W2, Hotel receipt samples, and support for training the new custom neural models. * [**Language Expansion**](language-support.md) Form Recognizer Read, Layout, and Custom Form add support for 42 new languages including Arabic, Hindi, and other languages using Arabic and Devanagari scripts to expand the coverage to 164 languages. Handwritten language support expands to Japanese and Korean. Form Recognizer service updates on an ongoing basis. Bookmark this page to stay -* Form Recognizer containers v2.1 released in gated preview and now supports six feature containersΓÇö**Layout**, **Business Card**,**ID Document**, **Receipt**, **Invoice**, and **Custom**. To use them, you must submit an [online request](https://customervoice.microsoft.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR7en2Ais5pxKtso_Pz4b1_xUNlpBU1lFSjJUMFhKNzVHUUVLN1NIOEZETiQlQCN0PWcu), and receive approval. +* Form Recognizer containers v2.1 released in gated preview and are now supported by six feature containersΓÇö**Layout**, **Business Card**,**ID Document**, **Receipt**, **Invoice**, and **Custom**. To use them, you must submit an [online request](https://customervoice.microsoft.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR7en2Ais5pxKtso_Pz4b1_xUNlpBU1lFSjJUMFhKNzVHUUVLN1NIOEZETiQlQCN0PWcu), and receive approval. * *See* [**Install and run Docker containers for Form Recognizer**](containers/form-recognizer-container-install-run.md?branch=main&tabs=layout) and [**Configure Form Recognizer containers**](containers/form-recognizer-container-configuration.md?branch=main) Form Recognizer service updates on an ongoing basis. Bookmark this page to stay * To get started, try the [Form Recognizer Sample Tool](https://fott-2-1.azurewebsites.net/) and follow the [quickstart](./quickstarts/try-sample-label-tool.md). -* The updated Layout API table feature adds header recognition with column headers that can span multiple rows. Each table cell has an attribute that indicates whether it's part of a header or not. This update identifies which rows make up the table header. +* The updated Layout API table feature adds header recognition with column headers that can span multiple rows. Each table cell has an attribute that indicates whether it's part of a header or not. This update can be used to identify which rows make up the table header. Form Recognizer service updates on an ongoing basis. Bookmark this page to stay For a list of field values, _see_ [Fields extracted](./concept-id-document.md) in our Form Recognizer documentation. -* Expanded the set of document languages provided to the **[StartRecognizeContent](/dotnet/api/azure.ai.formrecognizer.formrecognizerclient.startrecognizecontent?view=azure-dotnet-preview&preserve-view=true)** method. +* Expanded the set of document languages that can be provided to the **[StartRecognizeContent](/dotnet/api/azure.ai.formrecognizer.formrecognizerclient.startrecognizecontent?view=azure-dotnet-preview&preserve-view=true)** method. * **New property `Pages` supported by the following classes**: Form Recognizer service updates on an ongoing basis. Bookmark this page to stay **[RecognizeContentOptions](/dotnet/api/azure.ai.formrecognizer.recognizecontentoptions?view=azure-dotnet-preview&preserve-view=true)** - The `ReadingOrder` property is an optional parameter that allows you to specify which reading order algorithmΓÇö`basic` or `natural`ΓÇöapplies to order the extraction of text elements. If not specified, the default value is `basic`. + The `ReadingOrder` property is an optional parameter that allows you to specify which reading order algorithmΓÇö`basic` or `natural`ΓÇöshould be applied to order the extraction of text elements. If not specified, the default value is `basic`. ### [**Java**](#tab/java) Form Recognizer service updates on an ongoing basis. Bookmark this page to stay * **[beginRecognizeContent](/java/api/com.azure.ai.formrecognizer.formrecognizerclient.beginrecognizecontent?preserve-view=true&view=azure-java-preview)**</br> * **[beginRecognizeContentFromUrl](/java/api/com.azure.ai.formrecognizer.formrecognizerclient.beginrecognizecontentfromurl?view=azure-java-preview&preserve-view=true)**</br>- * The `ReadingOrder` keyword argument is an optional parameter that allows you to specify which reading order algorithmΓÇö`basic` or `natural`ΓÇöapplies to order the extraction of text elements. If not specified, the default value is `basic`. + * The `ReadingOrder` keyword argument is an optional parameter that allows you to specify which reading order algorithmΓÇö`basic` or `natural`ΓÇöshould be applied to order the extraction of text elements. If not specified, the default value is `basic`. * The client defaults to the latest supported service version, which currently is **2.1-preview.3**. Form Recognizer service updates on an ongoing basis. Bookmark this page to stay * New option `pages` supported by all form recognition methods (custom forms and all prebuilt models). The argument allows you to select individual or a range of pages for multi-page PDF and TIFF documents. For individual pages, enter the page number, for example, `3`. For a range of pages (like page 2 and pages 5-7) enter the page numbers and ranges separated by commas: `2, 5-7`. -* Added support for a **[ReadingOrder](/javascript/api/@azure/ai-form-recognizer/formreadingorder?view=azure-node-latest&preserve-view=true to the URL)** type to the content recognition methods. This option enables you to control the algorithm that the service uses to determine how the order of recognized lines of text. You can specify which reading order algorithmΓÇö`basic` or `natural`ΓÇöapplies to order the extraction of text elements. If not specified, the default value is `basic`. +* Added support for a **[ReadingOrder](/javascript/api/@azure/ai-form-recognizer/formreadingorder?view=azure-node-latest&preserve-view=true to the URL)** type to the content recognition methods. This option enables you to control the algorithm that the service uses to determine how recognized lines of text should be ordered. You can specify which reading order algorithmΓÇö`basic` or `natural`ΓÇöshould be applied to order the extraction of text elements. If not specified, the default value is `basic`. * Split **FormField** type into several different interfaces. This update shouldn't cause any API compatibility issues except in certain edge cases (undefined valueType). Form Recognizer service updates on an ongoing basis. Bookmark this page to stay **[begin_recognize_content_from_url](/python/api/azure-ai-formrecognizer/azure.ai.formrecognizer.formrecognizerclient?view=azure-python-preview&preserve-view=true#azure-ai-formrecognizer-formrecognizerclient-begin-recognize-content-from-url)** - The `readingOrder` keyword argument is an optional parameter that allows you to specify which reading order algorithmΓÇö`basic` or `natural`ΓÇöapplies to order the extraction of text elements. If not specified, the default value is `basic`. + The `readingOrder` keyword argument is an optional parameter that allows you to specify which reading order algorithmΓÇö`basic` or `natural`ΓÇöshould be applied to order the extraction of text elements. If not specified, the default value is `basic`. Form Recognizer service updates on an ongoing basis. Bookmark this page to stay [Learn more about the invoice model](./concept-invoice.md) -* **Supervised table labeling and training, empty-value labeling** - In addition to Form Recognizer's [state-of-the-art deep learning automatic table extraction capabilities](https://techcommunity.microsoft.com/t5/azure-ai/enhanced-table-extraction-from-documents-with-form-recognizer/ba-p/2058011), it now enables customers to label and train on tables. This new release includes the ability to label and train on line items/tables (dynamic and fixed) and train a custom model to extract key-value pairs and line items. A trained model extracts line items as part of the JSON output in the documentResults section. +* **Supervised table labeling and training, empty-value labeling** - In addition to Form Recognizer's [state-of-the-art deep learning automatic table extraction capabilities](https://techcommunity.microsoft.com/t5/azure-ai/enhanced-table-extraction-from-documents-with-form-recognizer/ba-p/2058011), it now enables customers to label and train on tables. This new release includes the ability to label and train on line items/tables (dynamic and fixed) and train a custom model to extract key-value pairs and line items. Once a model is trained, the model extracts line items as part of the JSON output in the documentResults section. :::image type="content" source="./media/table-labeling.png" alt-text="Screenshot of the table labeling feature." lightbox="./media/table-labeling.png"::: Form Recognizer service updates on an ongoing basis. Bookmark this page to stay > [Learn more about Layout extraction](concept-layout.md) * **Client library update** - The latest versions of the [client libraries](/azure/applied-ai-services/form-recognizer/how-to-guides/v2-1-sdk-rest-api) for .NET, Python, Java, and JavaScript support the Form Recognizer 2.1 API.- * **New language supported: Japanese** - Language support for `AnalyzeLayout` and `AnalyzeCustomForm`: Japanese (`ja`). [Language support](language-support.md) - * **Text line style indication (handwritten/other) (Latin languages only)** - Form Recognizer now outputs an `appearance` object classifying whether each text line is handwritten style or not, along with a confidence score. This feature supports only Latin languages. + * **New language supported: Japanese** - The following new languages are now supported: for `AnalyzeLayout` and `AnalyzeCustomForm`: Japanese (`ja`). [Language support](language-support.md) + * **Text line style indication (handwritten/other) (Latin languages only)** - Form Recognizer now outputs an `appearance` object classifying whether each text line is handwritten style or not, along with a confidence score. This feature is supported only for Latin languages. * **Quality improvements** - Extraction improvements including single digit extraction improvements.- * **New try-it-out feature in the Form Recognizer Sample and Labeling Tool** - Ability to try out prebuilt Invoice, Receipt, and Business Card models and the Layout API using the Form Recognizer Sample Labeling tool. See how to extract your data without writing any code. + * **New try-it-out feature in the Form Recognizer Sample and Labeling Tool** - Ability to try out prebuilt Invoice, Receipt, and Business Card models and the Layout API using the Form Recognizer Sample Labeling tool. See how your data is extracted without writing any code. * [**Try the Form Recognizer Sample Labeling tool**](https://fott-2-1.azurewebsites.net) Form Recognizer service updates on an ongoing basis. Bookmark this page to stay ## August 2020 -* **The Form Recognizer v2.1-preview.1** release includes the following features: +* **Form Recognizer v2.1-preview.1 has been released and includes the following features: * **REST API reference is available** - View the [`v2.1-preview.1 reference`](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1-preview-1/operations/AnalyzeBusinessCardAsync)- * **New languages supported In addition to English**, supported [languages](language-support.md) for `Layout` and `Train Custom Model`: English (`en`), Chinese (Simplified) (`zh-Hans`), Dutch (`nl`), French (`fr`), German (`de`), Italian (`it`), Portuguese (`pt`) and Spanish (`es`). - * **Checkbox / Selection Mark detection** ΓÇô Form Recognizer supports detection and extraction of selection marks such as check boxes and radio buttons. Extract selection marks with `Layout` and you can now also label and train in `Train Custom Model` - _Train with Labels_ to extract key-value pairs for selection marks. - * **Model Compose** - allows you to compose multiple models called with a single model ID. When you submit a document with a composed model ID, an initial classification step routes it to the correct custom model. Model Compose is available for `Train Custom Model` - _Train with labels_. + * **New languages supported In addition to English**, the following [languages](language-support.md) are now supported: for `Layout` and `Train Custom Model`: English (`en`), Chinese (Simplified) (`zh-Hans`), Dutch (`nl`), French (`fr`), German (`de`), Italian (`it`), Portuguese (`pt`) and Spanish (`es`). + * **Checkbox / Selection Mark detection** ΓÇô Form Recognizer supports detection and extraction of selection marks such as check boxes and radio buttons. Selection Marks are extracted in `Layout` and you can now also label and train in `Train Custom Model` - _Train with Labels_ to extract key-value pairs for selection marks. + * **Model Compose** - allows multiple models to be composed and called with a single model ID. When you submit a document to be analyzed with a composed model ID, a classification step is first performed to route it to the correct custom model. Model Compose is available for `Train Custom Model` - _Train with labels_. * **Model name** - add a friendly name to your custom models for easier management and tracking. * **[New prebuilt model for Business Cards](./concept-business-card.md)** for extracting common fields in English, language business cards. * **[New locales for prebuilt Receipts](./concept-receipt.md)** in addition to EN-US, support is now available for EN-AU, EN-CA, EN-GB, EN-IN Form Recognizer service updates on an ongoing basis. Bookmark this page to stay **New samples** are available on GitHub. * The [Knowledge Extraction Recipes - Forms Playbook](https://github.com/microsoft/knowledge-extraction-recipes-forms) collects best practices from real Form Recognizer customer engagements and provides usable code samples, checklists, and sample pipelines used in developing these projects.- * The [Sample Labeling tool](https://github.com/microsoft/OCR-Form-Tools) update supports the new v2.1 functionality. See this [quickstart](label-tool.md) for getting started with the tool. + * The [Sample Labeling tool](https://github.com/microsoft/OCR-Form-Tools) has been updated to support the new v2.1 functionality. See this [quickstart](label-tool.md) for getting started with the tool. * The [Intelligent Kiosk](https://github.com/microsoft/Cognitive-Samples-IntelligentKiosk/blob/master/Documentation/FormRecognizer.md) Form Recognizer sample shows how to integrate `Analyze Receipt` and `Train Custom Model` - _Train without Labels_. The new SDK supports all the features of the v2.0 REST API for Form Recognizer. ## March 2020 -* **Value types for labeling** You can now specify the types of values you're labeling with the Form Recognizer Sample Labeling tool. Supported value types and variations: +* **Value types for labeling** You can now specify the types of values you're labeling with the Form Recognizer Sample Labeling tool. The following value types and variations are currently supported: * `string` * default, `no-whitespaces`, `alphanumeric` * `number` The new SDK supports all the features of the v2.0 REST API for Form Recognizer. See the [Sample Labeling tool](label-tool.md#specify-tag-value-types) guide to learn how to use this feature. -* **Table visualization** The Sample Labeling tool now displays recognized tables in the document. This feature lets you view recognized and extracted tables from the document prior to labeling and analyzing. This feature can be toggled on/off using the layers option. +* **Table visualization** The Sample Labeling tool now displays tables that were recognized in the document. This feature lets you view recognized and extracted tables from the document prior to labeling and analyzing. This feature can be toggled on/off using the layers option. -* The following image is an example of recognized and extracted tables: +* The following image is an example of how tables are recognized and extracted: :::image type="content" source="media/whats-new/table-viz.png" alt-text="Screenshot of table visualization using the Sample Labeling tool."::: See the [Sample Labeling tool](label-tool.md#specify-tag-value-types) guide to l * TLS 1.2 enforcement -* TLS 1.2 enforces for all HTTP requests to this service. For more information, see [Azure Cognitive Services security](../../cognitive-services/security-features.md). +* TLS 1.2 is now enforced for all HTTP requests to this service. For more information, see [Azure Cognitive Services security](../../cognitive-services/security-features.md). This release introduces the Form Recognizer 2.0. In the next sections, you'll fi * Custom model API changes - All of APIs for training and using custom models renamed and some synchronous methods are now asynchronous. The following are major changes: + All of the APIs for training and using custom models have been renamed, and some synchronous methods are now asynchronous. The following are major changes: * The process of training a model is now asynchronous. You initiate training through the **/custom/models** API call. This call returns an operation ID, which you can pass into **custom/models/{modelID}** to return the training results.- * The **/custom/models/{modelID}/analyze** API call initiates key-value pair extraction. This call returns an operation ID, which you can pass into **custom/models/{modelID}/analyzeResults/{resultID}** to return the extraction results. - * Operation IDs for the Train operation are now in the **Location** header of HTTP responses, not the **Operation-Location** header. + * Key/value extraction is now initiated by the **/custom/models/{modelID}/analyze** API call. This call returns an operation ID, which you can pass into **custom/models/{modelID}/analyzeResults/{resultID}** to return the extraction results. + * Operation IDs for the Train operation are now found in the **Location** header of HTTP responses, not the **Operation-Location** header. * Receipt API changes - * Renamed APIs for reading sales receipts. + * The APIs for reading sales receipts have been renamed. - * The **/prebuilt/receipt/analyze** API call initiates receipt data extraction. This call returns an operation ID, which you can pass into **/prebuilt/receipt/analyzeResults/{resultID}** to return the extraction results. + * Receipt data extraction is now initiated by the **/prebuilt/receipt/analyze** API call. This call returns an operation ID, which you can pass into **/prebuilt/receipt/analyzeResults/{resultID}** to return the extraction results. * Output format changes - * The JSON responses for all API calls have new formats and some keys and values added, removed, or renamed. See the quickstarts for examples of the current JSON formats. + * The JSON responses for all API calls have new formats. Some keys and values have been added, removed, or renamed. See the quickstarts for examples of the current JSON formats. This release introduces the Form Recognizer 2.0. In the next sections, you'll fi * Complete a [Form Recognizer quickstart](quickstarts/get-started-sdks-rest-api.md?view=form-recog-2.1.0&preserve-view=true) and get started creating a document processing app in the development language of your choice. ::: moniker-end+ |
automation | Runbooks | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/troubleshoot/runbooks.md | Title: Troubleshoot Azure Automation runbook issues description: This article tells how to troubleshoot and resolve issues with Azure Automation runbooks. Previously updated : 02/21/2023 Last updated : 03/06/2023 It fails with the following error: ### Cause -Code that was introduced in [1.9.0 version](https://www.powershellgallery.com/packages/Az.Automation/1.9.0) of the Az.Automation module verifies the names of the runbooks to start and incorrectly flags runbooks with multiple "-" characters or with an "_" character in the name as invalid. +The naming convention is not being followed. Ensure that your runbook name starts with a letter and can contain letters, numbers, underscores, and dashes. The naming convention requirements are now being enforced starting with the Az module version 1.9 through the portal and cmdlets. ### Workaround-We recommend that you revert to [1.8.0 version](https://www.powershellgallery.com/packages/Az.Automation/1.8.0) of the module. -### Resolution -Currently, we are working to deploy a fix to address this issue. +We recommend that you follow the runbook naming convention or revert to [1.8.0 version](https://www.powershellgallery.com/packages/Az.Automation/1.8.0) of the module where the naming convention isn't enforced. + ## Diagnose runbook issues |
azure-arc | Troubleshooting | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/kubernetes/troubleshooting.md | Title: "Troubleshoot common Azure Arc-enabled Kubernetes issues" Previously updated : 01/23/2023 Last updated : 03/13/2023 description: "Learn how to resolve common issues with Azure Arc-enabled Kubernetes clusters and GitOps." If you encounter this issue, and your cluster is behind an outbound proxy server Problems retrieving the MSI certificate are usually due to network issues. Check to make sure all of the [network requirements](network-requirements.md) have been met, then try again. -### Azure CLI is unable to download Helm chart for Azure Arc agents --With Helm version >= 3.7.0, you may run into the following error when using `az connectedk8s connect` to connect the cluster to Azure Arc: --```azurecli -az connectedk8s connect -n AzureArcTest -g AzureArcTest -``` --```output -Unable to pull helm chart from the registry 'mcr.microsoft.com/azurearck8s/batch1/stable/azure-arc-k8sagents:1.4.0': Error: unknown command "chart" for "helm" -Run 'helm --help' for usage. -``` --To resolve this issue, you'll need to install a prior version of [Helm 3](https://helm.sh/docs/intro/install/), where the version is less than 3.7.0. After you've installed that version, run the `az connectedk8s connect` command again to connect the cluster to Azure Arc. - ### Insufficient cluster permissions If the provided kubeconfig file doesn't have sufficient permissions to install the Azure Arc agents, the Azure CLI command will return an error. |
azure-arc | System Requirements | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/resource-bridge/system-requirements.md | Arc resource bridge uses a MOC login credential called [KVA token](/azure-stack/ To use AKS and Arc resource bridge together on Azure Stack HCI, the AKS cluster must be deployed prior to deploying Arc resource bridge. If Arc resource bridge has already been deployed, AKS can't be deployed unless you delete Arc resource bridge first. Once your AKS cluster is deployed to Azure Stack HCI, you can deploy Arc resource bridge. -The following example shows a network configuration setup for Arc resource bridge and AKS clusters when deployed on Azure Stack HCI. Key details are that Arc resource bridge and AKS share the same switch and `ipaddressprefix`, but require different IP addresses for `vippoolstart/end` and `k8snodeippoolstart/end`. --### AKS hybrid --``` -azurestackhciprovider: - virtualnetwork: -   name: "mgmtvnet" -   vswitchname: "Default Switch" -   type: "Transparent" -   macpoolname:  -   vlanid: 0 -   ipaddressprefix: 172.16.0.0/16 -   gateway: 17.16.1.1  -   dnsservers: 17.16.1.1 -   vippoolstart: 172.16.255.0 -   vippoolend: 172.16.255.254 -   k8snodeippoolstart: 172.16.10.0 -   k8snodeippoolend: 172.16.10.254  -``` --### Arc resource bridge --``` -azurestackhciprovider: - virtualnetwork: -      name: "mgmtvnet" -      vswitchname: "Default Switch" -      type: "Transparent" -      macpoolname:  -      vlanid: 0 -      ipaddressprefix: 172.16.0.0/16 -      gateway: 17.16.1.1 -      dnsservers: 17.16.0.1 -      vippoolstart: 172.16.250.0 -      vippoolend: 172.16.250.254 -      k8snodeippoolstart: 172.16.30.0 -      k8snodeippoolend: 172.16.30.254 -``` --For instructions for how to deploy Arc resource bridge on Hybrid AKS, see [How to install Azure Arc Resource Bridge on Windows Server - AKS hybrid](/azure/aks/hybrid/deploy-arc-resource-bridge-windows-server). +When deploying Arc resource bridge with AKS on Azure Stack HCI (AKS Hybrid), the resource bridge should share the same 'vswitchname' and `ipaddressprefix`, but require different IP addresses for `vippoolstart/end` and `k8snodeippoolstart/end`. Arc resource bridge should be given a unique 'vnetname' that is different from the one used for AKS Hybrid. For full instructions to deploy Arc resource bridge on AKS Hybrid, see [How to install Azure Arc Resource Bridge on Windows Server - AKS hybrid](/azure/aks/hybrid/deploy-arc-resource-bridge-windows-server). ## Next steps |
azure-cache-for-redis | Cache Best Practices Enterprise Tiers | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-cache-for-redis/cache-best-practices-enterprise-tiers.md | -We strongly recommended that you deploy new caches in a [zone redundant](cache-high-availability.md) configuration. Zone redundancy ensures that Redis Enterprise nodes are spread among three availability zones, boosting redundancy from data center-level outages. Using zone redundancy increases availability. For more information, see [Service Level Agreements (SLA) for Online Services](https://azure.microsoft.com/support/legal/sla/cache/v1_1/). +We strongly recommend that you deploy new caches in a [zone redundant](cache-high-availability.md) configuration. Zone redundancy ensures that Redis Enterprise nodes are spread among three availability zones, boosting redundancy from data center-level outages. Using zone redundancy increases availability. For more information, see [Service Level Agreements (SLA) for Online Services](https://azure.microsoft.com/support/legal/sla/cache/v1_1/). Zone redundancy is important on the Enterprise tier because your cache instance always uses at least three nodes. Two nodes are data nodes, which hold your data, and a _quorum node_. Increasing capacity scales the number of data nodes in even-number increments. There's also another node called a quorum node. This node monitors the data node ## Scaling -In the Enterprise and Enterprise Flash tiers of Azure Cache for Redis, we recommended prioritizing scaling up over scaling out. Prioritize scaling up because the Enterprise tiers are built on Redis Enterprise, which is able to utilize more CPU cores in larger VMs. +In the Enterprise and Enterprise Flash tiers of Azure Cache for Redis, we recommend prioritizing scaling up over scaling out. Prioritize scaling up because the Enterprise tiers are built on Redis Enterprise, which is able to utilize more CPU cores in larger VMs. Conversely, the opposite recommendation is true for the Basic, Standard, and Premium tiers, which are built on open-source Redis. In those tiers, prioritizing scaling out over scaling up is recommended in most cases. |
azure-cache-for-redis | Cache Network Isolation | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-cache-for-redis/cache-network-isolation.md | VNet is the fundamental building block for your private network in Azure. VNet e ### Limitations of VNet injection - VNet injected caches are only available for Premium Azure Cache for Redis.-- When using a VNet injected cache, you must change your VNet to cache dependencies such as CRLs/PKI, AKV, Azure Storage, Azure Monitor, and more. +- When using a VNet injected cache, you must change your VNet to cache dependencies such as CRLs/PKI, AKV, Azure Storage, Azure Monitor, and more. +- You can't inject an Azure Cache for Redis instance into a Virtual Network. You can only select this option when you _create_ the cache. ## Azure Firewall rules |
azure-fluid-relay | Azure Function Token Provider | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-fluid-relay/how-tos/azure-function-token-provider.md | TokenProviders can be implemented in many ways, but must implement two separate To ensure that the tenant secret key is kept secure, it's stored in a secure backend location and is only accessible from within the Azure Function. To retrieve tokens, you need to make a `GET` or `POST` request to your deployed Azure Function, providing the `tenantID` and `documentId`, and `userID`/`userName`. The Azure Function is responsible for the mapping between the tenant ID and a tenant key secret to appropriately generate and sign the token. -The example implementation below handles making these requests to your Azure Function. It uses the [axios](https://www.npmjs.com/package/axios) library to make HTTP requests. You can use other libraries or approaches to making an HTTP request from server code. +The example implementation below handles making these requests to your Azure Function. It uses the [axios](https://www.npmjs.com/package/axios) library to make HTTP requests. You can use other libraries or approaches to making an HTTP request from server code. This specific implementation is also provided for you as an export from the `@fluidframework/azure-client` package. ```typescript import { ITokenProvider, ITokenResponse } from "@fluidframework/routerlicious-driver"; |
azure-fluid-relay | Quickstart Dice Roll | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-fluid-relay/quickstarts/quickstart-dice-roll.md | const serviceConfig = { ``` > [!WARNING]-> During development, you can use `InsecureTokenProvider` to generate and sign authentication tokens that the Azure Fluid Relay service will accept. However, as the name implies, this is insecure and should not be used in production environments. The Azure Fluid Relay resource creation process provides you with a secret key which can be used to sign secure requests. **To ensure that this secret doesn't get exposed, this should be replaced with another implementation of ITokenProvider that fetches the token from a secure, developer-provided backend service prior to releasing to production.** An example implementation is [AzureFunctionTokenProvider](https://fluidframework.com/docs/apis/azure-client/azurefunctiontokenprovider-class). For more information, see [How to: Write a TokenProvider with an Azure Function](../how-tos/azure-function-token-provider.md). +> During development, you can use `InsecureTokenProvider` to generate and sign authentication tokens that the Azure Fluid Relay service will accept. However, as the name implies, this is insecure and should not be used in production environments. The Azure Fluid Relay resource creation process provides you with a secret key which can be used to sign secure requests. **To ensure that this secret doesn't get exposed, this should be replaced with another implementation of ITokenProvider that fetches the token from a secure, developer-provided backend service prior to releasing to production.** +> +> One secure approach is outlined in ["How to: Write a TokenProvider with an Azure Function"](../how-tos/azure-function-token-provider.md). ### Build and run the client only |
azure-functions | Functions Host Json | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-host-json.md | For more information about these settings, see [Sampling in Application Insights ### applicationInsights.snapshotConfiguration -For more information on snapshots, see [Debug snapshots on exceptions in .NET apps](../azure-monitor/app/snapshot-debugger.md) and [Troubleshoot problems enabling Application Insights Snapshot Debugger or viewing snapshots](https://learn.microsoft.com/troubleshoot/azure/azure-monitor/app-insights/snapshot-debugger-troubleshoot). +For more information on snapshots, see [Debug snapshots on exceptions in .NET apps](../azure-monitor/app/snapshot-debugger.md) and [Troubleshoot problems enabling Application Insights Snapshot Debugger or viewing snapshots](/troubleshoot/azure/azure-monitor/app-insights/snapshot-debugger-troubleshoot.md). |Property | Default | Description | | | | | |
azure-monitor | Azure Monitor Agent Manage | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/azure-monitor-agent-manage.md | The following prerequisites must be met prior to installing Azure Monitor Agent. - **Networking**: If you use network firewalls, the [Azure Resource Manager service tag](../../virtual-network/service-tags-overview.md) must be enabled on the virtual network for the virtual machine. The virtual machine must also have access to the following HTTPS endpoints: - global.handler.control.monitor.azure.com- - `<virtual-machine-region-name>`.handler.control.monitor.azure.com (example: westus.handler.control.azure.com) + - `<virtual-machine-region-name>`.handler.control.monitor.azure.com (example: westus.handler.control.monitor.azure.com) - `<log-analytics-workspace-id>`.ods.opinsights.azure.com (example: 12345a01-b1cd-1234-e1f2-1234567g8h99.ods.opinsights.azure.com) (If you use private links on the agent, you must also add the [dce endpoints](../essentials/data-collection-endpoint-overview.md#components-of-a-data-collection-endpoint)). |
azure-monitor | Action Groups | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/alerts/action-groups.md | When you create or update an action group in the Azure portal, you can test the 1. On the page that lists the information you entered, select **Test action group**. - :::image type="content" source="./media/action-groups/test-action-group.png" alt-text="Screenshot that shows the test action group start page with the Test option."::: + :::image type="content" source="./media/action-groups/test-action-group.png" alt-text="Screenshot that shows the test action group page with the Test option."::: 1. Select a sample type and the notification and action types that you want to test. Then select **Test**. The following table describes the role membership requirements that are needed f > > When you configure an action group in the portal, you can opt in or out of the common alert schema: >-> - To find common schema samples for all sample types, see [Common alert schema definitions for Test Action Group](./alerts-common-schema-test-action-definitions.md). +> - To find common schema samples for all sample types, see [Alert payload samples](./alerts-payload-samples.md). > - To find non-common schema alert definitions, see [Non-common alert schema definitions for Test Action Group](./alerts-non-common-schema-definitions.md). ## Create an action group with a Resource Manager template If you use the webhook action, your target webhook endpoint must be able to proc 1. Copy the `$myApp.ObjectId` value that's in the script. 1. In the webhook action definition, in the **Object Id** box, enter the value that you copied. - :::image type="content" source="./media/action-groups/action-groups-secure-webhook.png" alt-text="Screenshot that shows the Secured Webhook dialog in the Azure portal with the Object Id box." border="true"::: + :::image type="content" source="./media/action-groups/action-groups-secure-webhook.png" alt-text="Screenshot that shows the Secured Webhook dialog in the Azure portal with the Object ID box." border="true"::: #### Secure webhook PowerShell script |
azure-monitor | Activity Log Alerts Webhook | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/alerts/activity-log-alerts-webhook.md | For more information on activity log alerts, see how to [create Azure activity l For information on action groups, see how to [create action groups](./action-groups.md). > [!NOTE]-> You can also use the [common alert schema](./alerts-common-schema.md) for your webhook integrations. It provides the advantage of having a single extensible and unified alert payload across all the alert services in Azure Monitor. [Learn about the common alert schema definitions](./alerts-common-schema-definitions.md)ΓÇï. +> You can also use the [common alert schema](./alerts-common-schema.md) for your webhook integrations. It provides the advantage of having a single extensible and unified alert payload across all the alert services in Azure Monitor. [Learn about the common alert schema](./alerts-common-schema.md)ΓÇï. ## Authenticate the webhook For specific schema details on service health notification activity log alerts, | resourceProviderName |The resource provider of the affected resource. | | conditionType |Always `Event`. | | name |Name of the alert rule. |-| id |Resource ID of the alert. | +| ID |Resource ID of the alert. | | description |Alert description set when the alert is created. | | subscriptionId |Azure subscription ID. | | timestamp |Time at which the event was generated by the Azure service that processed the request. | |
azure-monitor | Alerts Common Schema | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/alerts/alerts-common-schema.md | Title: Common alert schema for Azure Monitor alerts description: Understand the common alert schema, why you should use it, and how to enable it. Previously updated : 12/22/2022 Last updated : 02/16/2023 -The common alert schema standardizes the consumption experience for alert notifications in Azure. Historically, activity log, metric, and log alerts each had their own email templates and webhook schemas. The common alert schema provides one standardized schema for all alert notifications. +The common alert schema standardizes the consumption of Azure Monitor alert notifications. Historically, activity log, metric, and log alerts each had their own email templates and webhook schemas. The common alert schema provides one standardized schema for all alert notifications. A standardized schema can help you minimize the number of integrations, which simplifies the process of managing and maintaining your integrations. The common alert schema provides a consistent structure for: - Azure Functions - Azure Automation runbook -The new schema enables a richer alert consumption experience across both the Azure portal and the Azure mobile app. +The new schema enables a richer alert consumption experience in both the Azure portal and the Azure mobile app. > [!NOTE] > Alerts generated by [VM insights](../vm/vminsights-overview.md) do not support the common schema. The common schema includes information about the affected resource and the cause } ``` +For sample alerts that use the common schema, see [Sample alert payloads](alerts-payload-samples.md). ## Essentials fields | Field | Description| The common schema includes information about the affected resource and the cause | Severity | The severity of the alert. Possible values are Sev0, Sev1, Sev2, Sev3, or Sev4. | | signalType | Identifies the signal on which the alert rule was defined. Possible values are Metric, Log, or Activity Log. | | monitorCondition | When an alert fires, the alert's monitor condition is set to **Fired**. When the underlying condition that caused the alert to fire clears, the monitor condition is set to **Resolved**. |-| monitoringService | The monitoring service or solution that generated the alert. The fields for the alert context are dictated by the monitoring service. | +| monitoringService | The monitoring service or solution that generated the alert. The monitoring service determines which fields are in the alert context. | | alertTargetIds | The list of the Azure Resource Manager IDs that are affected targets of an alert. For a log alert defined on a Log Analytics workspace or Application Insights instance, it's the respective workspace or application. | | configurationItems |The list of affected resources of an alert.<br>In some cases, the configuration items can be different from the alert targets. For example, in metric-for-log or log alerts defined on a Log Analytics workspace, the configuration items are the actual resources sending the telemetry and not the workspace.<br><ul><li>In the log alerts API (Scheduled Query Rules) v2021-08-01, the `configurationItem` values are taken from explicitly defined dimensions in this priority: `Computer`, `_ResourceId`, `ResourceId`, `Resource`.</li><li>In earlier versions of the log alerts API, the `configurationItem` values are taken implicitly from the results in this priority: `Computer`, `_ResourceId`, `ResourceId`, `Resource`.</li></ul>In ITSM systems, the `configurationItems` field is used to correlate alerts to resources in a configuration management database. | | originAlertId | The ID of the alert instance, as generated by the monitoring service generating it. | The common schema includes information about the affected resource and the cause |alertContextVersion | The version number for the `alertContext` section. | -## Alert context fields for metric alerts --### Sample metric alert with a static threshold and the monitoringService = `Platform` +## Alert context fields for metric alerts ++|Field |Description | +||| +|properties |(Optional.) A collection of customer-defined properties. | +|conditionType |The type of condition selected for the alert rule:<br> - static threshold<br> - dynamic threshold<br> - webtest | +|condition | | +|windowSize |The time period analyzed by the alert rule.| +|allOf |Indicates that all conditions defined in the alert rule must be met to trigger an alert.| +|alertSensitivity |In an alert rule with a dynamic threshold, indicates how sensitive the rule is, or how much the value can deviate from the upper or lower threshold.| +|failingPeriods |In an alert rule with a dynamic threshold, the number of evaluation periods that don't meet the alert threshold that will trigger an alert. For example, you can indicate that an alert is triggered when 3 out of the last five evaluation periods aren't within the alert thresholds. | +|numberOfEvaluationPeriods|The total number of evaluations. | +|minFailingPeriodsToAlert|The minimum number of evaluations that do no meet the alert rule conditions.| +|ignoreDataBefore |(Optional.) In an alert rule with a dynamic threshold, the date from which the threshold is calculated. Use this value to indicate that the rule shouldn't calculate the dynamic threshold using data from before the specified date. | +|metricName |The name of the metric monitored by the alert rule. | +|metricNamespace |The namespace of the metric monitored by the alert rule. | +|operator |The logical operator of the alert rule. | +|threshold |The threshold defined in the alert rule. For an alert rule with a dynamic threshold, this value is the calculated threshold. | +|timeAggregation |The aggregation type of the alert rule. | +|dimensions |The metric dimension that triggered the alert. | +|name |The dimension name. | +|value |The dimension value. | +|metricValue |The metric value at the time that it violated the threshold. | +|webTestName |If the condition type is `webtest`, the name of the webtest. | +|windowStartTime |The start time of the evaluation window in which the alert fired. | +|windowEndTime |The end time of the evaluation window in which the alert fired. | ++### Sample metric alert with a static threshold when the monitoringService = `Platform` ```json { The common schema includes information about the affected resource and the cause } ``` -### Sample metric alert with a dynamic threshold and the monitoringService = Platform +### Sample metric alert with a dynamic threshold when the monitoringService = `Platform` ```json { The common schema includes information about the affected resource and the cause } } ```-### Sample metric alert for availability tests and the monitoringService = Platform +### Sample metric alert for availability tests when the monitoringService = `Platform` ```json { The common schema includes information about the affected resource and the cause > - The common schema is not supported for log alerts using webhooks with a custom email subject and/or JSON payload, since the common schema overwrites the custom configurations. > - Alerts using the common schema have an upper size limit of 256 KB per alert. If the log alerts payload includes search results that cause the alert to exceed the maximum size, the search results aren't embedded in the log alerts payload. You can check if the payload includes the search results with the `IncludedSearchResults` flag. Use `LinkToFilteredSearchResultsAPI` or `LinkToSearchResultsAPI` to access query results with the [Log Analytics API](/rest/api/loganalytics/dataaccess/query/get) if the search results are not included. +|Field |Description | +||| +|SearchQuery |The query defined in the alert rule. | +|SearchIntervalStartTimeUtc |The start time of the evaluation window in which the alert fired in UTC. | +|SearchIntervalEndTimeUtc |The end time of the evaluation window in which the alert fired in UTC. | +|ResultCount |The number of records returned by the query. For metric measurement rules, the number or records that match the specific dimension combination. | +|LinkToSearchResults |A link to the search results. | +|LinkToFilteredSearchResultsUI |For metric measurement rules, the link to the search results after they've been filtered by the dimension combinations. | +|LinkToSearchResultsAPI |A link to the query results using the Log Analytics API. | +|LinkToFilteredSearchResultsAPI |For metric measurement rules, the link to the search results using the Log Analytics API after they've been filtered by the dimension combinations. | +|SearchIntervalDurationMin |The total number of minutes in the search interval. | +|SearchIntervalInMin |The total number of minutes in the search interval. | +|Threshold |The threshold defined in the alert rule. | +|Operator |The operator defined in the alert rule. | +|ApplicationID |The Application Insights ID on which the alert was triggered. | +|Dimensions |For metric measurement rules, the metric dimensions on which the alert was triggered. | +|name |The dimension name. | +|value |The dimension value. | +|SearchResults |The complete search results. | +|table |The table of results in the search results. | +|name |The name of the table in the search results. | +|columns |The columns in the table. | +|name |The name of the column. | +|type |The type of the column. | +|rows |The rows in the table. | +|DataSources |The data sources on which the alert was triggered. | +|resourceID |The resource ID affected by the alert. | +|tables |The draft response tables included in the query. | +|IncludedSearchResults | Flag that indicates if the payload should contain the results. | +|AlertType |The alert type:<br> - Metric Measurement<br> - Number Of Results | +++ ### Sample log alert when the monitoringService = Platform ```json The common schema includes information about the affected resource and the cause } } ```- ### Sample log alert when the monitoringService = Log Alerts V2 > [!NOTE] The common schema includes information about the affected resource and the cause } } ```- ## Alert context fields for activity log alerts +See [Azure activity log event schema](../essentials/activity-log-schema.md) for detailed information about the fields in activity log alerts. ### Sample activity log alert when the monitoringService = Activity Log - Administrative ```json The common schema includes information about the affected resource and the cause } } ```- ### Sample activity log alert when the monitoringService = Activity Log - Policy ```json The common schema includes information about the affected resource and the cause } } ```- ### Sample activity log alert when the monitoringService = Activity Log - Autoscale ```json The common schema includes information about the affected resource and the cause } } ```- ### Sample activity log alert when the monitoringService = Activity Log - Security ```json The common schema includes information about the affected resource and the cause } } ```- ### Sample activity log alert when the monitoringService = ServiceHealth ```json The common schema includes information about the affected resource and the cause } } ```- ### Sample activity log alert when the monitoringService = ResourceHealth ```json The common schema includes information about the affected resource and the cause } } ```- ## Alert context fields for Prometheus alerts +See [Azure Monitor managed service for Prometheus rule groups (preview)](../essentials/prometheus-rule-groups.md) for detailed information about the fields in Prometheus alerts. ### Sample Prometheus alert ```json The common schema includes information about the affected resource and the cause } } ```- ## Enable the common alert schema Use action groups in the Azure portal or use the REST API to enable the common alert schema. Schemas are defined at the action level. For example, you must separately enable the schema for an email action and a webhook action. Use action groups in the Azure portal or use the REST API to enable the common a 1. Open any existing action or a new action in an action group. 1. Select **Yes** to enable the common alert schema.- ### Enable the common schema using the REST API You can also use the [Action Groups API](/rest/api/monitor/actiongroups) to opt in to the common alert schema. In the [create or update](/rest/api/monitor/actiongroups/createorupdate) REST API call, - Set the "useCommonAlertSchema" flag to `true` to enable the common schema - Set the "useCommonAlertSchema" flag to `false` to use the non-common schema for email, webhook, Logic Apps, Azure Functions, or Automation runbook actions.-- #### Sample REST API call for using the common schema The following [create or update](/rest/api/monitor/actiongroups/createorupdate) REST API request: The following [create or update](/rest/api/monitor/actiongroups/createorupdate) "tags": {} } ```- ## Next steps - [Learn how to create a logic app that uses the common alert schema to handle all your alerts](./alerts-common-schema-integrations.md) |
azure-monitor | Alerts Log Webhook | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/alerts/alerts-log-webhook.md | Last updated 2/23/2022 [Log alert](alerts-log.md) supports [configuring webhook action groups](./action-groups.md#webhook). In this article, we'll describe what properties are available. Webhook actions allow you to invoke a single HTTP POST request. The service that's called should support webhooks and know how to use the payload it receives. > [!NOTE]-> It is recommended you use [common alert schema](../alerts/alerts-common-schema.md) for your webhook integrations. The common alert schema provides the advantage of having a single extensible and unified alert payload across all the alert services in Azure Monitor. For log alerts rules that have a custom JSON payload defined, enabling the common alert schema reverts the payload schema to the one described [here](../alerts/alerts-common-schema-definitions.md#log-alerts). This means that if you want to have a custom JSON payload defined, the webhook can't use the common alert schema. Alerts with the common schema enabled have an upper size limit of 256 KB per alert, bigger alert will not include search results. When the search results aren't included, you should use the `LinkToFilteredSearchResultsAPI` or `LinkToSearchResultsAPI` to access query results via the Log Analytics API. +> It is recommended you use [common alert schema](../alerts/alerts-common-schema.md) for your webhook integrations. The common alert schema provides the advantage of having a single extensible and unified alert payload across all the alert services in Azure Monitor. For log alerts rules that have a custom JSON payload defined, enabling the common alert schema reverts the payload schema to the one described [here](../alerts/alerts-common-schema.md#alert-context-fields-for-log-alerts). This means that if you want to have a custom JSON payload defined, the webhook can't use the common alert schema. Alerts with the common schema enabled have an upper size limit of 256 KB per alert, bigger alert will not include search results. When the search results aren't included, you should use the `LinkToFilteredSearchResultsAPI` or `LinkToSearchResultsAPI` to access query results via the Log Analytics API. ## Sample payloads This section shows sample payloads for webhooks for log alerts. The sample payloads include examples when the payload is standard and when it's custom. |
azure-monitor | Alerts Metric Near Real Time | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/alerts/alerts-metric-near-real-time.md | Here's the full list of Azure Monitor metric sources supported by the newer aler ## Payload schema > [!NOTE]-> You can also use the [common alert schema](./alerts-common-schema.md), which provides the advantage of having a single extensible and unified alert payload across all the alert services in Azure Monitor, for your webhook integrations. [Learn about the common alert schema definitions](./alerts-common-schema-definitions.md).ΓÇï +> You can also use the [common alert schema](./alerts-common-schema.md), which provides the advantage of having a single extensible and unified alert payload across all the alert services in Azure Monitor, for your webhook integrations.ΓÇï The POST operation contains the following JSON payload and schema for all near newer metric alerts when an appropriately configured [action group](./action-groups.md) is used: |
azure-monitor | Alerts Payload Samples | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/alerts/alerts-payload-samples.md | + + Title: Samples of Azure Monitor alert payloads +description: See examples of payloads for Azure monitor alerts. + Last updated : 01/23/2023++++++# Sample alert payloads ++The common alert schema standardizes the consumption experience for alert notifications in Azure. Historically, activity log, metric, and log alerts each had their own email templates and webhook schemas. The common alert schema provides one standardized schema for all alert notifications. ++A standardized schema can help you minimize the number of integrations, which simplifies the process of managing and maintaining your integrations. ++The common schema includes information about the affected resource and the cause of the alert in these sections: +- **Essentials**: Standardized fields, used by all alert types that describe the resource affected by the alert and common alert metadata, such as severity or description. ++ If you want to route alert instances to specific teams based on criteria such as a resource group, you can use the fields in the **Essentials** section to provide routing logic for all alert types. The teams that receive the alert notification can then use the context fields for their investigation. +- **Alert context**: Fields that vary depending on the type of the alert. The alert context fields describe the cause of the alert. For example, a metric alert would have fields like the metric name and metric value in the alert context. An activity log alert would have information about the event that generated the alert. ++## Sample alert payload ++```json +{ + "schemaId": "azureMonitorCommonAlertSchema", + "data": { + "essentials": { + "alertId": "/subscriptions/<subscription ID>/providers/Microsoft.AlertsManagement/alerts/b9569717-bc32-442f-add5-83a997729330", + "alertRule": "WCUS-R2-Gen2", + "severity": "Sev3", + "signalType": "Metric", + "monitorCondition": "Resolved", + "monitoringService": "Platform", + "alertTargetIDs": [ + "/subscriptions/<subscription ID>/resourcegroups/pipelinealertrg/providers/microsoft.compute/virtualmachines/wcus-r2-gen2" + ], + "configurationItems": [ + "wcus-r2-gen2" + ], + "originAlertId": "3f2d4487-b0fc-4125-8bd5-7ad17384221e_PipeLineAlertRG_microsoft.insights_metricAlerts_WCUS-R2-Gen2_-117781227", + "firedDateTime": "2019-03-22T13:58:24.3713213Z", + "resolvedDateTime": "2019-03-22T14:03:16.2246313Z", + "description": "", + "essentialsVersion": "1.0", + "alertContextVersion": "1.0" + }, + "alertContext": { + "properties": null, + "conditionType": "SingleResourceMultipleMetricCriteria", + "condition": { + "windowSize": "PT5M", + "allOf": [ + { + "metricName": "Percentage CPU", + "metricNamespace": "Microsoft.Compute/virtualMachines", + "operator": "GreaterThan", + "threshold": "25", + "timeAggregation": "Average", + "dimensions": [ + { + "name": "ResourceId", + "value": "3efad9dc-3d50-4eac-9c87-8b3fd6f97e4e" + } + ], + "metricValue": 7.727 + } + ] + } + } + } +} +``` +++## Sample metric alerts +The following are sample metric alert payloads. ++### Metric alert with a static threshold and the monitoringService = `Platform` ++```json +{ + "alertContext": { + "properties": null, + "conditionType": "SingleResourceMultipleMetricCriteria", + "condition": { + "windowSize": "PT5M", + "allOf": [ + { + "metricName": "Percentage CPU", + "metricNamespace": "Microsoft.Compute/virtualMachines", + "operator": "GreaterThan", + "threshold": "25", + "timeAggregation": "Average", + "dimensions": [ + { + "name": "ResourceId", + "value": "3efad9dc-3d50-4eac-9c87-8b3fd6f97e4e" + } + ], + "metricValue": 31.1105 + } + ], + "windowStartTime": "2019-03-22T13:40:03.064Z", + "windowEndTime": "2019-03-22T13:45:03.064Z" + } + } +} +``` ++### Metric alert with a dynamic threshold and the monitoringService = Platform ++```json +{ + "alertContext": { + "properties": null, + "conditionType": "DynamicThresholdCriteria", + "condition": { + "windowSize": "PT5M", + "allOf": [ + { + "alertSensitivity": "High", + "failingPeriods": { + "numberOfEvaluationPeriods": 1, + "minFailingPeriodsToAlert": 1 + }, + "ignoreDataBefore": null, + "metricName": "Egress", + "metricNamespace": "microsoft.storage/storageaccounts", + "operator": "GreaterThan", + "threshold": "47658", + "timeAggregation": "Total", + "dimensions": [], + "metricValue": 50101 + } + ], + "windowStartTime": "2021-07-20T05:07:26.363Z", + "windowEndTime": "2021-07-20T05:12:26.363Z" + } + } +} +``` +### Metric alert for availability tests and the monitoringService = Platform ++```json +{ + "alertContext": { + "properties": null, + "conditionType": "WebtestLocationAvailabilityCriteria", + "condition": { + "windowSize": "PT5M", + "allOf": [ + { + "metricName": "Failed Location", + "metricNamespace": null, + "operator": "GreaterThan", + "threshold": "2", + "timeAggregation": "Sum", + "dimensions": [], + "metricValue": 5, + "webTestName": "myAvailabilityTest-myApplication" + } + ], + "windowStartTime": "2019-03-22T13:40:03.064Z", + "windowEndTime": "2019-03-22T13:45:03.064Z" + } + } +} +``` ++## Sample log alerts ++> [!NOTE] +> When you enable the common schema, the fields in the payload are reset to the common schema fields. Therefore, log alerts have these limitations regarding the common schema: +> - The common schema is not supported for log alerts using webhooks with a custom email subject and/or JSON payload, since the common schema overwrites the custom configurations. +> - Alerts using the common schema have an upper size limit of 256 KB per alert. If the log alerts payload includes search results that cause the alert to exceed the maximum size, the search results aren't embedded in the log alerts payload. You can check if the payload includes the search results with the `IncludedSearchResults` flag. Use `LinkToFilteredSearchResultsAPI` or `LinkToSearchResultsAPI` to access query results with the [Log Analytics API](/rest/api/loganalytics/dataaccess/query/get) if the search results are not included. ++### Log alert with monitoringService = Platform ++```json +{ + "alertContext": { + "SearchQuery": "Perf | where ObjectName == \"Processor\" and CounterName == \"% Processor Time\" | summarize AggregatedValue = avg(CounterValue) by bin(TimeGenerated, 5m), Computer", + "SearchIntervalStartTimeUtc": "3/22/2019 1:36:31 PM", + "SearchIntervalEndtimeUtc": "3/22/2019 1:51:31 PM", + "ResultCount": 2, + "LinkToSearchResults": "https://portal.azure.com/#Analyticsblade/search/index?_timeInterval.intervalEnd=2018-03-26T09%3a10%3a40.0000000Z&_timeInterval.intervalDuration=3600&q=Usage", + "LinkToFilteredSearchResultsUI": "https://portal.azure.com/#Analyticsblade/search/index?_timeInterval.intervalEnd=2018-03-26T09%3a10%3a40.0000000Z&_timeInterval.intervalDuration=3600&q=Usage", + "LinkToSearchResultsAPI": "https://api.loganalytics.io/v1/workspaces/workspaceID/query?query=Heartbeat×pan=2020-05-07T18%3a11%3a51.0000000Z%2f2020-05-07T18%3a16%3a51.0000000Z", + "LinkToFilteredSearchResultsAPI": "https://api.loganalytics.io/v1/workspaces/workspaceID/query?query=Heartbeat×pan=2020-05-07T18%3a11%3a51.0000000Z%2f2020-05-07T18%3a16%3a51.0000000Z", + "SeverityDescription": "Warning", + "WorkspaceId": "12345a-1234b-123c-123d-12345678e", + "SearchIntervalDurationMin": "15", + "AffectedConfigurationItems": [ + "INC-Gen2Alert" + ], + "SearchIntervalInMinutes": "15", + "Threshold": 10000, + "Operator": "Less Than", + "Dimensions": [ + { + "name": "Computer", + "value": "INC-Gen2Alert" + } + ], + "SearchResults": { + "tables": [ + { + "name": "PrimaryResult", + "columns": [ + { + "name": "$table", + "type": "string" + }, + { + "name": "Computer", + "type": "string" + }, + { + "name": "TimeGenerated", + "type": "datetime" + } + ], + "rows": [ + [ + "Fabrikam", + "33446677a", + "2018-02-02T15:03:12.18Z" + ], + [ + "Contoso", + "33445566b", + "2018-02-02T15:16:53.932Z" + ] + ] + } + ], + "dataSources": [ + { + "resourceId": "/subscriptions/a5ea55e2-7482-49ba-90b3-60e7496dd873/resourcegroups/test/providers/microsoft.operationalinsights/workspaces/test", + "tables": [ + "Heartbeat" + ] + } + ] + }, + "IncludedSearchResults": "True", + "AlertType": "Metric measurement" + } +} +``` +### Log alert with monitoringService = Application Insights ++```json +{ + "alertContext": { + "SearchQuery": "requests | where resultCode == \"500\" | summarize AggregatedValue = Count by bin(Timestamp, 5m), IP", + "SearchIntervalStartTimeUtc": "3/22/2019 1:36:33 PM", + "SearchIntervalEndtimeUtc": "3/22/2019 1:51:33 PM", + "ResultCount": 2, + "LinkToSearchResults": "https://portal.azure.com/AnalyticsBlade/subscriptions/12345a-1234b-123c-123d-12345678e/?query=search+*+&timeInterval.intervalEnd=2018-03-26T09%3a10%3a40.0000000Z&_timeInterval.intervalDuration=3600&q=Usage", + "LinkToFilteredSearchResultsUI": "https://portal.azure.com/AnalyticsBlade/subscriptions/12345a-1234b-123c-123d-12345678e/?query=search+*+&timeInterval.intervalEnd=2018-03-26T09%3a10%3a40.0000000Z&_timeInterval.intervalDuration=3600&q=Usage", + "LinkToSearchResultsAPI": "https://api.applicationinsights.io/v1/apps/0MyAppId0/metrics/requests/count", + "LinkToFilteredSearchResultsAPI": "https://api.applicationinsights.io/v1/apps/0MyAppId0/metrics/requests/count", + "SearchIntervalDurationMin": "15", + "SearchIntervalInMinutes": "15", + "Threshold": 10000.0, + "Operator": "Less Than", + "ApplicationId": "8e20151d-75b2-4d66-b965-153fb69d65a6", + "Dimensions": [ + { + "name": "IP", + "value": "1.1.1.1" + } + ], + "SearchResults": { + "tables": [ + { + "name": "PrimaryResult", + "columns": [ + { + "name": "$table", + "type": "string" + }, + { + "name": "Id", + "type": "string" + }, + { + "name": "Timestamp", + "type": "datetime" + } + ], + "rows": [ + [ + "Fabrikam", + "33446677a", + "2018-02-02T15:03:12.18Z" + ], + [ + "Contoso", + "33445566b", + "2018-02-02T15:16:53.932Z" + ] + ] + } + ], + "dataSources": [ + { + "resourceId": "/subscriptions/a5ea27e2-7482-49ba-90b3-52e7496dd873/resourcegroups/test/providers/microsoft.operationalinsights/workspaces/test", + "tables": [ + "Heartbeat" + ] + } + ] + }, + "IncludedSearchResults": "True", + "AlertType": "Metric measurement" + } +} +``` ++### Log alert with monitoringService = Log Alerts V2 ++> [!NOTE] +> Log alert rules from API version 2020-05-01 use this payload type, which only supports common schema. Search results aren't embedded in the log alerts payload when you use this version. Use [dimensions](./alerts-unified-log.md#split-by-alert-dimensions) to provide context to fired alerts. You can also use `LinkToFilteredSearchResultsAPI` or `LinkToSearchResultsAPI` to access query results with the [Log Analytics API](/rest/api/loganalytics/dataaccess/query/get). If you must embed the results, use a logic app with the provided links to generate a custom payload. ++```json +{ + "alertContext": { + "properties": { + "name1": "value1", + "name2": "value2" + }, + "conditionType": "LogQueryCriteria", + "condition": { + "windowSize": "PT10M", + "allOf": [ + { + "searchQuery": "Heartbeat", + "metricMeasureColumn": "CounterValue", + "targetResourceTypes": "['Microsoft.Compute/virtualMachines']", + "operator": "LowerThan", + "threshold": "1", + "timeAggregation": "Count", + "dimensions": [ + { + "name": "Computer", + "value": "TestComputer" + } + ], + "metricValue": 0.0, + "failingPeriods": { + "numberOfEvaluationPeriods": 1, + "minFailingPeriodsToAlert": 1 + }, + "linkToSearchResultsUI": "https://portal.azure.com#@12345a-1234b-123c-123d-12345678e/blade/Microsoft_Azure_Monitoring_Logs/LogsBlade/source/Alerts.EmailLinks/scope/%7B%22resources%22%3A%5B%7B%22resourceId%22%3A%22%2Fsubscriptions%212345a-1234b-123c-123d-12345678e%2FresourceGroups%2FContoso%2Fproviders%2FMicrosoft.Compute%2FvirtualMachines%2FContoso%22%7D%5D%7D/q/eJzzSE0sKklKTSypUSjPSC1KVQjJzE11T81LLUosSU1RSEotKU9NzdNIAfJKgDIaRgZGBroG5roGliGGxlYmJlbGJnoGEKCpp4dDmSmKMk0A/prettify/1/timespan/2020-07-07T13%3a54%3a34.0000000Z%2f2020-07-09T13%3a54%3a34.0000000Z", + "linkToFilteredSearchResultsUI": "https://portal.azure.com#@12345a-1234b-123c-123d-12345678e/blade/Microsoft_Azure_Monitoring_Logs/LogsBlade/source/Alerts.EmailLinks/scope/%7B%22resources%22%3A%5B%7B%22resourceId%22%3A%22%2Fsubscriptions%212345a-1234b-123c-123d-12345678e%2FresourceGroups%2FContoso%2Fproviders%2FMicrosoft.Compute%2FvirtualMachines%2FContoso%22%7D%5D%7D/q/eJzzSE0sKklKTSypUSjPSC1KVQjJzE11T81LLUosSU1RSEotKU9NzdNIAfJKgDIaRgZGBroG5roGliGGxlYmJlbGJnoGEKCpp4dDmSmKMk0A/prettify/1/timespan/2020-07-07T13%3a54%3a34.0000000Z%2f2020-07-09T13%3a54%3a34.0000000Z", + "linkToSearchResultsAPI": "https://api.loganalytics.io/v1/subscriptions/12345a-1234b-123c-123d-12345678e/resourceGroups/Contoso/providers/Microsoft.Compute/virtualMachines/Contoso/query?query=Heartbeat%7C%20where%20TimeGenerated%20between%28datetime%282020-07-09T13%3A44%3A34.0000000%29..datetime%282020-07-09T13%3A54%3A34.0000000%29%29×pan=2020-07-07T13%3a54%3a34.0000000Z%2f2020-07-09T13%3a54%3a34.0000000Z", + "linkToFilteredSearchResultsAPI": "https://api.loganalytics.io/v1/subscriptions/12345a-1234b-123c-123d-12345678e/resourceGroups/Contoso/providers/Microsoft.Compute/virtualMachines/Contoso/query?query=Heartbeat%7C%20where%20TimeGenerated%20between%28datetime%282020-07-09T13%3A44%3A34.0000000%29..datetime%282020-07-09T13%3A54%3A34.0000000%29%29×pan=2020-07-07T13%3a54%3a34.0000000Z%2f2020-07-09T13%3a54%3a34.0000000Z" + } + ], + "windowStartTime": "2020-07-07T13:54:34Z", + "windowEndTime": "2020-07-09T13:54:34Z" + } + } +} +``` ++## Sample activity log alerts ++### Activity log alert with monitoringService = `Activity Log - Administrative` ++```json +{ + "alertContext": { + "authorization": { + "action": "Microsoft.Compute/virtualMachines/restart/action", + "scope": "/subscriptions/<subscription ID>/resourceGroups/PipeLineAlertRG/providers/Microsoft.Compute/virtualMachines/WCUS-R2-ActLog" + }, + "channels": "Operation", + "claims": "{\"aud\":\"https://management.core.windows.net/\",\"iss\":\"https://sts.windows.net/12345a-1234b-123c-123d-12345678e/\",\"iat\":\"1553260826\",\"nbf\":\"1553260826\",\"exp\":\"1553264726\",\"aio\":\"42JgYNjdt+rr+3j/dx68v018XhuFAwA=\",\"appid\":\"e9a02282-074f-45cf-93b0-50568e0e7e50\",\"appidacr\":\"2\",\"http://schemas.microsoft.com/identity/claims/identityprovider\":\"https://sts.windows.net/12345a-1234b-123c-123d-12345678e/\",\"http://schemas.microsoft.com/identity/claims/objectidentifier\":\"9778283b-b94c-4ac6-8a41-d5b493d03aa3\",\"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/nameidentifier\":\"9778283b-b94c-4ac6-8a41-d5b493d03aa3\",\"http://schemas.microsoft.com/identity/claims/tenantid\":\"12345a-1234b-123c-123d-12345678e\",\"uti\":\"v5wYC9t9ekuA2rkZSVZbAA\",\"ver\":\"1.0\"}", + "caller": "9778283b-b94c-4ac6-8a41-d5b493d03aa3", + "correlationId": "8ee9c32a-92a1-4a8f-989c-b0ba09292a91", + "eventSource": "Administrative", + "eventTimestamp": "2019-03-22T13:56:31.2917159+00:00", + "eventDataId": "161fda7e-1cb4-4bc5-9c90-857c55a8f57b", + "level": "Informational", + "operationName": "Microsoft.Compute/virtualMachines/restart/action", + "operationId": "310db69b-690f-436b-b740-6103ab6b0cba", + "status": "Succeeded", + "subStatus": "", + "submissionTimestamp": "2019-03-22T13:56:54.067593+00:00" + } +} +``` ++### Activity log alert with monitoringService = `Activity Log - Policy` ++```json +{ + "alertContext": { + "authorization": { + "action": "Microsoft.Resources/checkPolicyCompliance/read", + "scope": "/subscriptions/<GUID>" + }, + "channels": "Operation", + "claims": "{\"aud\":\"https://management.azure.com/\",\"iss\":\"https://sts.windows.net/<GUID>/\",\"iat\":\"1566711059\",\"nbf\":\"1566711059\",\"exp\":\"1566740159\",\"aio\":\"42FgYOhynHNw0scy3T/bL71+xLyqEwA=\",\"appid\":\"<GUID>\",\"appidacr\":\"2\",\"http://schemas.microsoft.com/identity/claims/identityprovider\":\"https://sts.windows.net/<GUID>/\",\"http://schemas.microsoft.com/identity/claims/objectidentifier\":\"<GUID>\",\"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/nameidentifier\":\"<GUID>\",\"http://schemas.microsoft.com/identity/claims/tenantid\":\"<GUID>\",\"uti\":\"Miy1GzoAG0Scu_l3m1aIAA\",\"ver\":\"1.0\"}", + "caller": "<GUID>", + "correlationId": "<GUID>", + "eventSource": "Policy", + "eventTimestamp": "2019-08-25T11:11:34.2269098+00:00", + "eventDataId": "<GUID>", + "level": "Warning", + "operationName": "Microsoft.Authorization/policies/audit/action", + "operationId": "<GUID>", + "properties": { + "isComplianceCheck": "True", + "resourceLocation": "eastus2", + "ancestors": "<GUID>", + "policies": "[{\"policyDefinitionId\":\"/providers/Microsoft.Authorization/policyDefinitions/<GUID>/\",\"policySetDefinitionId\":\"/providers/Microsoft.Authorization/policySetDefinitions/<GUID>/\",\"policyDefinitionReferenceId\":\"vulnerabilityAssessmentMonitoring\",\"policySetDefinitionName\":\"<GUID>\",\"policyDefinitionName\":\"<GUID>\",\"policyDefinitionEffect\":\"AuditIfNotExists\",\"policyAssignmentId\":\"/subscriptions/<GUID>/providers/Microsoft.Authorization/policyAssignments/SecurityCenterBuiltIn/\",\"policyAssignmentName\":\"SecurityCenterBuiltIn\",\"policyAssignmentScope\":\"/subscriptions/<GUID>\",\"policyAssignmentSku\":{\"name\":\"A1\",\"tier\":\"Standard\"},\"policyAssignmentParameters\":{}}]" + }, + "status": "Succeeded", + "subStatus": "", + "submissionTimestamp": "2019-08-25T11:12:46.1557298+00:00" + } +} +``` ++### Activity log alert with monitoringService = `Activity Log - Autoscale` ++```json +{ + "alertContext": { + "channels": "Admin, Operation", + "claims": "{\"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/spn\":\"Microsoft.Insights/autoscaleSettings\"}", + "caller": "Microsoft.Insights/autoscaleSettings", + "correlationId": "<GUID>", + "eventSource": "Autoscale", + "eventTimestamp": "2019-08-21T16:17:47.1551167+00:00", + "eventDataId": "<GUID>", + "level": "Informational", + "operationName": "Microsoft.Insights/AutoscaleSettings/Scaleup/Action", + "operationId": "<GUID>", + "properties": { + "description": "The autoscale engine attempting to scale resource '/subscriptions/d<GUID>/resourceGroups/testRG/providers/Microsoft.Compute/virtualMachineScaleSets/testVMSS' from 9 instances count to 10 instances count.", + "resourceName": "/subscriptions/<GUID>/resourceGroups/voiceassistancedemo/providers/Microsoft.Compute/virtualMachineScaleSets/alexademo", + "oldInstancesCount": "9", + "newInstancesCount": "10", + "activeAutoscaleProfile": "{\r\n \"Name\": \"Auto created scale condition\",\r\n \"Capacity\": {\r\n \"Minimum\": \"1\",\r\n \"Maximum\": \"10\",\r\n \"Default\": \"1\"\r\n },\r\n \"Rules\": [\r\n {\r\n \"MetricTrigger\": {\r\n \"Name\": \"Percentage CPU\",\r\n \"Namespace\": \"microsoft.compute/virtualmachinescalesets\",\r\n \"Resource\": \"/subscriptions/<GUID>/resourceGroups/testRG/providers/Microsoft.Compute/virtualMachineScaleSets/testVMSS\",\r\n \"ResourceLocation\": \"eastus\",\r\n \"TimeGrain\": \"PT1M\",\r\n \"Statistic\": \"Average\",\r\n \"TimeWindow\": \"PT5M\",\r\n \"TimeAggregation\": \"Average\",\r\n \"Operator\": \"GreaterThan\",\r\n \"Threshold\": 0.0,\r\n \"Source\": \"/subscriptions/<GUID>/resourceGroups/testRG/providers/Microsoft.Compute/virtualMachineScaleSets/testVMSS\",\r\n \"MetricType\": \"MDM\",\r\n \"Dimensions\": [],\r\n \"DividePerInstance\": false\r\n },\r\n \"ScaleAction\": {\r\n \"Direction\": \"Increase\",\r\n \"Type\": \"ChangeCount\",\r\n \"Value\": \"1\",\r\n \"Cooldown\": \"PT1M\"\r\n }\r\n }\r\n ]\r\n}", + "lastScaleActionTime": "Wed, 21 Aug 2019 16:17:47 GMT" + }, + "status": "Succeeded", + "submissionTimestamp": "2019-08-21T16:17:47.2410185+00:00" + } +} +``` ++### Activity log alert with monitoringService = `Activity Log - Security` ++```json +{ + "alertContext": { + "channels": "Operation", + "correlationId": "<GUID>", + "eventSource": "Security", + "eventTimestamp": "2019-08-26T08:34:14+00:00", + "eventDataId": "<GUID>", + "level": "Informational", + "operationName": "Microsoft.Security/locations/alerts/activate/action", + "operationId": "<GUID>", + "properties": { + "threatStatus": "Quarantined", + "category": "Virus", + "threatID": "2147519003", + "filePath": "C:\\AlertGeneration\\test.eicar", + "protectionType": "Windows Defender", + "actionTaken": "Blocked", + "resourceType": "Virtual Machine", + "severity": "Low", + "compromisedEntity": "testVM", + "remediationSteps": "[\"No user action is necessary\"]", + "attackedResourceType": "Virtual Machine" + }, + "status": "Active", + "submissionTimestamp": "2019-08-26T09:28:58.3019107+00:00" + } +} +``` ++### Activity log alert with `monitoringService = ServiceHealth` ++```json +{ + "alertContext": { + "authorization": null, + "channels": 1, + "claims": null, + "caller": null, + "correlationId": "f3cf2430-1ee3-4158-8e35-7a1d615acfc7", + "eventSource": 2, + "eventTimestamp": "2019-06-24T11:31:19.0312699+00:00", + "httpRequest": null, + "eventDataId": "<GUID>", + "level": 3, + "operationName": "Microsoft.ServiceHealth/maintenance/action", + "operationId": "<GUID>", + "properties": { + "title": "Azure Synapse Analytics Scheduled Maintenance Pending", + "service": "Azure Synapse Analytics", + "region": "East US", + "communication": "<MESSAGE>", + "incidentType": "Maintenance", + "trackingId": "<GUID>", + "impactStartTime": "2019-06-26T04:00:00Z", + "impactMitigationTime": "2019-06-26T12:00:00Z", + "impactedServices": "[{\"ImpactedRegions\":[{\"RegionName\":\"East US\"}],\"ServiceName\":\"Azure Synapse Analytics\"}]", + "impactedServicesTableRows": "<tr>\r\n<td align='center' style='padding: 5px 10px; border-right:1px solid black; border-bottom:1px solid black'>Azure Synapse Analytics</td>\r\n<td align='center' style='padding: 5px 10px; border-bottom:1px solid black'>East US<br></td>\r\n</tr>\r\n", + "defaultLanguageTitle": "Azure Synapse Analytics Scheduled Maintenance Pending", + "defaultLanguageContent": "<MESSAGE>", + "stage": "Planned", + "communicationId": "<GUID>", + "maintenanceId": "<GUID>", + "isHIR": "false", + "version": "0.1.1" + }, + "status": "Active", + "subStatus": null, + "submissionTimestamp": "2019-06-24T11:31:31.7147357+00:00", + "ResourceType": null + } +} +``` ++### Activity log alert with monitoringService = `ResourceHealth` ++```json +{ + "alertContext": { + "channels": "Admin, Operation", + "correlationId": "<GUID>", + "eventSource": "ResourceHealth", + "eventTimestamp": "2019-06-24T15:42:54.074+00:00", + "eventDataId": "<GUID>", + "level": "Informational", + "operationName": "Microsoft.Resourcehealth/healthevent/Activated/action", + "operationId": "<GUID>", + "properties": { + "title": "This virtual machine is stopping and deallocating as requested by an authorized user or process", + "details": null, + "currentHealthStatus": "Unavailable", + "previousHealthStatus": "Available", + "type": "Downtime", + "cause": "UserInitiated" + }, + "status": "Active", + "submissionTimestamp": "2019-06-24T15:45:20.4488186+00:00" + } +} +``` ++## Sample Prometheus alert ++```json +{ + "alertContext": { + "interval": "PT1M", + "expression": "sql_up > 0", + "expressionValue": "0", + "for": "PT2M", + "labels": { + "Environment": "Prod", + "cluster": "myCluster1" + }, + "annotations": { + "summary": "alert on SQL availability" + }, + "ruleGroup": "/subscriptions/<subscription ID>/resourceGroups/myResourceGroup/providers/Microsoft.AlertsManagement/prometheusRuleGroups/myRuleGroup" + } +} +``` ++## Sample payloads for test actions ++### Sample test action alert +```json +{ + "schemaId": "azureMonitorCommonAlertSchema", + "data": { + "essentials": { + "alertId": "/subscriptions/<subscription ID>/providers/Microsoft.AlertsManagement/alerts/b9569717-bc32-442f-add5-83a997729330", + "alertRule": "WCUS-R2-Gen2", + "severity": "Sev3", + "signalType": "Metric", + "monitorCondition": "Resolved", + "monitoringService": "Platform", + "alertTargetIDs": [ + "/subscriptions/<subscription ID>/resourcegroups/pipelinealertrg/providers/microsoft.compute/virtualmachines/wcus-r2-gen2" + ], + "configurationItems": [ + "wcus-r2-gen2" + ], + "originAlertId": "3f2d4487-b0fc-4125-8bd5-7ad17384221e_PipeLineAlertRG_microsoft.insights_metricAlerts_WCUS-R2-Gen2_-117781227", + "firedDateTime": "2019-03-22T13:58:24.3713213Z", + "resolvedDateTime": "2019-03-22T14:03:16.2246313Z", + "description": "", + "essentialsVersion": "1.0", + "alertContextVersion": "1.0" + }, + "alertContext": { + "properties": null, + "conditionType": "SingleResourceMultipleMetricCriteria", + "condition": { + "windowSize": "PT5M", + "allOf": [ + { + "metricName": "Percentage CPU", + "metricNamespace": "Microsoft.Compute/virtualMachines", + "operator": "GreaterThan", + "threshold": "25", + "timeAggregation": "Average", + "dimensions": [ + { + "name": "ResourceId", + "value": "3efad9dc-3d50-4eac-9c87-8b3fd6f97e4e" + } + ], + "metricValue": 7.727 + } + ] + } + } + } +} +``` ++### Sample test action metric alerts ++#### Test action metric alert with a static threshold and the monitoringService = `Platform` ++```json +{ + "schemaId":"azureMonitorCommonAlertSchema", + "data":{ + "essentials":{ + "alertId":"/subscriptions/11111111-1111-1111-1111-111111111111/providers/Microsoft.AlertsManagement/alerts/12345678-1234-1234-1234-1234567890ab", + "alertRule":"test-metricAlertRule", + "severity":"Sev3", + "signalType":"Metric", + "monitorCondition":"Fired", + "monitoringService":"Platform", + "alertTargetIDs":[ + "/subscriptions/11111111-1111-1111-1111-111111111111/resourcegroups/test-RG/providers/Microsoft.Storage/storageAccounts/test-storageAccount" + ], + "configurationItems":[ + "test-storageAccount" + ], + "originAlertId":"11111111-1111-1111-1111-111111111111_test-RG_microsoft.insights_metricAlerts_test-metricAlertRule_1234567890", + "firedDateTime":"2021-11-15T09:35:24.3468506Z", + "description":"Alert rule description", + "essentialsVersion":"1.0", + "alertContextVersion":"1.0" + }, + "alertContext":{ + "properties":{ + "customKey1":"value1", + "customKey2":"value2" + }, + "conditionType":"DynamicThresholdCriteria", + "condition":{ + "windowSize":"PT15M", + "allOf":[ + { + "alertSensitivity":"Low", + "failingPeriods":{ + "numberOfEvaluationPeriods":3, + "minFailingPeriodsToAlert":3 + }, + "ignoreDataBefore":null, + "metricName":"Transactions", + "metricNamespace":"Microsoft.Storage/storageAccounts", + "operator":"GreaterThan", + "threshold":"0.3", + "timeAggregation":"Average", + "dimensions":[ + + ], + "metricValue":78.09, + "webTestName":null + } + ], + "windowStartTime":"2021-12-15T01:04:11.719Z", + "windowEndTime":"2021-12-15T01:19:11.719Z" + } + }, + "customProperties":{ + "customKey1":"value1", + "customKey2":"value2" + } + } +} +``` ++#### Test action metric alert with dynamic threshold and monitoringService = `Platform` +++```json +{ + "schemaId":"azureMonitorCommonAlertSchema", + "data":{ + "essentials":{ + "alertId":"/subscriptions/11111111-1111-1111-1111-111111111111/providers/Microsoft.AlertsManagement/alerts/12345678-1234-1234-1234-1234567890ab", + "alertRule":"test-metricAlertRule", + "severity":"Sev3", + "signalType":"Metric", + "monitorCondition":"Fired", + "monitoringService":"Platform", + "alertTargetIDs":[ + "/subscriptions/11111111-1111-1111-1111-111111111111/resourcegroups/test-RG/providers/Microsoft.Storage/storageAccounts/test-storageAccount" + ], + "configurationItems":[ + "test-storageAccount" + ], + "originAlertId":"11111111-1111-1111-1111-111111111111_test-RG_microsoft.insights_metricAlerts_test-metricAlertRule_1234567890", + "firedDateTime":"2021-11-15T09:35:24.3468506Z", + "description":"Alert rule description", + "essentialsVersion":"1.0", + "alertContextVersion":"1.0" + }, + "alertContext":{ + "properties":{ + "customKey1":"value1", + "customKey2":"value2" + }, + "conditionType":"DynamicThresholdCriteria", + "condition":{ + "windowSize":"PT15M", + "allOf":[ + { + "alertSensitivity":"Low", + "failingPeriods":{ + "numberOfEvaluationPeriods":3, + "minFailingPeriodsToAlert":3 + }, + "ignoreDataBefore":null, + "metricName":"Transactions", + "metricNamespace":"Microsoft.Storage/storageAccounts", + "operator":"GreaterThan", + "threshold":"0.3", + "timeAggregation":"Average", + "dimensions":[ + + ], + "metricValue":78.09, + "webTestName":null + } + ], + "windowStartTime":"2021-12-15T01:04:11.719Z", + "windowEndTime":"2021-12-15T01:19:11.719Z" + } + }, + "customProperties":{ + "customKey1":"value1", + "customKey2":"value2" + } + } +} +``` ++### Sample test action log alerts ++#### Test action log alert V1 ΓÇô Metric ++```json +{ + "schemaId":"azureMonitorCommonAlertSchema", + "data":{ + "essentials":{ + "alertId":"/subscriptions/11111111-1111-1111-1111-111111111111/providers/Microsoft.AlertsManagement/alerts/12345678-1234-1234-1234-1234567890ab", + "alertRule":"test-logAlertRule-v1-metricMeasurement", + "severity":"Sev3", + "signalType":"Log", + "monitorCondition":"Fired", + "monitoringService":"Log Analytics", + "alertTargetIDs":[ + "/subscriptions/11111111-1111-1111-1111-111111111111/resourcegroups/test-RG/providers/microsoft.operationalinsights/workspaces/test-logAnalyticsWorkspace" + ], + "configurationItems":[ + + ], + "originAlertId":"12345678-4444-4444-4444-1234567890ab", + "firedDateTime":"2021-11-16T15:17:21.9232467Z", + "description":"Alert rule description", + "essentialsVersion":"1.0", + "alertContextVersion":"1.1" + }, + "alertContext":{ + "SearchQuery":"Heartbeat | summarize AggregatedValue=count() by bin(TimeGenerated, 5m)", + "SearchIntervalStartTimeUtc":"2021-11-15T15:16:49Z", + "SearchIntervalEndtimeUtc":"2021-11-16T15:16:49Z", + "ResultCount":2, + "LinkToSearchResults":"https://portal.azure.com#@aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa/blade/Microsoft_Azure_Monitoring_Logs/LogsBlade/source/Alerts.EmailLinks/scope/%7B%22resources%22%3A%5B%7B%22resourceId%22%3A%22%2Fsubscriptions%2F11111111-1111-1111-1111-111111111111%2FresourceGroups%2Ftest-RG%2Fproviders%2FMicrosoft.OperationalInsights%2Fworkspaces%2Ftest-logAnalyticsWorkspace%22%7D%5D%7D/q/aBcDeFgHi%2BWqUSguzc1NLMqsSlVwTE8vSk1PLElNCUvMKU21Tc4vzSvRaBcDeFgHiaBcDeFgHiaBcDeFgHiaBcDeFgHi/prettify/1/timespan/2021-11-15T15%3a16%3a49.0000000Z%2f2021-11-16T15%3a16%3a49.0000000Z", + "LinkToFilteredSearchResultsUI":"https://portal.azure.com#@aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa/blade/Microsoft_Azure_Monitoring_Logs/LogsBlade/source/Alerts.EmailLinks/scope/%7B%22resources%22%3A%5B%7B%22resourceId%22%3A%22%2Fsubscriptions%2F11111111-1111-1111-1111-111111111111%2FresourceGroups%2Ftest-RG%2Fproviders%2FMicrosoft.OperationalInsights%2Fworkspaces%2Ftest-logAnalyticsWorkspace%22%7D%5D%7D/q/aBcDeFgHiaBcDeFgHiaBcDeFgHiaBcDeFgHiaBcDeFgHidp%2BOPOhDKsHR%2FFeJXsTgzGJRmVui3KF3RpLyEJCX9A2iMl6jgxMn6jRevng3JmIHLdYtKP4DRI9mhc%3D/prettify/1/timespan/2021-11-15T15%3a16%3a49.0000000Z%2f2021-11-16T15%3a16%3a49.0000000Z", + "LinkToSearchResultsAPI":"https://api.loganalytics.io/v1/workspaces/bbbbbbbb-bbbb-bbbb-bbbb-bbbbbbbbbbbb/query?query=Heartbeat%20%0A%7C%20summarize%20AggregatedValue%3Dcount%28%29%20by%20bin%28TimeGenerated%2C%205m%29×pan=2021-11-15T15%3a16%3a49.0000000Z%2f2021-11-16T15%3a16%3a49.0000000Z", + "LinkToFilteredSearchResultsAPI":"https://api.loganalytics.io/v1/workspaces/bbbbbbbb-bbbb-bbbb-bbbb-bbbbbbbbbbbb/query?query=Heartbeat%20%0A%7C%20summarize%20AggregatedValue%3Dcount%28%29%20by%20bin%28TimeGenerated%2C%205m%29%7C%20where%20todouble%28AggregatedValue%29%20%3E%200×pan=2021-11-15T15%3a16%3a49.0000000Z%2f2021-11-16T15%3a16%3a49.0000000Z", + "SeverityDescription":"Informational", + "WorkspaceId":"bbbbbbbb-bbbb-bbbb-bbbb-bbbbbbbbbbbb", + "SearchIntervalDurationMin":"1440", + "AffectedConfigurationItems":[ + + ], + "AlertType":"Metric measurement", + "IncludeSearchResults":true, + "Dimensions":[ + + ], + "SearchIntervalInMinutes":"1440", + "SearchResults":{ + "tables":[ + { + "name":"PrimaryResult", + "columns":[ + { + "name":"TimeGenerated", + "type":"datetime" + }, + { + "name":"AggregatedValue", + "type":"long" + } + ], + "rows":[ + [ + "2021-11-16T10:56:49Z", + 11 + ], + [ + "2021-11-16T11:56:49Z", + 11 + ] + ] + } + ], + "dataSources":[ + { + "resourceId":"/subscriptions/11111111-1111-1111-1111-111111111111/resourcegroups/test-RG/providers/microsoft.operationalinsights/workspaces/test-logAnalyticsWorkspace", + "region":"eastus", + "tables":[ + "Heartbeat" + ] + } + ] + }, + "Threshold":0, + "Operator":"Greater Than", + "IncludedSearchResults":"True" + } + } +} +``` ++#### Test action log alert V1 - Numresults ++```json +{ + "schemaId":"azureMonitorCommonAlertSchema", + "data":{ + "essentials":{ + "alertId":"/subscriptions/11111111-1111-1111-1111-111111111111/providers/Microsoft.AlertsManagement/alerts/12345678-1234-1234-1234-1234567890ab", + "alertRule":"test-logAlertRule-v1-numResults", + "severity":"Sev3", + "signalType":"Log", + "monitorCondition":"Fired", + "monitoringService":"Log Analytics", + "alertTargetIDs":[ + "/subscriptions/11111111-1111-1111-1111-111111111111/resourcegroups/test-RG/providers/microsoft.operationalinsights/workspaces/test-logAnalyticsWorkspace" + ], + "configurationItems":[ + "test-computer" + ], + "originAlertId":"22222222-2222-2222-2222-222222222222", + "firedDateTime":"2021-11-16T15:15:58.3302205Z", + "description":"Alert rule description", + "essentialsVersion":"1.0", + "alertContextVersion":"1.1" + }, + "alertContext":{ + "SearchQuery":"Heartbeat", + "SearchIntervalStartTimeUtc":"2021-11-15T15:15:24Z", + "SearchIntervalEndtimeUtc":"2021-11-16T15:15:24Z", + "ResultCount":1, + "LinkToSearchResults":"https://portal.azure.com#@aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa/blade/Microsoft_Azure_Monitoring_Logs/LogsBlade/source/Alerts.EmailLinks/scope/%7B%22resources%22%3A%5B%7B%22resourceId%22%3A%22%2Fsubscriptions%2F11111111-1111-1111-1111-111111111111%2FresourceGroups%2Ftest-RG%2Fproviders%2FMicrosoft.OperationalInsights%2Fworkspaces%2Ftest-logAnalyticsWorkspace%22%7D%5D%7D/q/aBcDeFgHi%2ABCDE%3D%3D/prettify/1/timespan/2021-11-15T15%3a15%3a24.0000000Z%2f2021-11-16T15%3a15%3a24.0000000Z", + "LinkToFilteredSearchResultsUI":"https://portal.azure.com#@aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa/blade/Microsoft_Azure_Monitoring_Logs/LogsBlade/source/Alerts.EmailLinks/scope/%7B%22resources%22%3A%5B%7B%22resourceId%22%3A%22%2Fsubscriptions%2F11111111-1111-1111-1111-111111111111%2FresourceGroups%2Ftest-RG%2Fproviders%2FMicrosoft.OperationalInsights%2Fworkspaces%2Ftest-logAnalyticsWorkspace%22%7D%5D%7D/q/aBcDeFgHi%2ABCDE%3D%3D/prettify/1/timespan/2021-11-15T15%3a15%3a24.0000000Z%2f2021-11-16T15%3a15%3a24.0000000Z", + "LinkToSearchResultsAPI":"https://api.loganalytics.io/v1/workspaces/bbbbbbbb-bbbb-bbbb-bbbb-bbbbbbbbbbbb/query?query=Heartbeat%0A×pan=2021-11-15T15%3a15%3a24.0000000Z%2f2021-11-16T15%3a15%3a24.0000000Z", + "LinkToFilteredSearchResultsAPI":"https://api.loganalytics.io/v1/workspaces/bbbbbbbb-bbbb-bbbb-bbbb-bbbbbbbbbbbb/query?query=Heartbeat%0A×pan=2021-11-15T15%3a15%3a24.0000000Z%2f2021-11-16T15%3a15%3a24.0000000Z", + "SeverityDescription":"Informational", + "WorkspaceId":"bbbbbbbb-bbbb-bbbb-bbbb-bbbbbbbbbbbb", + "SearchIntervalDurationMin":"1440", + "AffectedConfigurationItems":[ + "test-computer" + ], + "AlertType":"Number of results", + "IncludeSearchResults":true, + "SearchIntervalInMinutes":"1440", + "SearchResults":{ + "tables":[ + { + "name":"PrimaryResult", + "columns":[ + { + "name":"TenantId", + "type":"string" + }, + { + "name":"Computer", + "type":"string" + }, + { + "name":"TimeGenerated", + "type":"datetime" + } + ], + "rows":[ + [ + "bbbbbbbb-bbbb-bbbb-bbbb-bbbbbbbbbbbb", + "test-computer", + "2021-11-16T12:00:00Z" + ] + ] + } + ], + "dataSources":[ + { + "resourceId":"/subscriptions/11111111-1111-1111-1111-111111111111/resourcegroups/test-RG/providers/microsoft.operationalinsights/workspaces/test-logAnalyticsWorkspace", + "region":"eastus", + "tables":[ + "Heartbeat" + ] + } + ] + }, + "Threshold":0, + "Operator":"Greater Than", + "IncludedSearchResults":"True" + } + } +} +``` ++#### Test action log alert V2 ++> [!NOTE] +> Log alerts rules from API version 2020-05-01 use this payload type, which only supports common schema. Search results aren't embedded in the log alerts payload when you use this version. Use [dimensions](./alerts-unified-log.md#split-by-alert-dimensions) to provide context to fired alerts. ++You can also use `LinkToFilteredSearchResultsAPI` or `LinkToSearchResultsAPI` to access query results with the [Log Analytics API](/rest/api/loganalytics/dataaccess/query/get). If you must embed the results, use a logic app with the provided links to generate a custom payload. ++```json +{ + "schemaId":"azureMonitorCommonAlertSchema", + "data":{ + "essentials":{ + "alertId":"/subscriptions/11111111-1111-1111-1111-111111111111/providers/Microsoft.AlertsManagement/alerts/12345678-1234-1234-1234-1234567890ab", + "alertRule":"test-logAlertRule-v2", + "severity":"Sev3", + "signalType":"Log", + "monitorCondition":"Fired", + "monitoringService":"Log Alerts V2", + "alertTargetIDs":[ + "/subscriptions/11111111-1111-1111-1111-111111111111/resourcegroups/test-RG/providers/microsoft.operationalinsights/workspaces/test-logAnalyticsWorkspace" + ], + "configurationItems":[ + "test-computer" + ], + "originAlertId":"22222222-2222-2222-2222-222222222222", + "firedDateTime":"2021-11-16T11:47:41.4728231Z", + "description":"Alert rule description", + "essentialsVersion":"1.0", + "alertContextVersion":"1.0" + }, + "alertContext":{ + "properties":{ + "customKey1":"value1", + "customKey2":"value2" + }, + "conditionType":"LogQueryCriteria", + "condition":{ + "windowSize":"PT1H", + "allOf":[ + { + "searchQuery":"Heartbeat", + "metricMeasureColumn":null, + "targetResourceTypes":"['Microsoft.OperationalInsights/workspaces']", + "operator":"GreaterThan", + "threshold":"0", + "timeAggregation":"Count", + "dimensions":[ + { + "name":"Computer", + "value":"test-computer" + } + ], + "metricValue":3.0, + "failingPeriods":{ + "numberOfEvaluationPeriods":1, + "minFailingPeriodsToAlert":1 + }, + "linkToSearchResultsUI":"https://portal.azure.com#@aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa/blade/Microsoft_Azure_Monitoring_Logs/LogsBlade/source/Alerts.EmailLinks/scope/%7B%22resources%22%3A%5B%7B%22resourceId%22%3A%22%2Fsubscriptions%2F11111111-1111-1111-1111-111111111111%2FresourceGroups%2Ftest-RG%2Fproviders%2FMicrosoft.OperationalInsights%2Fworkspaces%2Ftest-logAnalyticsWorkspace%22%7D%5D%7D/q/aBcDeFgHiJkLmNaBcDeFgHiJkLmNaBcDeFgHiJkLmNaBcDeFgHiJkLmN1234567890ZAZBZiaGBlaG5lbKlnAAFRmnp6WNUZoqvTBAA%3D/prettify/1/timespan/2021-11-16T10%3a17%3a39.0000000Z%2f2021-11-16T11%3a17%3a39.0000000Z", + "linkToFilteredSearchResultsUI":"https://portal.azure.com#@aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa/blade/Microsoft_Azure_Monitoring_Logs/LogsBlade/source/Alerts.EmailLinks/scope/%7B%22resources%22%3A%5B%7B%22resourceId%22%3A%22%2Fsubscriptions%2F11111111-1111-1111-1111-111111111111%2FresourceGroups%2Ftest-RG%2Fproviders%2FMicrosoft.OperationalInsights%2Fworkspaces%2Ftest-logAnalyticsWorkspace%22%7D%5D%7D/q/aBcDeFgHiJkLmN%2Fl35oOTZoKioEOouaBcDeFgHiJkLmN%2BaBcDeFgHiJkLmN%2BaBcDeFgHiJkLmN7HHgOCZTR0Ak%2FaBcDeFgHiJkLmN1234567890Ltcw%2FOqZS%2FuX0L5d%2Bx3iMHNzQiu3Y%2BzsjpFSWlOzgA87vAxeHW2MoAtQxe6OUvVrZR3XYZPXrd%2FIE/prettify/1/timespan/2021-11-16T10%3a17%3a39.0000000Z%2f2021-11-16T11%3a17%3a39.0000000Z", + "linkToSearchResultsAPI":"https://api.loganalytics.io/v1/workspaces/bbbbbbbb-bbbb-bbbb-bbbb-bbbbbbbbbbbb/query?query=Heartbeat%7C%20where%20TimeGenerated%20between%28datetime%282021-11-16T10%3A17%3A39.0000000Z%29..datetime%282021-11-16T11%3A17%3A39.0000000Z%29%29×pan=2021-11-16T10%3a17%3a39.0000000Z%2f2021-11-16T11%3a17%3a39.0000000Z", + "linkToFilteredSearchResultsAPI":"https://api.loganalytics.io/v1/workspaces/bbbbbbbb-bbbb-bbbb-bbbb-bbbbbbbbbbbb/query?query=Heartbeat%7C%20where%20TimeGenerated%20between%28datetime%282021-11-16T10%3A17%3A39.0000000Z%29..datetime%282021-11-16T11%3A17%3A39.0000000Z%29%29%7C%20where%20tostring%28Computer%29%20%3D%3D%20%27test-computer%27×pan=2021-11-16T10%3a17%3a39.0000000Z%2f2021-11-16T11%3a17%3a39.0000000Z" + } + ], + "windowStartTime":"2021-11-16T10:17:39Z", + "windowEndTime":"2021-11-16T11:17:39Z" + } + } + } +} +``` ++### Sample test action activity log alerts ++#### Test action activity log alert with MonitoringService = `Administrative` ++```json +{ + "schemaId":"azureMonitorCommonAlertSchema", + "data":{ + "essentials":{ + "alertId":"/subscriptions/11111111-1111-1111-1111-111111111111/providers/Microsoft.AlertsManagement/alerts/12345678-1234-1234-1234-1234567890ab", + "alertRule":"test-activityLogAlertRule", + "severity":"Sev4", + "signalType":"Activity Log", + "monitorCondition":"Fired", + "monitoringService":"Activity Log - Administrative", + "alertTargetIDs":[ + "/subscriptions/11111111-1111-1111-1111-111111111111/resourcegroups/test-RG/providers/microsoft.compute/virtualmachines/test-VM" + ], + "configurationItems":[ + "test-VM" + ], + "originAlertId":"bbbbbbbb-bbbb-bbbb-bbbb-bbbbbbbbbbbb_123456789012345678901234567890ab", + "firedDateTime":"2021-11-16T08:29:01.2932462Z", + "description":"Alert rule description", + "essentialsVersion":"1.0", + "alertContextVersion":"1.0" + }, + "alertContext":{ + "authorization":{ + "action":"Microsoft.Compute/virtualMachines/restart/action", + "scope":"/subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/test-RG/providers/Microsoft.Compute/virtualMachines/test-VM" + }, + "channels":"Operation", + "claims":"{}", + "caller":"user-email@domain.com", + "correlationId":"aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa", + "eventSource":"Administrative", + "eventTimestamp":"2021-11-16T08:27:36.1836909+00:00", + "eventDataId":"bbbbbbbb-bbbb-bbbb-bbbb-bbbbbbbbbbbb", + "level":"Informational", + "operationName":"Microsoft.Compute/virtualMachines/restart/action", + "operationId":"cccccccc-cccc-cccc-cccc-cccccccccccc", + "properties":{ + "eventCategory":"Administrative", + "entity":"/subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/test-RG/providers/Microsoft.Compute/virtualMachines/test-VM", + "message":"Microsoft.Compute/virtualMachines/restart/action", + "hierarchy":"22222222-2222-2222-2222-222222222222/CnAIOrchestrationServicePublicCorpprod/33333333-3333-3333-3333-3333333333333/44444444-4444-4444-4444-444444444444/55555555-5555-5555-5555-555555555555/11111111-1111-1111-1111-111111111111" + }, + "status":"Succeeded", + "subStatus":"", + "submissionTimestamp":"2021-11-16T08:29:00.141807+00:00", + "Activity Log Event Description":"" + } + } +} +``` ++#### Test action activity log alert with MonitoringService = `ServiceHealth` ++```json +{ + "schemaId":"azureMonitorCommonAlertSchema", + "data":{ + "essentials":{ + "alertId":"/subscriptions/11111111-1111-1111-1111-111111111111/providers/Microsoft.AlertsManagement/alerts/1234abcd5678efgh1234abcd5678efgh1234abcd5678efgh1234abcd5678efgh", + "alertRule":"test-ServiceHealthAlertRule", + "severity":"Sev4", + "signalType":"Activity Log", + "monitorCondition":"Fired", + "monitoringService":"ServiceHealth", + "alertTargetIDs":[ + "/subscriptions/11111111-1111-1111-1111-111111111111" + ], + "originAlertId":"12345678-1234-1234-1234-1234567890ab", + "firedDateTime":"2021-11-17T05:34:48.0623172", + "description":"Alert rule description", + "essentialsVersion":"1.0", + "alertContextVersion":"1.0" + }, + "alertContext":{ + "authorization":null, + "channels":1, + "claims":null, + "caller":null, + "correlationId":"12345678-abcd-efgh-ijkl-abcd12345678", + "eventSource":2, + "eventTimestamp":"2021-11-17T05:34:44.5778226+00:00", + "httpRequest":null, + "eventDataId":"12345678-1234-1234-1234-1234567890ab", + "level":3, + "operationName":"Microsoft.ServiceHealth/incident/action", + "operationId":"12345678-abcd-efgh-ijkl-abcd12345678", + "properties":{ + "title":"Test Action Group - Test Service Health Alert", + "service":"Azure Service Name", + "region":"Global", + "communication":"<p><strong>Summary of impact</strong>: This is the impact summary.</p>\n<p><br></p>\n<p><strong>Preliminary Root Cause</strong>: This is the preliminary root cause.</p>\n<p><br></p>\n<p><strong>Mitigation</strong>: Mitigation description.</p>\n<p><br></p>\n<p><strong>Next steps</strong>: These are the next steps. </p>\n<p><br></p>\n<p>Stay informed about Azure service issues by creating custom service health alerts: <a href=\"https://aka.ms/ash-videos\" rel=\"noopener noreferrer\" target=\"_blank\">https://aka.ms/ash-videos</a> for video tutorials and <a href=\"https://aka.ms/ash-alerts%20for%20how-to%20documentation\" rel=\"noopener noreferrer\" target=\"_blank\">https://aka.ms/ash-alerts for how-to documentation</a>.</p>\n<p><br></p>", + "incidentType":"Incident", + "trackingId":"ABC1-DEF", + "impactStartTime":"2021-11-16T20:00:00Z", + "impactMitigationTime":"2021-11-17T01:00:00Z", + "impactedServices":"[{\"ImpactedRegions\":[{\"RegionName\":\"Global\"}],\"ServiceName\":\"Azure Service Name\"}]", + "impactedServicesTableRows":"<tr>\r\n<td align='center' style='padding: 5px 10px; border-right:1px solid black; border-bottom:1px solid black'>Azure Service Name</td>\r\n<td align='center' style='padding: 5px 10px; border-bottom:1px solid black'>Global<br></td>\r\n</tr>\r\n", + "defaultLanguageTitle":"Test Action Group - Test Service Health Alert", + "defaultLanguageContent":"<p><strong>Summary of impact</strong>: This is the impact summary.</p>\n<p><br></p>\n<p><strong>Preliminary Root Cause</strong>: This is the preliminary root cause.</p>\n<p><br></p>\n<p><strong>Mitigation</strong>: Mitigation description.</p>\n<p><br></p>\n<p><strong>Next steps</strong>: These are the next steps. </p>\n<p><br></p>\n<p>Stay informed about Azure service issues by creating custom service health alerts: <a href=\"https://aka.ms/ash-videos\" rel=\"noopener noreferrer\" target=\"_blank\">https://aka.ms/ash-videos</a> for video tutorials and <a href=\"https://aka.ms/ash-alerts%20for%20how-to%20documentation\" rel=\"noopener noreferrer\" target=\"_blank\">https://aka.ms/ash-alerts for how-to documentation</a>.</p>\n<p><br></p>", + "stage":"Resolved", + "communicationId":"11223344556677", + "isHIR":"false", + "IsSynthetic":"True", + "impactType":"SubscriptionList", + "version":"0.1.1" + }, + "status":"Resolved", + "subStatus":null, + "submissionTimestamp":"2021-11-17T01:23:45.0623172+00:00", + "ResourceType":null + } + } +} +``` ++#### Test action activity log alert with MonitoringService = `Resource Health` ++```json +{ + "schemaId":"azureMonitorCommonAlertSchema", + "data":{ + "essentials":{ + "alertId":"/subscriptions/11111111-1111-1111-1111-111111111111/providers/Microsoft.AlertsManagement/alerts/12345678-1234-1234-1234-1234567890ab", + "alertRule":"test-ResourceHealthAlertRule", + "severity":"Sev4", + "signalType":"Activity Log", + "monitorCondition":"Fired", + "monitoringService":"Resource Health", + "alertTargetIDs":[ + "/subscriptions/11111111-1111-1111-1111-111111111111/resourcegroups/test-RG/providers/microsoft.compute/virtualmachines/test-VM" + ], + "configurationItems":[ + "test-VM" + ], + "originAlertId":"bbbbbbbb-bbbb-bbbb-bbbb-bbbbbbbbbbbb_123456789012345678901234567890ab", + "firedDateTime":"2021-11-16T09:54:08.9938123Z", + "description":"Alert rule description", + "essentialsVersion":"1.0", + "alertContextVersion":"1.0" + }, + "alertContext":{ + "channels":"Admin, Operation", + "correlationId":"aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa", + "eventSource":"ResourceHealth", + "eventTimestamp":"2021-11-16T09:50:20.406+00:00", + "eventDataId":"bbbbbbbb-bbbb-bbbb-bbbb-bbbbbbbbbbbb", + "level":"Informational", + "operationName":"Microsoft.Resourcehealth/healthevent/Activated/action", + "operationId":"bbbbbbbb-bbbb-bbbb-bbbb-bbbbbbbbbbbb", + "properties":{ + "title":"Rebooted by user", + "details":null, + "currentHealthStatus":"Unavailable", + "previousHealthStatus":"Available", + "type":"Downtime", + "cause":"UserInitiated" + }, + "status":"Active", + "submissionTimestamp":"2021-11-16T09:54:08.5303319+00:00", + "Activity Log Event Description":null + } + } +} +``` ++#### Test action activity log alert with MonitoringService = `Budget` ++```json +{ + "schemaId":"AIP Budget Notification", + "data":{ + "SubscriptionName":"test-subscription", + "SubscriptionId":"11111111-1111-1111-1111-111111111111", + "EnrollmentNumber":"", + "DepartmentName":"test-budgetDepartmentName", + "AccountName":"test-budgetAccountName", + "BillingAccountId":"", + "BillingProfileId":"", + "InvoiceSectionId":"", + "ResourceGroup":"test-RG", + "SpendingAmount":"1111.32", + "BudgetStartDate":"11/17/2021 5:40:29 PM -08:00", + "Budget":"10000", + "Unit":"USD", + "BudgetCreator":"email@domain.com", + "BudgetName":"test-budgetName", + "BudgetType":"Cost", + "NotificationThresholdAmount":"8000.0" + } +} +``` ++#### Test action activity log alert with MonitoringService = `Actual Cost Budget` ++```json +{ + "schemaId": "azureMonitorCommonAlertSchema", + "data": { + "essentials": { + "monitoringService": "CostAlerts", + "firedDateTime": "2022-12-07T21:13:20.645Z", + "description": "Your spend for budget Test_actual_cost_budget is now $11,111.00 exceeding your specified threshold $25.00.", + "essentialsVersion": "1.0", + "alertContextVersion": "1.0", + "alertId": "/subscriptions/11111111-1111-1111-1111-111111111111/providers/Microsoft.CostManagement/alerts/Test_Alert", + "alertRule": null, + "severity": null, + "signalType": null, + "monitorCondition": null, + "alertTargetIDs": null, + "configurationItems": [ + "budgets" + ], + "originAlertId": null + }, + "alertContext": { + "AlertCategory": "budgets", + "AlertData": { + "Scope": "/subscriptions/11111111-1111-1111-1111-111111111111/", + "ThresholdType": "Actual", + "BudgetType": "Cost", + "BudgetThreshold": "$50.00", + "NotificationThresholdAmount": "$25.00", + "BudgetName": "Test_actual_cost_budget", + "BudgetId": "/subscriptions/11111111-1111-1111-1111-111111111111/providers/Microsoft.Consumption/budgets/Test_actual_cost_budget", + "BudgetStartDate": "2022-11-01", + "BudgetCreator": "test@sample.test", + "Unit": "USD", + "SpentAmount": "$11,111.00" + } + } + } +} +``` +#### Test action activity log alerts with MonitoringService = `Forecasted Budget` ++```json +{ + "schemaId": "azureMonitorCommonAlertSchema", + "data": { + "essentials": { + "monitoringService": "CostAlerts", + "firedDateTime": "2022-12-07T21:13:29.576Z", + "description": "The total spend for your budget, Test_forcasted_budget, is forecasted to reach $1111.11 before the end of the period. This amount exceeds your specified budget threshold of $50.00.", + "essentialsVersion": "1.0", + "alertContextVersion": "1.0", + "alertId": "/subscriptions/11111111-1111-1111-1111-111111111111/providers/Microsoft.CostManagement/alerts/Test_Alert", + "alertRule": null, + "severity": null, + "signalType": null, + "monitorCondition": null, + "alertTargetIDs": null, + "configurationItems": [ + "budgets" + ], + "originAlertId": null + }, + "alertContext": { + "AlertCategory": "budgets", + "AlertData": { + "Scope": "/subscriptions/11111111-1111-1111-1111-111111111111/", + "ThresholdType": "Forecasted", + "BudgetType": "Cost", + "BudgetThreshold": "$50.00", + "NotificationThresholdAmount": "$50.00", + "BudgetName": "Test_forcasted_budget", + "BudgetId": "/subscriptions/11111111-1111-1111-1111-111111111111/providers/Microsoft.Consumption/budgets/Test_forcasted_budget", + "BudgetStartDate": "2022-11-01", + "BudgetCreator": "test@sample.test", + "Unit": "USD", + "SpentAmount": "$999.99", + "ForecastedTotalForPeriod": "$1111.11" + } + } + } +} +``` ++#### Test action activity log alerts with MonitoringService = `Smart Alert` ++```json +{ + "schemaId":"azureMonitorCommonAlertSchema", + "data":{ + "essentials":{ + "alertId":"/subscriptions/11111111-1111-1111-1111-111111111111/providers/Microsoft.AlertsManagement/alerts/12345678-1234-1234-1234-1234567890ab", + "alertRule":"Dependency Latency Degradation - test-applicationInsights", + "severity":"Sev3", + "signalType":"Log", + "monitorCondition":"Fired", + "monitoringService":"SmartDetector", + "alertTargetIDs":[ + "/subscriptions/11111111-1111-1111-1111-111111111111/resourcegroups/test-RG/providers/microsoft.insights/components/test-applicationInsights" + ], + "configurationItems":[ + "test-applicationInsights" + ], + "originAlertId":"1234abcd5678efgh1234abcd5678efgh1234abcd5678efgh1234abcd5678efgh", + "firedDateTime":"2021-10-28T19:09:09.1115084Z", + "description":"Dependency Latency Degradation notifies you of an unusual increase in response by a dependency your app is calling (e.g. REST API or database)", + "essentialsVersion":"1.0", + "alertContextVersion":"1.0" + }, + "alertContext":{ + "DetectionSummary":"A degradation in the dependency duration over the last 24 hours", + "FormattedOccurrenceTime":"2021-10-27T23:59:59Z", + "DetectedValue":"0.45 sec", + "NormalValue":"0.27 sec (over the last 7 days)", + "PresentationInsightEventRequest":"/subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/test-RG/providers/microsoft.insights/components/test-applicationInsights/query?query=systemEvents%0d%0a++++++++++++++++%7c+where+timestamp+%3e%3d+datetime(%272021-10-27T23%3a29%3a59.0000000Z%27)+%0d%0a++++++++++++++++%7c+where+itemType+%3d%3d+%27systemEvent%27+and+name+%3d%3d+%27ProactiveDetectionInsight%27+%0d%0a++++++++++++++++%7c+where+dimensions.InsightType+%3d%3d+3+%0d%0a++++++++++++++++%7c+where+dimensions.InsightVersion+%3d%3d+%27SmartAlert%27%0d%0a++++++++++++++++%7c+where+dimensions.InsightDocumentId+%3d%3d+%2712345678-abcd-1234-5678-abcd12345678%27+%0d%0a++++++++++++++++%7c+project+dimensions.InsightPropertiesTable%2cdimensions.InsightDegradationChart%2cdimensions.InsightCountChart%2cdimensions.InsightLinksTable%0d%0a++++++++++++++++&api-version=2018-04-20", + "SmartDetectorId":"DependencyPerformanceDegradationDetector", + "SmartDetectorName":"Dependency Performance Degradation Detector", + "AnalysisTimestamp":"2021-10-28T19:09:09.1115084Z" + } + } +} +``` ++## Next steps ++- [Learn how to create a logic app that uses the common alert schema to handle all your alerts](./alerts-common-schema-integrations.md) |
azure-monitor | Alerts Processing Rules | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/alerts/alerts-processing-rules.md | You can also define filters to narrow down which specific subset of alerts are a | Filter | Description| |:|:|-Alert context (payload) | The rule applies only to alerts that contain any of the filter's strings within the [alert context](./alerts-common-schema-definitions.md#alert-context) section of the alert. This section includes fields specific to each alert type. This filter does not apply to log alert search results. | +Alert context (payload) | The rule applies only to alerts that contain any of the filter's strings within the [alert context](./alerts-common-schema.md) section of the alert. This section includes fields specific to each alert type. This filter does not apply to log alert search results. | Alert rule ID | The rule applies only to alerts from a specific alert rule. The value should be the full resource ID, for example, `/subscriptions/SUB1/resourceGroups/RG1/providers/microsoft.insights/metricalerts/MY-API-LATENCY`. To locate the alert rule ID, open a specific alert rule in the portal, select **Properties**, and copy the **Resource ID** value. You can also locate it by listing your alert rules from PowerShell or the Azure CLI. | Alert rule name | The rule applies only to alerts with this alert rule name. It can also be useful with a **Contains** operator. | Description | The rule applies only to alerts that contain the specified string within the alert rule description field. | |
azure-monitor | Alerts Troubleshoot | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/alerts/alerts-troubleshoot.md | -This article discusses common problems in Azure Monitor alerting and notifications. +This article discusses common problems in Azure Monitor alerting and notifications. Azure Monitor alerts proactively notify you when important conditions are found in your monitoring data. They allow you to identify and address issues before the users of your system notice them. For more information on alerting, see [Overview of alerts in Microsoft Azure](./alerts-overview.md). -Azure Monitor alerts proactively notify you when important conditions are found in your monitoring data. They allow you to identify and address issues before the users of your system notice them. For more information on alerting, see [Overview of alerts in Microsoft Azure](./alerts-overview.md). +You can see fired alerts in the Azure portal. -If you have a problem with an alert firing or not firing when expected, refer to the articles below. You can see "fired" alerts in the Azure portal. +Refer to these articles for troubleshooting information about metric or log alerts that are not behaving as expected: -- [Troubleshooting Azure Monitor Metric Alerts in Microsoft Azure](alerts-troubleshoot-metric.md) -- [Troubleshooting Azure Monitor Log Alerts in Microsoft Azure](alerts-troubleshoot-log.md)+- [Troubleshoot Azure Monitor metric alerts](alerts-troubleshoot-metric.md) +- [Troubleshoot Azure Monitor log alerts](alerts-troubleshoot-log.md) If the alert fires as intended according to the Azure portal but the proper notifications do not occur, use the information in the rest of this article to troubleshoot that problem. If you have received the alert, but believe some of its fields are missing or in Check if the format specified at the action level is what you expect. For example, you may have developed code that responds to alerts (webhook, function, logic app, etc.), expecting one format, but later in the action you or another person specified a different format. - Also, check the payload format (JSON) for [activity log alerts](../alerts/activity-log-alerts-webhook.md), for [log search alerts](../alerts/alerts-log-webhook.md) (both Application Insights and log analytics), for [metric alerts](alerts-metric-near-real-time.md#payload-schema), for the [common alert schema](../alerts/alerts-common-schema-definitions.md), and for the deprecated [classic metric alerts](./alerts-webhooks.md). + Also, check the payload format (JSON) for [activity log alerts](../alerts/activity-log-alerts-webhook.md), for [log search alerts](../alerts/alerts-log-webhook.md) (both Application Insights and log analytics), for [metric alerts](alerts-metric-near-real-time.md#payload-schema), for the [common alert schema](../alerts/alerts-common-schema.md), and for the deprecated [classic metric alerts](./alerts-webhooks.md). 1. **Activity log alerts: Is the information available in the activity log?** |
azure-monitor | Container Insights Syslog | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/containers/container-insights-syslog.md | Container Insights offers the ability to collect Syslog events from Linux nodes ## Prerequisites -- You will need to have managed identity authentication enabled on your cluster. To enable, see [migrate your AKS cluster to managed identity authentication](container-insights-enable-existing-clusters.md?tabs=azure-cli#migrate-to-managed-identity-authentication). Note: This which will create a Data Collection Rule (DCR) named `MSCI-<WorkspaceRegion>-<ClusterName>` +- You need to have managed identity authentication enabled on your cluster. To enable, see [migrate your AKS cluster to managed identity authentication](container-insights-enable-existing-clusters.md?tabs=azure-cli#migrate-to-managed-identity-authentication). Note: Enabling Managed Identity will create a new Data Collection Rule (DCR) named `MSCI-<WorkspaceRegion>-<ClusterName>` - Minimum versions of Azure components - **Azure CLI**: Minimum version required for Azure CLI is [2.45.0 (link to release notes)](/cli/azure/release-notes-azure-cli#february-07-2023). See [How to update the Azure CLI](/cli/azure/update-azure-cli) for upgrade instructions. - **Azure CLI AKS-Preview Extension**: Minimum version required for AKS-Preview Azure CLI extension is [ 0.5.125 (link to release notes)](https://github.com/Azure/azure-cli-extensions/blob/main/src/aks-preview/HISTORY.rst#05125). See [How to update extensions](/cli/azure/azure-cli-extensions-overview#how-to-update-extensions) for upgrade guidance. - **Linux image version**: Minimum version for AKS node linux image is 2022.11.01. See [Upgrade Azure Kubernetes Service (AKS) node images](https://learn.microsoft.com/azure/aks/node-image-upgrade) for upgrade help. ## How to enable Syslog- -Use the following command in Azure CLI to enable syslog collection when you create a new AKS cluster. ++### From the Azure portal ++Navigate to your cluster. Open the _Insights_ tab for your cluster. Open the _Monitor Settings_ panel. Click on Edit collection settings, then check the box for _Enable Syslog collection_ + ### Using Azure CLI commands +Use the following command in Azure CLI to enable syslog collection when you create a new AKS cluster. + ```azurecli az aks create -g syslog-rg -n new-cluster --enable-managed-identity --node-count 1 --enable-addons monitoring --enable-msi-auth-for-monitoring --enable-syslog --generate-ssh-key ``` provisioningState : Succeeded ``` ## How to access Syslog data- ++### Access using built-in workbooks ++To get a quick snapshot of your syslog data, customers can use our built-in Syslog workbook. There are two ways to access the built-in workbook. ++Option 1 - The Reports tab in Container Insights. +Navigate to your cluster. Open the _Insights_ tab for your cluster. Open the _Reports_ tab and look for the _Syslog_ workbook. +++Option 2 - The Workbooks tab in AKS +Navigate to your cluster. Open the _Workbooks_ tab for your cluster and look for the _Syslog_ workbook. +++### Access using log queries + Syslog data is stored in the [Syslog](/azure/azure-monitor/reference/tables/syslog) table in your Log Analytics workspace. You can create your own [log queries](../logs/log-query-overview.md) in [Log Analytics](../logs/log-analytics-overview.md) to analyze this data or use any of the [prebuilt queries](../logs/log-query-overview.md). :::image type="content" source="media/container-insights-syslog/azmon-3.png" lightbox="media/container-insights-syslog/azmon-3.png" alt-text="Screenshot of Syslog query loaded in the query editor in the Azure Monitor Portal UI." border="false"::: You can open Log Analytics from the **Logs** menu in the **Monitor** menu to acc :::image type="content" source="media/container-insights-syslog/aks-4.png" lightbox="media/container-insights-syslog/aks-4.png" alt-text="Screenshot of Query editor with Syslog query." border="false"::: -### Sample queries +#### Sample queries The following table provides different examples of log queries that retrieve Syslog records. Select the minimum log level for each facility that you want to collect. ## Next steps -- Read more about [Syslog record properties](/azure/azure-monitor/reference/tables/syslog)+Once setup customers can start sending Syslog data to the tools of their choice +- Send Syslog to Microsoft Sentinel: https://learn.microsoft.com/azure/sentinel/connect-syslog +- Export data from Log Analytics: https://learn.microsoft.com/azure/azure-monitor/logs/logs-data-export?tabs=portal ++Read more +- [Syslog record properties](/azure/azure-monitor/reference/tables/syslog) +Share your feedback for the preview here: https://forms.office.com/r/BBvCjjDLTS |
azure-monitor | Collect Custom Metrics Linux Telegraf | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/collect-custom-metrics-linux-telegraf.md | -> InfluxData Telegraf is an open source agent and not officially supported by Azure Monitor. For issues wuth the Telegraf connector, please refer to the Telegraf Github page here: [InfluxData](https://github.com/influxdata/telegraf) +> InfluxData Telegraf is an open source agent and not officially supported by Azure Monitor. For issues wuth the Telegraf connector, please refer to the Telegraf GitHub page here: [InfluxData](https://github.com/influxdata/telegraf) ## InfluxData Telegraf agent |
azure-monitor | Profiler Bring Your Own Storage | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/profiler/profiler-bring-your-own-storage.md | To configure BYOS for code-level diagnostics (Profiler/Debugger), there are thre For general Profiler troubleshooting, refer to the [Profiler Troubleshoot documentation](profiler-troubleshooting.md). -For general Snapshot Debugger troubleshooting, refer to the [Snapshot Debugger Troubleshoot documentation](https://learn.microsoft.com/troubleshoot/azure/azure-monitor/app-insights/snapshot-debugger-troubleshoot). +For general Snapshot Debugger troubleshooting, refer to the [Snapshot Debugger Troubleshoot documentation](/troubleshoot/azure/azure-monitor/app-insights/snapshot-debugger-troubleshoot.md). ## Frequently asked questions |
azure-monitor | Snapshot Collector Release Notes | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/snapshot-debugger/snapshot-collector-release-notes.md | A point release to address user-reported bugs. ### Bug fixes - Fix [Hide the IDMS dependency from dependency tracker.](https://github.com/microsoft/ApplicationInsights-SnapshotCollector/issues/17) - Fix [ArgumentException: telemetryProcessorTypedoes not implement ITelemetryProcessor.](https://github.com/microsoft/ApplicationInsights-SnapshotCollector/issues/19)-<br>Snapshot Collector used via SDK is not supported when Interop feature is enabled. [See more not supported scenarios.](https://learn.microsoft.com/troubleshoot/azure/azure-monitor/app-insights/snapshot-debugger-troubleshoot#not-supported-scenarios) +<br>Snapshot Collector used via SDK is not supported when Interop feature is enabled. [See more not supported scenarios.](/troubleshoot/azure/azure-monitor/app-insights/snapshot-debugger-troubleshoot.md#not-supported-scenarios) ## [1.4.2](https://www.nuget.org/packages/Microsoft.ApplicationInsights.SnapshotCollector/1.4.2) A point release to address a user-reported bug. |
azure-monitor | Snapshot Debugger App Service | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/snapshot-debugger/snapshot-debugger-app-service.md | Below you can find scenarios where Snapshot Collector isn't supported: * Generate traffic to your application that can trigger an exception. Then, wait 10 to 15 minutes for snapshots to be sent to the Application Insights instance. * See [snapshots](snapshot-debugger.md?toc=/azure/azure-monitor/toc.json#view-snapshots-in-the-portal) in the Azure portal.-* For help with troubleshooting Snapshot Debugger issues, see [Snapshot Debugger troubleshooting](https://learn.microsoft.com/troubleshoot/azure/azure-monitor/app-insights/snapshot-debugger-troubleshoot). +* For help with troubleshooting Snapshot Debugger issues, see [Snapshot Debugger troubleshooting](/troubleshoot/azure/azure-monitor/app-insights/snapshot-debugger-troubleshoot.md). [Enablement UI]: ./media/snapshot-debugger/enablement-ui.png [snapshot-debugger-app-setting]:./media/snapshot-debugger/snapshot-debugger-app-setting.png |
azure-monitor | Snapshot Debugger Function App | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/snapshot-debugger/snapshot-debugger-function-app.md | We recommend that you have Snapshot Debugger enabled on all your apps to ease di * Generate traffic to your application that can trigger an exception. Then, wait 10 to 15 minutes for snapshots to be sent to the Application Insights instance. * [View snapshots](snapshot-debugger.md?toc=/azure/azure-monitor/toc.json#view-snapshots-in-the-portal) in the Azure portal. * Customize Snapshot Debugger configuration based on your use-case on your Function app. For more information, see [snapshot configuration in host.json](../../azure-functions/functions-host-json.md#applicationinsightssnapshotconfiguration).-* For help with troubleshooting Snapshot Debugger issues, see [Snapshot Debugger troubleshooting](https://learn.microsoft.com/troubleshoot/azure/azure-monitor/app-insights/snapshot-debugger-troubleshoot). +* For help with troubleshooting Snapshot Debugger issues, see [Snapshot Debugger troubleshooting](/troubleshoot/azure/azure-monitor/app-insights/snapshot-debugger-troubleshoot.md). |
azure-monitor | Snapshot Debugger Vm | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/snapshot-debugger/snapshot-debugger-vm.md | If your application runs in Azure Service Fabric, Cloud Service, Virtual Machine - Generate traffic to your application that can trigger an exception. Then, wait 10 to 15 minutes for snapshots to be sent to the Application Insights instance. - See [snapshots](snapshot-debugger.md?toc=/azure/azure-monitor/toc.json#view-snapshots-in-the-portal) in the Azure portal.-- For help with troubleshooting Snapshot Debugger issues, see [Snapshot Debugger troubleshooting](https://learn.microsoft.com/troubleshoot/azure/azure-monitor/app-insights/snapshot-debugger-troubleshoot).+- For help with troubleshooting Snapshot Debugger issues, see [Snapshot Debugger troubleshooting](/troubleshoot/azure/azure-monitor/app-insights/snapshot-debugger-troubleshoot.md). |
azure-monitor | Snapshot Debugger | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/snapshot-debugger/snapshot-debugger.md | The following environments are supported: > [!NOTE] > Client applications (for example, WPF, Windows Forms or UWP) aren't supported. -If you've enabled Snapshot Debugger but aren't seeing snapshots, check our [Troubleshooting guide](https://learn.microsoft.com/troubleshoot/azure/azure-monitor/app-insights/snapshot-debugger-troubleshoot). +If you've enabled Snapshot Debugger but aren't seeing snapshots, check our [Troubleshooting guide](/troubleshoot/azure/azure-monitor/app-insights/snapshot-debugger-troubleshoot.md). ## Grant permissions |
azure-monitor | Whats New | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/whats-new.md | Snapshot-Debugger|[Configure Bring Your Own Storage (BYOS) for Application Insig Snapshot-Debugger|[Release notes for Microsoft.ApplicationInsights.SnapshotCollector](snapshot-debugger/snapshot-collector-release-notes.md)|Removing the TSG from the AzMon TOC and adding to the support TOC| Snapshot-Debugger|[Enable Snapshot Debugger for .NET apps in Azure App Service](snapshot-debugger/snapshot-debugger-app-service.md)|Removing the TSG from the AzMon TOC and adding to the support TOC| Snapshot-Debugger|[Enable Snapshot Debugger for .NET and .NET Core apps in Azure Functions](snapshot-debugger/snapshot-debugger-function-app.md)|Removing the TSG from the AzMon TOC and adding to the support TOC|-Snapshot-Debugger|[ Troubleshoot problems enabling Application Insights Snapshot Debugger or viewing snapshots](snapshot-debugger/snapshot-debugger-troubleshoot.md)|Removing the TSG from the AzMon TOC and adding to the support TOC| +Snapshot-Debugger|[ Troubleshoot problems enabling Application Insights Snapshot Debugger or viewing snapshots](/troubleshoot/azure/azure-monitor/app-insights/snapshot-debugger-troubleshoot.md)|Removing the TSG from the AzMon TOC and adding to the support TOC| Snapshot-Debugger|[Enable Snapshot Debugger for .NET apps in Azure Service Fabric, Cloud Service, and Virtual Machines](snapshot-debugger/snapshot-debugger-vm.md)|Removing the TSG from the AzMon TOC and adding to the support TOC| Snapshot-Debugger|[Debug snapshots on exceptions in .NET apps](snapshot-debugger/snapshot-debugger.md)|Removing the TSG from the AzMon TOC and adding to the support TOC| Virtual-Machines|[Monitor virtual machines with Azure Monitor: Analyze monitoring data](vm/monitor-virtual-machine-analyze.md)|New article| |
azure-netapp-files | Azure Netapp Files Create Volumes | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-netapp-files/azure-netapp-files-create-volumes.md | This article shows you how to create an NFS volume. For SMB volumes, see [Create * **Virtual network** Specify the Azure virtual network (VNet) from which you want to access the volume. - The Vnet you specify must have a subnet delegated to Azure NetApp Files. The Azure NetApp Files service can be accessed only from the same Vnet or from a Vnet that is in the same region as the volume through Vnet peering. You can also access the volume from your on-premises network through Express Route. + The VNet you specify must have a subnet delegated to Azure NetApp Files. The Azure NetApp Files service can be accessed only from the same Vnet or from a Vnet that is in the same region as the volume through VNet peering. You can also access the volume from your on-premises network through Express Route. * **Subnet** Specify the subnet that you want to use for the volume. The subnet you specify must be delegated to Azure NetApp Files. - If you have not delegated a subnet, you can click **Create new** on the Create a Volume page. Then in the Create Subnet page, specify the subnet information, and select **Microsoft.NetApp/volumes** to delegate the subnet for Azure NetApp Files. In each Vnet, only one subnet can be delegated to Azure NetApp Files. + If you have not delegated a subnet, you can click **Create new** on the Create a Volume page. Then in the Create Subnet page, specify the subnet information, and select **Microsoft.NetApp/volumes** to delegate the subnet for Azure NetApp Files. In each VNet, only one subnet can be delegated to Azure NetApp Files.  |
azure-netapp-files | Azure Netapp Files Network Topologies | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-netapp-files/azure-netapp-files-network-topologies.md | Azure NetApp Files Standard network features are supported for the following reg * North Europe * Norway East * Norway West +* Qatar Central * South Africa North * South Central US * South India |
azure-netapp-files | Configure Customer Managed Keys | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-netapp-files/configure-customer-managed-keys.md | Azure NetApp Files customer-managed keys is supported for the following regions: * North Europe * Norway East * Norway West+* Qatar Central * South Africa North * South Central US * South India |
azure-netapp-files | Cross Zone Replication Introduction | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-netapp-files/cross-zone-replication-introduction.md | The preview of cross-zone replication is available in the following regions: * Korea Central * North Europe * Norway East +* Qatar Central * South Africa North * Southeast Asia * South Central US |
azure-netapp-files | Large Volumes Requirements Considerations | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-netapp-files/large-volumes-requirements-considerations.md | Support for Azure NetApp Files large volumes is available in the following regio * East US 2 * Germany West Central * Japan East-* North Central US * North Europe * South Central US * Switzerland North |
azure-resource-manager | Decompile | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/bicep/decompile.md | Title: Decompile ARM template JSON to Bicep description: Describes commands for decompiling Azure Resource Manager templates to Bicep files. Previously updated : 11/11/2022 Last updated : 03/03/2023+ # Decompiling ARM template JSON to Bicep This article describes how to decompile Azure Resource Manager templates (ARM te > [!NOTE] > From Visual Studio Code, you can directly create resource declarations by importing from existing resources. For more information, see [Bicep commands](./visual-studio-code.md#bicep-commands).+> +> Visual Studio Code enables you to paste JSON as Bicep. It automatically runs the decompile command. For more information, see [Paste JSON as Bicep](./visual-studio-code.md#paste-as-bicep-preview). Decompiling an ARM template helps you get started with Bicep development. If you have a library of ARM templates and want to use Bicep for future development, you can decompile them to Bicep. However, the Bicep file might need revisions to implement best practices for Bicep. |
azure-resource-manager | Migrate | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/bicep/migrate.md | description: Describes the recommended workflow when migrating Azure resources a Previously updated : 11/11/2022 Last updated : 03/03/2023 # Migrate to Bicep The convert phase consists of two steps, which you complete in sequence: > [!NOTE] > You can import a resource by opening the Visual Studio Code command palette. Use <kbd>Ctrl+Shift+P</kbd> on Windows and Linux and <kbd>Γîÿ+Shift+P</kbd> on macOS.+> +> Visual Studio Code enables you to paste JSON as Bicep. For more information, see [Paste JSON as Bicep](./visual-studio-code.md#paste-as-bicep-preview). ## Phase 2: Migrate The migrate phase consists of three steps, which you complete in sequence: 1. **Copy each resource from your decompiled template.** Copy each resource individually from the converted Bicep file to the new Bicep file. This process helps you resolve any issues on a per-resource basis and to avoid any confusion as your template grows in size. -1. **Identify and recreate any missing resources.** Not all Azure resource types can be exported through the Azure portal, Azure CLI, or Azure PowerShell. For example, virtual machine extensions such as the DependencyAgentWindows and MMAExtension (Microsoft Monitoring Agent) aren't supported resource types for export. For any resource that wasn't exported, such as virtual machine extensions, you'll need to recreate those resources in your new Bicep file. You can recreate resources using a variety of tools and approaches, including [Azure Resource Explorer](../templates/export-template-portal.md?azure-portal=true), the [Bicep and ARM template reference documentation](/azure/templates/?azure-portal=true), and the [Azure Quickstart Templates](https://azure.microsoft.com/resources/templates?azure-portal=true) site. +1. **Identify and recreate any missing resources.** Not all Azure resource types can be exported through the Azure portal, Azure CLI, or Azure PowerShell. For example, virtual machine extensions such as the DependencyAgentWindows and MMAExtension (Microsoft Monitoring Agent) aren't supported resource types for export. For any resource that wasn't exported, such as virtual machine extensions, you need to recreate those resources in your new Bicep file. You can recreate resources using various tools and approaches, including [Azure Resource Explorer](../templates/export-template-portal.md?azure-portal=true), the [Bicep and ARM template reference documentation](/azure/templates/?azure-portal=true), and the [Azure Quickstart Templates](https://azure.microsoft.com/resources/templates?azure-portal=true) site. ## Phase 3: Refactor The deploy phase consists of eight steps, which you complete in any order: 1. **Review the linter suggestions in your new Bicep file.** When you use the [Bicep extension for Visual Studio Code](https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-bicep&azure-portal=true) to create Bicep files, the [Bicep linter](linter.md) runs automatically and highlights suggestions and errors in your code. Many of the suggestions and errors include an option to apply a quick fix of the issue. Review these recommendations and adjust your Bicep file. -1. **Revise parameters, variables, and symbolic names.** It's possible the names of parameters, variables, and symbolic names generated by the decompiler won't match your standard naming convention. Review the generated names and make adjustments as necessary. +1. **Revise parameters, variables, and symbolic names.** It's possible the names of parameters, variables, and symbolic names generated by the decompiler don't match your standard naming convention. Review the generated names and make adjustments as necessary. 1. **Simplify expressions.** The decompile process may not always take advantage of some of Bicep's features. Review any expressions generated in the conversion and simplify them. For example, the decompiled template may include a `concat()` or `format()` function that could be simplified by using [string interpolation](bicep-functions-string.md#concat). Review any suggestions from the linter and make adjustments as necessary. In the _deploy_ phase of migrating your resources to Bicep, the goal is to deplo The deploy phase consists of four steps, which you complete in sequence: -1. **Prepare a rollback plan.** The ability to recover from a failed deployment is crucial. Create a rollback strategy in the event that any breaking changes are introduced into your environments. Take inventory of the types of resources that are deployed, such as virtual machines, web apps, and databases. Each resource's data plane should be considered as well. Do you have a way to recover a virtual machine and its data? Do you have a way to recover a database after deletion? A well-developed rollback plan will help to keep your downtime to a minimum if any issues arise from a deployment. +1. **Prepare a rollback plan.** The ability to recover from a failed deployment is crucial. Create a rollback strategy if any breaking changes are introduced into your environments. Take inventory of the types of resources that are deployed, such as virtual machines, web apps, and databases. Each resource's data plane should be considered as well. Do you have a way to recover a virtual machine and its data? Do you have a way to recover a database after deletion? A well-developed rollback plan helps to keep your downtime to a minimum if any issues arise from a deployment. 1. **Run the what-if operation against production.** Before deploying your final Bicep file to production, run the what-if operation against your production environment, making sure to use production parameter values, and consider documenting the results. 1. **Deploy manually.** If you're going to use the converted template in a pipeline, such as [Azure DevOps](add-template-to-azure-pipelines.md) or [GitHub Actions](deploy-github-actions.md), consider running the deployment from your local machine first. It's preferable to test the template's functionality before incorporating it into your production pipeline. That way, you can respond quickly if there's a problem. -1. **Run smoke tests.** After your deployment is complete, you should run a series of *smoke tests* to ensure that your application or workload is working properly. For example, test to see if your web app is accessible through normal access channels, such as the public Internet or across a corporate VPN. For databases, attempt to make a database connection and execute a series of queries. With virtual machines, log in to the virtual machine and make sure that all services are up and running. +1. **Run smoke tests.** After your deployment is complete, you should run a series of *smoke tests* to ensure that your application or workload is working properly. For example, test to see if your web app is accessible through normal access channels, such as the public Internet or across a corporate VPN. For databases, attempt to make a database connection and execute a series of queries. With virtual machines, sign in to the virtual machine and make sure that all services are up and running. ## Next steps |
azure-resource-manager | Visual Studio Code | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/bicep/visual-studio-code.md | Title: Create Bicep files by using Visual Studio Code description: Describes how to create Bicep files by using Visual Studio Code Previously updated : 02/21/2023 Last updated : 03/03/2023 # Create Bicep files by using Visual Studio Code To set up your environment for Bicep development, see [Install Bicep tools](inst Visual Studio Code comes with several Bicep commands. -Open or create a Bicep file in VS Code, select the **View** menu and then select **Command Palette**. You can also use the key combination **[CTRL]+[SHIFT]+P** to bring up the command palette. Type **Bicep** to list the Bicep commands. +Open or create a Bicep file in VS Code, select the **View** menu and then select **Command Palette**. You can also use **F1** or the key combination <kbd>Ctrl+Shift+P</kbd> to bring up the command palette. Type **Bicep** to list the Bicep commands. :::image type="content" source="./media/visual-studio-code/visual-studio-code-bicep-commands.png" alt-text="Screenshot of Visual Studio Code Bicep commands in the command palette."::: The [Bicep configuration file (bicepconfig.json)](./bicep-config.md) can be used To create a Bicep configuration file: 1. Open Visual Studio Code.-1. From the **View** menu, select **Command Palette** (or press **[CTRL/CMD]**+**[SHIFT]**+**P**), and then select **Bicep: Create Bicep Configuration File**. +1. From the **View** menu, select **Command Palette** (or press <kbd>Ctrl/Cmd+Shift+P</kbd>), and then select **Bicep: Create Bicep Configuration File**. 1. Select the file directory where you want to place the file. 1. Save the configuration file when you're done. From Visual Studio Code, you can easily open the template reference for the reso :::image type="content" source="./media/visual-studio-code/visual-studio-code-bicep-view-type-document.png" alt-text="Screenshot of Visual Studio Code Bicep view type document."::: +## Paste as Bicep (Preview) ++You can paste a JSON snippet from an ARM template to Bicep file. Visual Studio Code automatically decompiles the JSON to Bicep. This feature is only available with the Bicep extension version 0.14.0 or newer. ++To enable the feature: ++1. In Visual Studio Code, select **Manage** (gear icon) in the side menu. Select **Settings**. You can also use <kbd>Ctrl+,</kbd> to open settings. +1. Expand **Extensions** and then select **Bicep**. +1. Select **Decompile on Paste**. ++ :::image type="content" source="./media/visual-studio-code/enable-paste-json.png" alt-text="Screenshot of Visual Studio Code Paste as Bicep."::: ++By using this feature, you can paste: ++- Full ARM JSON templates. +- Single resource or multiple resources. +- JSON values, such as objects, arrays, and strings. A string with double-quotes is converted to single-quotes. ++For example, you can start with the following Bicep file: ++```bicep +@description('Storage Account type') +@allowed([ + 'Standard_LRS' + 'Standard_GRS' + 'Standard_ZRS' + 'Premium_LRS' +]) +param storageAccountsku string = 'Standard_LRS' ++@description('Location for all resources.') +param location string = resourceGroup().location ++var storageAccountName = '${uniqueString(resourceGroup().id)}storage' ++resource storageAccount 'Microsoft.Storage/storageAccounts@2021-08-01' = { + name: storageAccountName + location: location + sku: { + name: storageAccountsku + } + kind: 'StorageV2' + tags: { + ObjectName: storageAccountName + } + properties: {} +} ++output storageAccountName string = storageAccountName +``` ++And, paste the following JSON: ++```json +{ + "type": "Microsoft.Batch/batchAccounts", + "apiVersion": "2021-06-01", + "name": "[parameters('batchAccountName')]", + "location": "[parameters('location')]", + "tags": { + "ObjectName": "[parameters('batchAccountName')]" + }, + "properties": { + "autoStorage": { + "storageAccountId": "[resourceId('Microsoft.Storage/storageAccounts', variables('storageAccountName'))]" + } + } +} +``` ++Visual Studio Code automatically converts the JSON to Bicep. Notice that you also need to add the parameter named `batchAccountName`. ++You can undo the decompilation by using <kbd>Ctrl+Z</kbd>. The original JSON appears in the file. + ## Next steps To walk through a quickstart, see [Quickstart: Create Bicep files with Visual Studio Code](./quickstart-create-bicep-use-visual-studio-code.md). |
azure-resource-manager | Networking Move Limitations | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/management/move-limitations/networking-move-limitations.md | To move a peered virtual network, you must first disable the virtual network pee ## VPN Gateways -You cannot move VPN Gateways across subscriptions if they are of Basic SKU. Basic SKU is only meant for test environment usage and doesn't support resource move operation. +You cannot move VPN Gateways across resource groups or subscriptions if they are of Basic SKU. Basic SKU is only meant for test environment usage and doesn't support resource move operation. ## Subnet links |
azure-resource-manager | Move Support Resources | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/management/move-support-resources.md | Before starting your move operation, review the [checklist](./move-resource-grou > | trafficmanagerprofiles / heatmaps | No | No | No | > | trafficmanagerusermetricskeys | No | No | No | > | virtualhubs | No | No | No |-> | virtualnetworkgateways | **Yes** | **Yes** - see [Networking move guidance](./move-limitations/networking-move-limitations.md) | No | +> | virtualnetworkgateways | **Yes** except Basic SKU - see [Networking move guidance](./move-limitations/networking-move-limitations.md)| **Yes** except Basic SKU - see [Networking move guidance](./move-limitations/networking-move-limitations.md) | No | > | virtualnetworks | **Yes** | **Yes** | No | > | virtualnetworktaps | No | No | No | > | virtualrouters | **Yes** | **Yes** | No | |
azure-signalr | Signalr Quickstart Azure Functions Python | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-signalr/signalr-quickstart-azure-functions-python.md | This quickstart can be run on macOS, Windows, or Linux. { "type": "http", "direction": "out",- "name": "res" + "name": "$return" } ] } |
azure-web-pubsub | Tutorial Pub Sub Messages | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-web-pubsub/tutorial-pub-sub-messages.md | In this tutorial, you learn how to: [!INCLUDE [azure-web-pubsub-tutorial-prerequisites](includes/cli-awps-prerequisites.md)] +You can use the Windows cmd.exe command shell instead of a Bash shell to run the commands in this tutorial. + If creating the project on a local machine, you'll need to install the dependencies for the language you're using: # [C#](#tab/csharp) If creating the project on a local machine, you'll need to install the dependenc -## Setup +## Prepare your environment # [Local Azure CLI](#tab/LocalBash) If creating the project on a local machine, you'll need to install the dependenc -### Create a resource group +## Create a resource group [!INCLUDE [Create a resource group](includes/cli-rg-creation.md)] If creating the project on a local machine, you'll need to install the dependenc ### Create a Web PubSub instance +Use the Azure CLI [az webpubsub create](/cli/azure/webpubsub#az-webpubsub-create) command to create a Web PubSub in the resource group you've created. The following command creates a _Free_ Web PubSub resource under resource group `myResourceGroup` in `EastUS`: ++Each Web PubSub resource must have a unique name. Replace <your-unique-resource-name> with the name of your Web PubSub instance in the following command. ++```azurecli +az webpubsub create --resource-group myResourceGroup --name <your-unique-resource-name> --location EastUS --sku Free_F1 +``` ++The output of this command shows properties of the newly created resource. Take note of the two properties listed below: ++* **name**: The Web PubSub name you provided in the `--name` parameter above. +* **hostName**: In the example, the host name is `<your-unique-resource-name>.webpubsub.azure.com/`. ++At this point, your Azure account is the only one authorized to perform any operations on this new resource. ### Get the connection string Clients connect to the Azure Web PubSub service through the standard WebSocket p 1. First, create a project directory named `subscriber` for this project and install required dependencies: - * The package [Websocket.Client](https://github.com/Marfusios/websocket-client) is a third-party package supporting WebSocket connection. You can use any API/library that supports WebSocket to do so. + * The package [Websocket.Client](https://github.com/Marfusios/websocket-client) is a third-party package supporting WebSocket connections. You can use any API/library that supports WebSocket. * The SDK package `Azure.Messaging.WebPubSub` helps to generate the JWT token. ```bash Clients connect to the Azure Web PubSub service through the standard WebSocket p After the connection is established, your client receives messages through the WebSocket connection. The client uses `client.MessageReceived.Subscribe(msg => ...));` to listen for incoming messages. -1. Run the following command: +1. To start the subscriber, run the following command replacing `<Web-PubSub-connection-string>` with the connection string you copied earlier: ```bash- dotnet run $connection_string "myHub1" + dotnet run <Web-PubSub-connection-string> "myHub1" ``` # [JavaScript](#tab/javascript) Clients connect to the Azure Web PubSub service through the standard WebSocket p After the connection is established, your client receives messages through the WebSocket connection. The client uses `client.MessageReceived.Subscribe(msg => ...));` to listen for incoming messages. -1. Run the following command: +1. Run the following command replacing `<Web-PubSub-connection-string>` with the connection string you copied earlier. If you are using Windows command shell, you can use `set` instead of `export`. ```bash- export WebPubSubConnectionString=$connection_string - node subscribe + export WebPubSubConnectionString=<Web-PubSub-connection-string> + node subscribe.js ``` # [Python](#tab/python) Clients connect to the Azure Web PubSub service through the standard WebSocket p After the connection is established, your client will receive messages through the WebSocket connection. Use `await ws.recv()` to listen for incoming messages. -1. Run the following command: +1. Run the following command replacing `<Web-PubSub-connection-string>` with the connection string you copied earlier: ```bash- python subscribe.py $connection_string "myHub1" + python subscribe.py <Web-PubSub-connection-string> "myHub1" ``` # [Java](#tab/java) Clients connect to the Azure Web PubSub service through the standard WebSocket p 1. Inside the `pubsub` directory, use Maven to create a new console app called `webpubsub-quickstart-subscriber`, then go to the *webpubsub-quickstart-subscriber* directory: - ```console + ```bash mvn archetype:generate --define interactiveMode=n --define groupId=com.webpubsub.quickstart --define artifactId=webpubsub-quickstart-subscriber --define archetypeArtifactId=maven-archetype-quickstart --define archetypeVersion=1.4 cd webpubsub-quickstart-subscriber ``` Clients connect to the Azure Web PubSub service through the standard WebSocket p After connection is established, your client will receive messages through the WebSocket connection. Use `onMessage(String message)` to listen for incoming messages. -1. Navigate to the *webpubsub-quickstart-subscriber* directory and run the app with following command: +1. To start the subscriber app, go to the *webpubsub-quickstart-subscriber* directory and run the following command. Replace `<Web-PubSub-connection-string>` with the connection string you copied earlier. - ```console - mvn compile & mvn package & mvn exec:java -Dexec.mainClass="com.webpubsub.quickstart.App" -Dexec.cleanupDaemonThreads=false -Dexec.args="$connection_string 'myHub1'" + ```bash + mvn compile & mvn package & mvn exec:java -Dexec.mainClass="com.webpubsub.quickstart.App" -Dexec.cleanupDaemonThreads=false -Dexec.args="<Web-PubSub-connection-string> 'myHub1'" ``` Create a publisher using the Azure Web PubSub SDK to publish a message to the co The `SendToAllAsync()` call simply sends a message to all connected clients in the hub. -1. Send a message by running the command: +1. Send a message by running the following command. Replace `<Web-PubSub-connection-string>` with the connection string you copied earlier. ```bash- dotnet run $connection_string "myHub1" "Hello World" + dotnet run <Web-PubSub-connection-string> "myHub1" "Hello World" ``` -1. Check the command shell of the previous subscriber to see that it received the message: +1. Check the command shell of the subscriber to see that it received the message: - ```text + ```console Message received: Hello World ``` Create a publisher using the Azure Web PubSub SDK to publish a message to the co The `service.sendToAll()` call simply sends a message to all connected clients in a hub. -1. Send a message by running the command: +1. To send a message, run the following command replacing `<Web-PubSub-connection-string>` with the connection string you copied earlier. If you are using Windows command shell, you can use `set` instead of `export`. ```bash- export WebPubSubConnectionString=$connection_string + export WebPubSubConnectionString=<Web-PubSub-connection-string> node publish "Hello World" ``` -1. You can see that the previous subscriber received the message: +1. You can see that the subscriber received the message: - ```text + ```console Message received: Hello World ``` Create a publisher using the Azure Web PubSub SDK to publish a message to the co The `send_to_all()` send the message to all connected clients in a hub. -1. Run the following command: +1. To send a message, run the following command replacing `<Web-PubSub-connection-string>` with the connection string you copied earlier. ```bash- python publish.py $connection_string "myHub1" "Hello World" + python publish.py <Web-PubSub-connection-string> "myHub1" "Hello World" ``` 1. Check the previous command shell to that the subscriber received the message: - ```text + ```console Received message: Hello World ``` Create a publisher using the Azure Web PubSub SDK to publish a message to the co 1. Go to the `pubsub` directory. Use Maven to create a publisher console app `webpubsub-quickstart-publisher` and go to the *webpubsub-quickstart-publisher* directory: - ```console + ```bash mvn archetype:generate --define interactiveMode=n --define groupId=com.webpubsub.quickstart --define artifactId=webpubsub-quickstart-publisher --define archetypeArtifactId=maven-archetype-quickstart --define archetypeVersion=1.4 cd webpubsub-quickstart-publisher ``` Create a publisher using the Azure Web PubSub SDK to publish a message to the co The `sendToAll()` call sends a message to all connected clients in a hub. -1. Go to the *webpubsub-quickstart-publisher* directory and run the project using the following command: +1. To send a message, go to the *webpubsub-quickstart-publisher* directory and run the project using the following command. Replace the `<Web-PubSub-connection-string>` with the connection string you copied earlier. - ```console - mvn compile & mvn package & mvn exec:java -Dexec.mainClass="com.webpubsub.quickstart.App" -Dexec.cleanupDaemonThreads=false -Dexec.args="$connection_string 'myHub1' 'Hello World'" + ```bash + mvn compile & mvn package & mvn exec:java -Dexec.mainClass="com.webpubsub.quickstart.App" -Dexec.cleanupDaemonThreads=false -Dexec.args="<Web-PubSub-connection-string> 'myHub1' 'Hello World'" ``` -1. You can see that the previous subscriber received the message: +1. You can see that the subscriber received the message: - ```text + ```console Message received: Hello World ``` az group delete --name <CloudShellResourceGroup> --yes ## Next steps -This tutorial provides you a basic idea of how to connect to the Web PubSub service and publish messages to the connected clients. +This tutorial provides you with a basic idea of how to connect to the Web PubSub service and publish messages to the connected clients. Check other tutorials to dive further into how to use the service. |
backup | Backup Azure Integrate Microsoft Defender Using Logic Apps | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/backup-azure-integrate-microsoft-defender-using-logic-apps.md | To deploy Azure Logic Apps, follow these steps: - Security Reader >[!Note]- >To further tighten the security, we recommend you create a custom role and assign that to the Managed Identity instead of the above built-in roles. This ensures that all the calls run with least privileges. For more information on custom role, see the [Github article](https://github.com/Azure/Microsoft-Defender-for-Cloud/tree/main/Workflow%20automation/Protect%20Azure%20VM%20Backup%20from%20Ransomware). + >To further tighten the security, we recommend you create a custom role and assign that to the Managed Identity instead of the above built-in roles. This ensures that all the calls run with least privileges. For more information on custom role, see the [GitHub article](https://github.com/Azure/Microsoft-Defender-for-Cloud/tree/main/Workflow%20automation/Protect%20Azure%20VM%20Backup%20from%20Ransomware). - **Managed Identity Subscription**: Enter the name of a Subscription that the Managed Identity should reside in. - **Managed Identity Resource Group**: Enter the name of a resource group that the Managed Identity should reside in. |
backup | Backup Azure Restore Files From Vm | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/backup-azure-restore-files-from-vm.md | To restore files or folders from the recovery point, go to the virtual machine a  > [!IMPORTANT]-> Users should note the performance limitations of this feature. As pointed out in the footnote section of the above blade, this feature should be used when the total size of recovery is not beyond 10 GB and you could get data transfer speeds of around 1 GB per hour +> Users should note the performance limitations of this feature. As pointed out in the footnote section of the above blade, this feature should be used when the total size of recovery is 10 GB or less. The expected data transfer speeds are around 1 GB per hour. 4. From the **Select recovery point** drop-down menu, select the recovery point that holds the files you want. By default, the latest recovery point is already selected. |
certification | Program Requirements Edge Secured Core | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/certification/program-requirements-edge-secured-core.md | -zone_pivot_groups: app-service-platform-windows-linux +zone_pivot_groups: app-service-platform-windows-linux-sphere-rtos # Azure Certified Device - Edge Secured-core # zone_pivot_groups: app-service-platform-windows-linux ## Edge Secured-Core certification requirements ## ### Program purpose ###-Edge Secured-core is an incremental certification in the Azure Certified Device program for IoT devices running a full operating system, such as Linux or Windows 10 IoT.This program enables device partners to differentiate their devices by meeting an additional set of security criteria. Devices meeting this criteria enable these promises: +Edge Secured-core is an incremental certification in the Azure Certified Device program for IoT devices running a full operating system, such as Linux, Windows 10 IoT or Azure Sphere OS. This program enables device partners to differentiate their devices by meeting an additional set of security criteria. Devices meeting this criteria enable these promises: 1. Hardware-based device identity 2. Capable of enforcing system integrity Edge Secured-core is an incremental certification in the Azure Certified Device 5. Provides data in-transit protection 6. Built in security agent and hardening -## Preview Program Support -While in public preview, we are supporting a small number of partners to pre-validate devices against the Edge Secured-core program requirements. If you would like to participate in the Edge Secured-core public preview, please contact iotcert@microsoft.com -Overview content ::: zone pivot="platform-windows" ## Windows IoT OS Support Edge Secured-core for Windows IoT requires Windows 10 IoT Enterprise version 190 </br> -|Name|SecuredCore.Hardware.Identity| -|:|:| -|Status|Required| -|Description|The purpose of the test is to validate the device identity is rooted in hardware and can be the primary authentication method with Azure IoT Hub Device Provisioning Service (DPS).| -|Target Availability|2022| +|Name|SecuredCore.Hardware.Identity|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|2023|2024| +|Description|The purpose of the requirement is to validate the device identity is rooted in hardware and can be the primary authentication method with Azure IoT Hub Device Provisioning Service (DPS).| |Requirements dependency|TPM v2.0 device| |Validation Type|Manual/Tools|-|Validation|Devices will be enrolled to DPS using the TPM authentication mechanism during testing.| +|Validation|Devices are enrolled to DPS using the TPM authentication mechanism during testing.| |Resources|Azure IoT Hub Device Provisioning Service: <ul><li>[Quickstart - Provision a simulated TPM device to Microsoft Azure IoT Hub](../iot-dps/quick-create-simulated-device-tpm.md) </li><li>[TPM Attestation Concepts](../iot-dps/concepts-tpm-attestation.md)</li></ul>| </br> -|Name|SecuredCore.Hardware.MemoryProtection| -|:|:| -|Status|Required| -|Description|The purpose of the test is to validate that DMA is not enabled on externally accessible ports.| -|Target Availability|2022| +|Name|SecuredCore.Hardware.MemoryProtection|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|2023|2024| +|Description|The purpose of the requirement is to validate that DMA isn't enabled on externally accessible ports.| |Requirements dependency|Only if DMA capable ports exist| |Validation Type|Manual/Tools|-|Validation|If DMA capable external ports exist on the device, toolset to validate that the IOMMU or SMMU is enabled and configured for those ports.| -|Resources|| +|Validation|If DMA capable external ports exist on the device, toolset to validate that the IOMMU, or SMMU is enabled and configured for those ports.| + </br> -|Name|SecuredCore.Firmware.Protection| -|:|:| -|Status|Required| -|Description|The purpose of the test is to ensure that device has adequate mitigations from Firmware security threats.| -|Target Availability|2022| +|Name|SecuredCore.Firmware.Protection|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|2023|2024| +|Description|The purpose of the requirement is to ensure that device has adequate mitigations from Firmware security threats.| |Requirements dependency|DRTM + UEFI| |Validation Type|Manual/Tools|-|Validation|Device to be validated through [Edge Secured-core Agent](https://aka.ms/Scforwiniot) toolset to confirm it is protected from firmware security threats through one of the following approaches: <ul><li>DRTM + UEFI Management Mode mitigations</li><li>DRTM + UEFI Management Mode hardening</li></ul> | +|Validation|Device to be validated through [Edge Secured-core Agent](https://aka.ms/Scforwiniot) toolset to confirm it's protected from firmware security threats through one of the following approaches: <ul><li>DRTM + UEFI Management Mode mitigations</li><li>DRTM + UEFI Management Mode hardening</li></ul> | |Resources| <ul><li>https://trustedcomputinggroup.org/</li><li>[Intel's DRTM based computing whitepaper](https://www.intel.com/content/dam/www/central-libraries/us/en/documents/drtm-based-computing-whitepaper.pdf)</li><li>[AMD Security whitepaper](https://www.amd.com/system/files/documents/amd-security-white-paper.pdf)</li></ul> | </br> -|Name|SecuredCore.Firmware.SecureBoot| -|:|:| -|Status|Required| -|Description|The purpose of the test is to validate the boot integrity of the device.| -|Target Availability|2022| +|Name|SecuredCore.Firmware.SecureBoot|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|2023|2024| +|Description|The purpose of the requirement is to validate the boot integrity of the device.| |Requirements dependency|UEFI| |Validation Type|Manual/Tools| |Validation|Device to be validated through [Edge Secured-core Agent](https://aka.ms/Scforwiniot) toolset to ensure that firmware and kernel signatures are validated every time the device boots. <ul><li>UEFI: Secure boot is enabled</li></ul>|-|Resources|| + </br> -|Name|SecuredCore.Firmware.Attestation| -|:|:| -|Status|Required| -|Description|The purpose of the test is to ensure the device can remotely attest to the Microsoft Azure Attestation service.| -|Target Availability|2022| +|Name|SecuredCore.Firmware.Attestation|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|2023|2024| +|Description|The purpose of the requirement is to ensure the device can remotely attest to the Microsoft Azure Attestation service.| |Requirements dependency|Azure Attestation Service| |Validation Type|Manual/Tools| |Validation|Device to be validated through toolset to ensure that platform boot logs and measurements of boot activity can be collected and remotely attested to the Microsoft Azure Attestation service.| Edge Secured-core for Windows IoT requires Windows 10 IoT Enterprise version 190 </br> -|Name|SecuredCore.Encryption.Storage| -|:|:| -|Status|Required| -|Description|The purpose of the test to validate that sensitive data can be encrypted on non-volatile storage.| -|Target Availability|2022| +|Name|SecuredCore.Encryption.Storage|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|2023|2024| +|Description|The purpose of the requirement to validate that sensitive data can be encrypted on nonvolatile storage.| |Validation Type|Manual/Tools| |Validation|Device to be validated through [Edge Secured-core Agent](https://aka.ms/Scforwiniot) toolset to ensure Secure-boot and BitLocker is enabled and bound to PCR7.|-|Resources|| + </br> -|Name|SecuredCore.Encryption.TLS| -|:|:| -|Status|Required| -|Description|The purpose of the test is to validate support for required TLS versions and cipher suites.| -|Target Availability|2022| +|Name|SecuredCore.Encryption.TLS|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|2023|2024| +|Description|The purpose of the requirement is to validate support for required TLS versions and cipher suites.| |Requirements dependency|Windows 10 IoT Enterprise Version 1903 or greater. Note: other requirements may require greater versions for other services. | |Validation Type|Manual/Tools| Validation|Device to be validated through toolset to ensure the device supports a minimum TLS version of 1.2 and supports the following required TLS cipher suites.<ul><li>TLS_RSA_WITH_AES_128_GCM_SHA256</li><li>TLS_RSA_WITH_AES_128_CBC_SHA256</li><li>TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256</li><li>TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256</li><li>TLS_DHE_RSA_WITH_AES_128_GCM_SHA256</li><li>TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256</li><li>TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256</li></ul>| Validation|Device to be validated through toolset to ensure the device supports </br> -|Name|SecuredCore.Protection.CodeIntegrity| -|:|:| -|Status|Required| -|Description|The purpose of this test is to validate that code integrity is available on this device.| -|Target Availability|2022| +|Name|SecuredCore.Protection.CodeIntegrity|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|2023|2024| +|Description|The purpose of this requirement is to validate that code integrity is available on this device.| |Requirements dependency|HVCI is enabled on the device.| |Validation Type|Manual/Tools| |Validation|Device to be validated through [Edge Secured-core Agent](https://aka.ms/Scforwiniot) toolset to ensure that HVCI is enabled on the device.| Validation|Device to be validated through toolset to ensure the device supports </br> -|Name|SecuredCore.Protection.NetworkServices| -|:|:| -|Status|Required| -|Description|The purpose of the test is to validate that services listening for input from the network are not running with elevated privileges.| -|Target Availability|2022| +|Name|SecuredCore.Protection.NetworkServices|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|2023|2024| +|Description|The purpose of the requirement is to validate that services listening for input from the network aren't running with elevated privileges.| |Validation Type|Manual/Tools|-|Validation|Device to be validated through [Edge Secured-core Agent](https://aka.ms/Scforwiniot) toolset to ensure that 3rd party services accepting network connections are not running with elevated LocalSystem and LocalService privileges. <ol><li>Exceptions may apply</li></ol>| -|Resources|| +|Validation|Device to be validated through [Edge Secured-core Agent](https://aka.ms/Scforwiniot) toolset to ensure that third party services accepting network connections aren't running with elevated LocalSystem and LocalService privileges. <ol><li>Exceptions may apply</li></ol>| + Validation|Device to be validated through toolset to ensure the device supports </br> -|Name|SecuredCore.Built-in.Security| -|:|:| -|Status|Coming Soon June 2022| -|Description|The purpose of the test is to make sure devices can report security information and events by sending data to Azure Defender for IoT. <br>Note: Download and deploy security agent from GitHub| +|Name|SecuredCore.Built-in.Security|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|Future|Future| +|Description|The purpose of the requirement is to make sure devices can report security information and events by sending data to Azure Defender for IoT. <br>Note: Download and deploy security agent from GitHub| |Target Availability|2022| |Validation Type|Manual/Tools| |Validation |Device must generate security logs and alerts. Device logs and alerts messages to Azure Security Center.<ol><li>Device must have the Azure Defender microagent running</li><li>Configuration_Certification_Check must report TRUE in the module twin</li><li>Validate alert messages from Azure Defender for IoT.</li></ol>| Validation|Device to be validated through toolset to ensure the device supports </br> -|Name|SecuredCore.Protection.Baselines| -|:|:| -|Status|Coming Soon June 2022| -|Description|The purpose of the test is to validate that the system conforms to a baseline security configuration.| +|Name|SecuredCore.Protection.Baselines|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|Future|Future| +|Description|The purpose of the requirement is to validate that the system conforms to a baseline security configuration.| |Target Availability|2022| |Requirements dependency|Azure Defender for IoT| |Validation Type|Manual/Tools| Validation|Device to be validated through toolset to ensure the device supports ## Windows IoT Policy Requirements -Some requirements of this program are based on a business agreement between your company and Microsoft. The following requirements are not validated through our test harness, but are required by your company in certifying the device. +Some requirements of this program are based on a business agreement between your company and Microsoft. The following requirements aren't validated through our test harness, but are required by your company in certifying the device. </br> Some requirements of this program are based on a business agreement between your |Name|SecuredCore.Policy.Protection.Debug| |:|:| |Status|Required|-|Description|The purpose of the test is to validate that debug functionality on the device is disabled.| -|Target Availability|2022| +|Description|The purpose of the requirement is to validate that debug functionality on the device is disabled.| |Requirements dependency|| |Validation Type|Manual/Tools| |Validation|Device to be validated through toolset to ensure that debug functionality requires authorization to enable.|-|Resources|| + </br> Some requirements of this program are based on a business agreement between your |Name|SecuredCore.Policy.Manageability.Reset| |:|:| |Status|Required|-|Description|The purpose of this test is to validate the device against two use cases: a) Ability to perform a reset (remove user data, remove user configs), b) Restore device to last known good in the case of an update causing issues.| -|Target Availability|2022| +|Description|The purpose of this requirement is to validate the device against two use cases: a) Ability to perform a reset (remove user data, remove user configs), b) Restore device to last known good in the case of an update causing issues.| |Requirements dependency|| |Validation Type|Manual/Tools| |Validation|Device to be validated through a combination of toolset and submitted documentation that the device supports this functionality. The device manufacturer can determine whether to implement these capabilities to support remote reset or only local reset.|-|Resources|| + </br> Some requirements of this program are based on a business agreement between your |:|:| |Status|Required| |Description|The purpose of this policy is to ensure that the device remains secure.|-|Target Availability|2022| |Validation Type|Manual|-|Validation|Commitment from submission that devices certified will be required to keep devices up to date for 60 months from date of submission. Specifications available to the purchaser and devices itself in some manner should indicate the duration for which their software will be updated.| -|Resources|| +|Validation|Commitment from submission that devices certified can be kept up to date for 60 months from date of submission. Specifications available to the purchaser and devices itself in some manner should indicate the duration for which their software will be updated.| + </br> Some requirements of this program are based on a business agreement between your |Name|SecuredCore.Policy.Vuln.Disclosure| |:|:| |Status|Required|-|Description|The purpose of this policy is to ensure that there is a mechanism for collecting and distributing reports of vulnerabilities in the product.| -|Target Availability|2022| +|Description|The purpose of this policy is to ensure that there's a mechanism for collecting and distributing reports of vulnerabilities in the product.| |Validation Type|Manual| |Validation|Documentation on the process for submitting and receiving vulnerability reports for the certified devices will be reviewed.|-|Resources|| + </br> Some requirements of this program are based on a business agreement between your |:|:| |Status|Required| |Description|The purpose of this policy is to ensure that vulnerabilities that are high/critical (using CVSS 3.0) are addressed within 180 days of the fix being available.|-|Target Availability|2022| |Validation Type|Manual| |Validation|Documentation on the process for submitting and receiving vulnerability reports for the certified devices will be reviewed.|-|Resources|| + </br> Some requirements of this program are based on a business agreement between your ## Linux OS Support OS Support is determined through underlying requirements of Azure services and our ability to validate scenarios. -The Edge Secured-core program for Linux is enabled through the IoT Edge runtime which is supported based on [Tier 1 and Tier 2 operating systems](../iot-edge/support.md). +The Edge Secured-core program for Linux is enabled through the IoT Edge runtime, which is supported based on [Tier 1 and Tier 2 operating systems](../iot-edge/support.md). ## IoT Edge Edge Secured-core validation on Linux based devices is executed through a container run on the IoT Edge runtime. For this reason, all devices that are certifying Edge Secured-core must have the IoT Edge runtime installed. Edge Secured-core validation on Linux based devices is executed through a contai > * Hardware must support TPM v2.0, SRTM, Secure-boot or UBoot. > * Firmware will be submitted to Microsoft for vulnerability and configuration evaluation. + -|Name|SecuredCore.Hardware.Identity| -|:|:| -|Status|Required| -|Description|The purpose of the test is to validate the device identify is rooted in hardware.| -|Target Availability|2022| -|Requirements dependency|TPM v2.0 device| -|Validation Type|Manual/Tools| -|Validation|Device to be validated through toolset to ensure that the device has a TPM present and that it can be provisioned through DPS using TPM endorsement key.| -|Resources|[Setup auto provisioning with DPS](../iot-dps/quick-setup-auto-provision.md)| +|Name|SecuredCore.Hardware.Identity|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|2023|2023| +|Description|The purpose of the requirement is to validate the device identify is rooted in hardware.||| +|Requirements dependency||TPM v2.0 device|TPM v2.0 </br><sup>or *other supported method</sup>| +|Validation Type|Manual/Tools||| +|Validation|Device to be validated through toolset to ensure that the device has a HWRoT present and that it can be provisioned through DPS using TPM or SE.||| +|Resources|[Setup auto provisioning with DPS](../iot-dps/quick-setup-auto-provision.md)||| </br> -|Name|SecuredCore.Hardware.MemoryProtection| -|:|:| -|Status|Required| -|Description|The purpose of the test is to validate that DMA is not enabled on externally accessible ports.| -|Target Availability|2022| +|Name|SecuredCore.Hardware.MemoryProtection|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|2023|2023| +|Description|The purpose of the requirement is to validate ensure that memory integrity helps protect the device from vulnerable peripherals.| |Validation Type|Manual/Tools|-|Validation|If DMA capable external ports exist on the device, toolset to validate that the IOMMU or SMMU is enabled and configured for those ports.| -|Resources|| +|Validation|memory regions for peripherals must be gated with hardware/firmware such as memory region domain controllers or SMMU (System memory management Unit).| + </br> -|Name|SecuredCore.Firmware.Protection| -|:|:| -|Status|Required| -|Description|The purpose of the test is to ensure that device has adequate mitigations from Firmware security threats.| -|Target Availability|2022| +|Name|SecuredCore.Firmware.Protection|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|2023|2023| +|Description|The purpose of the requirement is to ensure that device has adequate mitigations from Firmware security threats.| |Validation Type|Manual/Tools|-|Validation|Device to be validated through toolset to confirm it is protected from firmware security threats through one of the following approaches: <ul><li>Approved FW that does SRTM + runtime firmware hardening</li><li>Firmware scanning and evaluation by approved Microsoft 3rd party</li></ul> | +|Validation|Device to be validated through toolset to confirm it's protected from firmware security threats through one of the following approaches: <ul><li>Approved FW that does SRTM + runtime firmware hardening</li><li>Firmware scanning and evaluation by approved Microsoft third party</li></ul> | |Resources| https://trustedcomputinggroup.org/ | </br> -|Name|SecuredCore.Firmware.SecureBoot| -|:|:| -|Status|Required| -|Description|The purpose of the test is to validate the boot integrity of the device.| -|Target Availability|2022| +|Name|SecuredCore.Firmware.SecureBoot|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|2023|2023| +|Description|The purpose of the requirement is to validate the boot integrity of the device.| |Validation Type|Manual/Tools| |Validation|Device to be validated through toolset to ensure that firmware and kernel signatures are validated every time the device boots. <ul><li>UEFI: Secure boot is enabled</li><li>Uboot: Verified boot is enabled</li></ul>|-|Resources|| + </br> -|Name|SecuredCore.Firmware.Attestation| -|:|:| -|Status|Required| -|Description|The purpose of the test is to ensure the device can remotely attest to the Microsoft Azure Attestation service.| -|Target Availability|2022| +|Name|SecuredCore.Firmware.Attestation|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|2023|2023| +|Description|The purpose of the requirement is to ensure the device can remotely attest to the Microsoft Azure Attestation service.| +|Dependency||TPM 2.0|TPM 2.0 </br><sup>or *supported OP-TEE based application chained to a HWRoT (Secure Element or Secure Enclave)</sup>| |Validation Type|Manual/Tools|-|Validation|Device to be validated through toolset to ensure that platform boot logs and measurements of boot activity can be collected and remotely attested to the Microsoft Azure Attestation service.| -|Resources| [Microsoft Azure Attestation](../attestation/index.yml) | +|Validation|Device to be validated through toolset to ensure that platform boot logs and applicable runtime measurements can be collected and remotely attested to the Microsoft Azure Attestation service.| +|Resources| [Microsoft Azure Attestation](../attestation/index.yml) </br> Certification portal test includes an attestation client that when combined with the TPM 2.0 can validate the Microsoft Azure Attestation service.| </br> -|Name|SecuredCore.Hardware.SecureEnclave| -|:|:| -|Status|Optional| -|Description|The purpose of the test to validate the existence of a secure enclave and that the enclave is accessible from a secure agent.| -|Target Availability|2022| +|Name|SecuredCore.Hardware.SecureEnclave|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|Future|Future| +|Description|The purpose of the requirement to validate the existence of a secure enclave and that the enclave can be used for security functions.| |Validation Type|Manual/Tools|-|Validation|Device to be validated through toolset to ensure the Azure Security Agent can communicate with the secure enclave| -|Resources|https://github.com/openenclave/openenclave/blob/master/samples/BuildSamplesLinux.md| +|Validation|| + ## Linux Configuration Requirements -|Name|SecuredCore.Encryption.Storage| -|:|:| -|Status|Required| -|Description|The purpose of the test to validate that sensitive data can be encrypted on non-volatile storage.| -|Target Availability|2022| +|Name|SecuredCore.Encryption.Storage|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|2023|2023| +|Description|The purpose of the requirement to validate that sensitive data can be encrypted on nonvolatile storage.| |Validation Type|Manual/Tools| |Validation|Device to be validated through toolset to ensure storage encryption is enabled and default algorithm is XTS-AES, with key length 128 bits or higher.|-|Resources|| + </br> -|Name|SecuredCore.Encryption.TLS| -|:|:| -|Status|Required| -|Description|The purpose of the test is to validate support for required TLS versions and cipher suites.| -|Target Availability|2022| +|Name|SecuredCore.Encryption.TLS|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|2023|2023| +|Description|The purpose of the requirement is to validate support for required TLS versions and cipher suites.| |Validation Type|Manual/Tools| Validation|Device to be validated through toolset to ensure the device supports a minimum TLS version of 1.2 and supports the following required TLS cipher suites.<ul><li>TLS_RSA_WITH_AES_128_GCM_SHA256</li><li>TLS_RSA_WITH_AES_128_CBC_SHA256</li><li>TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256</li><li>TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256</li><li>TLS_DHE_RSA_WITH_AES_128_GCM_SHA256</li><li>TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256</li><li>TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256</li></ul>| |Resources| [TLS support in IoT Hub](../iot-hub/iot-hub-tls-support.md) <br /> | Validation|Device to be validated through toolset to ensure the device supports </br> -|Name|SecuredCore.Protection.CodeIntegrity| -|:|:| -|Status|Required| -|Description|The purpose of this test is to validate that code integrity is available on this device.| -|Target Availability|2022| +|Name|SecuredCore.Protection.CodeIntegrity|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|2023|2023| +|Description|The purpose of this requirement is to validate that authorized code runs with least privilege.| |Validation Type|Manual/Tools| |Validation|Device to be validated through toolset to ensure that code integrity is enabled by validating dm-verity and IMA|-|Resources|| + </br> -|Name|SecuredCore.Protection.NetworkServices| -|:|:| -|Status|Required| -|Description|The purpose of the test is to validate that applications accepting input from the network are not running with elevated privileges.| -|Target Availability|2022| +|Name|SecuredCore.Protection.NetworkServices|x86/AMD64|Arm64| +|:|:|:|:| +|Status|<sup>*</sup>Required|2023|2023| +|Description|The purpose of the requirement is to validate that applications accepting input from the network aren't running with elevated privileges.| |Validation Type|Manual/Tools|-|Validation|Device to be validated through toolset to ensure that services accepting network connections are not running with SYSTEM or root privileges.| -|Resources|| +|Validation|Device to be validated through toolset to ensure that services accepting network connections aren't running with SYSTEM or root privileges.| + ## Linux Software/Service Requirements -|Name|SecuredCore.Built-in.Security| -|:|:| -|Status|Required| -|Description|The purpose of the test is to make sure devices can report security information and events by sending data to Azure Defender for IoT. <br>Note: Download and deploy security agent from GitHub| -|Target Availability|2022| +|Name|SecuredCore.Built-in.Security|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|2023|2023| +|Description|The purpose of the requirement is to make sure devices can report security information and events by sending data to Microsoft Defender for IoT.| |Validation Type|Manual/Tools|-|Validation |Device must generate security logs and alerts. Device logs and alerts messages to Azure Security Center.<ol><li>Device must have the Azure Defender microagent running</li><li>Configuration_Certification_Check must report TRUE in the module twin</li><li>Validate alert messages from Azure Defender for IoT.</li></ol>| +|Validation |<ol><li>Device must generate security logs and alerts.</li><li>Device logs and alerts messages to Azure Security Center.</li><li>Device must have the Azure Defender for IoT microagent running</li><li>Configuration_Certification_Check must report TRUE in the module twin</li><li>Validate alert messages from Azure Defender for IoT.</li></ol>| |Resources|[Azure Docs IoT Defender for IoT](../defender-for-iot/how-to-configure-agent-based-solution.md)| </br> -|Name|SecuredCore.Manageability.Configuration| -|:|:| -|Status|Required| -|Description|The purpose of the test is to validate the devices supports Remote administration via OSConfig.| -|Target Availability|2022| +|Name|SecuredCore.Manageability.Configuration|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|2023|2023| +|Description|The purpose of the requirement is to validate that device supports auditing and setting of system configuration (and certain management actions such as reboot) through Azure.| +|Dependency|azure-osconfig| |Validation Type|Manual/Tools|-|Validation|Device to be validated through toolset to ensure the device supports the ability to be remotely manageable and configured by OSConfig.| -|Resources|| +|Validation|<ol><li>Device must report, via IoT Hub, its firewall state, firewall fingerprint, ip addresses, network adapter state, host name, hosts file, TPM (absence, or presence with version) and package manager sources (see What can I manage) </li><li>Device must accept the creation, via IoT Hub, of a default firewall policy (accept vs drop), and at least one firewall rule, with positive remote acknowledgment (see configurationStatus)</li><li>Device must accept the replacement of /etc/hosts file contents via IoT Hub, with positive remote acknowledgment (see https://learn.microsoft.com/en-us/azure/osconfig/howto-hosts?tabs=portal#the-object-model )</li><li>Device must accept and implement, via IoT Hub, remote reboot</li></ol> Note: Use of other system management toolchains (for example, Ansible, etc.) by operators are not prohibited, but the device must include the azure-osconfig agent such that it's ready to be managed from Azure.| + </br> -|Name|SecuredCore.Update| -|:|:| -|Status|Audit| -|Description|The purpose of the test is to validate the device can receive and update its firmware and software.| -|Target Availability|2022| +|Name|SecuredCore.Update|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Audit|2023|2023| +|Description|The purpose of the requirement is to validate the device can receive and update its firmware and software.| |Validation Type|Manual/Tools| |Validation|Partner confirmation that they were able to send an update to the device through Azure Device update and other approved services.| |Resources|[Device Update for IoT Hub](../iot-hub-device-update/index.yml)| Validation|Device to be validated through toolset to ensure the device supports </br> -|Name|SecuredCore.Protection.Baselines| -|:|:| -|Status|Required| -|Description|The purpose of the test is to validate that the system conforms to a baseline security configuration.| -|Target Availability|2022| +|Name|SecuredCore.Protection.Baselines|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|2023|2023| +|Description|The purpose of the requirement is to validate the extent to which the device implements the Azure Security Baseline| +|Dependency|azure-osconfig| |Validation Type|Manual/Tools|-|Validation|Device to be validated through toolset to ensure that Defender IOT system configurations benchmarks have been run.| -|Resources| https://techcommunity.microsoft.com/t5/microsoft-security-baselines/bg-p/Microsoft-Security-Baselines <br> https://www.cisecurity.org/cis-benchmarks/ | +|Validation|OSConfig is present on the device and reporting to what extent it implements the Azure Security Baseline.| +|Resources| <ul><li>https://techcommunity.microsoft.com/t5/microsoft-security-baselines/bg-p/Microsoft-Security-Baselines </li><li> https://www.cisecurity.org/cis-benchmarks/ </li><li>https://learn.microsoft.com/en-us/azure/governance/policy/samples/guest-configuration-baseline-linux|</li></ul> </br> -|Name|SecuredCore.Protection.SignedUpdates| -|:|:| -|Status|Required| -|Description|The purpose of the test is to validate that updates must be signed.| -|Target Availability|2022| +|Name|SecuredCore.Protection.SignedUpdates|x86/AMD64|Arm64| +|:|:|:|:| +|Status|Required|2023|2023| +|Description|The purpose of the requirement is to validate that updates must be signed.| |Validation Type|Manual/Tools|-|Validation|Device to be validated through toolset to ensure that updates to the operating system, drivers, application software, libraries, packages and firmware will not be applied unless properly signed and validated. -|Resources|| +|Validation|Device to be validated through toolset to ensure that updates to the operating system, drivers, application software, libraries, packages and firmware won't be applied unless properly signed and validated. + Validation|Device to be validated through toolset to ensure the device supports |Name|SecuredCore.Policy.Protection.Debug| |:|:| |Status|Required|-|Description|The purpose of the test is to validate that debug functionality on the device is disabled.| -|Target Availability|2022| +|Description|The purpose of the requirement is to validate that debug functionality on the device is disabled.| |Validation Type|Manual/Tools| |Validation|Device to be validated through toolset to ensure that debug functionality requires authorization to enable.|-|Resources|| + </br> Validation|Device to be validated through toolset to ensure the device supports |Name|SecuredCore.Policy.Manageability.Reset| |:|:| |Status|Required|-|Description|The purpose of this test is to validate the device against two use cases: a) Ability to perform a reset (remove user data, remove user configs), b) Restore device to last known good in the case of an update causing issues.| -|Target Availability|2022| +|Description|The purpose of this requirement is to validate the device against two use cases: a) Ability to perform a reset (remove user data, remove user configs), b) Restore device to last known good if an update causing issues.| |Validation Type|Manual/Tools| |Validation|Device to be validated through a combination of toolset and submitted documentation that the device supports this functionality. The device manufacturer can determine whether to implement these capabilities to support remote reset or only local reset.|-|Resources|| + </br> Validation|Device to be validated through toolset to ensure the device supports |:|:| |Status|Required| |Description|The purpose of this policy is to ensure that the device remains secure.|-|Target Availability|2022| |Validation Type|Manual| |Validation|Commitment from submission that devices certified will be required to keep devices up to date for 60 months from date of submission. Specifications available to the purchaser and devices itself in some manner should indicate the duration for which their software will be updated.|-|Resources|| + </br> Validation|Device to be validated through toolset to ensure the device supports |Name|SecuredCore.Policy.Vuln.Disclosure| |:|:| |Status|Required|-|Description|The purpose of this policy is to ensure that there is a mechanism for collecting and distributing reports of vulnerabilities in the product.| -|Target Availability|2022| +|Description|The purpose of this policy is to ensure that there's a mechanism for collecting and distributing reports of vulnerabilities in the product.| |Validation Type|Manual| |Validation|Documentation on the process for submitting and receiving vulnerability reports for the certified devices will be reviewed.|-|Resources|| + </br> Validation|Device to be validated through toolset to ensure the device supports |:|:| |Status|Required| |Description|The purpose of this policy is to ensure that vulnerabilities that are high/critical (using CVSS 3.0) are addressed within 180 days of the fix being available.|-|Target Availability|2022| |Validation Type|Manual| |Validation|Documentation on the process for submitting and receiving vulnerability reports for the certified devices will be reviewed.|-|Resources|| +++</br> +<!-> +<!-> +<!-> ++## Azure Sphere platform Support +The Mediatek MT3620AN must be included in your design. Additional guidance for building secured Azure Sphere applications can be within the [Azure Sphere application notes](https://learn.microsoft.com/azure-sphere/app-notes/app-notes-overview). +++## Azure Sphere Hardware/Firmware Requirements +++|Name|SecuredCore.Hardware.Identity|Azure Sphere| +|:|:|:| +|Status|Required|2023| +|Description|The purpose of the requirement is to validate the device identity is rooted in hardware.|| +|Validation Type|Prevalidated, no additional validation is required|| +|Validation|Provided by Microsoft|| +++</br> ++|Name|SecuredCore.Hardware.MemoryProtection|Azure Sphere| +|:|:|:| +|Status|Required|2023| +|Description|The purpose of the requirement is to ensure that memory integrity helps protect the device from vulnerable peripherals.| +|Validation Type|Prevalidated, no additional validation is required| +|Validation|Provided by Microsoft| +++</br> +++|Name|SecuredCore.Firmware.Protection|Azure Sphere| +|:|:|:| +|Status|Required|2023| +|Description|The purpose of the requirement is to ensure that device has adequate mitigations from Firmware security threats.| +|Validation Type|Prevalidated, no additional validation is required| +|Validation|Provided by Microsoft| ++++</br> ++|Name|SecuredCore.Firmware.SecureBoot|Azure Sphere| +|:|:|:| +|Status|Required|2023| +|Description|The purpose of the requirement is to validate the boot integrity of the device.| +|Validation Type|Prevalidated, no additional validation is required| +|Validation|Provided by Microsoft| ++++</br> ++|Name|SecuredCore.Firmware.Attestation|Azure Sphere| +|:|:|:| +|Status|Required|2023| +|Description|The purpose of the requirement is to ensure the device can remotely attest to a Microsoft Azure Attestation service.| +|Validation Type|Prevalidated, no additional validation is required| +|Validation|Provided by Microsoft| ++++</br> ++|Name|SecuredCore.Hardware.SecureEnclave|Azure Sphere| +|:|:|:| +|Status|Required|2023| +|Description|The purpose of this requirement is to validate hardware security that is accessible from a secure operating system.| +|Validation Type|Prevalidated, no additional validation is required| +|Validation|Provided by Microsoft| ++## Azure Sphere OS Configuration Requirements +++|Name|SecuredCore.Encryption.Storage|Azure Sphere| +|:|:|:| +|Status|Required|2023| +|Description|The purpose of this requirement is to validate that sensitive data can be encrypted on nonvolatile storage.| +|Validation Type|Prevalidated, no additional validation is required| +|Validation|Provided by Microsoft| +|Resources|[Data at rest protection on Azure Sphere](https://learn.microsoft.com/azure-sphere/app-notes/app-notes-overview)| +++</br> ++|Name|SecuredCore.Encryption.TLS|Azure Sphere| +|:|:|:| +|Status|Required|2023| +|Description|The purpose of the requirement is to validate support for required TLS versions and cipher suites.| +|Validation Type|Prevalidated, no additional validation is required| +|Validation|Provided by Microsoft| +|Resources| [TLS support in IoT Hub](../iot-hub/iot-hub-tls-support.md) <br /> | +++</br> ++|Name|SecuredCore.Protection.CodeIntegrity|Azure Sphere| +|:|:|:| +|Status|Required|2023| +|Description|The purpose of this requirement is to validate that authorized code runs with least privilege.| +|Validation Type|Prevalidated, no additional validation is required| +|Validation|Provided by Microsoft| +++</br> ++|Name|SecuredCore.Protection.NetworkServices|Azure Sphere| +|:|:|:| +|Status|Required|2023| +|Description|The purpose of the requirement is to validate that applications accepting input from the network aren't running with elevated privileges.| +|Validation Type|Prevalidated, no additional validation is required| +|Validation|Provided by Microsoft| +++</br> ++|Name|SecuredCore.Protection.NetworkFirewall|Azure Sphere| +|:|:|:| +|Status|Required|2023| +|Description|The purpose of this requirement is to validate that applications can't connect to endpoints that haven't been authorized.| +|Validation Type|Prevalidated, no additional validation is required| +|Validation|Provided by Microsoft| +++## Azure Sphere Software/Service Requirements ++|Name|SecuredCore.Built-in.Security|Azure Sphere| +|:|:|:| +|Status|Required|2023| +|Description|The purpose of this requirement is to make sure devices can report security information and events by sending data to a Microsoft telemetry service.| +|Validation Type|Prevalidated, no additional validation is required| +|Validation|Provided by Microsoft| +|Resources|[Collect and interpret error data - Azure Sphere](https://learn.microsoft.com/azure-sphere/deployment/interpret-error-data?tabs=cliv2beta)</br>[Configure crash dumps - Azure Sphere](https://learn.microsoft.com/azure-sphere/deployment/configure-crash-dumps)| +++</br> ++|Name|SecuredCore.Manageability.Configuration|Azure Sphere| +|:|:|:| +|Status|Required|2023| +|Description|The purpose of this requirement is to validate the device supports remote administration via service-based configuration control.| +|Validation Type|Prevalidated, no additional validation is required| +|Validation|Provided by Microsoft| +++</br> ++|Name|SecuredCore.Update|Azure Sphere| +|:|:|:| +|Status|Required|2023| +|Description|The purpose of the requirement is to validate the device can receive and update its firmware and software.| +|Validation Type|Prevalidated, no additional validation is required| +|Validation|Provided by Microsoft| +++</br> ++|Name|SecuredCore.Protection.Baselines|Azure Sphere| +|:|:|:| +|Status|Required|2023| +|Description|The purpose of the requirement is to validate that the system conforms to a baseline security configuration| +|Validation Type|Prevalidated, no additional validation is required| +|Validation|Provided by Microsoft| +++</br> ++|Name|SecuredCore.Protection.SignedUpdates|Azure Sphere| +|:|:|:| +|Status|Required|2023| +|Description|The purpose of the requirement is to validate that updates must be signed.| +|Validation Type|Prevalidated, no additional validation is required| +|Validation|Provided by Microsoft| ++++## Azure Sphere Policy Requirements ++|Name|SecuredCore.Policy.Protection.Debug| +|:|:| +|Status|Required| +|Description|The purpose of the policy requires that debug functionality on the device is disabled.| +|Validation Type|Prevalidated, no additional validation is required| +|Validation|Provided by Microsoft| ++++</br> ++|Name|SecuredCore.Policy.Manageability.Reset| +|:|:| +|Status|Required| +|Description|The policy requires that the device can execute two use cases: a) Ability to perform a reset (remove user data, remove user configurations), b) Restore device to last known good in the case of an update causing issues.| +|Validation Type|Prevalidated, no additional validation is required| +|Validation|Provided by Microsoft| ++++</br> ++|Name|SecuredCore.Policy.Updates.Duration| +|:|:| +|Status|Required| +|Description|The purpose of this policy is to ensure that the device remains secure.| +|Validation Type|Prevalidated, no additional validation is required| +|Validation|Provided by Microsoft| ++++</br> ++|Name|SecuredCore.Policy.Vuln.Disclosure| +|:|:| +|Status|Required| +|Description|The purpose of this policy is to ensure that there's a mechanism for collecting and distributing reports of vulnerabilities in the product.| +|Validation Type|Prevalidated, no additional validation is required| +|Validation|Azure Sphere vulnerabilities are collected by Microsoft through MSRC and are published to customers through the Tech Community Blog, Azure Sphere “What’s New” page, and through Mitre’s CVE database.| +|Resources|<ul><li>[Report an issue and submission guidelines](https://www.microsoft.com/msrc/faqs-report-an-issue)</li><li>[What's new - Azure Sphere](https://learn.microsoft.com/azure-sphere/product-overview/whats-new)</li><li> +[Azure Sphere CVEs](https://learn.microsoft.com/azure-sphere/deployment/azure-sphere-cves)|</li></ul> +++</br> ++|Name|SecuredCore.Policy.Vuln.Fixes| +|:|:| +|Status|Required| +|Description|The purpose of this policy is to ensure that vulnerabilities that are high/critical (using CVSS 3.0) are addressed within 180 days of the fix being available.| +|Validation Type|Prevalidated, no additional validation is required| +|Validation|Provided by Microsoft| + </br> ::: zone-end |
cognitive-services | Overview | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Speech-Service/overview.md | With [text to speech](text-to-speech.md), you can convert input text into humanl ### Intent recognition -[Intent recognition](./intent-recognition.md): Use speech-to-text with [Language Understanding (LUIS)](../luis/index.yml) to derive user intents from transcribed speech and act on voice commands. +[Intent recognition](./intent-recognition.md): Use speech-to-text with conversational language understanding to derive user intents from transcribed speech and act on voice commands. ## Delivery and presence |
cognitive-services | Speech Container Howto | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Speech-Service/speech-container-howto.md | With Speech containers, you can build a speech application architecture that's o | Container | Features | Supported versions and locales | |--|--|--|-| Speech-to-text | Analyzes sentiment and transcribes continuous real-time speech or batch audio recordings with intermediate results. | Latest: 3.11.0<br/><br/>For all supported versions and locales, see the [Microsoft Container Registry (MCR)](https://mcr.microsoft.com/product/azure-cognitive-services/speechservices/speech-to-text/tags) and [JSON tags](https://mcr.microsoft.com/v2/azure-cognitive-services/speechservices/speech-to-text/tags/list).| -| Custom speech-to-text | Using a custom model from the [Custom Speech portal](https://speech.microsoft.com/customspeech), transcribes continuous real-time speech or batch audio recordings into text with intermediate results. | Latest: 3.11.0<br/><br/>For all supported versions and locales, see the [Microsoft Container Registry (MCR)](https://mcr.microsoft.com/product/azure-cognitive-services/speechservices/custom-speech-to-text/tags) and [JSON tags](https://mcr.microsoft.com/v2/azure-cognitive-services/speechservices/speech-to-text/tags/list). | +| Speech-to-text | Analyzes sentiment and transcribes continuous real-time speech or batch audio recordings with intermediate results. | Latest: 3.12.0<br/><br/>For all supported versions and locales, see the [Microsoft Container Registry (MCR)](https://mcr.microsoft.com/product/azure-cognitive-services/speechservices/speech-to-text/tags) and [JSON tags](https://mcr.microsoft.com/v2/azure-cognitive-services/speechservices/speech-to-text/tags/list).| +| Custom speech-to-text | Using a custom model from the [Custom Speech portal](https://speech.microsoft.com/customspeech), transcribes continuous real-time speech or batch audio recordings into text with intermediate results. | Latest: 3.12.0<br/><br/>For all supported versions and locales, see the [Microsoft Container Registry (MCR)](https://mcr.microsoft.com/product/azure-cognitive-services/speechservices/custom-speech-to-text/tags) and [JSON tags](https://mcr.microsoft.com/v2/azure-cognitive-services/speechservices/speech-to-text/tags/list). | | Speech language identification | Detects the language spoken in audio files. | Latest: 1.5.0<sup>1</sup><br/><br/>For all supported versions and locales, see the [Microsoft Container Registry (MCR)](https://mcr.microsoft.com/product/azure-cognitive-services/speechservices/language-detection/tags) and [JSON tags](https://mcr.microsoft.com/v2/azure-cognitive-services/speechservices/language-detection/tags/list). |-| Neural text-to-speech | Converts text to natural-sounding speech by using deep neural network technology, which allows for more natural synthesized speech. | Latest: 2.10.0<br/><br/>For all supported versions and locales, see the [Microsoft Container Registry (MCR)](https://mcr.microsoft.com/product/azure-cognitive-services/speechservices/neural-text-to-speech/tags) and [JSON tags](https://mcr.microsoft.com/v2/azure-cognitive-services/speechservices/neural-text-to-speech/tags/list). | +| Neural text-to-speech | Converts text to natural-sounding speech by using deep neural network technology, which allows for more natural synthesized speech. | Latest: 2.11.0<br/><br/>For all supported versions and locales, see the [Microsoft Container Registry (MCR)](https://mcr.microsoft.com/product/azure-cognitive-services/speechservices/neural-text-to-speech/tags) and [JSON tags](https://mcr.microsoft.com/v2/azure-cognitive-services/speechservices/neural-text-to-speech/tags/list). | <sup>1</sup> The container is available in public preview. Containers in preview are still under development and don't meet Microsoft's stability and support requirements. |
cognitive-services | Speech Synthesis Markup Structure | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Speech-Service/speech-synthesis-markup-structure.md | This example uses the `en-US-JennyNeural` voice. For more examples, see [voice e </speak> ``` -## Add or prevent a break +## Add a break -Use the `break` element to override the default behavior of breaks or pauses between words. You can use it to add or prevent pauses that are otherwise automatically inserted by the Speech service. +Use the `break` element to override the default behavior of breaks or pauses between words. You can use it to add pauses that are otherwise automatically inserted by the Speech service. Usage of the `break` element's attributes are described in the following table. | Attribute | Description | Required or optional | | - | - | - |-| `strength` | The relative duration of a pause by using one of the following values:<br/><ul><li>none</li><li>x-weak</li><li>weak</li><li>medium (default)</li><li>strong</li><li>x-strong</li></ul><br/><br/>Set `strength` to `none` to prevent automatic insertion of a prosodic break. | Optional | +| `strength` | The relative duration of a pause by using one of the following values:<br/><ul><li>x-weak</li><li>weak</li><li>medium (default)</li><li>strong</li><li>x-strong</li></ul>| Optional | | `time` | The absolute duration of a pause in seconds (such as `2s`) or milliseconds (such as `500ms`). Valid values range from 0 to 5000 milliseconds. If you set a value greater than the supported maximum, the service will use `5000ms`. If the `time` attribute is set, the `strength` attribute is ignored.| Optional | Here are more details about the `strength` attribute. | Strength | Relative duration | | - | - |-| None, or if no value provided | 0 ms | | X-weak | 250 ms | | Weak | 500 ms | | Medium | 750 ms | Here are more details about the `strength` attribute. ### Break examples -The supported values for attributes of the `break` element were [described previously](#add-or-prevent-a-break). +The supported values for attributes of the `break` element were [described previously](#add-a-break). The following three ways all add 750 ms breaks. ```xml <speak version="1.0" xmlns="http://www.w3.org/2001/10/synthesis" xml:lang="en-US"> <voice name="en-US-JennyNeural"> Welcome <break /> to text-to-speech. Welcome <break strength="medium" /> to text-to-speech.- Welcome <break time="250ms" /> to text-to-speech. + Welcome <break time="750ms" /> to text-to-speech. </voice> </speak> ``` |
cognitive-services | Managed Identity | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/openai/how-to/managed-identity.md | Assigning yourself to the Cognitive Services User role will allow you to use you Use the access token to authorize your API call by setting the `Authorization` header value. ```bash- curl ${endpoint%/}/openai/deployment/YOUR_DEPLOYMENT_NAME/completions?api-version=2022-12-01 \ + curl ${endpoint%/}/openai/deployments/YOUR_DEPLOYMENT_NAME/completions?api-version=2022-12-01 \ -H "Content-Type: application/json" \ -H "Authorization: Bearer $accessToken" \ -d '{ "prompt": "Once upon a time" }' Before you can use managed identities for Azure resources to authorize access to - [Azure Resource Manager template](../../../active-directory/managed-identities-azure-resources/qs-configure-template-windows-vm.md) - [Azure Resource Manager client libraries](../../../active-directory/managed-identities-azure-resources/qs-configure-sdk-windows-vm.md) -For more information about managed identities, see [Managed identities for Azure resources](../../../active-directory/managed-identities-azure-resources/overview.md). +For more information about managed identities, see [Managed identities for Azure resources](../../../active-directory/managed-identities-azure-resources/overview.md). |
cognitive-services | Prepare Dataset | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/openai/how-to/prepare-dataset.md | The first step of customizing your model is to prepare a high quality dataset. T - Each completion should start with a whitespace due to our tokenization, which tokenizes most words with a preceding whitespace. - Each completion should end with a fixed stop sequence to inform the model when the completion ends. A stop sequence could be `\n`, `###`, or any other token that doesn't appear in any completion. - For inference, you should format your prompts in the same way as you did when creating the training dataset, including the same separator. Also specify the same stop sequence to properly truncate the completion.+- The dataset cannot exceed 100 Mb in total file size. ## Best practices |
communication-services | Known Issues | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/known-issues.md | This issue is fixed in Azure Communication Services Calling SDK version 1.3.1-be * iOS Safari version: 15.1 +### MacOS Ventura Safari(v16.3 and below) screen sharing. +Screen sharing does not work in MacOS Ventura Safari(v16.3 and below). Known issue from Safari and will be fixed in v16.4+ + ### Refreshing a page doesn't immediately remove the user from their call If a user is in a call and decides to refresh the page, the Communication Services media service won't remove this user immediately from the call. It will wait for the user to rejoin. The user will be removed from the call after the media service times out. The environment in which this problem occurs is the following: The cause of this problem might be that acquiring your own stream from the same device will have a side effect of running into race conditions. Acquiring streams from other devices might lead the user into insufficient USB/IO bandwidth, and the `sourceUnavailableError` rate will skyrocket. -### Support for simulcast --Simulcast is a technique by which a client encodes the same video stream twice, in different resolutions and bitrates. The client then lets Communication Services decide which stream a client should receive. The Communication Services calling library SDK for Windows, Android, or iOS supports sending simulcast streams. The Communication Services Web SDK doesn't currently support sending simulcast streams out. - ## Communication Services Call Automation APIs The following are known issues in the Communication Services Call Automation APIs: |
container-instances | Container Instances Application Gateway | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-instances/container-instances-application-gateway.md | az network application-gateway create \ --vnet-name myVNet \ --subnet myAGSubnet \ --servers "$ACI_IP"+ --priority 100 ``` |
container-registry | Container Registry Helm Repos | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-registry/container-registry-helm-repos.md | Run `helm registry login` to authenticate with the registry. You may pass [regi --scopes $(az acr show --name $ACR_NAME --query id --output tsv) \ --role acrpush \ --query "password" --output tsv)- USER_NAME=$(az ad sp list --display-name $SERVICE_PRINCIPAL_NAME --query "[].appId" --output tsv) + USER_NAME=$(az identity show -n $SERVICE_PRINCIPAL_NAME -g $RESOURCE_GROUP_NAME --subscription $SUBSCRIPTION_ID --query "clientId" -o tsv) ``` - Authenticate with your [individual Azure AD identity](container-registry-authentication.md?tabs=azure-cli#individual-login-with-azure-ad) to push and pull Helm charts using an AD token. ```azurecli |
cosmos-db | Role Based Access Control | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/role-based-access-control.md | This setting will prevent any changes to any Azure Cosmos DB resource from any c - Creating, deleting child resources such as databases and containers. This includes resources for other APIs such as Cassandra, MongoDB, Gremlin, and table resources. -- Updating throughput on database or container level resources.-+- Reading or updating throughput on database or container level resources. - Modifying container properties including index policy, TTL and unique keys. - Modifying stored procedures, triggers or user-defined functions. Update-AzCosmosDBAccount -ResourceGroupName [ResourceGroupName] -Name [CosmosDBA - [Azure custom roles](../role-based-access-control/custom-roles.md) - [Azure Cosmos DB resource provider operations](../role-based-access-control/resource-provider-operations.md#microsoftdocumentdb) - [Configure role-based access control for your Azure Cosmos DB for MongoDB](mongodb/how-to-setup-rbac.md)+ |
cost-management-billing | Understand Usage | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/understand/understand-usage.md | tags: billing Previously updated : 08/17/2022 Last updated : 03/13/2023 BillingPeriodStartDate | All | The start date of the billing period. BillingProfileId┬╣ | All | Unique identifier of the EA enrollment, PAYG subscription, MCA billing profile, or AWS consolidated account. BillingProfileName | All | Name of the EA enrollment, PAYG subscription, MCA billing profile, or AWS consolidated account. ChargeType | All | Indicates whether the charge represents usage (**Usage**), a purchase (**Purchase**), or a refund (**Refund**).-ConsumedService | All | Name of the service the charge is associated with. +ConsumedService | EA, PAYG | Name of the service the charge is associated with. For more information about choosing a method to get cost details, see [Choose a cost details solution](../automate/usage-details-best-practices.md). CostCenter┬╣ | EA, MCA | The cost center defined for the subscription for tracking costs (only available in open billing periods for MCA accounts). Cost | EA, PAYG | See CostInBillingCurrency. CostInBillingCurrency | MCA | Cost of the charge in the billing currency before credits or taxes. |
data-factory | Concepts Integration Runtime Performance | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/concepts-integration-runtime-performance.md | While increasing the shuffle partitions, make sure data is spread across well. A Here are the steps on how it's set in a custom integration runtime. You can't set it for autoresolve integrtaion runtime. -1. From ADF portal under **Manage**, select a custom itegration run time and you go to edit mode. +1. From ADF portal under **Manage**, select a custom integration run time and you go to edit mode. 2. Under dataflow run time tab, go to **Compute Cusotm Properties** section. 3. Select **Shuffle Partitions** under Property name, input value of your choice, like 250, 500 etc. |
databox-online | Azure Stack Edge Gpu Manage Device Event Alert Notifications | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/databox-online/azure-stack-edge-gpu-manage-device-event-alert-notifications.md | Title: Use action rules to manage alert notifications on Azure Stack Edge devices | Microsoft Docs -description: Describes how to define action rules to manage alert notifications for Azure Stack Edge devices in the Azure portal. + Title: Use alert processing rules to manage alert notifications on Azure Stack Edge devices | Microsoft Docs +description: Describes how to define alert processing rules to manage alert notifications for Azure Stack Edge devices in the Azure portal. -+ Previously updated : 12/06/2021 Last updated : 03/13/2023 -# Use action rules to manage alert notifications on Azure Stack Edge devices +# Use alert processing rules to manage alert notifications on Azure Stack Edge devices [!INCLUDE [applies-to-GPU-and-pro-r-and-mini-r-skus](../../includes/azure-stack-edge-applies-to-gpu-pro-r-mini-r-sku.md)] -This article describes how to create action rules in the Azure portal to trigger or suppress alert notifications for device events that occur within a resource group, an Azure subscription, or an individual Azure Stack Edge resource. +This article describes how to create alert processing rules in the Azure portal. Alert processing rules trigger or suppress notifications for device events that occur within a resource group, an Azure subscription, or an individual Azure Stack Edge resource. -## About action rules +## About alert processing rules -An action rule can trigger or suppress alert notifications. The action rule is added to an *action group* - a set of notification preferences that's used to notify users who need to act on alerts triggered in different contexts for a resource or set of resources. +An alert processing rule can add action groups to alert notifications. Use alert notification preferences, like email or SMS messages, to notify users when alerts are triggered. -For more information about action rules, see [Configuring an action rule](../azure-monitor/alerts/alerts-action-rules.md?tabs=portal#configuring-an-action-rule). For more information about action groups, see [Create and manage action groups in the Azure portal](../azure-monitor/alerts/action-groups.md). +For more information about alert processing rules, see [Alert processing rules](../azure-monitor/alerts/alerts-processing-rules.md?tabs=portal). For more information about action groups, see [Create and manage action groups in the Azure portal](../azure-monitor/alerts/action-groups.md). -> [!NOTE] -> The action rules feature is in preview. Some screens and steps might change as the process is refined. ---## Create an action rule +## Create an alert processing rule -Take the following steps in the Azure portal to create an action rule for your Azure Stack Edge device. +Use the following steps in the Azure portal to create an alert processing rule for your Azure Stack Edge device. > [!NOTE]-> These steps create an action rule that sends notifications to an action group. For details about creating an action rule to suppress notifications, see [Configuring an action rule](../azure-monitor/alerts/alerts-action-rules.md?tabs=portal#configuring-an-action-rule). +> These steps create an alert processing rule. The alert processing rule adds action groups to alert notifications. For details about creating an alert processing rule to suppress notifications, see [Alert processing rules](../azure-monitor/alerts/alerts-action-rules.md?tabs=portal). -1. Go to the Azure Stack Edge device in the [Azure portal](https://portal.azure.com), and select the **Alerts** menu item (under **Monitoring**). Then select **Action rules (preview)**. +1. Go to the Azure Stack Edge device in the [Azure portal](https://portal.azure.com), and select the **Alerts** menu item (under **Monitoring**). Then select **Alert processing rules**. -  + [](media/azure-stack-edge-gpu-manage-device-event-alert-notifications/azure-stack-edge-alert-processing-rules-01.png#lightbox) -2. In the **Action rules (preview)**, select **+ Create**. +2. On the **Alert processing rules** page, select **+ Create** to launch the **Create an alert processing rule** wizard. - [  ](media/azure-stack-edge-gpu-manage-device-event-alert-notifications/action-rules-open-view-02-expanded.png#lightbox) + [](media/azure-stack-edge-gpu-manage-device-event-alert-notifications/azure-stack-edge-alert-processing-rules-create-rule-02.png#lightbox) -3. On the **Create action rule** screen, create a **Scope** to select an Azure subscription, resource group, or target resource. The action rule will act on all alerts generated within that scope. +3. On the **Scope** page, select **+ Select scope**. - 1. Select **Edit** beside **Scope** to open the **Select scope** panel. + [](media/azure-stack-edge-gpu-manage-device-event-alert-notifications/azure-stack-edge-alert-processing-rules-select-scope-03.png#lightbox) - [  ](media/azure-stack-edge-gpu-manage-device-event-alert-notifications/new-action-rule-scope-01-expanded.png#lightbox) -- 2. On the **Select Scope** panel, select the **Subscription** for the action rule, and optionally filter by a **Resource** type. To filter to Azure Stack Edge resources, select **Data Box Edge devices (dataBoxEdge)**. - -  +1. Select a **Subscription** and optionally filter by **Resource types**. To filter by Azure Stack Edge resources, select **Resource types** for **Azure Stack Edge / Data Box Gateway** as shown in the following example. - The **Resource** area lists the available resources based on your selections. + [](media/azure-stack-edge-gpu-manage-device-event-alert-notifications/azure-stack-edge-alert-processing-rules-select-subscription-04.png#lightbox) - 3. Select the check box by each resource you want to apply the rule to. You can select the subscription, resource groups, or individual resources. +1. The **Resource type** option lists the available resources based on your selection. Use the filter option to reduce the list of options. Select the **Checkbox** for the scope option you want to work with and then select **Apply**. - 4. When you finish, select **Done**. + [](media/azure-stack-edge-gpu-manage-device-event-alert-notifications/azure-stack-edge-new-action-rule-scope-selection-05a.png#lightbox) -  +1. You can also use the **Filter** control in the following example to reduce the list of options to a subset of alerts within the selected scope. - The **Create action rule** screen shows the selected scope. --  --4. Use **Filter** options to narrow the application of the rule to a subset of alerts within the selected scope. + [](media/azure-stack-edge-gpu-manage-device-event-alert-notifications/azure-stack-edge-filter-scope-results-06a.png#lightbox) + +1. On the **Add filters** pane, under **Filters**, add each filter you want to apply. For each filter, select the **Filter** type, **Operator**, and **Value**. - 1. Select **Add** to open the **Add filters** pane. + For a list of filter options, see [Filter criteria](../azure-monitor/alerts/alerts-processing-rules.md?tabs=portal#scope-and-filters-for-alert-processing-rules). -  + The filters in the following example apply to all alerts at Severity levels 2, 3, and 4 that the Monitor service raises for Azure Stack Edge resources. - 2. On the **Add filters** pane, under **Filters**, add each filter you want to apply. For each filter, select the filter type, **Operator**, and **Value**. - - For a list of filter options, see [Filter criteria](../azure-monitor/alerts/alerts-action-rules.md?tabs=portal#filter-criteria). + [](media/azure-stack-edge-gpu-manage-device-event-alert-notifications/azure-stack-edge-action-rule-filter-criteria-builder-07.png#lightbox) - The sample filters below apply to all alerts at Severity levels 2, 3, and 4 that the Monitor service raises for Azure Stack Edge resources. +1. On the **Rule Settings** page, select **Apply action group** to create a rule that sends notifications. -  + Select an option to **+ Select action group** for an existing group or **+ Create action group** to create a new one. - 3. When you finish adding filters, select **Done**. + To create a new action group, select **+ Create action group** and follow the steps in [Alert processing rules](../azure-monitor/alerts/alerts-processing-rules.md#add-action-groups-to-all-alert-types). -5. On the **Create action rule** screen, select **Action group** to create a rule that sends notifications. Then, by **Actions**, choose **Select**. + >[!NOTE] + >Select the **Suppress notifications** option if you don't want to invoke notifications for alerts. For more information, see [Alert processing rules](../azure-monitor/alerts/alerts-processing-rules.md?tabs=portal#suppress-notifications-during-planned-maintenance). -  + [](media/azure-stack-edge-gpu-manage-device-event-alert-notifications/azure-stack-edge-action-rule-setting-options-08.png#lightbox) - > [!NOTE] - > To create a rule that suppresses notifications, you would choose **Suppression**. For more information, see [Configuring an action rule](../azure-monitor/alerts/alerts-action-rules.md?tabs=portal#configuring-an-action-rule). +1. On the **Select action groups** page, select up to five action groups to attach to the alert processing rule, and then choose **Select**. -6. On the **Add action groups** screen, select the action group to use with this action rule. Then choose **Select**. Your new action rule will be added to the notification preferences of the action group. + The new alert processing rule is added to the notification preferences of the action group. - If you need to create a new action group, select **+ Create action group**, and follow the steps in [Create an action group by using the Azure portal](../azure-monitor/alerts/action-groups.md#create-an-action-group-by-using-the-azure-portal). + [](media/azure-stack-edge-gpu-manage-device-event-alert-notifications/azure-stack-edge-select-action-group-09a.png#lightbox) -  +1. On the **Details** tab, assign the alert processing rule to a **Resource group** and then specify a **Name** and a **Description** (optional) for the new rule. -7. Give the new action rule a **Name** and **Description** (optional), and assign the rule to a resource group. + The new rule is enabled by default. If you don't want to start using the rule immediately, leave the **Enable rule upon creation** option unchecked. -8. The new rule will be enabled by default. If you don't want to start using the rule immediately, select **No** for **Enable rule update creation**. + [](media/azure-stack-edge-gpu-manage-device-event-alert-notifications/azure-stack-edge-create-processing-rule-10.png#lightbox) -9. When you finish your settings, select **Create**. +1. To continue, select **Review+Create**. -  + [](media/azure-stack-edge-gpu-manage-device-event-alert-notifications/azure-stack-edge-processing-rule-details-11a.png#lightbox) - The **Action rules (Preview)** screen opens, but you might not see your new action rule immediately. The focus is **All** resource groups. +1. Review your selections and then select **Create**. -10. To see your new action rule, select the resource group for the rule. + The **Alert processing rules** page launches, but you may not see the new rule immediately. The default view is **All** resource groups. -  +1. To view your new alert processing rule, select the resource group that contains the rule. + [](media/azure-stack-edge-gpu-manage-device-event-alert-notifications/azure-stack-edge-processing-rules-12.png#lightbox) ## View notifications -Notifications go out when a new event triggers an alert for a resource that's within the scope of an action rule. +Notifications go out when an event triggers an alert for a resource within the scope of an alert processing rule. -The action group for a rule sets who receives a notification and the type of notification that's sent - email, a Short Message Service (SMS) message, or both. +The action group for an alert processing rule determines who receives a notification and the type of notification to send. Notifications can be sent via email, SMS message, or both. -It might take a few minutes to receive notifications after an alert is triggered. +It may take a few minutes to receive notifications after an alert is triggered. -The email notification will look similar to this one. +The email notification looks similar to the following example. - +[](media/azure-stack-edge-gpu-manage-device-event-alert-notifications/azure-stack-edge-sample-action-rule-email-notification-13.png#lightbox) ## Next steps |
defender-for-cloud | Alerts Reference | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/alerts-reference.md | description: This article lists the security alerts visible in Microsoft Defende Previously updated : 01/16/2023 Last updated : 03/05/2023 # Security alerts - a reference guide Microsoft Defender for Containers provides security alerts on the cluster level |-||:--:|-| | **Azure Resource Manager operation from suspicious IP address**<br>(ARM_OperationFromSuspiciousIP) | Microsoft Defender for Resource Manager detected an operation from an IP address that has been marked as suspicious in threat intelligence feeds. | Execution | Medium | | **Azure Resource Manager operation from suspicious proxy IP address**<br>(ARM_OperationFromSuspiciousProxyIP) | Microsoft Defender for Resource Manager detected a resource management operation from an IP address that is associated with proxy services, such as TOR. While this behavior can be legitimate, it's often seen in malicious activities, when threat actors try to hide their source IP. | Defense Evasion | Medium |-| **MicroBurst exploitation toolkit used to enumerate resources in your subscriptions**<br>(ARM_MicroBurst.AzDomainInfo) | MicroBurst's Information Gathering module was run on your subscription. This tool can be used to discover resources, permissions and network structures. This was detected by analyzing the Azure Activity logs and resource management operations in your subscription | - | High | -| **MicroBurst exploitation toolkit used to enumerate resources in your subscriptions**<br>(ARM_MicroBurst.AzureDomainInfo) | MicroBurst's Information Gathering module was run on your subscription. This tool can be used to discover resources, permissions and network structures. This was detected by analyzing the Azure Activity logs and resource management operations in your subscription | - | High | +| **MicroBurst exploitation toolkit used to enumerate resources in your subscriptions**<br>(ARM_MicroBurst.AzDomainInfo) | MicroBurst's Information Gathering module was run on your subscription. This tool can be used to discover resources, permissions and network structures. This was detected by analyzing the Azure Activity logs and resource management operations in your subscription | - | Low | +| **MicroBurst exploitation toolkit used to enumerate resources in your subscriptions**<br>(ARM_MicroBurst.AzureDomainInfo) | MicroBurst's Information Gathering module was run on your subscription. This tool can be used to discover resources, permissions and network structures. This was detected by analyzing the Azure Activity logs and resource management operations in your subscription | - | Low | | **MicroBurst exploitation toolkit used to execute code on your virtual machine**<br>(ARM_MicroBurst.AzVMBulkCMD) | MicroBurst's exploitation toolkit was used to execute code on your virtual machines. This was detected by analyzing Azure Resource Manager operations in your subscription. | Execution | High | | **MicroBurst exploitation toolkit used to execute code on your virtual machine**<br>(RM_MicroBurst.AzureRmVMBulkCMD) | MicroBurst's exploitation toolkit was used to execute code on your virtual machines. This was detected by analyzing Azure Resource Manager operations in your subscription. | - | High | | **MicroBurst exploitation toolkit used to extract keys from your Azure key vaults**<br>(ARM_MicroBurst.AzKeyVaultKeysREST) | MicroBurst's exploitation toolkit was used to extract keys from your Azure key vaults. This was detected by analyzing Azure Activity logs and resource management operations in your subscription. | - | High | Microsoft Defender for Containers provides security alerts on the cluster level [Further details and notes](defender-for-key-vault-introduction.md) -| Alert (alert type) | Description | MITRE tactics<br>([Learn more](#intentions)) | Severity | -||--|:--:|-| -| **Access from a suspicious IP address to a key vault**<br>(KV_SuspiciousIPAccess) | A key vault has been successfully accessed by an IP that has been identified by Microsoft Threat Intelligence as a suspicious IP address. This may indicate that your infrastructure has been compromised. We recommend further investigation. Learn more about [Microsoft's threat intelligence capabilities](https://go.microsoft.com/fwlink/?linkid=2128684). | Credential Access | Medium | -| **Access from a TOR exit node to a key vault**<br>(KV_TORAccess) | A key vault has been accessed from a known TOR exit node. This could be an indication that a threat actor has accessed the key vault and is using the TOR network to hide their source location. We recommend further investigations. | Credential Access | Medium | -| **High volume of operations in a key vault**<br>(KV_OperationVolumeAnomaly) | An anomalous number of key vault operations were performed by a user, service principal, and/or a specific key vault. This anomalous activity pattern may be legitimate, but it could be an indication that a threat actor has gained access to the key vault and the secrets contained within it. We recommend further investigations. | Credential Access | Medium | -| **Suspicious policy change and secret query in a key vault**<br>(KV_PutGetAnomaly) | A user or service principal has performed an anomalous Vault Put policy change operation followed by one or more Secret Get operations. This pattern is not normally performed by the specified user or service principal. This may be legitimate activity, but it could be an indication that a threat actor has updated the key vault policy to access previously inaccessible secrets. We recommend further investigations. | Credential Access | Medium | -| **Suspicious secret listing and query in a key vault**<br>(KV_ListGetAnomaly) | A user or service principal has performed an anomalous Secret List operation followed by one or more Secret Get operations. This pattern is not normally performed by the specified user or service principal and is typically associated with secret dumping. This may be legitimate activity, but it could be an indication that a threat actor has gained access to the key vault and is trying to discover secrets that can be used to move laterally through your network and/or gain access to sensitive resources. We recommend further investigations. | Credential Access | Medium | +| Alert (alert type) | Description | MITRE tactics<br>([Learn more](#intentions)) | Severity | +|||:-:|| +| **Access from a suspicious IP address to a key vault**<br>(KV_SuspiciousIPAccess) | A key vault has been successfully accessed by an IP that has been identified by Microsoft Threat Intelligence as a suspicious IP address. This may indicate that your infrastructure has been compromised. We recommend further investigation. Learn more about [Microsoft's threat intelligence capabilities](https://go.microsoft.com/fwlink/?linkid=2128684). | Credential Access | Medium | +| **Access from a TOR exit node to a key vault**<br>(KV_TORAccess) | A key vault has been accessed from a known TOR exit node. This could be an indication that a threat actor has accessed the key vault and is using the TOR network to hide their source location. We recommend further investigations. | Credential Access | Medium | +| **High volume of operations in a key vault**<br>(KV_OperationVolumeAnomaly) | An anomalous number of key vault operations were performed by a user, service principal, and/or a specific key vault. This anomalous activity pattern may be legitimate, but it could be an indication that a threat actor has gained access to the key vault and the secrets contained within it. We recommend further investigations. | Credential Access | Medium | +| **Suspicious policy change and secret query in a key vault**<br>(KV_PutGetAnomaly) | A user or service principal has performed an anomalous Vault Put policy change operation followed by one or more Secret Get operations. This pattern is not normally performed by the specified user or service principal. This may be legitimate activity, but it could be an indication that a threat actor has updated the key vault policy to access previously inaccessible secrets. We recommend further investigations. | Credential Access | Medium | +| **Suspicious secret listing and query in a key vault**<br>(KV_ListGetAnomaly) | A user or service principal has performed an anomalous Secret List operation followed by one or more Secret Get operations. This pattern is not normally performed by the specified user or service principal and is typically associated with secret dumping. This may be legitimate activity, but it could be an indication that a threat actor has gained access to the key vault and is trying to discover secrets that can be used to move laterally through your network and/or gain access to sensitive resources. We recommend further investigations. | Credential Access | Medium | | **Unusual access denied - User accessing high volume of key vaults denied**<br>(KV_AccountVolumeAccessDeniedAnomaly) | A user or service principal has attempted access to anomalously high volume of key vaults in the last 24 hours. This anomalous access pattern may be legitimate activity. Though this attempt was unsuccessful, it could be an indication of a possible attempt to gain access of key vault and the secrets contained within it. We recommend further investigations. | Discovery | Low |-| **Unusual access denied - Unusual user accessing key vault denied**<br>(KV_UserAccessDeniedAnomaly) | A key vault access was attempted by a user that does not normally access it, this anomalous access pattern may be legitimate activity. Though this attempt was unsuccessful, it could be an indication of a possible attempt to gain access of key vault and the secrets contained within it. | Initial Access, Discovery | Low | -| **Unusual application accessed a key vault**<br>(KV_AppAnomaly) | A key vault has been accessed by a service principal that does not normally access it. This anomalous access pattern may be legitimate activity, but it could be an indication that a threat actor has gained access to the key vault in an attempt to access the secrets contained within it. We recommend further investigations. | Credential Access | Medium | -| **Unusual operation pattern in a key vault**<br>(KV_OperationPatternAnomaly) | An anomalous pattern of key vault operations was performed by a user, service principal, and/or a specific key vault. This anomalous activity pattern may be legitimate, but it could be an indication that a threat actor has gained access to the key vault and the secrets contained within it. We recommend further investigations. | Credential Access | Medium | -| **Unusual user accessed a key vault**<br>(KV_UserAnomaly) | A key vault has been accessed by a user that does not normally access it. This anomalous access pattern may be legitimate activity, but it could be an indication that a threat actor has gained access to the key vault in an attempt to access the secrets contained within it. We recommend further investigations. | Credential Access | Medium | -| **Unusual user-application pair accessed a key vault**<br>(KV_UserAppAnomaly) | A key vault has been accessed by a user-service principal pair that does not normally access it. This anomalous access pattern may be legitimate activity, but it could be an indication that a threat actor has gained access to the key vault in an attempt to access the secrets contained within it. We recommend further investigations. | Credential Access | Medium | -| **User accessed high volume of key vaults**<br>(KV_AccountVolumeAnomaly) | A user or service principal has accessed an anomalously high volume of key vaults. This anomalous access pattern may be legitimate activity, but it could be an indication that a threat actor has gained access to multiple key vaults in an attempt to access the secrets contained within them. We recommend further investigations. | Credential Access | Medium | -+| **Unusual access denied - Unusual user accessing key vault denied**<br>(KV_UserAccessDeniedAnomaly) | A key vault access was attempted by a user that does not normally access it, this anomalous access pattern may be legitimate activity. Though this attempt was unsuccessful, it could be an indication of a possible attempt to gain access of key vault and the secrets contained within it. | Initial Access, Discovery | Low | +| **Unusual application accessed a key vault**<br>(KV_AppAnomaly) | A key vault has been accessed by a service principal that does not normally access it. This anomalous access pattern may be legitimate activity, but it could be an indication that a threat actor has gained access to the key vault in an attempt to access the secrets contained within it. We recommend further investigations. | Credential Access | Medium | +| **Unusual operation pattern in a key vault**<br>(KV_OperationPatternAnomaly) | An anomalous pattern of key vault operations was performed by a user, service principal, and/or a specific key vault. This anomalous activity pattern may be legitimate, but it could be an indication that a threat actor has gained access to the key vault and the secrets contained within it. We recommend further investigations. | Credential Access | Medium | +| **Unusual user accessed a key vault**<br>(KV_UserAnomaly) | A key vault has been accessed by a user that does not normally access it. This anomalous access pattern may be legitimate activity, but it could be an indication that a threat actor has gained access to the key vault in an attempt to access the secrets contained within it. We recommend further investigations. | Credential Access | Medium | +| **Unusual user-application pair accessed a key vault**<br>(KV_UserAppAnomaly) | A key vault has been accessed by a user-service principal pair that does not normally access it. This anomalous access pattern may be legitimate activity, but it could be an indication that a threat actor has gained access to the key vault in an attempt to access the secrets contained within it. We recommend further investigations. | Credential Access | Medium | +| **User accessed high volume of key vaults**<br>(KV_AccountVolumeAnomaly) | A user or service principal has accessed an anomalously high volume of key vaults. This anomalous access pattern may be legitimate activity, but it could be an indication that a threat actor has gained access to multiple key vaults in an attempt to access the secrets contained within them. We recommend further investigations. | Credential Access | Medium | +| **Denied access from a suspicious IP to a key vault**<br>(KV_SuspiciousIPAccessDenied) | An unsuccessful key vault access has been attempted by an IP that has been identified by Microsoft Threat Intelligence as a suspicious IP address. Though this attempt was unsuccessful, it indicates that your infrastructure might have been compromised. We recommend further investigations. | Credential Access | Low | ## <a name="alerts-azureddos"></a>Alerts for Azure DDoS Protection |
defender-for-cloud | Custom Dashboards Azure Workbooks | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/custom-dashboards-azure-workbooks.md | This workbook provides a customizable data analysis and gives you the ability to :::image type="content" source="media/custom-dashboards-azure-workbooks/devops-workbook.png" alt-text="A screenshot that shows a sample results page once you've selected the DevOps workbook." lightbox="media/custom-dashboards-azure-workbooks/devops-workbook.png"::: > [!NOTE] -> You must have a [Github connector](quickstart-onboard-github.md) or a [DevOps connector](quickstart-onboard-devops.md), connected to your environment in order to utilize this workbook +> You must have a [GitHub connector](quickstart-onboard-github.md) or a [DevOps connector](quickstart-onboard-devops.md), connected to your environment in order to utilize this workbook **To deploy the workbook**: |
defender-for-cloud | Release Notes | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/release-notes.md | To learn about *planned* changes that are coming soon to Defender for Cloud, see > [!TIP] > If you're looking for items older than six months, you can find them in the [Archive for What's new in Microsoft Defender for Cloud](release-notes-archive.md). +## March 2023 ++Updates in March include: ++- [New alert in Azure Defender for Key Vault](#new-alert-in-azure-defender-for-key-vault) ++### New alert in Azure Defender for Key Vault ++Azure Defender for Key Vault has the following new alert: ++| Alert (alert type) | Description | MITRE tactics | Severity | +|||:-:|| +| **Denied access from a suspicious IP to a key vault**<br>(KV_SuspiciousIPAccessDenied) | An unsuccessful key vault access has been attempted by an IP that has been identified by Microsoft Threat Intelligence as a suspicious IP address. Though this attempt was unsuccessful, it indicates that your infrastructure might have been compromised. We recommend further investigations. | Credential Access | Low | ++You can see a list of all of the [alerts available for Key Vault](alerts-reference.md). + ## February 2023 Updates in February include: |
defender-for-iot | Neousys Nuvo 5006Lp | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-iot/organizations/appliance-catalog/neousys-nuvo-5006lp.md | -> Neousys Nuvo-5006LP is a Legacy appliance and it supports Defender for IoT sensor software up to version 22.2.9. -> It is recommended that these appliances be replaced with newer certified models such as the [YS-FIT2](ys-techsystems-ys-fit2.md) or [HPE DL20 (NHP 2LFF)](hpe-proliant-dl20-plus-smb.md). +> Neousys Nuvo-5006LP is a legacy appliance, and is supported for Defender for IoT software up to the latest patch for versions [22.2.x](../release-notes.md#versions-222x). We recommend that you replace these appliances with newer certified models, such as the [YS-FIT2](ys-techsystems-ys-fit2.md) or [HPE DL20 (NHP 2LFF)](hpe-proliant-dl20-plus-smb.md). | Appliance characteristic |Details | ||| |**Hardware profile** | L100 | |**Performance** | Max bandwidth: 30 Mbps<br>Max devices: 400 | |**Physical specifications** | Mounting: Mounting kit, Din Rail<br>Ports: 5x RJ45|-|**Status** | Supported up to version 22.2.9| +|**Status** | Supported up to the latest Defender for IoT software patch for versions [22.2.x](../release-notes.md#versions-222x)| :::image type="content" source="../media/ot-system-requirements/cyberx.png" alt-text="Photo of a Neousys Nuvo-5006LP." border="false"::: |
dms | Ads Sku Recommend | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/dms/ads-sku-recommend.md | -# Get Azure recommendations to migrate your SQL Server database (preview) +# Get Azure recommendations to migrate your SQL Server database Learn how to use the unified experience in the [Azure SQL Migration extension for Azure Data Studio](/sql/azure-data-studio/extensions/azure-sql-migration-extension) to assess your database requirements, get right-sized SKU recommendations for Azure resources, and migrate your SQL Server databases to Azure. Before you migrate your SQL Server databases to Azure, it's important to assess It's equally important to identify the right-sized Azure resource to migrate to so that your database workload performance requirements are met with minimal cost. -The Azure SQL Migration extension for Azure Data Studio provides both the assessment and SKU recommendations when you're trying to choose the best option to migrate your SQL Server databases to Azure SQL Managed Instance, SQL Server on Azure Virtual Machines, or Azure SQL Database (preview). The extension has an intuitive interface to help you efficiently run the assessment and generate recommendations. +The Azure SQL Migration extension for Azure Data Studio provides both the assessment and SKU recommendations when you're trying to choose the best option to migrate your SQL Server databases to Azure SQL Managed Instance, SQL Server on Azure Virtual Machines, or Azure SQL Database. The extension has an intuitive interface to help you efficiently run the assessment and generate recommendations. > [!NOTE] > Assessment and the Azure recommendation feature in the Azure SQL Migration extension for Azure Data Studio supports source SQL Server instances running on Windows or Linux. |
dms | Migration Dms Powershell Cli | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/dms/migration-dms-powershell-cli.md | SQL Server to Azure SQL Managed Instance (using file share)|[PowerShell](https:/ SQL Server to Azure SQL Managed Instance (using Azure storage)|[PowerShell](https://github.com/Azure-Samples/data-migration-sql/blob/main//PowerShell/sql-server-to-sql-mi-blob.md) / [Azure CLI](https://github.com/Azure-Samples/data-migration-sql/blob/main/CLI/sql-server-to-sql-mi-blob.md) SQL Server to SQL Server on Azure Virtual Machines (using file share)|[PowerShell](https://github.com/Azure-Samples/data-migration-sql/blob/main//PowerShell/sql-server-to-sql-vm-fileshare.md) / [Azure CLI](https://github.com/Azure-Samples/data-migration-sql/blob/main/CLI/sql-server-to-sql-vm-fileshare.md) SQL Server to SQL Server on Azure Virtual Machines (using Azure Storage)|[PowerShell](https://github.com/Azure-Samples/data-migration-sql/blob/main//PowerShell/sql-server-to-sql-vm-blob.md) / [Azure CLI](https://github.com/Azure-Samples/data-migration-sql/blob/main/CLI/sql-server-to-sql-vm-blob.md)-SQL Server to Azure SQL Database (Preview)|[PowerShell](https://github.com/Azure-Samples/data-migration-sql/blob/main//PowerShell/sql-server-to-sql-db.md) / [Azure CLI](https://github.com/Azure-Samples/data-migration-sql/blob/main/CLI/sql-server-to-sql-db.md) +SQL Server to Azure SQL Database |[PowerShell](https://github.com/Azure-Samples/data-migration-sql/blob/main//PowerShell/sql-server-to-sql-db.md) / [Azure CLI](https://github.com/Azure-Samples/data-migration-sql/blob/main/CLI/sql-server-to-sql-db.md) SKU recommendations (Preview)|[PowerShell](https://github.com/Azure-Samples/data-migration-sql/blob/main//PowerShell/sql-server-sku-recommendation.md) / [Azure CLI](https://github.com/Azure-Samples/data-migration-sql/blob/main/CLI/sql-server-sku-recommendation.md) End-to-End migration automation|[PowerShell](https://github.com/Azure-Samples/data-migration-sql/blob/main//PowerShell/scripts/) / [Azure CLI](https://github.com/Azure-Samples/data-migration-sql/blob/main/CLI/scripts/) End-to-End migration automation for multiple databases|[PowerShell](https://github.com/Azure-Samples/data-migration-sql/blob/main//PowerShell/scripts/multiple%20databases/) / [Azure CLI](https://github.com/Azure-Samples/data-migration-sql/blob/main/CLI/scripts/multiple%20databases/) End-to-End migration automation for multiple databases|[PowerShell](https://gith Pre-requisites that are common across all supported migration scenarios using Azure PowerShell or Azure CLI are: * Have an Azure account that is assigned to one of the built-in roles listed below:- - Contributor for the target Azure SQL Managed Instance, SQL Server on Azure Virtual Machines or Azure SQL Database (Preview) and, Storage Account to upload your database backup files from SMB network share (*Not applicable for Azure SQL Database*). - - Reader role for the Azure Resource Groups containing the target Azure SQL Managed Instance, SQL Server on Azure Virtual Machines or Azure SQL Database (Preview). + - Contributor for the target Azure SQL Managed Instance, SQL Server on Azure Virtual Machines or Azure SQL Database and, Storage Account to upload your database backup files from SMB network share (*Not applicable for Azure SQL Database*). + - Reader role for the Azure Resource Groups containing the target Azure SQL Managed Instance, SQL Server on Azure Virtual Machines or Azure SQL Database. - Owner or Contributor role for the Azure subscription. > [!IMPORTANT] > Azure account is only required when running the migration steps and is not required for assessment or Azure recommendation steps process.-* Create a target [Azure SQL Managed Instance](/azure/azure-sql/managed-instance/create-configure-managed-instance-powershell-quickstart), [SQL Server on Azure Virtual Machine](/azure/azure-sql/virtual-machines/windows/sql-vm-create-powershell-quickstart), or [Azure SQL Database (Preview)](/azure/azure-sql/database/single-database-create-quickstart) +* Create a target [Azure SQL Managed Instance](/azure/azure-sql/managed-instance/create-configure-managed-instance-powershell-quickstart), [SQL Server on Azure Virtual Machine](/azure/azure-sql/virtual-machines/windows/sql-vm-create-powershell-quickstart), or [Azure SQL Database](/azure/azure-sql/database/single-database-create-quickstart) > [!IMPORTANT] - > If your target is Azure SQL Database (Preview) you have to migrate database schema from source to target using [SQL Server dacpac extension](/sql/azure-data-studio/extensions/sql-server-dacpac-extension) or, [SQL Database Projects extension](/sql/azure-data-studio/extensions/sql-database-project-extension) for Azure Data Studio. + > If your target is Azure SQL Database you have to migrate database schema from source to target using [SQL Server dacpac extension](/sql/azure-data-studio/extensions/sql-server-dacpac-extension) or, [SQL Database Projects extension](/sql/azure-data-studio/extensions/sql-database-project-extension) for Azure Data Studio. > > If you have an existing Azure Virtual Machine, it should be registered with [SQL IaaS Agent extension in Full management mode](/azure/azure-sql/virtual-machines/windows/sql-server-iaas-agent-extension-automate-management#management-modes). * If your target is **Azure SQL Managed Instance** or **SQL Server on Azure Virtual Machine** ensure that the logins used to connect the source SQL Server are members of the *sysadmin* server role or have `CONTROL SERVER` permission.-* If your target is **Azure SQL Database (Preview)**, ensure that the login used to connect the source SQL Server is a member, and the `db_datareader` and login for the target SQL server is `db_owner`. +* If your target is **Azure SQL Database**, ensure that the login used to connect the source SQL Server is a member, and the `db_datareader` and login for the target SQL server is `db_owner`. * Use one of the following storage options for the full database and transaction log backup files: - SMB network share - Azure storage account file share or blob container Pre-requisites that are common across all supported migration scenarios using Az * If you're using the Azure Database Migration Service for the first time, ensure that Microsoft.DataMigration resource provider is registered in your subscription. You can follow the steps to [register the resource provider](./quickstart-create-data-migration-service-portal.md#register-the-resource-provider) > [!IMPORTANT]- > If your migration target is Azure SQL Database (Preview), you don't need backups to perform this migration. The migration to Azure SQL Database is considered a logical migration involving the database's pre-creation and data movement (performed by DMS). + > If your migration target is Azure SQL Database, you don't need backups to perform this migration. The migration to Azure SQL Database is considered a logical migration involving the database's pre-creation and data movement (performed by DMS). ## Automate database migrations Using Azure PowerShell [Az.DataMigration](/powershell/module/az.datamigration) or Azure CLI [az datamigration](/cli/azure/datamigration), you can migrate databases by automating the creation of the Azure Database Migration Service, configuring database migrations for online migration, and performing a cutover. There are several more commands and functionality that are documented in [Azure Samples](https://github.com/Azure-Samples/data-migration-sql). |
dms | Migration Using Azure Data Studio | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/dms/migration-using-azure-data-studio.md | For information about specific migration scenarios and Azure SQL targets, see th ||| SQL Server to Azure SQL Managed Instance| [Online](./tutorial-sql-server-managed-instance-online-ads.md) / [Offline](./tutorial-sql-server-managed-instance-offline-ads.md) SQL Server to SQL Server on an Azure virtual machine|[Online](./tutorial-sql-server-to-virtual-machine-online-ads.md) / [Offline](./tutorial-sql-server-to-virtual-machine-offline-ads.md)-SQL Server to Azure SQL Database (preview)| [Offline](./tutorial-sql-server-azure-sql-database-offline-ads.md) +SQL Server to Azure SQL Database | [Offline](./tutorial-sql-server-azure-sql-database-offline-ads.md) > [!IMPORTANT] > If your target is Azure SQL Database, make sure you deploy the database schema before you begin the migration. You can use tools like the [SQL Server dacpac extension](/sql/azure-data-studio/extensions/sql-server-dacpac-extension) or the [SQL Database Projects extension](/sql/azure-data-studio/extensions/sql-database-project-extension) for Azure Data Studio. The following sections walk through the prerequisites for each supported Azure S [!INCLUDE [dms-ads-sqlvm-prereq](../../includes/dms-ads-sqlvm-prereq.md)] -### [Azure SQL Database (preview)](#tab/azure-sql-db) +### [Azure SQL Database](#tab/azure-sql-db) [!INCLUDE [dms-ads-sqldb-prereq](../../includes/dms-ads-sqldb-prereq.md)] |
dms | Resource Custom Roles Sql Database Ads | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/dms/resource-custom-roles-sql-database-ads.md | Title: "Custom roles for SQL Server to Azure SQL Database (preview) migrations in Azure Data Studio" + Title: "Custom roles for SQL Server to Azure SQL Database migrations in Azure Data Studio" -description: Learn how to use custom roles for SQL Server to Azure SQL Database (preview) migrations in Azure Data Studio. +description: Learn how to use custom roles for SQL Server to Azure SQL Database migrations in Azure Data Studio. Last updated 09/28/2022-# Custom roles for SQL Server to Azure SQL Database (preview) migrations in Azure Data Studio +# Custom roles for SQL Server to Azure SQL Database migrations in Azure Data Studio -This article explains how to set up a custom role in Azure for SQL Server database migrations. A custom role will have only the permissions that are required to create and run an instance of Azure Database Migration Service with Azure SQL Database (preview) as a target. +This article explains how to set up a custom role in Azure for SQL Server database migrations. A custom role will have only the permissions that are required to create and run an instance of Azure Database Migration Service with Azure SQL Database as a target. Use the AssignableScopes section of the role definition JSON string to control where the permissions appear in the **Add role assignment** UI in the Azure portal. To avoid cluttering the UI with extra roles, you might want to define the role at the level of the resource group, or even the level of the resource. The resource that the custom role applies to doesn't perform the actual role assignment. You can use either the Azure portal, Azure PowerShell, the Azure CLI, or the Azu For more information, see [Create custom roles by using the Azure portal](../role-based-access-control/custom-roles-portal.md) and [Azure custom roles](../role-based-access-control/custom-roles.md). -## Permissions required to migrate to Azure SQL Database (preview) +## Permissions required to migrate to Azure SQL Database | Permission action | Description | | - | --| |
dms | Tutorial Sql Server Azure Sql Database Offline Ads | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/dms/tutorial-sql-server-azure-sql-database-offline-ads.md | Title: "Tutorial: Migrate SQL Server to Azure SQL Database (preview) offline in Azure Data Studio" + Title: "Tutorial: Migrate SQL Server to Azure SQL Database offline in Azure Data Studio" -description: Learn how to migrate on-premises SQL Server to Azure SQL Database (preview) offline by using Azure Data Studio and Azure Database Migration Service. +description: Learn how to migrate on-premises SQL Server to Azure SQL Database offline by using Azure Data Studio and Azure Database Migration Service. Last updated 01/12/2023-# Tutorial: Migrate SQL Server to Azure SQL Database (preview) offline in Azure Data Studio +# Tutorial: Migrate SQL Server to Azure SQL Database offline in Azure Data Studio -You can use Azure Database Migration Service and the Azure SQL Migration extension for Azure Data Studio to migrate databases from an on-premises instance of SQL Server to Azure SQL Database (preview) offline and with minimal downtime. +You can use Azure Database Migration Service and the Azure SQL Migration extension for Azure Data Studio to migrate databases from an on-premises instance of SQL Server to Azure SQL Database offline and with minimal downtime. In this tutorial, learn how to migrate the example AdventureWorks2019 database from an on-premises instance of SQL Server to an instance of Azure SQL Database by using the Azure SQL Migration extension for Azure Data Studio. This tutorial uses offline migration mode, which considers an acceptable downtime during the migration process. |
event-hubs | Authenticate Shared Access Signature | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-hubs/authenticate-shared-access-signature.md | Title: Authenticate access to Azure Event Hubs with shared access signatures description: This article shows you how to authenticate access to Event Hubs resources using shared access signatures. Previously updated : 09/16/2022 Last updated : 03/13/2023 ms.devlang: csharp, java, javascript, php + # Authenticate access to Event Hubs resources using shared access signatures (SAS) -Shared access signature (SAS) gives you granular control over the type of access you grant to the clients who has the shared access signature. Here are some of the controls you can set in a SAS: +Shared access signature (SAS) gives you granular control over the type of access you grant to the clients. Here are some of the controls you can set in a SAS: - The interval over which the SAS is valid, which includes the start time and expiry time. - The permissions granted by the SAS. For example, a SAS for an Event Hubs namespace might grant the listen permission, but not the send permission. This article covers authenticating the access to Event Hubs resources using SAS. ## Configuring for SAS authentication-You can configure a shared access authorization rule on an Event Hubs namespace, or an entity (event hub instance or Kafka Topic in an event hub). Configuring a shared access authorization rule on a consumer group is currently not supported, but you can use rules configured on a namespace or entity to secure access to consumer group. +You can configure a SAS rule on an Event Hubs namespace, or an entity (event hub instance or Kafka Topic in an event hub). Configuring a SAS rule on a consumer group is currently not supported, but you can use rules configured on a namespace or entity to secure access to consumer group. The following image shows how the authorization rules apply on sample entities.  -In this example, the sample Event Hubs namespace (ExampleNamespace) has two entities: eh1 and topic1. The authorization rules are defined both at the entity level and also at the namespace level. +In this example, the sample Event Hubs namespace (ExampleNamespace) has two entities: eh1 and Kafka topic1. The authorization rules are defined both at the entity level and also at the namespace level. -The manageRuleNS, sendRuleNS, and listenRuleNS authorization rules apply to both event hub instance eh1 and topic t1. The listenRule-eh and sendRule-eh authorization rules apply only to event hub instance eh1 and sendRuleT authorization rule applies only to topic topic1. +The manageRuleNS, sendRuleNS, and listenRuleNS authorization rules apply to both eh1 and t1. The listenRule-eh and sendRule-eh authorization rules apply only to eh1 and sendRuleT authorization rule applies only to topic1. When you use sendRuleNS authorization rule, client applications can send to both eh1 and topic1. When sendRuleT authorization rule is used, it enforces granular access to topic1 only and hence client applications using this rule for access now can't send to eh1, but only to topic1. Any client that has access to name of an authorization rule name and one of its - `sr` ΓÇô URI of the resource being accessed. - `sig` ΓÇô Signature. -The signature-string is the SHA-256 hash computed over the resource URI (scope as described in the previous section) and the string representation of the token expiry instant, separated by CRLF. --The hash computation looks similar to the following pseudo code and returns a 256-bit/32-byte hash value. +The signature-string is the SHA-256 hash computed over the resource URI (scope as described in the previous section) and the string representation of the token expiry instant, separated by CRLF. The hash computation looks similar to the following pseudo code and returns a 256-bit/32-byte hash value. ``` SHA-256('https://<yournamespace>.servicebus.windows.net/'+'\n'+ 1438205742) The resource URI is the full URI of the Service Bus resource to which access is The URI must be percent-encoded. -The shared access authorization rule used for signing must be configured on the entity specified by this URI, or by one of its hierarchical parents. For example, `http://contoso.servicebus.windows.net/eh1` or `http://contoso.servicebus.windows.net` in the previous example. +The SAS rule used for signing must be configured on the entity specified by this URI, or by one of its hierarchical parents. For example, `http://contoso.servicebus.windows.net/eh1` or `http://contoso.servicebus.windows.net` in the previous example. A SAS token is valid for all resources prefixed with the `<resourceURI>` used in the signature-string. For certain organizational security requirements, you may have to disable local/ ### Disabling Local/SAS Key authentication via the portal You can disable local/SAS key authentication for a given Event Hubs namespace using the Azure portal. -As shown in the following image, in the namespace overview section, click on the *Local Authentication*. +As shown in the following image, in the namespace overview section, select **Local Authentication**.  -And then select *Disabled* option and click *Ok* as shown below. +And then select **Disabled** option and select **Ok** as shown below.  ### Disabling Local/SAS Key authentication using a template |
event-hubs | Authorize Access Event Hubs | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-hubs/authorize-access-event-hubs.md | Title: Authorize access to Azure Event Hubs description: This article provides information about different options for authorizing access to Azure Event Hubs resources. Previously updated : 09/20/2021 Last updated : 03/13/2023 # Authorize access to Azure Event Hubs -Every time you publish or consume events/data from an event hub, your client is trying to access Event Hubs resources. Every request to a secure resource must be authorized so that the service can ensure that the client has the required permissions to publish/consume the data. +Every time you publish or consume events from an event hub, your client is trying to access Event Hubs resources. Every request to a secure resource must be authorized so that the service can ensure that the client has the required permissions to publish or consume the data. Azure Event Hubs offers the following options for authorizing access to secure resources: Azure Event Hubs offers the following options for authorizing access to secure r > This article applies to both Event Hubs and [Apache Kafka](azure-event-hubs-kafka-overview.md) scenarios. ## Azure Active Directory-Azure Active Directory (Azure AD) integration for Event Hubs resources provides Azure role-based access control (Azure RBAC) for fine-grained control over a client's access to resources. You can use Azure RBAC to grant permissions to security principal, which may be a user, a group, or an application service principal. The security principal is authenticated by Azure AD to return an OAuth 2.0 token. The token can be used to authorize a request to access an Event Hubs resource. +Azure Active Directory (Azure AD) integration with Event Hubs resources provides Azure role-based access control (Azure RBAC) for fine-grained control over a client's access to resources. You can use Azure RBAC to grant permissions to security principal, which may be a user, a group, or an application service principal. Azure AD authenticates the security principal and returns an OAuth 2.0 token. The token can be used to authorize a request to access an Event Hubs resource. For more information about authenticating with Azure AD, see the following articles: -- [Authenticate requests to Azure Event Hubs using Azure Active Directory](authenticate-application.md)-- [Authorize access to Event Hubs resources using Azure Active Directory](authorize-access-azure-active-directory.md).+- [Authenticate requests to Azure Event Hubs using Azure AD](authenticate-application.md) +- [Authorize access to Event Hubs resources using Azure AD](authorize-access-azure-active-directory.md). ## Shared access signatures Shared access signatures (SAS) for Event Hubs resources provide limited delegated access to Event Hubs resources. Adding constraints on time interval for which the signature is valid or on permissions it grants provides flexibility in managing resources. For more information, see [Authenticate using shared access signatures (SAS)](authenticate-shared-access-signature.md). Authorizing users or applications using an OAuth 2.0 token returned by Azure AD provides superior security and ease of use over shared access signatures (SAS). With Azure AD, there's no need to store the access tokens with your code and risk potential security vulnerabilities. While you can continue to use shared access signatures (SAS) to grant fine-grained access to Event Hubs resources, Azure AD offers similar capabilities without the need to manage SAS tokens or worry about revoking a compromised SAS. -By default, all Event Hubs resources are secured, and are available only to the account owner. Although you can use any of the authorization strategies outlined above to grant clients access to Event Hub resources. Microsoft recommends using Azure AD when possible for maximum security and ease of use. +By default, all Event Hubs resources are secured, and are available only to the account owner. Although you can use any of the authorization strategies outlined above to grant clients access to Event Hubs resources. Microsoft recommends using Azure AD when possible for maximum security and ease of use. For more information about authorization using SAS, see [Authorizing access to Event Hubs resources using Shared Access Signatures](authorize-access-shared-access-signature.md). |
event-hubs | Authorize Access Shared Access Signature | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-hubs/authorize-access-shared-access-signature.md | Title: Authorize access with a shared access signature in Azure Event Hubs description: This article provides information about authorizing access to Azure Event Hubs resources by using Shared Access Signatures (SAS). Previously updated : 09/20/2021 Last updated : 03/13/2023 # Authorizing access to Event Hubs resources using Shared Access Signatures -A shared access signature (SAS) provides you with a way to grant limited access to resources in your Event Hubs namespace. SAS guards access to Event Hubs resources based on authorization rules. These rules are configured either on a namespace, or an entity (event hub or topic). This article provides an overview of the SAS model, and reviews SAS best practices. +A shared access signature (SAS) provides you with a way to grant limited access to resources in your Event Hubs namespace. SAS guards access to Event Hubs resources based on authorization rules. These rules are configured either on a namespace, or an event hub. This article provides an overview of the SAS model, and reviews SAS best practices. ++> [!NOTE] +> This article covers authorizing access to Event Hubs resources using SAS. To learn about **authenticating** access to Event Hubs resources using SAS, see [Authenticate with SAS](authorize-access-shared-access-signature.md). ## What are shared access signatures? A shared access signature (SAS) provides delegated access to Event Hubs resources based on authorization rules. An authorization rule has a name, is associated with specific rights, and carries a pair of cryptographic keys. You use the ruleΓÇÖs name and key via the Event Hubs clients or in your own code to generate SAS tokens. A client can then pass the token to Event Hubs to prove authorization for the requested operation. -SAS is a claim-based authorization mechanism using simple tokens. Using SAS, keys are never passed on the wire. Keys are used to cryptographically sign information that can later be verified by the service. SAS can be used similar to a username and password scheme where the client is in immediate possession of an authorization rule name and a matching key. SAS can be used similar to a federated security model, where the client receives a time-limited and signed access token from a security token service without ever coming into possession of the signing key. +SAS is a claim-based authorization mechanism using simple tokens. When you use SAS, keys are never passed on the wire. Keys are used to cryptographically sign information that can later be verified by the service. SAS can be used similar to a username and password scheme where the client is in immediate possession of an authorization rule name and a matching key. SAS can be used similar to a federated security model, where the client receives a time-limited and signed access token from a security token service without ever coming into possession of the signing key. > [!NOTE]-> Azure Event Hubs supports authorizing to Event Hubs resources using Azure Active Directory (Azure AD). Authorizing users or applications using OAuth 2.0 token returned by Azure AD provides superior security and ease of use over shared access signatures (SAS). With Azure AD, there is no need to store the tokens in your code and risk potential security vulnerabilities. +> Azure Event Hubs also supports authorizing to Event Hubs resources using Azure Active Directory (Azure AD). Authorizing users or applications using OAuth 2.0 token returned by Azure AD provides superior security and ease of use over shared access signatures (SAS). With Azure AD, there is no need to store the tokens in your code and risk potential security vulnerabilities. > > Microsoft recommends using Azure AD with your Azure Event Hubs applications when possible. For more information, see [Authorize access to Azure Event Hubs resource using Azure Active Directory](authorize-access-azure-active-directory.md). SAS is a claim-based authorization mechanism using simple tokens. Using SAS, key > SAS (Shared Access Signatures) tokens are critical to protect your resources. While providing granularity, SAS grants clients access to your Event Hubs resources. They should not be shared publicly. When sharing, if required for troubleshooting reasons, consider using a reduced version of any log files or deleting the SAS tokens (if present) from the log files, and make sure the screenshots donΓÇÖt contain the SAS information either. ## Shared access authorization policies-Each Event Hubs namespace and each Event Hubs entity (an event hub instance or a Kafka topic) has a shared access authorization policy made up of rules. The policy at the namespace level applies to all entities inside the namespace, irrespective of their individual policy configuration. +Each Event Hubs namespace and each Event Hubs entity (an event hub or a Kafka topic) has a shared access authorization policy made up of rules. The policy at the namespace level applies to all entities inside the namespace, irrespective of their individual policy configuration. For each authorization policy rule, you decide on three pieces of information: name, scope, and rights. The name is a unique name in that scope. The scope is the URI of the resource in question. For an Event Hubs namespace, the scope is the fully qualified domain name (FQDN), such as `https://<yournamespace>.servicebus.windows.net/`. The rights provided by the policy rule can be a combination of: - **Send** ΓÇô Gives the right to send messages to the entity-- **Listen** ΓÇô Gives the right to listen or receive to the entity-- **Manage** ΓÇô Gives the right to manage the topology of the namespace, including creation and deletion of entities--> [!NOTE] -> The **Manage** right includes the **Send** and **Listen** rights. +- **Listen** ΓÇô Gives the right to listen or receive messages from the entity +- **Manage** ΓÇô Gives the right to manage the topology of the namespace, including creation and deletion of entities. The **Manage** right includes the **Send** and **Listen** rights. A namespace or entity policy can hold up to 12 shared access authorization rules, providing room for the three sets of rules, each covering the basic rights, and the combination of Send and Listen. This limit underlines that the SAS policy store isn't intended to be a user or service account store. If your application needs to grant access to Event Hubs resources based on user or service identities, it should implement a security token service that issues SAS tokens after an authentication and access check. When you use shared access signatures in your applications, you need to be aware The following recommendations for using shared access signatures can help mitigate these risks: -- **Have clients automatically renew the SAS if necessary**: Clients should renew the SAS well before expiration, to allow time for retries if the service providing the SAS is unavailable. If your SAS is meant to be used for a small number of immediate, short-lived operations that are expected to be completed within the expiration period, then it may be unnecessary as the SAS is not expected to be renewed. However, if you have client that is routinely making requests via SAS, then the possibility of expiration comes into play. The key consideration is to balance the need for the SAS to be short-lived (as previously stated) with the need to ensure that client is requesting renewal early enough (to avoid disruption due to the SAS expiring prior to a successful renewal).-- **Be careful with the SAS start time**: If you set the start time for SAS to **now**, then due to clock skew (differences in current time according to different machines), failures may be observed intermittently for the first few minutes. In general, set the start time to be at least 15 minutes in the past. Or, donΓÇÖt set it at all, which will make it valid immediately in all cases. The same generally applies to the expiry time as well. Remember that you may observer up to 15 minutes of clock skew in either direction on any request. +- **Have clients automatically renew the SAS if necessary**: Clients should renew the SAS well before expiration, to allow time for retries if the service providing the SAS is unavailable. If your SAS is meant to be used for a small number of immediate, short-lived operations that are expected to be completed within the expiration period, then it may be unnecessary as the SAS isn't expected to be renewed. However, if you have client that is routinely making requests via SAS, then the possibility of expiration comes into play. The key consideration is to balance the need for the SAS to be short-lived (as previously stated) with the need to ensure that client is requesting renewal early enough (to avoid disruption due to the SAS expiring prior to a successful renewal). +- **Be careful with the SAS start time**: If you set the start time for SAS to **now**, then due to clock skew (differences in current time according to different machines), failures may be observed intermittently for the first few minutes. In general, set the start time to be at least 15 minutes in the past. Or, donΓÇÖt set it at all, which will make it valid immediately in all cases. The same generally applies to the expiry time as well. Remember that you may observe up to 15 minutes of clock skew in either direction on any request. - **Be specific with the resource to be accessed**: A security best practice is to provide user with the minimum required privileges. If a user only needs read access to a single entity, then grant them read access to that single entity, and not read/write/delete access to all entities. It also helps lessen the damage if a SAS is compromised because the SAS has less power in the hands of an attacker. - **DonΓÇÖt always use SAS**: Sometimes the risks associated with a particular operation against your Event Hubs outweigh the benefits of SAS. For such operations, create a middle-tier service that writes to your Event Hubs after business rule validation, authentication, and auditing.-- **Always use HTTPs**: Always use Https to create or distribute a SAS. If a SAS is passed over HTTP and intercepted, an attacker performing a man-in-the-middle attach is able to read the SAS and then use it just as the intended user could have, potentially compromising sensitive data or allowing for data corruption by the malicious user.+- **Always use HTTPs**: Always use Https to create or distribute a SAS. If a SAS is passed over HTTP and intercepted, an attacker performing a man-in-the-middle attack is able to read the SAS and then use it just as the intended user could have, potentially compromising sensitive data or allowing for data corruption by the malicious user. ## Conclusion-Share access signatures are useful for providing limited permissions to Event Hubs resources to your clients. They are vital part of the security model for any application using Azure Event Hubs. If you follow the best practices listed in this article, you can use SAS to provide greater flexibility of access to your resources, without compromising the security of your application. +Share access signatures are useful for providing limited permissions to Event Hubs resources to your clients. They're vital part of the security model for any application using Azure Event Hubs. If you follow the best practices listed in this article, you can use SAS to provide greater flexibility of access to your resources, without compromising the security of your application. ## Next steps See the following related articles: |
event-hubs | Azure Event Hubs Kafka Overview | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-hubs/azure-event-hubs-kafka-overview.md | Title: Introduction to Apache Kafka on Azure Event Hubs -description: Learn what Apache Kafka on Azure Event Hubs is and how to use it to stream data from Apache Kafka applications without setting up a Kafka cluster on your own. + Title: Introduction to Apache Kafka in Event Hubs on Azure Cloud +description: Learn what Apache Kafka in the Event Hubs service on Azure Cloud is and how to use it to stream data from Apache Kafka applications without setting up a Kafka cluster on your own. Last updated 02/03/2023 |
event-hubs | Event Hubs Availability And Consistency | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-hubs/event-hubs-availability-and-consistency.md | Title: Availability and consistency - Azure Event Hubs | Microsoft Docs description: How to provide the maximum amount of availability and consistency with Azure Event Hubs using partitions. Previously updated : 03/15/2021 Last updated : 03/13/2023 ms.devlang: csharp, java, javascript, python This article provides information about availability and consistency supported b ## Availability Azure Event Hubs spreads the risk of catastrophic failures of individual machines or even complete racks across clusters that span multiple failure domains within a datacenter. It implements transparent failure detection and failover mechanisms such that the service will continue to operate within the assured service-levels and typically without noticeable interruptions when such failures occur. -If an Event Hubs namespace has been created with [availability zones](../availability-zones/az-overview.md) enabled, the outage risk is further spread across three physically separated facilities, and the service has enough capacity reserves to instantly cope up with the complete, catastrophic loss of the entire facility. For more information, see [Azure Event Hubs - Geo-disaster recovery](event-hubs-geo-dr.md). +If an Event Hubs namespace is created in a region with [availability zones](../availability-zones/az-overview.md), the outage risk is further spread across three physically separated facilities, and the service has enough capacity reserves to instantly cope up with the complete, catastrophic loss of the entire facility. For more information, see [Azure Event Hubs - Geo-disaster recovery](event-hubs-geo-dr.md). When a client application sends events to an event hub without specifying a partition, events are automatically distributed among partitions in your event hub. If a partition isn't available for some reason, events are distributed among the remaining partitions. This behavior allows for the greatest amount of up time. For use cases that require the maximum up time, this model is preferred instead of sending events to a specific partition. |
event-hubs | Event Hubs Dedicated Overview | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-hubs/event-hubs-dedicated-overview.md | The Event Hubs Dedicated offering is billed at a fixed monthly price, with a **m For more information about quotas and limits, see [Event Hubs quotas and limits](event-hubs-quotas.md) -## How to onboard -Event Hubs Dedicated tier is generally available (GA). The self-serve experience to create an Event Hubs cluster through the [Azure portal](event-hubs-dedicated-cluster-create-portal.md) is currently in Preview. You can also request for the cluster to be created by contacting the [Event Hubs team](mailto:askeventhubs@microsoft.com). - ## FAQs [!INCLUDE [event-hubs-dedicated-clusters-faq](./includes/event-hubs-dedicated-clusters-faq.md)] |
expressroute | Expressroute Faqs | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/expressroute/expressroute-faqs.md | Your existing circuit will continue advertising the prefixes for Microsoft 365. * Microsoft peering of ExpressRoute circuits that are configured on or after August 1, 2017 won't have any prefixes advertised until a route filter is attached to the circuit. You'll see no prefixes by default. -### If I have multiple Virtual Networks (Vnets) connected to the same ExpressRoute circuit, can I use ExpressRoute for Vnet-to-Vnet connectivity? -Vnet-to-Vnet connectivity over ExpressRoute isn't recommended. To achieve this, configure [Virtual Network Peering](../virtual-network/virtual-network-peering-overview.md?msclkid=b64a7b6ac19e11eca60d5e3e5d0764f5). +### If I have multiple Virtual Networks (VNets) connected to the same ExpressRoute circuit, can I use ExpressRoute for VNet-to-VNet connectivity? +VNet-to-VNet connectivity over ExpressRoute isn't recommended. To achieve this, configure [Virtual Network Peering](../virtual-network/virtual-network-peering-overview.md?msclkid=b64a7b6ac19e11eca60d5e3e5d0764f5). ## <a name="expressRouteDirect"></a>ExpressRoute Direct You can associate a single ExpressRoute Direct circuit with multiple ExpressRout ### Does the ExpressRoute service store customer data? -No. +No. |
expressroute | Expressroute For Cloud Solution Providers | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/expressroute/expressroute-for-cloud-solution-providers.md | ExpressRoute supports network speeds from 50 Mb/s to 10 Gb/s. This allows custom > > -ExpressRoute supports the connection of multiple vNets to a single ExpressRoute circuit for better utilization of the higher-speed connections. A single ExpressRoute circuit can be shared among multiple Azure subscriptions owned by the same customer. +ExpressRoute supports the connection of multiple VNets to a single ExpressRoute circuit for better utilization of the higher-speed connections. A single ExpressRoute circuit can be shared among multiple Azure subscriptions owned by the same customer. ## Configuring ExpressRoute ExpressRoute can be configured to support three types of traffic ([routing domains](#expressroute-routing-domains)) over a single ExpressRoute circuit. This traffic is segregated into private peering, Microsoft peering, and public peering (deprecated). You can choose one or all types of traffic to be sent over a single ExpressRoute circuit or use multiple ExpressRoute circuits depending on the size of the ExpressRoute circuit and isolation required by your customer. The security posture of your customer may not allow public traffic and private traffic to traverse over the same circuit. ### Connect-through model-In a connect-through configuration, you will be responsible for all of the networking underpinnings to connect your customer's datacenter resources to the subscriptions hosted in Azure. Each of your customers that want to use Azure capabilities will need their own ExpressRoute connection, which will be managed by you. You will use the same methods the customer would use to procure the ExpressRoute circuit. You will follow the same steps outlined in the article [ExpressRoute workflows](expressroute-workflows.md) for circuit provisioning and circuit states. You will then configure the Border Gateway Protocol (BGP) routes to control the traffic flowing between the on-premises network and Azure vNet. +In a connect-through configuration, you will be responsible for all of the networking underpinnings to connect your customer's datacenter resources to the subscriptions hosted in Azure. Each of your customers that want to use Azure capabilities will need their own ExpressRoute connection, which will be managed by you. You will use the same methods the customer would use to procure the ExpressRoute circuit. You will follow the same steps outlined in the article [ExpressRoute workflows](expressroute-workflows.md) for circuit provisioning and circuit states. You will then configure the Border Gateway Protocol (BGP) routes to control the traffic flowing between the on-premises network and Azure VNet. ### Connect-to model-In a connect-to configuration, your customer already has an existing connection to Azure or will initiate a connection to the internet service provider linking ExpressRoute from their own datacenter directly to Azure, instead of your datacenter. To begin the provisioning process, your customer will follow the steps as described in the Connect-Through model, above. Once the circuit has been established, your customer will need to configure the on-premises routers to be able to access both your network and Azure vNets. +In a connect-to configuration, your customer already has an existing connection to Azure or will initiate a connection to the internet service provider linking ExpressRoute from their own datacenter directly to Azure, instead of your datacenter. To begin the provisioning process, your customer will follow the steps as described in the Connect-Through model, above. Once the circuit has been established, your customer will need to configure the on-premises routers to be able to access both your network and Azure VNets. You can assist with setting up the connection and configuring the routes to allow the resources in your datacenter(s) to communicate with the client resources in your datacenter, or with the resources hosted in Azure. You can define custom routes filters to allow only the route(s) you want to allo ## Routing ExpressRoute connects to the Azure networks through the Azure Virtual Network Gateway. Network gateways provide routing for Azure virtual networks. -Creating Azure Virtual Networks also creates a default routing table for the vNet to direct traffic to/from the subnets of the vNet. If the default route table is insufficient for the solution, custom routes can be created to route outgoing traffic to custom appliances or to block routes to specific subnets or external networks. +Creating Azure Virtual Networks also creates a default routing table for the VNet to direct traffic to/from the subnets of the VNet. If the default route table is insufficient for the solution, custom routes can be created to route outgoing traffic to custom appliances or to block routes to specific subnets or external networks. ### Default routing The default route table includes the following routes: The default route table includes the following routes: User-defined routes allow the control of traffic outbound from the assigned subnet to other subnets in the virtual network or over one of the other predefined gateways (ExpressRoute; internet or VPN). The default system routing table can be replaced with a user-defined routing table that replaces the default routing table with custom routes. With user-defined routing, customers can create specific routes to appliances such as firewalls or intrusion detection appliances, or block access to specific subnets from the subnet hosting the user-defined route. For an overview of User-Defined Routes look [here](../virtual-network/virtual-networks-udr-overview.md). ## Security-Depending on which model is in use, Connect-To or Connect-Through, your customer defines the security policies in their vNet or provides the security policy requirements to the CSP to define to their vNets. The following security criteria can be defined: +Depending on which model is in use, Connect-To or Connect-Through, your customer defines the security policies in their VNet or provides the security policy requirements to the CSP to define to their VNets. The following security criteria can be defined: -1. **Customer Isolation** ΓÇö The Azure platform provides customer isolation by storing Customer ID and vNet info in a secure database, which is used to encapsulate each customerΓÇÖs traffic in a GRE tunnel. -2. **Network Security Group (NSG)** rules are for defining allowed traffic into and out of the subnets within vNets in Azure. By default, the NSG contains Block rules to block traffic from the Internet to the vNet and Allow rules for traffic within a vNet. For more information about Network Security Groups, look [here](https://azure.microsoft.com/blog/network-security-groups/). +1. **Customer Isolation** ΓÇö The Azure platform provides customer isolation by storing Customer ID and VNet info in a secure database, which is used to encapsulate each customerΓÇÖs traffic in a GRE tunnel. +2. **Network Security Group (NSG)** rules are for defining allowed traffic into and out of the subnets within VNets in Azure. By default, the NSG contains Block rules to block traffic from the Internet to the VNet and Allow rules for traffic within a VNet. For more information about Network Security Groups, look [here](https://azure.microsoft.com/blog/network-security-groups/). 3. **Force tunneling** ΓÇöThis is an option to redirect internet bound traffic originating in Azure to be redirected over the ExpressRoute connection to the on premises datacenter. For more information about Forced tunneling look [here](expressroute-routing.md#advertising-default-routes). 4. **Encryption** ΓÇö Even though the ExpressRoute circuits are dedicated to a specific customer, there is the possibility that the network provider could be breached, allowing an intruder to examine packet traffic. To address this potential, a customer or CSP can encrypt traffic over the connection by defining IPSec tunnel-mode policies for all traffic flowing between the on premises resources and Azure resources (refer to the optional Tunnel mode IPSec for Customer 1 in Figure 5: ExpressRoute Security, above). The second option would be to use a firewall appliance at each the end point of the ExpressRoute circuit. This will require additional third-party firewall VMs/Appliances to be installed on both ends to encrypt the traffic over the ExpressRoute circuit. Additional Information can be found at the following links: [Azure in Cloud Solution Provider program](/azure/cloud-solution-provider). [Get ready to transact as a Cloud Solution Provider](https://partner.microsoft.com/solutions/cloud-reseller-pre-launch). -[Microsoft Cloud Solution Provider resources](https://partner.microsoft.com/solutions/cloud-reseller-resources). +[Microsoft Cloud Solution Provider resources](https://partner.microsoft.com/solutions/cloud-reseller-resources). |
firewall | Overview | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/firewall/overview.md | Untrusted customer signed certificates|Customer signed certificates aren't trust |Wrong source IP address in Alerts with IDPS for HTTP (without TLS inspection).|When plain text HTTP traffic is in use, and IDPS issues a new alert, and the destination is a public IP address, the displayed source IP address is wrong (the internal IP address is displayed instead of the original IP address).|A fix is being investigated.| |Certificate Propagation|After a CA certificate is applied on the firewall, it may take between 5-10 minutes for the certificate to take effect.|A fix is being investigated.| |TLS 1.3 support|TLS 1.3 is partially supported. The TLS tunnel from client to the firewall is based on TLS 1.2, and from the firewall to the external Web server is based on TLS 1.3.|Updates are being investigated.|+|TLSi intermediate CA certification expiration|In some unique cases, your intermediate CA certificate can be expired two months ahead of its original expiration date.|Renew your intermediate CA certification two months ahead of time. <br>A fix is being investigated.| |Availability Zones for Firewall Premium in the Southeast Asia region|You can't currently deploy Azure Firewall Premium with Availability Zones in the Southeast Asia region.|Deploy the firewall in Southeast Asia without Availability Zones, or deploy in a region that supports Availability Zones.| |TLSi intermediate CA certificate expiration|In some unique cases, the intermediate CA certificate can expire two months before the original expiration date.|Renew the intermediate CA certificate two months before the original expiration date. A fix is being investigated.| |
healthcare-apis | Get Started With Azure Api Fhir | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/get-started-with-azure-api-fhir.md | +> [!Note] +> Azure Health Data services is the evolved version of Azure API for FHIR enabling customers to manage FHIR, DICOM, and MedTech services with integrations into other Azure Services. To learn about Azure Health Data Services [click here](https://azure.microsoft.com/products/health-data-services/). + This article outlines the basic steps to get started with Azure API for FHIR. Azure API for FHIR is a managed, standards-based, compliant API for clinical health data that enables solutions for actionable analytics and machine learning. As a prerequisite, you'll need an Azure subscription and have been granted proper permissions to create Azure resource groups and deploy Azure resources. If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) before you begin. This article described the basic steps to get started using Azure API for FHIR. >[!div class="nextstepaction"] >[Frequently asked questions about Azure API for FHIR](fhir-faq.yml) -FHIR® is a registered trademark of [HL7](https://hl7.org/fhir/) and is used with the permission of HL7. +FHIR® is a registered trademark of [HL7](https://hl7.org/fhir/) and is used with the permission of HL7. |
healthcare-apis | Release Notes | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/azure-api-for-fhir/release-notes.md | +> [!Note] +> Azure Health Data services is the evolved version of Azure API for FHIR enabling customers to manage FHIR, DICOM, and MedTech services with integrations into other Azure Services. To learn about Azure Health Data Services [click here](https://azure.microsoft.com/products/health-data-services/). + ## **November 2022** **Fixed the Error generated when resource is updated using if-match header and PATCH** |
iot-develop | How To Use Reliability Features In Sdks | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-develop/how-to-use-reliability-features-in-sdks.md | The SDKs typically provide three retry policies: | SDK | SetRetryPolicy method | Policy implementations | Implementation guidance | ||||| | C | [IOTHUB_CLIENT_RESULT IoTHubDeviceClient_SetRetryPolicy](https://azure.github.io/azure-iot-sdk-c/iothub__device__client_8h.html#a53604d8d75556ded769b7947268beec8) | See: [IOTHUB_CLIENT_RETRY_POLICY](https://azure.github.io/azure-iot-sdk-c/iothub__client__core__common_8h.html#a361221e523247855ff0a05c2e2870e4a) | [C implementation](https://github.com/Azure/azure-iot-sdk-c/blob/master/doc/connection_and_messaging_reliability.md) |-| Java | [SetRetryPolicy](/jav) | +| Java | [SetRetryPolicy](/jav) | | .NET | [DeviceClient.SetRetryPolicy](/dotnet/api/microsoft.azure.devices.client.deviceclient.setretrypolicy) | **Default**: [ExponentialBackoff class](/dotnet/api/microsoft.azure.devices.client.exponentialbackoff)<BR>**Custom:** implement [IRetryPolicy interface](/dotnet/api/microsoft.azure.devices.client.iretrypolicy)<BR>**No retry:** [NoRetry class](/dotnet/api/microsoft.azure.devices.client.noretry) | [C# implementation](https://github.com/Azure/azure-iot-sdk-csharp/blob/main/iothub/device/devdoc/retrypolicy.md) | | Node | [setRetryPolicy](/javascript/api/azure-iot-device/client#azure-iot-device-client-setretrypolicy) | **Default**: [ExponentialBackoffWithJitter class](/javascript/api/azure-iot-common/exponentialbackoffwithjitter)<BR>**Custom:** implement [RetryPolicy interface](/javascript/api/azure-iot-common/retrypolicy)<BR>**No retry:** [NoRetry class](/javascript/api/azure-iot-common/noretry) | [Node implementation](https://github.com/Azure/azure-iot-sdk-node/wiki/Connectivity-and-Retries) | | Python | Not currently supported | Not currently supported | Built-in connection retries: Dropped connections will be retried with a fixed 10 second interval by default. This functionality can be disabled if desired, and the interval can be configured. | |
iot-dps | Libraries Sdks | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-dps/libraries-sdks.md | The DPS device SDKs provide implementations of the [Register](/rest/api/iot-dps/ | --|--|--|--|--|--| | .NET|[NuGet](https://www.nuget.org/packages/Microsoft.Azure.Devices.Provisioning.Client/) |[GitHub](https://github.com/Azure/azure-iot-sdk-csharp/)|[Samples](https://github.com/Azure/azure-iot-sdk-csharp/tree/main/provisioning/device/samples)|[Quickstart](./quick-create-simulated-device-x509.md?pivots=programming-language-csharp&tabs=windows)| [Reference](/dotnet/api/microsoft.azure.devices.provisioning.client) | | C|[apt-get, MBED, Arduino IDE or iOS](https://github.com/Azure/azure-iot-sdk-c/blob/master/readme.md#packages-and-libraries)|[GitHub](https://github.com/Azure/azure-iot-sdk-c/blob/master/provisioning\_client)|[Samples](https://github.com/Azure/azure-iot-sdk-c/tree/main/provisioning_client/samples)|[Quickstart](./quick-create-simulated-device-x509.md?pivots=programming-language-ansi-c&tabs=windows)|[Reference](https://github.com/Azure/azure-iot-sdk-c/) |-| Java|[Maven](https://mvnrepository.com/artifact/com.microsoft.azure.sdk.iot.provisioning/provisioning-device-client)|[GitHub](https://github.com/Azure/azure-iot-sdk-jav?pivots=programming-language-java&tabs=windows)|[Reference](/java/api/com.microsoft.azure.sdk.iot.provisioning.device) | +| Java|[Maven](https://mvnrepository.com/artifact/com.microsoft.azure.sdk.iot.provisioning/provisioning-device-client)|[GitHub](https://github.com/Azure/azure-iot-sdk-jav?pivots=programming-language-java&tabs=windows)|[Reference](/java/api/com.microsoft.azure.sdk.iot.provisioning.device) | | Node.js|[npm](https://www.npmjs.com/package/azure-iot-provisioning-device) |[GitHub](https://github.com/Azure/azure-iot-sdk-node/tree/main/provisioning)|[Samples](https://github.com/Azure/azure-iot-sdk-node/tree/main/provisioning/device/samples)|[Quickstart](./quick-create-simulated-device-x509.md?pivots=programming-language-nodejs&tabs=windows)|[Reference](/javascript/api/azure-iot-provisioning-device) | | Python|[pip](https://pypi.org/project/azure-iot-device/) |[GitHub](https://github.com/Azure/azure-iot-sdk-python)|[Samples](https://github.com/Azure/azure-iot-sdk-python/tree/main/samples)|[Quickstart](./quick-create-simulated-device-x509.md?pivots=programming-language-python&tabs=windows)|[Reference](/python/api/azure-iot-device/azure.iot.device.provisioningdeviceclient) | The DPS service SDKs help you build backend applications to manage enrollments a | Platform | Package | Code repository | Samples | Quickstart | Reference | | --|--|--|--|--|--| | .NET|[NuGet](https://www.nuget.org/packages/Microsoft.Azure.Devices.Provisioning.Service/) |[GitHub](https://github.com/Azure/azure-iot-sdk-csharp/)|[Samples](https://github.com/Azure/azure-iot-sdk-csharp/tree/main/provisioning/service/samples)|[Quickstart](./quick-enroll-device-tpm.md?pivots=programming-language-csharp&tabs=symmetrickey)|[Reference](/dotnet/api/microsoft.azure.devices.provisioning.service) |-| Java|[Maven](https://mvnrepository.com/artifact/com.microsoft.azure.sdk.iot.provisioning/provisioning-service-client)|[GitHub](https://github.com/Azure/azure-iot-sdk-jav?pivots=programming-language-java&tabs=symmetrickey)|[Reference](/java/api/com.microsoft.azure.sdk.iot.provisioning.service) | +| Java|[Maven](https://mvnrepository.com/artifact/com.microsoft.azure.sdk.iot.provisioning/provisioning-service-client)|[GitHub](https://github.com/Azure/azure-iot-sdk-jav?pivots=programming-language-java&tabs=symmetrickey)|[Reference](/java/api/com.microsoft.azure.sdk.iot.provisioning.service) | | Node.js|[npm](https://www.npmjs.com/package/azure-iot-provisioning-service)|[GitHub](https://github.com/Azure/azure-iot-sdk-node/tree/main/provisioning)|[Samples](https://github.com/Azure/azure-iot-sdk-node/tree/main/provisioning/service/samples)|[Quickstart](./quick-enroll-device-tpm.md?pivots=programming-language-nodejs&tabs=symmetrickey)|[Reference](/javascript/api/azure-iot-provisioning-service) | ## Management SDKs |
iot-hub | Iot Hub Dev Guide Azure Ad Rbac | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-hub/iot-hub-dev-guide-azure-ad-rbac.md | For more information, see the [Azure IoT extension for Azure CLI release page](h ## SDK samples - [.NET Microsoft.Azure.Devices SDK sample](https://aka.ms/iothubaadcsharpsample)-- [Java SDK sample](https://aka.ms/iothubaadjavasample)+- [Java SDK sample](https://github.com/Azure/azure-iot-service-sdk-java/tree/main/service/iot-service-samples/role-based-authorization-sample) ## Next steps |
iot-hub | Iot Hub Devguide Messages Read Builtin | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-hub/iot-hub-devguide-messages-read-builtin.md | You can use the Event Hubs SDKs to read from the built-in endpoint in environmen | Language | Sample | | -- | | | .NET | [ReadD2cMessages .NET](https://github.com/Azure/azure-iot-sdk-csharp/tree/main/iothub/service/samples/getting%20started/ReadD2cMessages) |-| Java | [read-d2c-messages Java](https://github.com/Azure/azure-iot-sdk-java/tree/main/service/iot-service-samples/read-d2c-messages) | | Node.js | [read-d2c-messages Node.js](https://github.com/Azure-Samples/azure-iot-samples-node/tree/master/iot-hub/Quickstarts/read-d2c-messages) | | Python | [read-dec-messages Python](https://github.com/Azure-Samples/azure-iot-samples-python/tree/master/iot-hub/Quickstarts/read-d2c-messages) | |
iot-hub | Iot Hub Java Java Device Management Getstarted | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-hub/iot-hub-java-java-device-management-getstarted.md | In this section, you create a Java console app that simulates a device. The app > [!NOTE] > You can check for the latest version of **iot-device-client** using [Maven search](https://search.maven.org/#search%7Cga%7C1%7Ca%3A%22iot-device-client%22%20g%3A%22com.microsoft.azure.sdk.iot%22). -4. Add the following dependency to the **dependencies** node. This dependency configures a NOP for the Apache [SLF4J](https://www.slf4j.org/) logging facade, which is used by the device client SDK to implement logging. This configuration is optional, but, if you omit it, you may see a warning in the console when you run the app. For more information about logging in the device client SDK, see [Logging](https://github.com/Azure/azure-iot-sdk-jav#logging) in the *Samples for the Azure IoT device SDK for Java* readme file. +4. Add the following dependency to the **dependencies** node. This dependency configures a NOP for the Apache [SLF4J](https://www.slf4j.org/) logging facade, which is used by the device client SDK to implement logging. This configuration is optional, but, if you omit it, you may see a warning in the console when you run the app. For more information about logging in the device client SDK, see [Logging](https://github.com/Azure/azure-iot-sdk-java/tree/main/iothub/device/iot-device-samples#logging) in the *Samples for the Azure IoT device SDK for Java* readme file. ```xml <dependency> |
iot-hub | Iot Hub Java Java File Upload | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-hub/iot-hub-java-java-file-upload.md | The [Send telemetry from a device to an IoT hub](../iot-develop/quickstart-send- * Some form of preprocessed data. These files are typically batch processed in the cloud, using tools such as [Azure Data Factory](../data-factory/introduction.md) or the [Hadoop](../hdinsight/index.yml) stack. When you need to upload files from a device, you can still use the security and reliability of IoT Hub. This article shows you how. View two samples from [azure-iot-sdk-java-](https://github.com/Azure/azure-iot-sdk-java/tree/main/device/iot-device-samples/file-upload-sample/src/main/java/samples/com/microsoft/azure/sdk/iot) in GitHub. +](https://github.com/Azure/azure-iot-sdk-java/tree/main/iothub/device/iot-device-samples/file-upload-sample/src/main/java/samples/com/microsoft/azure/sdk/iot) in GitHub. > [!NOTE] > IoT Hub supports many device platforms and languages (including C, .NET, and JavaScript) through Azure IoT device SDKs. Refer to the [Azure IoT Developer Center](https://azure.microsoft.com/develop/iot) to learn how to connect your device to Azure IoT Hub. In this article, you learned how to use the file upload feature of IoT Hub to si To further explore the capabilities of IoT Hub, see: -* [Simulating a device with IoT Edge](../iot-edge/quickstart-linux.md) +* [Simulating a device with IoT Edge](../iot-edge/quickstart-linux.md) |
iot-hub | Iot Hub Java Java Schedule Jobs | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-hub/iot-hub-java-java-schedule-jobs.md | In this section, you create a Java console app that handles the desired properti > [!NOTE] > You can check for the latest version of **iot-device-client** using [Maven search](https://search.maven.org/#search%7Cga%7C1%7Ca%3A%22iot-device-client%22%20g%3A%22com.microsoft.azure.sdk.iot%22). -4. Add the following dependency to the **dependencies** node. This dependency configures a NOP for the Apache [SLF4J](https://www.slf4j.org/) logging facade, which is used by the device client SDK to implement logging. This configuration is optional, but if you omit it, you may see a warning in the console when you run the app. For more information about logging in the device client SDK, see [Logging](https://github.com/Azure/azure-iot-sdk-jav#logging)in the *Samples for the Azure IoT device SDK for Java* readme file. +4. Add the following dependency to the **dependencies** node. This dependency configures a NOP for the Apache [SLF4J](https://www.slf4j.org/) logging facade, which is used by the device client SDK to implement logging. This configuration is optional, but if you omit it, you may see a warning in the console when you run the app. For more information about logging in the device client SDK, see [Logging](https://github.com/Azure/azure-iot-sdk-jav#logging)in the *Samples for the Azure IoT device SDK for Java* readme file. ```xml <dependency> You are now ready to run the console apps. In this article, you scheduled jobs to run a direct method and update the device twin's properties. -To continue exploring IoT Hub and device management patterns, update an image in [Device Update for Azure IoT Hub tutorial using the Raspberry Pi 3 B+ Reference Image](../iot-hub-device-update/device-update-raspberry-pi.md). +To continue exploring IoT Hub and device management patterns, update an image in [Device Update for Azure IoT Hub tutorial using the Raspberry Pi 3 B+ Reference Image](../iot-hub-device-update/device-update-raspberry-pi.md). |
iot-hub | Iot Hub Java Java Twin Getstarted | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-hub/iot-hub-java-java-twin-getstarted.md | In this section, you create a Java console app that connects to your hub as **my > [!NOTE] > You can check for the latest version of **iot-device-client** using [Maven search](https://search.maven.org/#search%7Cga%7C1%7Ca%3A%22iot-device-client%22%20g%3A%22com.microsoft.azure.sdk.iot%22). -1. Add the following dependency to the **dependencies** node. This dependency configures a NOP for the Apache [SLF4J](https://www.slf4j.org/) logging facade, which is used by the device client SDK to implement logging. This configuration is optional, but, if you omit it, you may see a warning in the console when you run the app. For more information about logging in the device client SDK, see [Logging](https://github.com/Azure/azure-iot-sdk-jav#logging) in the *Samples for the Azure IoT device SDK for Java* readme file. +1. Add the following dependency to the **dependencies** node. This dependency configures a NOP for the Apache [SLF4J](https://www.slf4j.org/) logging facade, which is used by the device client SDK to implement logging. This configuration is optional, but, if you omit it, you may see a warning in the console when you run the app. For more information about logging in the device client SDK, see [Logging](https://github.com/Azure/azure-iot-sdk-jav#logging) in the *Samples for the Azure IoT device SDK for Java* readme file. ```xml <dependency> To learn how to: * Configure devices using device twin's desired properties, see [Tutorial: Configure your devices from a back-end service](tutorial-device-twins.md) -* Control devices interactively, such as turning on a fan from a user-controlled app, see [Quickstart: Control a device connected to an IoT hub](./quickstart-control-device.md?pivots=programming-language-java) +* Control devices interactively, such as turning on a fan from a user-controlled app, see [Quickstart: Control a device connected to an IoT hub](./quickstart-control-device.md?pivots=programming-language-java) |
iot-hub | Iot Hub Mqtt Support | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-hub/iot-hub-mqtt-support.md | The following table contains links to code samples for each supported language a | Language | MQTT protocol parameter | MQTT over WebSockets protocol parameter | | | | | [Node.js](https://github.com/Azure/azure-iot-sdk-node/blob/main/device/samples/javascript/simple_sample_device.js) | azure-iot-device-mqtt.Mqtt | azure-iot-device-mqtt.MqttWs |-| [Java](https://github.com/Azure/azure-iot-sdk-java/blob/main/device/iot-device-samples/send-receive-sample/src/main/java/samples/com/microsoft/azure/sdk/iot/SendReceive.java) |[IotHubClientProtocol](/java/api/com.microsoft.azure.sdk.iot.device.iothubclientprotocol).MQTT | IotHubClientProtocol.MQTT_WS | +| [Java](https://github.com/Azure/azure-iot-sdk-java/blob/main/iothub/device/iot-device-samples/send-receive-sample/src/main/java/samples/com/microsoft/azure/sdk/iot/SendReceive.java) |[IotHubClientProtocol](/java/api/com.microsoft.azure.sdk.iot.device.iothubclientprotocol).MQTT | IotHubClientProtocol.MQTT_WS | | [C](https://github.com/Azure/azure-iot-sdk-c/tree/master/iothub_client/samples/iothub_client_sample_mqtt_dm) | [MQTT_Protocol](https://github.com/Azure/azure-iot-sdk-c/blob/main/iothub_client/inc/iothubtransportmqtt.h) | [MQTT_WebSocket_Protocol](https://github.com/Azure/azure-iot-sdk-c/blob/main/iothub_client/inc/iothubtransportmqtt_websockets.h) | | [C#](https://github.com/Azure/azure-iot-sdk-csharp/tree/main/iothub/device/samples) | [TransportType](/dotnet/api/microsoft.azure.devices.client.transporttype).Mqtt | TransportType.Mqtt falls back to MQTT over WebSockets if MQTT fails. To specify MQTT over WebSockets only, use TransportType.Mqtt_WebSocket_Only | | [Python](https://github.com/Azure/azure-iot-sdk-python/tree/main/samples) | Supports MQTT by default | Add `websockets=True` in the call to create the client | In order to ensure a client/IoT Hub connection stays alive, both the service and |Language |Default keep-alive interval |Configurable | |||| |Node.js | 180 seconds | No |-|Java | 230 seconds | [Yes](https://github.com/Azure/azure-iot-sdk-java/blob/main/device/iot-device-client/src/main/java/com/microsoft/azure/sdk/iot/device/ClientOptions.java#L64) | +|Java | 230 seconds | [Yes](https://github.com/Azure/azure-iot-sdk-java/blob/main/iothub/device/iot-device-client/src/main/java/com/microsoft/azure/sdk/iot/device/ClientOptions.java#L64) | |C | 240 seconds | [Yes](https://github.com/Azure/azure-iot-sdk-c/blob/master/doc/Iothub_sdk_options.md#mqtt-transport) | |C# | 300 seconds* | [Yes](/dotnet/api/microsoft.azure.devices.client.transport.mqtt.mqtttransportsettings.keepaliveinseconds) | |Python | 60 seconds | [Yes](https://github.com/Azure/azure-iot-sdk-python/blob/main/azure-iot-device/azure/iot/device/iothub/abstract_clients.py#L343) | To learn more about planning your IoT Hub deployment, see: To further explore the capabilities of IoT Hub, see: * [Azure IoT Hub concepts overview](iot-hub-devguide.md)-* [Quickstart: Deploy your first IoT Edge module to a virtual Linux device](../iot-edge/quickstart-linux.md) +* [Quickstart: Deploy your first IoT Edge module to a virtual Linux device](../iot-edge/quickstart-linux.md) |
iot-hub | Tutorial X509 Introduction | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-hub/tutorial-x509-introduction.md | Before starting any of the articles in this tutorial, you should be familiar wit Using a CA-signed certificate chain backed by a PKI to authenticate a device provides the best level of security for your devices: -- In production, we recommend you get your X.509 CA certificates from a public root certificate authority. Purchasing a CA certificate has the benefit of the root CA acting as a trusted third party to vouch for the legitimacy of your devices. If you already have an X.509 CA certificate, and you know how to create and sign device certificates into a certificate chain, follow the instructions in [Tutorial: Upload and verify a CA certificate to IoT Hub](/tutorial-x509-prove-possession.md) to upload your CA certificate to your IoT hub. Then, follow the instructions in [Tutorial: Test certificate authentication](tutorial-x509-test-certificate.md) to authenticate a device with your IoT hub.+- In production, we recommend you get your X.509 CA certificates from a public root certificate authority. Purchasing a CA certificate has the benefit of the root CA acting as a trusted third party to vouch for the legitimacy of your devices. If you already have an X.509 CA certificate, and you know how to create and sign device certificates into a certificate chain, follow the instructions in [Tutorial: Upload and verify a CA certificate to IoT Hub](tutorial-x509-prove-possession.md) to upload your CA certificate to your IoT hub. Then, follow the instructions in [Tutorial: Test certificate authentication](tutorial-x509-test-certificate.md) to authenticate a device with your IoT hub. - For testing purposes, we recommend using OpenSSL to create an X.509 certificate chain. OpenSSL is used widely across the industry to work with X.509 certificates. You can follow the steps in [Tutorial: Use OpenSSL to create test certificates](tutorial-x509-openssl.md) to create a root CA and intermediate CA certificate with which to create and sign device certificates. The tutorial also shows how to upload and verify a CA certificate. Then, follow the instructions in [Tutorial: Test certificate authentication](tutorial-x509-test-certificate.md) to authenticate a device with your IoT hub. If you're already familiar with X.509 certificates, and you want to generate tes >[!IMPORTANT] >We recommend that you use certificates signed by an issuing Certificate Authority (CA), even for testing purposes. Never use self-signed certificates in production. -If you have a root CA certificate or subordinate CA certificate and you want to upload it to your IoT hub, you must verify that you own that certificate. For more information, see [Tutorial: Upload and verify a CA certificate to IoT Hub](tutorial-x509-prove-possession.md). +If you have a root CA certificate or subordinate CA certificate and you want to upload it to your IoT hub, you must verify that you own that certificate. For more information, see [Tutorial: Upload and verify a CA certificate to IoT Hub](tutorial-x509-prove-possession.md). |
lab-services | Concept Migrating Physical Labs | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/lab-services/concept-migrating-physical-labs.md | + + Title: Migrating from physical labs to the cloud ++description: Learn about the benefits and considerations for migrating from physical labs to Azure Lab Services. Understand how to configure your labs to optimize costs. +++++ Last updated : 01/31/2023+++# Considerations for migrating from physical labs to Azure Lab Services ++Azure Lab Services enables you to provide lab environments that users can access from anywhere, any time of the day. When you migrate from physical labs to Azure Lab Services, you should reassess your lab structure to minimize costs and optimize the experience for lab creators and users. In this article, you learn about the considerations and benefits of migrating from physical labs to Azure Lab Services. ++## Considerations for moving to Azure Lab Services ++When you migrate physical labs to Azure Lab Services, you should consider the following aspects: ++- What is the lab structure? Are labs used for different purposes (shared lab), such as multiple classes, or are they dedicated (single-purpose lab)? +- What are the software requirements for the lab? +- What are the lab hardware requirements? A shared lab has to accommodate the needs for all usage scenarios and therefore has higher requirements. ++To optimally benefit, you need to reassess the lab and image contents as a whole. It's not recommended to reuse the same lab image from your physical lab as-is. ++## Lab structure ++Usually a physical lab is shared by students from multiple classes. As a result, all of the classesΓÇÖ software applications are installed together at once on each lab computer. When a class uses the lab, students only run a subset of the applications that are relevant to their class. ++This type of physical computer lab often leads to increased hardware requirements: ++- A large disk size may be required to install the combined set of applications that are needed by the classes that are sharing the lab. +- Some applications require more processing power compared to others, or require specialized processors, such as a GPU. By installing multiple applications on the same lab computer, each computer must have sufficient hardware to run the most compute-intensive applications. ++This level of hardware is wasted for classes that only use the lab to run applications that require less memory, compute power, or disk space. ++Azure Lab Services is designed to use hardware more efficiently, so that you only pay for what your users actually need and use. With Azure Lab Services, labs are structured to be more granular: ++- One lab is created for each class (or session of a class). +- On the labΓÇÖs image, only the software applications that are needed by that specific class are installed. ++This structure helps to identify the optimal VM size for each class based on the specific workload, and helps to reduce the disk size requirements (Azure Lab ServicesΓÇÖ currently supports a disk size of 127 GB). ++When you use Azure Lab Services, it's recommended that you use single-purpose labs. ++Learn more about [how to structure labs](./administrator-guide.md#lab) in the Azure Lab Services administrator guide. ++## Benefits ++There are multiple benefits of using single-purpose labs (for example, one class per lab): ++- Optimize costs by selecting the right VM size for each lab. See the below [example use case and cost analysis](#example-use-case). ++- Lab VMs only contain the software that is needed for their purpose. This simplifies the set-up and maintenance of labs by lab creators, and provides more clarity for lab users. ++- Access to each individual lab is controlled. Lab users are only granted access to labs and software they need. Learn how to [add and manage lab users](./how-to-configure-student-usage.md). ++- Further optimize costs by taking advantage of the following features: ++ - [Schedules](./how-to-create-schedules.md) are used to automatically start and stop all VMs within a lab according to each classΓÇÖs schedule. + - [Quotas](./how-to-configure-student-usage.md#set-quotas-for-users) allow you to control the amount of time that each classΓÇÖs students can access VMs outside of their scheduled hours. ++## Example use case ++Consider the following physical lab configuration, where the lab is shared by multiple classes: ++- An engineering class that uses [SolidWorks](./class-type-solidworks.md) with 100 students enrolled. +- A math class that uses [MATLAB](./class-type-matlab.md) that also has 100 students enrolled. ++Since our physical lab is shared by these two classes, each lab computer has both SolidWorks and MATLAB installed, along with various other common applications, such as Word, or Excel. Also, itΓÇÖs important to note that SolidWorks is more compute-intensive since it typically requires a GPU. ++To move this physical lab to Azure Lab ++- Create two labs: one for the engineering class and another for the math class. +- Create two VM images: one with SolidWorks installed and another with MATLAB. ++Because SolidWorks requires a GPU, the engineering lab uses the **Small GPU (Visualization)** VM size. The lab for math class only requires a **Medium** VM size. ++The following image shows how the lab structure changes when moving this physical lab to Azure Lab Services. +++### Cost analysis ++In this example, the cost per usage hour for the two VM sizes is substantially different: ++- Small GPU (Visualization): provides high compute-power and as a result, the cost is 160 lab units per hour. +- Medium: provides less compute power but is suitable for many types of classes. The cost is only 55 lab units per hour. ++By using separate labs and assigning the smallest appropriate VM size for each lab, you can save on the total cost for running the labs. ++Consider a usage scenario where student uses their VM for a total of 10 hours: ++- A single lab using the Small GPU (Visualization) size that is shared by students from both the engineering and math classes is estimated to have the following usage: ++ 10 hours * 200 students * 160 lab units/hour = 320000 lab units ++- Separate labs that use the Small GPU (Visualization) size for engineering and Medium size for math are estimated to have the following usage: ++ - Engineering class lab: 10 hours * 100 students * 160 lab units/hour = 160000 ++ - Math class lab: 10 hours * 100 students * 55 lab units/hour = 55000 ++ The total of both the engineering and math labs is 215000. ++By using a more granular lab structure, the total savings for running the labs are 33%. Also, keep in mind that you only pay for the number of hours that your students actually use their VMs. If students use their VMs less, the actual costs are lower. ++>[!IMPORTANT] +> The cost estimate is for example purposes only. For current details on pricing, see [Azure Lab Services Pricing](https://azure.microsoft.com/pricing/details/lab-services/). ++## Prepare for migrating to Azure Lab Services ++When you start using Azure Lab Services, IT and faculty should coordinate early in the planning process to: ++- Identify the specific software applications that each class requires. Learn more about [lab software requirements](./setup-guide.md#what-software-requirements-does-the-class-have). +- Understand the workloads that students perform using the lab. ++This information is needed to choose the appropriate VM size when you create a lab and to set up the image on the template VM. Learn more about [VM sizing in Azure Lab Services](./administrator-guide.md#vm-sizing). ++To ensure that you choose the appropriate VM size, we recommend starting with the minimum VM size that meets the hardware requirements for your applications. Then, have faculty connect to a lab VM to validate common workloads that students perform to ensure the performance and experience is sufficient. ItΓÇÖs helpful to refer to the [Class Types](./class-types.md), which show real-world examples of how to set up applications for classes along with the recommended VM size. ++Also, [Azure Compute Gallery](./how-to-use-shared-image-gallery.md) is useful for creating and storing custom images. A compute gallery enables you to create an image once and reuse it to create multiple labs. ++## Conclusion ++Azure Lab Services provides many benefits for optimizing the cost of running your labs, simplifying set-up and maintenance, and having fine-grained access control. To optimally benefit, it's recommended to structure your labs in Azure Lab Services to have a single purpose. For example, create a separate lab for each classroom training. ++## Next steps ++- Get started by [creating a lab plan](./quick-create-lab-plan-portal.md). +- Understand [cost estimation and analysis](./cost-management-guide.md). +- Understand the [lab requirements for a lab](./setup-guide.md#understand-the-lab-requirements-of-your-class). +- Learn more about [VM sizing in Azure Lab Services](./administrator-guide.md#vm-sizing). |
load-testing | Quickstart Create And Run Load Test | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/load-testing/quickstart-create-and-run-load-test.md | If you already have a Load Testing resource, skip this section and continue to [ To create a Load Testing resource: ## Create a load test |
load-testing | Tutorial Identify Bottlenecks Azure Portal | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/load-testing/tutorial-identify-bottlenecks-azure-portal.md | The sample application's source repo includes an Apache JMeter script named *Sam > [!NOTE] > The sample Apache JMeter script requires two plugins: ```Custom Thread Groups``` and ```Throughput Shaping Timer```. To open the script on your local Apache JMeter instance, you need to install both plugins. You can use the [Apache JMeter Plugins Manager](https://jmeter-plugins.org/install/Install/) to do this. -### Create the Azure Load Testing resource +### Create the Azure load testing resource -The Load Testing resource is a top-level resource for your load-testing activities. This resource provides a centralized place to view and manage load tests, test results, and related artifacts. +The Azure load testing resource is a top-level resource for your load-testing activities. This resource provides a centralized place to view and manage load tests, test results, and related artifacts. -If you already have a Load Testing resource, skip this section and continue to [Create a load test](#create_test). +If you already have a load testing resource, skip this section and continue to [Create a load test](#create-a-load-test). -If you don't yet have a Load Testing resource, create one now: +If you don't yet have an Azure load testing resource, create one now: -### <a name="create_test"></a> Create a load test +### Create a load test -To create a load test in the Load Testing resource for the sample app: +Next, you create a load test in your load testing resource for the sample app. You create the load test by using an existing JMeter script in the sample app repository. -1. Go to the Load Testing resource and select **Create new test** on the command bar. +1. Go to your load testing resource, and select **Create** on the **Overview** page. :::image type="content" source="./media/tutorial-identify-bottlenecks-azure-portal/create-test.png" alt-text="Screenshot that shows the button for creating a new test." ::: |
logic-apps | Logic Apps Azure Functions | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/logic-apps-azure-functions.md | This how-to guide shows how to call an Azure function from a logic app workflow. ## Limitations -* You can create a function directly from inside a Consumption logic app workflow, but from not a Standard logic app workflow. However, you can create functions in other ways. For more information, see [Create functions from inside logic app workflows](#create-function-designer). +* You can create a function directly from inside a Consumption logic app workflow, but not from a Standard logic app workflow. However, you can create functions in other ways. For more information, see [Create functions from inside logic app workflows](#create-function-designer). * Only Consumption workflows support authenticating Azure function calls using a managed identity with Azure Active Directory (Azure AD) authentication. Standard workflows aren't currently supported in the section about [how to enable authentication for function calls](#enable-authentication-functions). |
machine-learning | Concept Workspace | https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/concept-workspace.md | -The workspace is the top-level resource for Azure Machine Learning, providing a centralized place to work with all the artifacts you create when you use Azure Machine Learning. The workspace keeps a history of all training runs, including logs, metrics, output, and a snapshot of your scripts. You use this information to determine which training run produces the best model. +Workspaces are places to collaborate with colleagues and group related work. For example, experiments, jobs, datasets, components, and inference endpoints. -Once you have a model you like, you register it with the workspace. You then use the registered model and scoring scripts to deploy to an [online endpoint](concept-endpoints.md) as a REST-based HTTP endpoint. ++We recommend creating a workspace _per project_. While a workspace can be used for multiple projects, limiting it to one project per workspace allows for cost reporting accrued to a project level. It also allows you to manage configurations like datastores in the scope of each project. +++## Working with a workspace ++Machine learning tasks read and/or write artifacts to your workspace. +++ Run an experiment to train a model - writes job run results to the workspace.++ Use automated ML to train a model - writes training results to the workspace.++ Register a model in the workspace.++ Deploy a model - uses the registered model to create a deployment.++ Create and run reusable workflows.++ View machine learning artifacts such as jobs, pipelines, models, deployments.++ Track and monitor models.++ You can share assets between workspaces using [Azure Machine Learning registries (preview)](how-to-share-models-pipelines-across-workspaces-with-registries.md). ## Taxonomy You can interact with your workspace in the following ways: + On the command line using the Azure Machine Learning [CLI extension](how-to-configure-cli.md) + [Azure Machine Learning VS Code Extension](how-to-manage-resources-vscode.md#workspaces) -## Machine learning with a workspace --Machine learning tasks read and/or write artifacts to your workspace. --+ Run an experiment to train a model - writes job run results to the workspace. -+ Use automated ML to train a model - writes training results to the workspace. -+ Register a model in the workspace. -+ Deploy a model - uses the registered model to create a deployment. -+ Create and run reusable workflows. -+ View machine learning artifacts such as jobs, pipelines, models, deployments. -+ Track and monitor models. - ## Workspace management You can also perform the following workspace management tasks: When you create a new workspace, it automatically creates several Azure resource > By default, the storage account is a general-purpose v1 account. You can [upgrade this to general-purpose v2](../storage/common/storage-account-upgrade.md) after the workspace has been created. > Do not enable hierarchical namespace on the storage account after upgrading to general-purpose v2. - To use an existing Azure Storage account, it cannot be of type BlobStorage or a premium account (Premium_LRS and Premium_GRS). It also cannot have a hierarchical namespace (used with Azure Data Lake Storage Gen2). Neither premium storage nor hierarchical namespaces are supported with the _default_ storage account of the workspace. You can use premium storage or hierarchical namespace with _non-default_ storage accounts. + To use an existing Azure Storage account, it can't be of type BlobStorage or a premium account (Premium_LRS and Premium_GRS). It also can't ha |