Updates from: 08/23/2022 01:08:47
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory-b2c Configure Authentication In Azure Static App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/configure-authentication-in-azure-static-app.md
Previously updated : 06/28/2022 Last updated : 08/22/2022
OpenID Connect (OIDC) is an authentication protocol that's built on OAuth 2.0. U
When the access token expires or the app session is invalidated, Azure Static Web App initiates a new authentication request and redirects users to Azure AD B2C. If the Azure AD B2C [SSO session](session-behavior.md) is active, Azure AD B2C issues an access token without prompting users to sign in again. If the Azure AD B2C session expires or becomes invalid, users are prompted to sign in again. ## Prerequisites-
+- A premium Azure subscription.
- If you haven't created an app yet, follow the guidance how to create an [Azure Static Web App](../static-web-apps/overview.md). - Familiarize yourself with the Azure Static Web App [staticwebapp.config.json](../static-web-apps/configuration.md) configuration file. - Familiarize yourself with the Azure Static Web App [App Settings](../static-web-apps/application-settings.md).
active-directory Howto Registration Mfa Sspr Combined https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/authentication/howto-registration-mfa-sspr-combined.md
Complete the following steps to create a policy that applies to all selected use
1. In the **Azure portal**, browse to **Azure Active Directory** > **Security** > **Conditional Access**. 1. Select **+ New policy**. 1. Enter a name for this policy, such as *Combined Security Info Registration on Trusted Networks*.
-1. Under **Assignments**, select **Users and groups**. Choose the users and groups you want this policy to apply to, then select **Done**.
+1. Under **Assignments**, select **Users or workload identities**.. Choose the users and groups you want this policy to apply to, then select **Done**.
> [!WARNING] > Users must be enabled for combined registration.
active-directory Block Legacy Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/block-legacy-authentication.md
Title: Block legacy authentication - Azure Active Directory description: Learn how to improve your security posture by blocking legacy authentication using Azure AD Conditional Access.++ Previously updated : 06/21/2022 Last updated : 08/22/2022+ -++
-# How to: Block legacy authentication access to Azure AD with Conditional Access
+# Block legacy authentication with Azure AD with Conditional Access
-To give your users easy access to your cloud apps, Azure Active Directory (Azure AD) supports a broad variety of authentication protocols including legacy authentication. However, legacy authentication doesn't support multifactor authentication (MFA). MFA is in many environments a common requirement to address identity theft.
+To give your users easy access to your cloud apps, Azure Active Directory (Azure AD) supports a broad variety of authentication protocols including legacy authentication. However, legacy authentication doesn't support things like multifactor authentication (MFA). MFA is a common requirement to improve security posture in organizations.
> [!NOTE] > Effective October 1, 2022, we will begin to permanently disable Basic Authentication for Exchange Online in all Microsoft 365 tenants regardless of usage, except for SMTP Authentication. Read more [here](/exchange/clients-and-mobile-in-exchange-online/deprecation-of-basic-authentication-exchange-online)
Alex Weinert, Director of Identity Security at Microsoft, in his March 12, 2020
> - Azure AD accounts in organizations that have disabled legacy authentication experience 67 percent fewer compromises than those where legacy authentication is enabled >
-If your environment is ready to block legacy authentication to improve your tenant's protection, you can accomplish this goal with Conditional Access. This article explains how you can configure Conditional Access policies that block legacy authentication for all workloads within your tenant.
+If you're ready to block legacy authentication to improve your tenant's protection, you can accomplish this goal with Conditional Access. This article explains how you can configure Conditional Access policies that block legacy authentication for all workloads within your tenant.
While rolling out legacy authentication blocking protection, we recommend a phased approach, rather than disabling it for all users all at once. Customers may choose to first begin disabling basic authentication on a per-protocol basis, by applying Exchange Online authentication policies, then (optionally) also blocking legacy authentication via Conditional Access policies when ready.
Many clients that previously only supported legacy authentication now support mo
> > When implementing Exchange Active Sync (EAS) with CBA, configure clients to use modern authentication. Clients not using modern authentication for EAS with CBA **are not blocked** with [Deprecation of Basic authentication in Exchange Online](/exchange/clients-and-mobile-in-exchange-online/deprecation-of-basic-authentication-exchange-online). However, these clients **are blocked** by Conditional Access policies configured to block legacy authentication. >
->For more Information on implementing support for CBA with Azure AD and modern authentication See: [How to configure Azure AD certificate-based authentication (Preview)](../authentication/how-to-certificate-based-authentication.md). As another option, CBA performed at a federation server can be used with modern authentication.
+> For more Information on implementing support for CBA with Azure AD and modern authentication See: [How to configure Azure AD certificate-based authentication (Preview)](../authentication/how-to-certificate-based-authentication.md). As another option, CBA performed at a federation server can be used with modern authentication.
If you're using Microsoft Intune, you might be able to change the authentication type using the email profile you push or deploy to your devices. If you're using iOS devices (iPhones and iPads), you should take a look at [Add e-mail settings for iOS and iPadOS devices in Microsoft Intune](/mem/intune/configuration/email-settings-ios).
The easiest way to block legacy authentication across your entire organization i
### Indirectly blocking legacy authentication
-Even if your organization isn't ready to block legacy authentication across the entire organization, you should ensure that sign-ins using legacy authentication aren't bypassing policies that require grant controls such as requiring multifactor authentication or compliant/hybrid Azure AD joined devices. During authentication, legacy authentication clients don't support sending MFA, device compliance, or join state information to Azure AD. Therefore, apply policies with grant controls to all client applications so that legacy authentication based sign-ins that canΓÇÖt satisfy the grant controls are blocked. With the general availability of the client apps condition in August 2020, newly created Conditional Access policies apply to all client apps by default.
+If your organization isn't ready to block legacy authentication across the entire organization, you should ensure that sign-ins using legacy authentication aren't bypassing policies that require grant controls such as requiring multifactor authentication or compliant/hybrid Azure AD joined devices. During authentication, legacy authentication clients don't support sending MFA, device compliance, or join state information to Azure AD. Therefore, apply policies with grant controls to all client applications so that legacy authentication based sign-ins that canΓÇÖt satisfy the grant controls are blocked. With the general availability of the client apps condition in August 2020, newly created Conditional Access policies apply to all client apps by default.
![Client apps condition default configuration](./media/block-legacy-authentication/client-apps-condition-configured-no.png)
You can select all available grant controls for the **Other clients** condition;
- [Determine impact using Conditional Access report-only mode](howto-conditional-access-insights-reporting.md) - If you aren't familiar with configuring Conditional Access policies yet, see [require MFA for specific apps with Azure Active Directory Conditional Access](../authentication/tutorial-enable-azure-mfa.md) for an example. - For more information about modern authentication support, see [How modern authentication works for Office client apps](/office365/enterprise/modern-auth-for-office-2013-and-2016) -- [How to set up a multifunction device or application to send email using Microsoft 365](/exchange/mail-flow-best-practices/how-to-set-up-a-multifunction-device-or-application-to-send-email-using-microsoft-365-or-office-365)
+- [How to set up a multifunction device or application to send email using Microsoft 365](/exchange/mail-flow-best-practices/how-to-set-up-a-multifunction-device-or-application-to-send-email-using-microsoft-365-or-office-365)
+- [Enable modern authentication in Exchange Online](/exchange/clients-and-mobile-in-exchange-online/enable-or-disable-modern-authentication-in-exchange-online)
+- [Enable Modern Authentication for Office 2013 on Windows devices](/office365/admin/security-and-compliance/enable-modern-authentication)
+- [How to configure Exchange Server on-premises to use Hybrid Modern Authentication](/office365/enterprise/configure-exchange-server-for-hybrid-modern-authentication)
+- [How to use Modern Authentication with Skype for Business](/skypeforbusiness/manage/authentication/use-adal)
active-directory Concept Condition Filters For Devices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/concept-condition-filters-for-devices.md
# Conditional Access: Filter for devices
-When creating Conditional Access policies, administrators have asked for the ability to target or exclude specific devices in their environment. The condition filter for devices give administrators this capability. Now you can target specific devices using [supported operators and properties for device filters](#supported-operators-and-device-properties-for-filters) and the other available assignment conditions in your Conditional Access policies.
+When creating Conditional Access policies, administrators have asked for the ability to target or exclude specific devices in their environment. The condition filter for devices gives administrators this capability. Now you can target specific devices using [supported operators and properties for device filters](#supported-operators-and-device-properties-for-filters) and the other available assignment conditions in your Conditional Access policies.
:::image type="content" source="media/concept-condition-filters-for-devices/create-filter-for-devices-condition.png" alt-text="Creating a filter for device in Conditional Access policy conditions":::
Policy 1: All users with the directory role of Global administrator, accessing t
1. Browse to **Azure Active Directory** > **Security** > **Conditional Access**. 1. Select **New policy**. 1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.
-1. Under **Assignments**, select **Users and groups**.
+1. Under **Assignments**, select **Users or workload identities**..
1. Under **Include**, select **Directory roles** and choose **Global administrator**. > [!WARNING]
Policy 2: All users with the directory role of Global administrator, accessing t
1. Select **New policy**. 1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.
-1. Under **Assignments**, select **Users and groups**.
+1. Under **Assignments**, select **Users or workload identities**..
1. Under **Include**, select **Directory roles** and choose **Global administrator**. > [!WARNING]
active-directory Concept Conditional Access Policy Common https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/concept-conditional-access-policy-common.md
Previously updated : 11/05/2021 Last updated : 08/22/2022
active-directory Controls https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/controls.md
# Custom controls (preview)
-Custom controls is a preview capability of the Azure Active Directory. When using custom controls, your users are redirected to a compatible service to satisfy authentication requirements outside of Azure Active Directory. To satisfy this control, a user's browser is redirected to the external service, performs any required authentication, and is then redirected back to Azure Active Directory. Azure Active Directory verifies the response and, if the user was successfully authenticated or validated, the user continues in the Conditional Access flow.
+Custom controls are a preview capability of the Azure Active Directory. When using custom controls, your users are redirected to a compatible service to satisfy authentication requirements outside of Azure Active Directory. To satisfy this control, a user's browser is redirected to the external service, performs any required authentication, and is then redirected back to Azure Active Directory. Azure Active Directory verifies the response and, if the user was successfully authenticated or validated, the user continues in the Conditional Access flow.
> [!NOTE] > For more information about changes we are planning to the Custom Control capability, see the February 2020 [Archive for What's new](../fundamentals/whats-new-archive.md#upcoming-changes-to-custom-controls).
Custom controls is a preview capability of the Azure Active Directory. When usin
## Creating custom controls > [!IMPORTANT]
-> Custom controls cannot be used with Identity Protection's automation requiring Azure AD Multi-Factor Authentication, Azure AD self-service password reset (SSPR), satisfying multi-factor authentication claim requirements, to elevate roles in Privileged Identity Manager (PIM), as part of Intune device enrollment, or when joining devices to Azure AD.
+> Custom controls cannot be used with Identity Protection's automation requiring Azure AD Multifactor Authentication, Azure AD self-service password reset (SSPR), satisfying multifactor authentication claim requirements, to elevate roles in Privileged Identity Manager (PIM), as part of Intune device enrollment, or when joining devices to Azure AD.
Custom Controls works with a limited set of approved authentication providers. To create a custom control, you should first contact the provider that you wish to utilize. Each non-Microsoft provider has its own process and requirements to sign up, subscribe, or otherwise become a part of the service, and to indicate that you wish to integrate with Conditional Access. At that point, the provider will provide you with a block of data in JSON format. This data allows the provider and Conditional Access to work together for your tenant, creates the new control and defines how Conditional Access can tell if your users have successfully performed verification with the provider.
-Copy the JSON data and then paste it into the related textbox. Do not make any changes to the JSON unless you explicitly understand the change you're making. Making any change could break the connection between the provider and Microsoft and potentially lock you and your users out of your accounts.
+Copy the JSON data and then paste it into the related textbox. Don't make any changes to the JSON unless you explicitly understand the change you're making. Making any change could break the connection between the provider and Microsoft and potentially lock you and your users out of your accounts.
The option to create a custom control is in the **Manage** section of the **Conditional Access** page.
Clicking **New custom control**, opens a blade with a textbox for the JSON data
To delete a custom control, you must first ensure that it isn't being used in any Conditional Access policy. Once complete: 1. Go to the Custom controls list
-1. Click …
+1. Select …
1. Select **Delete**. ## Editing custom controls
To edit a custom control, you must delete the current control and create a new c
## Known limitations
-Custom controls cannot be used with Identity Protection's automation requiring Azure AD Multi-Factor Authentication, Azure AD self-service password reset (SSPR), satisfying multi-factor authentication claim requirements, to elevate roles in Privileged Identity Manager (PIM), as part of Intune device enrollment, or when joining devices to Azure AD.
+Custom controls can't be used with Identity Protection's automation requiring Azure AD Multifactor Authentication, Azure AD self-service password reset (SSPR), satisfying multifactor authentication claim requirements, to elevate roles in Privileged Identity Manager (PIM), as part of Intune device enrollment, or when joining devices to Azure AD.
## Next steps
active-directory Faqs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/faqs.md
Title: Azure Active Directory Conditional Access FAQs | Microsoft Docs
+ Title: Azure Active Directory Conditional Access FAQs
description: Get answers to frequently asked questions about Conditional Access in Azure Active Directory. Previously updated : 10/16/2020 Last updated : 08/22/2022
For information about applications that work with Conditional Access policies, s
## Are Conditional Access policies enforced for B2B collaboration and guest users?
-Policies are enforced for business-to-business (B2B) collaboration users. However, in some cases, a user might not be able to satisfy the policy requirements. For example, a guest user's organization might not support multi-factor authentication.
+Policies are enforced for business-to-business (B2B) collaboration users. However, in some cases, a user might not be able to satisfy the policy requirements. For example, a guest user's organization might not support multifactor authentication.
## Does a SharePoint Online policy also apply to OneDrive for Business?
Yes. A SharePoint Online policy also applies to OneDrive for Business. For more
## Why canΓÇÖt I set a policy directly on client apps, like Word or Outlook?
-A Conditional Access policy sets requirements for accessing a service. It's enforced when authentication to that service occurs. The policy is not set directly on a client application. Instead, it is applied when a client calls a service. For example, a policy set on SharePoint applies to clients calling SharePoint. A policy set on Exchange applies to Outlook. For more information, see the article, [Conditional Access service dependencies](service-dependencies.md) and consider targeting policies to the [Office 365 app](concept-conditional-access-cloud-apps.md#office-365) instead.
+A Conditional Access policy sets requirements for accessing a service. It's enforced when authentication to that service occurs. The policy isn't set directly on a client application. Instead, it's applied when a client calls a service. For example, a policy set on SharePoint applies to clients calling SharePoint. A policy set on Exchange applies to Outlook. For more information, see the article, [Conditional Access service dependencies](service-dependencies.md) and consider targeting policies to the [Office 365 app](concept-conditional-access-cloud-apps.md#office-365) instead.
## Does a Conditional Access policy apply to service accounts?
-Conditional Access policies apply to all user accounts. This includes user accounts that are used as service accounts. Often, a service account that runs unattended can't satisfy the requirements of a Conditional Access policy. For example, multi-factor authentication might be required. Service accounts can be excluded from a policy by using a [user or group exclusion](concept-conditional-access-users-groups.md#exclude-users).
+Conditional Access policies apply to all user accounts. This includes user accounts that are used as service accounts. Often, a service account that runs unattended can't satisfy the requirements of a Conditional Access policy. For example, multifactor authentication might be required. Service accounts can be excluded from a policy by using a [user or group exclusion](concept-conditional-access-users-groups.md#exclude-users).
## What is the default exclusion policy for unsupported device platforms?
-Currently, Conditional Access policies are selectively enforced on users of iOS and Android devices. Applications on other device platforms are, by default, not affected by the Conditional Access policy for iOS and Android devices. A tenant admin can choose to override the global policy to disallow access to users on platforms that are not supported.
+Currently, Conditional Access policies are selectively enforced on users of iOS and Android devices. Applications on other device platforms are, by default, not affected by the Conditional Access policy for iOS and Android devices. A tenant admin can choose to override the global policy to disallow access to users on platforms that aren't supported.
## How do Conditional Access policies work for Microsoft Teams?
For more information, see the article, [Conditional Access service dependencies]
After enabling some Conditional Access policies on the tenant in Microsoft Teams, certain tabs may no longer function in the desktop client as expected. However, the affected tabs function when using the Microsoft Teams web client. The tabs affected may include Power BI, Forms, VSTS, Power Apps, and SharePoint List.
-To see the affected tabs you must use the Teams web client in Edge, Internet Explorer, or Chrome with the Windows 10 Accounts extension installed. Some tabs depend on web authentication, which doesn't work in the Microsoft Teams desktop client when Conditional Access is enabled. Microsoft is working with partners to enable these scenarios. To date, we have enabled scenarios involving Planner, OneNote, and Stream.
+To see the affected tabs you must use the Teams web client in Microsoft Edge, Internet Explorer, or Chrome with the Windows 10 Accounts extension installed. Some tabs depend on web authentication, which doesn't work in the Microsoft Teams desktop client when Conditional Access is enabled. Microsoft is working with partners to enable these scenarios. To date, we have enabled scenarios involving Planner, OneNote, and Stream.
## Next steps
active-directory Howto Conditional Access Policy Admin Mfa https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/howto-conditional-access-policy-admin-mfa.md
Title: Conditional Access - Require MFA for administrators - Azure Active Directory
-description: Create a custom Conditional Access policy to require administrators to perform multi-factor authentication
+description: Create a custom Conditional Access policy to require administrators to perform multifactor authentication
Previously updated : 11/05/2021 Last updated : 08/22/2022
# Conditional Access: Require MFA for administrators
-Accounts that are assigned administrative rights are targeted by attackers. Requiring multi-factor authentication (MFA) on those accounts is an easy way to reduce the risk of those accounts being compromised.
+Accounts that are assigned administrative rights are targeted by attackers. Requiring multifactor authentication (MFA) on those accounts is an easy way to reduce the risk of those accounts being compromised.
Microsoft recommends you require MFA on the following roles at a minimum, based on [identity score recommendations](../fundamentals/identity-secure-score.md):
Conditional Access policies are powerful tools, we recommend excluding the follo
- **Emergency access** or **break-glass** accounts to prevent tenant-wide account lockout. In the unlikely scenario all administrators are locked out of your tenant, your emergency-access administrative account can be used to log into the tenant to take steps to recover access. - More information can be found in the article, [Manage emergency access accounts in Azure AD](../roles/security-emergency-access.md).-- **Service accounts** and **service principals**, such as the Azure AD Connect Sync Account. Service accounts are non-interactive accounts that are not tied to any particular user. They are normally used by back-end services allowing programmatic access to applications, but are also used to sign in to systems for administrative purposes. Service accounts like these should be excluded since MFA can't be completed programmatically. Calls made by service principals are not blocked by Conditional Access.
+- **Service accounts** and **service principals**, such as the Azure AD Connect Sync Account. Service accounts are non-interactive accounts that aren't tied to any particular user. They're normally used by back-end services allowing programmatic access to applications, but are also used to sign in to systems for administrative purposes. Service accounts like these should be excluded since MFA can't be completed programmatically. Calls made by service principals aren't blocked by Conditional Access.
- If your organization has these accounts in use in scripts or code, consider replacing them with [managed identities](../managed-identities-azure-resources/overview.md). As a temporary workaround, you can exclude these specific accounts from the baseline policy. ## Template deployment
Organizations can choose to deploy this policy using the steps outlined below or
## Create a Conditional Access policy
-The following steps will help create a Conditional Access policy to require those assigned administrative roles to perform multi-factor authentication.
+The following steps will help create a Conditional Access policy to require those assigned administrative roles to perform multifactor authentication.
1. Sign in to the **Azure portal** as a global administrator, security administrator, or Conditional Access administrator. 1. Browse to **Azure Active Directory** > **Security** > **Conditional Access**. 1. Select **New policy**. 1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.
-1. Under **Assignments**, select **Users and groups**
+1. Under **Assignments**, select **Users or workload identities**.
1. Under **Include**, select **Directory roles** and choose built-in roles like: - Global administrator - Application administrator
The following steps will help create a Conditional Access policy to require thos
> Conditional Access policies support built-in roles. Conditional Access policies are not enforced for other role types including [administrative unit-scoped](../roles/admin-units-assign-roles.md) or [custom roles](../roles/custom-create.md). 1. Under **Exclude**, select **Users and groups** and choose your organization's emergency access or break-glass accounts.
- 1. Select **Done**.
-1. Under **Cloud apps or actions** > **Include**, select **All cloud apps**, and select **Done**.
-1. Under **Access controls** > **Grant**, select **Grant access**, **Require multi-factor authentication**, and select **Select**.
+1. Under **Cloud apps or actions** > **Include**, select **All cloud apps**.
+1. Under **Access controls** > **Grant**, select **Grant access**, **Require multifactor authentication**, and select **Select**.
1. Confirm your settings and set **Enable policy** to **Report-only**. 1. Select **Create** to create to enable your policy.
active-directory Howto Conditional Access Policy All Users Mfa https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/howto-conditional-access-policy-all-users-mfa.md
Title: Conditional Access - Require MFA for all users - Azure Active Directory
-description: Create a custom Conditional Access policy to require all users do multi-factor authentication
+description: Create a custom Conditional Access policy to require all users do multifactor authentication
Previously updated : 03/28/2022 Last updated : 08/22/2022
Organizations may have many cloud applications in use. Not all of those applicat
### Subscription activation
-Organizations that use the [Subscription Activation](/windows/deployment/windows-10-subscription-activation) feature to enable users to ΓÇ£step-upΓÇ¥ from one version of Windows to another, may want to exclude the Universal Store Service APIs and Web Application, AppID 45a330b1-b1ec-4cc1-9161-9f03992aa49f from their all users all cloud apps MFA policy.
+Organizations that use [Subscription Activation](/windows/deployment/windows-10-subscription-activation) to enable users to ΓÇ£step-upΓÇ¥ from one version of Windows to another, may want to exclude the Universal Store Service APIs and Web Application, AppID 45a330b1-b1ec-4cc1-9161-9f03992aa49f from their all users all cloud apps MFA policy.
## Template deployment
Organizations can choose to deploy this policy using the steps outlined below or
## Create a Conditional Access policy
-The following steps will help create a Conditional Access policy to require all users do multi-factor authentication.
+The following steps will help create a Conditional Access policy to require all users do multifactor authentication.
1. Sign in to the **Azure portal** as a global administrator, security administrator, or Conditional Access administrator. 1. Browse to **Azure Active Directory** > **Security** > **Conditional Access**. 1. Select **New policy**. 1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.
-1. Under **Assignments**, select **Users and groups**
+1. Under **Assignments**, select **Users or workload identities**.
1. Under **Include**, select **All users** 1. Under **Exclude**, select **Users and groups** and choose your organization's emergency access or break-glass accounts.
- 1. Select **Done**.
1. Under **Cloud apps or actions** > **Include**, select **All cloud apps**.
- 1. Under **Exclude**, select any applications that don't require multi-factor authentication.
-1. Under **Access controls** > **Grant**, select **Grant access**, **Require multi-factor authentication**, and select **Select**.
+ 1. Under **Exclude**, select any applications that don't require multifactor authentication.
+1. Under **Access controls** > **Grant**, select **Grant access**, **Require multifactor authentication**, and select **Select**.
1. Confirm your settings and set **Enable policy** to **Report-only**. 1. Select **Create** to create to enable your policy.
After confirming your settings using [report-only mode](howto-conditional-access
Organizations may choose to incorporate known network locations known as **Named locations** to their Conditional Access policies. These named locations may include trusted IPv4 networks like those for a main office location. For more information about configuring named locations, see the article [What is the location condition in Azure Active Directory Conditional Access?](location-condition.md)
-In the example policy above, an organization may choose to not require multi-factor authentication if accessing a cloud app from their corporate network. In this case they could add the following configuration to the policy:
+In the example policy above, an organization may choose to not require multifactor authentication if accessing a cloud app from their corporate network. In this case they could add the following configuration to the policy:
1. Under **Assignments**, select **Conditions** > **Locations**. 1. Configure **Yes**.
active-directory Howto Conditional Access Policy Azure Management https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/howto-conditional-access-policy-azure-management.md
Title: Conditional Access - Require MFA for Azure management - Azure Active Directory
-description: Create a custom Conditional Access policy to require multi-factor authentication for Azure management tasks
+description: Create a custom Conditional Access policy to require multifactor authentication for Azure management tasks
Previously updated : 02/03/2022 Last updated : 08/22/2022
Organizations use many Azure services and manage them from Azure Resource Manage
* Azure PowerShell * Azure CLI
-These tools can provide highly privileged access to resources, that can alter subscription-wide configurations, service settings, and subscription billing. To protect these privileged resources, Microsoft recommends requiring multi-factor authentication for any user accessing these resources. In Azure AD, these tools are grouped together in a suite called [Microsoft Azure Management](concept-conditional-access-cloud-apps.md#microsoft-azure-management). For Azure Government, this suite should be the Azure Government Cloud Management API app.
+These tools can provide highly privileged access to resources that can make the following changes:
+
+- Alter subscription-wide configurations
+- Service settings
+- Subscription billing
+
+To protect these privileged resources, Microsoft recommends requiring multifactor authentication for any user accessing these resources. In Azure AD, these tools are grouped together in a suite called [Microsoft Azure Management](concept-conditional-access-cloud-apps.md#microsoft-azure-management). For Azure Government, this suite should be the Azure Government Cloud Management API app.
## User exclusions
Conditional Access policies are powerful tools, we recommend excluding the follo
* **Emergency access** or **break-glass** accounts to prevent tenant-wide account lockout. In the unlikely scenario all administrators are locked out of your tenant, your emergency-access administrative account can be used to log into the tenant take steps to recover access. * More information can be found in the article, [Manage emergency access accounts in Azure AD](../roles/security-emergency-access.md).
-* **Service accounts** and **service principals**, such as the Azure AD Connect Sync Account. Service accounts are non-interactive accounts that aren't tied to any particular user. They're normally used by back-end services allowing programmatic access to applications, but are also used to sign in to systems for administrative purposes. Service accounts like these should be excluded since MFA can't be completed programmatically. Calls made by service principals are not blocked by Conditional Access.
+* **Service accounts** and **service principals**, such as the Azure AD Connect Sync Account. Service accounts are non-interactive accounts that aren't tied to any particular user. They're normally used by back-end services allowing programmatic access to applications, but are also used to sign in to systems for administrative purposes. Service accounts like these should be excluded since MFA can't be completed programmatically. Calls made by service principals aren't blocked by Conditional Access.
* If your organization has these accounts in use in scripts or code, consider replacing them with [managed identities](../managed-identities-azure-resources/overview.md). As a temporary workaround, you can exclude these specific accounts from the baseline policy. ## Template deployment
Organizations can choose to deploy this policy using the steps outlined below or
## Create a Conditional Access policy
-The following steps will help create a Conditional Access policy to require users who access the [Microsoft Azure Management](concept-conditional-access-cloud-apps.md#microsoft-azure-management) suite do multi-factor authentication.
+The following steps will help create a Conditional Access policy to require users who access the [Microsoft Azure Management](concept-conditional-access-cloud-apps.md#microsoft-azure-management) suite do multifactor authentication.
> [!CAUTION] > Make sure you understand how Conditional Access works before setting up a policy to manage access to Microsoft Azure Management. Make sure you don't create conditions that could block your own access to the portal.
The following steps will help create a Conditional Access policy to require user
1. Browse to **Azure Active Directory** > **Security** > **Conditional Access**. 1. Select **New policy**. 1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.
-1. Under **Assignments**, select **Users and groups**
+1. Under **Assignments**, select **Users or workload identities**.
1. Under **Include**, select **All users**. 1. Under **Exclude**, select **Users and groups** and choose your organization's emergency access or break-glass accounts.
- 1. Select **Done**.
-1. Under **Cloud apps or actions** > **Include**, select **Select apps**, choose **Microsoft Azure Management**, and select **Select** then **Done**.
-1. Under **Access controls** > **Grant**, select **Grant access**, **Require multi-factor authentication**, and select **Select**.
+1. Under **Cloud apps or actions** > **Include**, select **Select apps**, choose **Microsoft Azure Management**, and select **Select**.
+1. Under **Access controls** > **Grant**, select **Grant access**, **Require multifactor authentication**, and select **Select**.
1. Confirm your settings and set **Enable policy** to **Report-only**. 1. Select **Create** to create to enable your policy.
active-directory Howto Conditional Access Policy Block Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/howto-conditional-access-policy-block-access.md
Title: Conditional Access - Block access - Azure Active Directory
-description: Create a custom Conditional Access policy to
+description: Create a custom Conditional Access policy to Block access
Previously updated : 02/14/2022 Last updated : 08/22/2022
Conditional Access policies are powerful tools, we recommend excluding the follo
* **Emergency access** or **break-glass** accounts to prevent tenant-wide account lockout. In the unlikely scenario all administrators are locked out of your tenant, your emergency-access administrative account can be used to log into the tenant take steps to recover access. * More information can be found in the article, [Manage emergency access accounts in Azure AD](../roles/security-emergency-access.md).
-* **Service accounts** and **service principals**, such as the Azure AD Connect Sync Account. Service accounts are non-interactive accounts that are not tied to any particular user. They are normally used by back-end services allowing programmatic access to applications, but are also used to sign in to systems for administrative purposes. Service accounts like these should be excluded since MFA can't be completed programmatically. Calls made by service principals are not blocked by Conditional Access.
+* **Service accounts** and **service principals**, such as the Azure AD Connect Sync Account. Service accounts are non-interactive accounts that aren't tied to any particular user. They're normally used by back-end services allowing programmatic access to applications, but are also used to sign in to systems for administrative purposes. Service accounts like these should be excluded since MFA can't be completed programmatically. Calls made by service principals aren't blocked by Conditional Access.
* If your organization has these accounts in use in scripts or code, consider replacing them with [managed identities](../managed-identities-azure-resources/overview.md). As a temporary workaround, you can exclude these specific accounts from the baseline policy. ## Create a Conditional Access policy
-The following steps will help create Conditional Access policies to block access to all apps except for [Office 365](concept-conditional-access-cloud-apps.md#office-365) if users are not on a trusted network. These policies are put in to [Report-only mode](howto-conditional-access-insights-reporting.md) to start so administrators can determine the impact they will have on existing users. When administrators are comfortable that the policies apply as they intend, they can switch them to **On**.
+The following steps will help create Conditional Access policies to block access to all apps except for [Office 365](concept-conditional-access-cloud-apps.md#office-365) if users aren't on a trusted network. These policies are put in to [Report-only mode](howto-conditional-access-insights-reporting.md) to start so administrators can determine the impact they'll have on existing users. When administrators are comfortable that the policies apply as they intend, they can switch them to **On**.
The first policy blocks access to all apps except for Microsoft 365 applications if not on a trusted location.
The first policy blocks access to all apps except for Microsoft 365 applications
1. Browse to **Azure Active Directory** > **Security** > **Conditional Access**. 1. Select **New policy**. 1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.
-1. Under **Assignments**, select **Users and groups**.
+1. Under **Assignments**, select **Users or workload identities**.
1. Under **Include**, select **All users**. 1. Under **Exclude**, select **Users and groups** and choose your organization's emergency access or break-glass accounts.
- 1. Select **Done**.
1. Under **Cloud apps or actions**, select the following options: 1. Under **Include**, select **All cloud apps**.
- 1. Under **Exclude**, select **Office 365**, select **Select**, then select **Done**.
+ 1. Under **Exclude**, select **Office 365**, select **Select**.
1. Under **Conditions**: 1. Under **Conditions** > **Location**. 1. Set **Configure** to **Yes** 1. Under **Include**, select **Any location**. 1. Under **Exclude**, select **All trusted locations**.
- 1. Select **Done**.
- 1. Under **Client apps (Preview)**, set **Configure** to **Yes**, and select **Done**, then **Done**.
+ 1. Under **Client apps**, set **Configure** to **Yes**, and select **Done**.
1. Under **Access controls** > **Grant**, select **Block access**, then select **Select**. 1. Confirm your settings and set **Enable policy** to **Report-only**. 1. Select **Create** to create to enable your policy. After confirming your settings using [report-only mode](howto-conditional-access-insights-reporting.md), an administrator can move the **Enable policy** toggle from **Report-only** to **On**.
-A second policy is created below to require multi-factor authentication or a compliant device for users of Microsoft 365.
+A second policy is created below to require multifactor authentication or a compliant device for users of Microsoft 365.
1. Select **New policy**. 1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.
-1. Under **Assignments**, select **Users and groups**.
+1. Under **Assignments**, select **Users or workload identities**.
1. Under **Include**, select **All users**. 1. Under **Exclude**, select **Users and groups** and choose your organization's emergency access or break-glass accounts.
- 1. Select **Done**.
-1. Under **Cloud apps or actions** > **Include**, select **Select apps**, choose **Office 365**, and select **Select**, then **Done**.
+1. Under **Cloud apps or actions** > **Include**, select **Select apps**, choose **Office 365**, and select **Select**.
1. Under **Access controls** > **Grant**, select **Grant access**.
- 1. Select **Require multi-factor authentication** and **Require device to be marked as compliant** select **Select**.
+ 1. Select **Require multifactor authentication** and **Require device to be marked as compliant** select **Select**.
1. Ensure **Require one of the selected controls** is selected. 1. Select **Select**. 1. Confirm your settings and set **Enable policy** to **Report-only**.
active-directory Howto Conditional Access Policy Block Legacy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/howto-conditional-access-policy-block-legacy.md
Previously updated : 11/05/2021 Last updated : 08/22/2022
Organizations can choose to deploy this policy using the steps outlined below or
## Create a Conditional Access policy
-The following steps will help create a Conditional Access policy to block legacy authentication requests. This policy is put in to [Report-only mode](howto-conditional-access-insights-reporting.md) to start so administrators can determine the impact they will have on existing users. When administrators are comfortable that the policy applies as they intend, they can switch to **On** or stage the deployment by adding specific groups and excluding others.
+The following steps will help create a Conditional Access policy to block legacy authentication requests. This policy is put in to [Report-only mode](howto-conditional-access-insights-reporting.md) to start so administrators can determine the impact they'll have on existing users. When administrators are comfortable that the policy applies as they intend, they can switch to **On** or stage the deployment by adding specific groups and excluding others.
1. Sign in to the **Azure portal** as a global administrator, security administrator, or Conditional Access administrator. 1. Browse to **Azure Active Directory** > **Security** > **Conditional Access**. 1. Select **New policy**. 1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.
-1. Under **Assignments**, select **Users and groups**
+1. Under **Assignments**, select **Users or workload identities**.
1. Under **Include**, select **All users**.
- 1. Under **Exclude**, select **Users and groups** and choose any accounts that must maintain the ability to use legacy authentication. Exclude at least one account to prevent yourself from being locked out. If you do not exclude any account, you will not be able to create this policy.
- 1. Select **Done**.
+ 1. Under **Exclude**, select **Users and groups** and choose any accounts that must maintain the ability to use legacy authentication. Exclude at least one account to prevent yourself from being locked out. If you don't exclude any account, you won't be able to create this policy.
1. Under **Cloud apps or actions**, select **All cloud apps**.
- 1. Select **Done**.
1. Under **Conditions** > **Client apps**, set **Configure** to **Yes**. 1. Check only the boxes **Exchange ActiveSync clients** and **Other clients**. 1. Select **Done**.
active-directory Howto Conditional Access Policy Compliant Device https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/howto-conditional-access-policy-compliant-device.md
Previously updated : 03/28/2022 Last updated : 08/22/2022
The following steps will help create a Conditional Access policy to require devi
1. Browse to **Azure Active Directory** > **Security** > **Conditional Access**. 1. Select **New policy**. 1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.
-1. Under **Assignments**, select **Users and groups**
+1. Under **Assignments**, select **Users or workload identities**.
1. Under **Include**, select **All users**. 1. Under **Exclude**, select **Users and groups** and choose your organization's emergency access or break-glass accounts.
- 1. Select **Done**.
1. Under **Cloud apps or actions** > **Include**, select **All cloud apps**. 1. If you must exclude specific applications from your policy, you can choose them from the **Exclude** tab under **Select excluded cloud apps** and choose **Select**.
- 1. Select **Done**.
1. Under **Access controls** > **Grant**. 1. Select **Require device to be marked as compliant** and **Require Hybrid Azure AD joined device** 1. **For multiple controls** select **Require one of the selected controls**.
active-directory Howto Conditional Access Policy Location https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/howto-conditional-access-policy-location.md
Previously updated : 11/05/2021 Last updated : 08/22/2022
# Conditional Access: Block access by location
-With the location condition in Conditional Access, you can control access to your cloud apps based on the network location of a user. The location condition is commonly used to block access from countries/regions where your organization knows traffic should not come from.
+With the location condition in Conditional Access, you can control access to your cloud apps based on the network location of a user. The location condition is commonly used to block access from countries/regions where your organization knows traffic shouldn't come from.
> [!NOTE] > Conditional Access policies are enforced after first-factor authentication is completed. Conditional Access isn't intended to be an organization's first line of defense for scenarios like denial-of-service (DoS) attacks, but it can use signals from these events to determine access.
With the location condition in Conditional Access, you can control access to you
1. Choose **New location**. 1. Give your location a name. 1. Choose **IP ranges** if you know the specific externally accessible IPv4 address ranges that make up that location or **Countries/Regions**.
- 1. Provide the **IP ranges** or select the **Countries/Regions** for the location you are specifying.
+ 1. Provide the **IP ranges** or select the **Countries/Regions** for the location you're specifying.
* If you choose Countries/Regions, you can optionally choose to include unknown areas. 1. Choose **Save**
More information about the location condition in Conditional Access can be found
1. Browse to **Azure Active Directory** > **Security** > **Conditional Access**. 1. Select **New policy**. 1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.
-1. Under **Assignments**, select **Users and groups**
+1. Under **Assignments**, select **Users or workload identities**.
1. Under **Include**, select **All users**. 1. Under **Exclude**, select **Users and groups** and choose your organization's emergency access or break-glass accounts.
- 1. Select **Done**.
1. Under **Cloud apps or actions** > **Include**, and select **All cloud apps**. 1. Under **Conditions** > **Location**. 1. Set **Configure** to **Yes**
active-directory Howto Conditional Access Policy Registration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/howto-conditional-access-policy-registration.md
Previously updated : 11/15/2021 Last updated : 08/22/2022
# Conditional Access: Securing security info registration
-Securing when and how users register for Azure AD Multi-Factor Authentication and self-service password reset is possible with user actions in a Conditional Access policy. This feature is available to organizations who have enabled the [combined registration](../authentication/concept-registration-mfa-sspr-combined.md). This functionality allows organizations to treat the registration process like any application in a Conditional Access policy and use the full power of Conditional Access to secure the experience. Users signing in to the Microsoft Authenticator app or enabling passwordless phone sign-in are subject to this policy.
+Securing when and how users register for Azure AD multifactor Authentication and self-service password reset is possible with user actions in a Conditional Access policy. This feature is available to organizations who have enabled the [combined registration](../authentication/concept-registration-mfa-sspr-combined.md). This functionality allows organizations to treat the registration process like any application in a Conditional Access policy and use the full power of Conditional Access to secure the experience. Users signing in to the Microsoft Authenticator app or enabling passwordless phone sign-in are subject to this policy.
-Some organizations in the past may have used trusted network location or device compliance as a means to secure the registration experience. With the addition of [Temporary Access Pass](../authentication/howto-authentication-temporary-access-pass.md) in Azure AD, administrators can provide time-limited credentials to their users that allow them to register from any device or location. Temporary Access Pass credentials satisfy Conditional Access requirements for multi-factor authentication.
+Some organizations in the past may have used trusted network location or device compliance as a means to secure the registration experience. With the addition of [Temporary Access Pass](../authentication/howto-authentication-temporary-access-pass.md) in Azure AD, administrators can provide time-limited credentials to their users that allow them to register from any device or location. Temporary Access Pass credentials satisfy Conditional Access requirements for multifactor authentication.
## Template deployment
Organizations can choose to deploy this policy using the steps outlined below or
## Create a policy to secure registration
-The following policy applies to the selected users, who attempt to register using the combined registration experience. The policy requires users to be in a trusted network location, do multi-factor authentication or use Temporary Access Pass credentials.
+The following policy applies to the selected users, who attempt to register using the combined registration experience. The policy requires users to be in a trusted network location, do multifactor authentication or use Temporary Access Pass credentials.
1. In the **Azure portal**, browse to **Azure Active Directory** > **Security** > **Conditional Access**. 1. Select **New policy**. 1. In Name, Enter a Name for this policy. For example, **Combined Security Info Registration with TAP**.
-1. Under **Assignments**, select **Users and groups**.
+1. Under **Assignments**, select **Users or workload identities**.
1. Under **Include**, select **All users**. > [!WARNING]
The following policy applies to the selected users, who attempt to register usin
1. Include **Any location**. 1. Exclude **All trusted locations**. 1. Under **Access controls** > **Grant**.
- 1. Select **Grant access**, **Require multi-factor authentication**.
+ 1. Select **Grant access**, **Require multifactor authentication**.
1. Select **Select**. 1. Confirm your settings and set **Enable policy** to **Report-only**. 1. Select **Create** to create to enable your policy. After confirming your settings using [report-only mode](howto-conditional-access-insights-reporting.md), an administrator can move the **Enable policy** toggle from **Report-only** to **On**.
-Administrators will now have to issue Temporary Access Pass credentials to new users so they can satisfy the requirements for multi-factor authentication to register. Steps to accomplish this task, are found in the section [Create a Temporary Access Pass in the Azure AD Portal](../authentication/howto-authentication-temporary-access-pass.md#create-a-temporary-access-pass).
+Administrators will now have to issue Temporary Access Pass credentials to new users so they can satisfy the requirements for multifactor authentication to register. Steps to accomplish this task, are found in the section [Create a Temporary Access Pass in the Azure AD Portal](../authentication/howto-authentication-temporary-access-pass.md#create-a-temporary-access-pass).
-Organizations may choose to require other grant controls with or in place of **Require multi-factor authentication** at step 6b. When selecting multiple controls, be sure to select the appropriate radio button toggle to require **all** or **one** of the selected controls when making this change.
+Organizations may choose to require other grant controls with or in place of **Require multifactor authentication** at step 6b. When selecting multiple controls, be sure to select the appropriate radio button toggle to require **all** or **one** of the selected controls when making this change.
### Guest user registration
-For [guest users](../external-identities/what-is-b2b.md) who need to register for multi-factor authentication in your directory you may choose to block registration from outside of [trusted network locations](concept-conditional-access-conditions.md#locations) using the following guide.
+For [guest users](../external-identities/what-is-b2b.md) who need to register for multifactor authentication in your directory you may choose to block registration from outside of [trusted network locations](concept-conditional-access-conditions.md#locations) using the following guide.
1. In the **Azure portal**, browse to **Azure Active Directory** > **Security** > **Conditional Access**. 1. Select **New policy**. 1. In Name, Enter a Name for this policy. For example, **Combined Security Info Registration on Trusted Networks**.
-1. Under **Assignments**, select **Users and groups**.
+1. Under **Assignments**, select **Users or workload identities**.
1. Under **Include**, select **All guest and external users**. 1. Under **Cloud apps or actions**, select **User actions**, check **Register security information**. 1. Under **Conditions** > **Locations**.
active-directory Howto Conditional Access Policy Risk User https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/howto-conditional-access-policy-risk-user.md
Previously updated : 08/16/2022 Last updated : 08/22/2022
Organizations can choose to deploy this policy using the steps outlined below or
1. Browse to **Azure Active Directory** > **Security** > **Conditional Access**. 1. Select **New policy**. 1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.
-1. Under **Assignments**, select **Users and groups**.
+1. Under **Assignments**, select **Users or workload identities**.
1. Under **Include**, select **All users**. 1. Under **Exclude**, select **Users and groups** and choose your organization's emergency access or break-glass accounts.
- 1. Select **Done**.
1. Under **Cloud apps or actions** > **Include**, select **All cloud apps**. 1. Under **Conditions** > **User risk**, set **Configure** to **Yes**. 1. Under **Configure user risk levels needed for policy to be enforced**, select **High**.
active-directory Howto Conditional Access Policy Risk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/howto-conditional-access-policy-risk.md
Previously updated : 08/16/2022 Last updated : 08/22/2022
# Conditional Access: Sign-in risk-based Conditional Access
-Most users have a normal behavior that can be tracked, when they fall outside of this norm it could be risky to allow them to just sign in. You may want to block that user or maybe just ask them to perform multi-factor authentication to prove that they are really who they say they are.
+Most users have a normal behavior that can be tracked, when they fall outside of this norm it could be risky to allow them to just sign in. You may want to block that user or maybe just ask them to perform multifactor authentication to prove that they're really who they say they are.
A sign-in risk represents the probability that a given authentication request isn't authorized by the identity owner. Organizations with Azure AD Premium P2 licenses can create Conditional Access policies incorporating [Azure AD Identity Protection sign-in risk detections](../identity-protection/concept-identity-protection-risks.md#sign-in-risk). There are two locations where this policy may be configured, Conditional Access and Identity Protection. Configuration using a Conditional Access policy is the preferred method providing more context including enhanced diagnostic data, report-only mode integration, Graph API support, and the ability to utilize other Conditional Access attributes in the policy.
-The Sign-in risk-based policy protects users from registering MFA in risky sessions. For example. If the users are not registered for MFA, their risky sign-ins will get blocked and presented with the AADSTS53004 error.
+The Sign-in risk-based policy protects users from registering MFA in risky sessions. If users aren't registered for MFA, their risky sign-ins will get blocked, and they see an AADSTS53004 error.
## Template deployment
Organizations can choose to deploy this policy using the steps outlined below or
1. Browse to **Azure Active Directory** > **Security** > **Conditional Access**. 1. Select **New policy**. 1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.
-1. Under **Assignments**, select **Users and groups**.
+1. Under **Assignments**, select **Users or workload identities**.
1. Under **Include**, select **All users**. 1. Under **Exclude**, select **Users and groups** and choose your organization's emergency access or break-glass accounts.
- 1. Select **Done**.
1. Under **Cloud apps or actions** > **Include**, select **All cloud apps**. 1. Under **Conditions** > **Sign-in risk**, set **Configure** to **Yes**. Under **Select the sign-in risk level this policy will apply to**. 1. Select **High** and **Medium**. 1. Select **Done**. 1. Under **Access controls** > **Grant**.
- 1. Select **Grant access**, **Require multi-factor authentication**.
+ 1. Select **Grant access**, **Require multifactor authentication**.
1. Select **Select**. 1. Confirm your settings and set **Enable policy** to **Report-only**. 1. Select **Create** to create to enable your policy.
active-directory Howto Conditional Access Session Lifetime https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/howto-conditional-access-session-lifetime.md
Previously updated : 07/06/2022 Last updated : 08/22/2022
On Azure AD registered Windows devices, sign in to the device is considered a pr
1. Browse to **Azure Active Directory** > **Security** > **Conditional Access**. 1. Select **New policy**. 1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.
-1. Under **Assignments**, select **Users and groups**.
+1. Under **Assignments**, select **Users or workload identities**.
1. Under **Include**, select **All users**. 1. Under **Exclude**, select **Users and groups** and choose your organization's [emergency access or break-glass accounts](../roles/security-emergency-access.md). 1. Select **Done**.
After administrators confirm your settings using [report-only mode](howto-condit
### Validation
-Use the What-If tool to simulate a sign in from the user to the target application and other conditions based on how you configured your policy. The authentication session management controls show up in the result of the tool.
+Use the What-If tool to simulate a sign-in from the user to the target application and other conditions based on how you configured your policy. The authentication session management controls show up in the result of the tool.
![Conditional Access What If tool results](media/howto-conditional-access-session-lifetime/conditional-access-what-if-tool-result.png)
active-directory Howto Policy Approved App Or App Protection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/howto-policy-approved-app-or-app-protection.md
Previously updated : 11/08/2021 Last updated : 08/22/2022
Organizations can choose to deploy this policy using the steps outlined below or
1. Browse to **Azure Active Directory** > **Security** > **Conditional Access**. 1. Select **New policy**. 1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.
-1. Under **Assignments**, select **Users and groups**
+1. Under **Assignments**, select **Users or workload identities**.
1. Under **Include**, select **All users**. 1. Under **Exclude**, select **Users and groups** and exclude at least one account to prevent yourself from being locked out. If you don't exclude any accounts, you can't create the policy.
- 1. Select **Done**.
1. Under **Cloud apps or actions**, select **All cloud apps**. 1. Under **Conditions** > **Device platforms**, set **Configure** to **Yes**. 1. Under **Include**, **Select device platforms**.
This policy will block all Exchange ActiveSync clients using basic authenticatio
1. Browse to **Azure Active Directory** > **Security** > **Conditional Access**. 1. Select **New policy**. 1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.
-1. Under **Assignments**, select **Users and groups**
+1. Under **Assignments**, select **Users or workload identities**.
1. Under **Include**, select **All users**. 1. Under **Exclude**, select **Users and groups** and exclude at least one account to prevent yourself from being locked out. If you don't exclude any accounts, you can't create the policy. 1. Select **Done**.
active-directory Policy Migration Mfa https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/policy-migration-mfa.md
Title: Migrate Conditional Access policies with multi-factor authentication - Azure Active Directory
-description: This article shows how to migrate a classic policy that requires multi-factor authentication in the Azure portal.
+ Title: Migrate a classic Conditional Access policy - Azure Active Directory
+description: This article shows how to migrate a classic Conditional Access policy in the Azure portal.
Previously updated : 05/26/2020 Last updated : 08/22/2022
# Migrate a classic policy in the Azure portal
-This article shows how to migrate a classic policy that requires **multi-factor authentication** for a cloud app. Although it is not a prerequisite, we recommend that you read [Migrate classic policies in the Azure portal](policy-migration.md) before you start migrating your classic policies.
+This article shows how to migrate a classic policy that requires **multifactor authentication** for a cloud app. Although it isn't a prerequisite, we recommend that you read [Migrate classic policies in the Azure portal](policy-migration.md) before you start migrating your classic policies.
![Classic policy details requiring MFA for Salesforce app](./media/policy-migration/33.png)
The migration process consists of the following steps:
1. In the list of classic policies, select the policy you wish to migrate. Document the configuration settings so that you can re-create with a new Conditional Access policy.
-## Create a new Conditional Access policy
-
-1. In the [Azure portal](https://portal.azure.com), navigate to **Azure Active Directory** > **Security** > **Conditional Access**.
-1. To create a new Conditional Access policy, select **New policy**.
-1. On the **New** page, in the **Name** textbox, type a name for your policy.
-1. In the **Assignments** section, click **Users and groups**.
- 1. If you have all users selected in your classic policy, click **All users**.
- 1. If you have groups selected in your classic policy, click **Select users and groups**, and then select the required users and groups.
- 1. If you have the excluded groups, click the **Exclude** tab, and then select the required users and groups.
- 1. Select **Done**
-1. In the **Assignment** section, click **Cloud apps or actions**.
-1. On the **Cloud apps or actions** page, perform the following steps:
- 1. Click **Select apps**.
- 1. Click **Select**.
- 1. On the **Select** page, select your cloud app, and then click **Select**.
- 1. On the **Cloud apps** page, click **Done**.
-1. If you have **Require multi-factor authentication** selected:
- 1. In the **Access controls** section, click **Grant**.
- 1. On the **Grant** page, click **Grant access**, and then click **Require multi-factor authentication**.
- 1. Click **Select**.
-1. Click **On** to enable your policy then select **Save**.
-
- ![Conditional Access policy creation](./media/policy-migration-mfa/conditional-access-policy-migration.png)
+For examples of common policies and their configuration in the Azure portal, see the article [Common Conditional Access policies](concept-conditional-access-policy-common.md).
## Disable the classic policy
-To disable your classic policy, click **Disable** in the **Details** view.
+To disable your classic policy, select **Disable** in the **Details** view.
![Disable classic policies](./media/policy-migration-mfa/14.png)
active-directory Policy Migration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/policy-migration.md
Previously updated : 12/04/2019 Last updated : 08/22/2022
Conditional Access is the tool used by Azure Active Directory to bring signals together, to make decisions, and enforce organizational policies. Conditional Access is at the heart of the new identity driven control plane. While the purpose is still the same, the release of the new Azure portal has introduced significant improvements to how Conditional Access works.
-Consider migrating the policies you have not created in the Azure portal because:
+Consider migrating the policies you haven't created in the Azure portal because:
-- You can now address scenarios you could not handle before.
+- You can now address scenarios you couldn't handle before.
- You can reduce the number of policies you have to manage by consolidating them. - You can manage all your Conditional Access policies in one central location. - The Azure classic portal will be retired.
This article explains what you need to know to migrate your existing Conditional
## Classic policies
-In the [Azure portal](https://portal.azure.com), Conditional Access policies can be found under **Azure Active Directory** > **Security** > **Conditional Access**. Your organization might also have older Conditional Access policies not created using this page. These policies are known as *classic policies*. Classic policies are Conditional Access policies, you have created in:
+In the [Azure portal](https://portal.azure.com), Conditional Access policies can be found under **Azure Active Directory** > **Security** > **Conditional Access**. Your organization might also have older Conditional Access policies not created using this page. These policies are known as *classic policies*. Classic policies are Conditional Access policies, you've created in:
- The Azure classic portal - The Intune classic portal
This is, for example, the case if you want to support all client app types. In a
![Conditional Access selecting client apps](./media/policy-migration/64.png)
-A consolidation into one new policy is also not possible if your classic policies contain several conditions. A new policy that has **Exchange Active Sync** as client apps condition configured does not support other conditions:
+A consolidation into one new policy is also not possible if your classic policies contain several conditions. A new policy that has **Exchange Active Sync** as client apps condition configured doesn't support other conditions:
![Exchange ActiveSync does not support the selected conditions](./media/policy-migration/08.png)
-If you have a new policy that has **Exchange Active Sync** as client apps condition configured, you need to make sure that all other conditions are not configured.
+If you have a new policy that has **Exchange Active Sync** as client apps condition configured, you need to make sure that all other conditions aren't configured.
![Conditional Access conditions](./media/policy-migration/16.png)
App-based classic policies for Exchange Online that include **Exchange Active Sy
You can consolidate multiple classic policies that include **Exchange Active Sync** as client apps condition if they have: - Only **Exchange Active Sync** as condition -- Several requirements for granting access configured
+- Several requirements for granting access are configured
One common scenario is the consolidation of:
In a new policy, you need to select the [device platforms](concept-conditional-a
- [Use report-only mode for Conditional Access to determine the impact of new policy decisions.](concept-conditional-access-report-only.md) - If you want to know how to configure a Conditional Access policy, see [Conditional Access common policies](concept-conditional-access-policy-common.md).-- If you are ready to configure Conditional Access policies for your environment, see the article [How To: Plan your Conditional Access deployment in Azure Active Directory](plan-conditional-access.md).
+- If you're ready to configure Conditional Access policies for your environment, see the article [How To: Plan your Conditional Access deployment in Azure Active Directory](plan-conditional-access.md).
active-directory Troubleshoot Policy Changes Audit Log https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/troubleshoot-policy-changes-audit-log.md
Title: Troubleshooting Conditional Access policy changes - Azure Active Directory
+ Title: Troubleshoot Conditional Access policy changes - Azure Active Directory
description: Diagnose changes to Conditional Access policy with the Azure AD audit logs. Previously updated : 08/09/2021 Last updated : 08/22/2022
Audit log data is only kept for 30 days by default, which may not be long enough
- Send data to a Log Analytics workspace - Archive data to a storage account-- Stream data to an Event Hub
+- Stream data to Event Hubs
- Send data to a partner solution Find these options in the **Azure portal** > **Azure Active Directory**, **Diagnostic settings** > **Edit setting**. If you don't have a diagnostic setting, follow the instructions in the article [Create diagnostic settings to send platform logs and metrics to different destinations](../../azure-monitor/essentials/diagnostic-settings.md) to create one.
active-directory 7 Secure Access Conditional Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/7-secure-access-conditional-access.md
To create a policy that blocks access for external users to a set of application
1. Browse to **Azure Active Directory** > **Security** > **Conditional Access**. 1. Select **New policy**. 1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies, for example ExternalAccess_Block_FinanceApps.
-1. Under **Assignments**, select **Users and groups**.
+1. Under **Assignments**, select **Users or workload identities**..
1. Under **Include**, select **All guests and external users**. 1. Under **Exclude**, select **Users and groups** and choose your organization's [emergency access or break-glass accounts](../roles/security-emergency-access.md). 1. Select **Done**.
There may be times you want to block external users except a specific group. For
1. Browse to **Azure Active Directory** > **Security** > **Conditional Access**. 1. Select **New policy**. 1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies, for example ExternalAccess_Block_AllButFinance.
-1. Under **Assignments**, select **Users and groups**.
+1. Under **Assignments**, select **Users or workload identities**..
1. Under **Include**, select **All guests and external users**. 1. Under **Exclude**, select **Users and groups**, 1. Choose your organization's [emergency access or break-glass accounts](../roles/security-emergency-access.md).
active-directory Concept Fundamentals Block Legacy Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/concept-fundamentals-block-legacy-authentication.md
- Title: Blocking legacy authentication protocols in Azure AD
-description: Learn how and why organizations should block legacy authentication protocols
----- Previously updated : 01/26/2021---------
-# Blocking legacy authentication
-
-To give your users easy access to your cloud apps, Azure Active Directory (Azure AD) supports a broad variety of authentication protocols including legacy authentication. Legacy authentication is a term that refers to an authentication request made by:
--- Older Office clients that do not use modern authentication (for example, Office 2010 client)-- Any client that uses legacy mail protocols such as IMAP/SMTP/POP3-
-Today, the majority of all compromising sign-in attempts come from legacy authentication. Legacy authentication does not support multi-factor authentication (MFA). Even if you have an MFA policy enabled on your directory, a bad actor can authenticate using a legacy protocol and bypass MFA. The best way to protect your account from malicious authentication requests made by legacy protocols is to block these attempts altogether.
-
-## Identify legacy authentication use
-
-Before you can block legacy authentication in your directory, you need to first understand if your users have apps that use legacy authentication and how it affects your overall directory. Azure AD sign-in logs can be used to understand if you're using legacy authentication.
-
-1. Navigate to the **Azure portal** > **Azure Active Directory** > **Sign-in logs**.
-1. Add the **Client App** column if it is not shown by clicking onΓÇ»**Columns**ΓÇ»>ΓÇ»**Client App**.
-1. Filter by **Client App** > check all the **Legacy Authentication Clients** options presented.
-1. Filter by **Status** > **Success**.
-1. Expand your date range if necessary using the **Date** filter.
-1. If you have activated the [new sign-in activity reports preview](../reports-monitoring/concept-all-sign-ins.md), repeat the above steps also on the **User sign-ins (non-interactive)** tab.
-
-Filtering will only show you successful sign-in attempts that were made by the selected legacy authentication protocols. Clicking on each individual sign-in attempt will show you additional details. The Client App column or the Client App field under the Basic Info tab after selecting an individual row of data will indicate which legacy authentication protocol was used.
-These logs will indicate which users are still depending on legacy authentication and which applications are using legacy protocols to make authentication requests. For users that do not appear in these logs and are confirmed to not be using legacy authentication, implement a Conditional Access policy or enable the Baseline policy: block legacy authentication for these users only.
-
-## Moving away from legacy authentication
-
-Once you have a better idea of who is using legacy authentication in your directory and which applications depend on it, the next step is upgrading your users to use modern authentication. Modern authentication is a method of identity management that offers more secure user authentication and authorization. If you have an MFA policy in place on your directory, modern authentication ensures that the user is prompted for MFA when required. It is the more secure alternative to legacy authentication protocols.
-
-This section gives a step-by-step overview on how to update your environment to modern authentication. Read through the steps below before enabling a legacy authentication blocking policy in your organization.
-
-### Step 1: Enable modern authentication in your directory
-
-The first step in enabling modern authentication is making sure your directory supports modern authentication. Modern authentication is enabled by default for directories created on or after August 1, 2017. If your directory was created prior to this date, you'll need to manually enable modern authentication for your directory using the following steps:
-
-1. Check to see if your directory already supports modern authentication by running `Get-CsOAuthConfiguration` from the [Skype for Business Online PowerShell module](/office365/enterprise/powershell/manage-skype-for-business-online-with-office-365-powershell).
-1. If your command returns an empty `OAuthServers` property, then Modern Authentication is disabled. Update the setting to enable modern authentication using `Set-CsOAuthConfiguration`. If your `OAuthServers` property contains an entry, you're good to go.
-
-Be sure to complete this step before moving forward. It's critical that your directory configurations are changed first because they dictate which protocol will be used by all Office clients. Even if you're using Office clients that support modern authentication, they will default to using legacy protocols if modern authentication is disabled on your directory.
-
-### Step 2: Office applications
-
-Once you have enabled modern authentication in your directory, you can start updating applications by enabling modern authentication for Office clients. Office 2016 or later clients support modern authentication by default. No extra steps are required.
-
-If you are using Office 2013 Windows clients or older, we recommend upgrading to Office 2016 or later. Even after completing the prior step of enabling modern authentication in your directory, the older Office applications will continue to use legacy authentication protocols. If you are using Office 2013 clients and are unable to immediately upgrade to Office 2016 or later, follow the steps in the following article to [Enable Modern Authentication for Office 2013 on Windows devices](/office365/admin/security-and-compliance/enable-modern-authentication). To help protect your account while you're using legacy authentication, we recommend using strong passwords across your directory. Check out [Azure AD password protection](../authentication/concept-password-ban-bad.md) to ban weak passwords across your directory.
-
-Office 2010 does not support modern authentication. You will need to upgrade any users with Office 2010 to a more recent version of Office. We recommend upgrading to Office 2016 or later, as it blocks legacy authentication by default.
-
-If you are using macOS, we recommend upgrading to Office for Mac 2016 or later. If you are using the native mail client, you will need to have macOS version 10.14 or later on all devices.
-
-### Step 3: Exchange and SharePoint
-
-For Windows-based Outlook clients to use modern authentication, Exchange Online must be modern authentication enabled as well. If modern authentication is disabled for Exchange Online, Windows-based Outlook clients that support modern authentication (Outlook 2013 or later) will use basic authentication to connect to Exchange Online mailboxes.
-
-SharePoint Online is enabled for modern authentication default. For directories created after August 1, 2017, modern authentication is enabled by default in Exchange Online. However, if you had previously disabled modern authentication or are you using a directory created prior to this date, follow the steps in the following article toΓÇ»[Enable modern authentication in Exchange Online](/exchange/clients-and-mobile-in-exchange-online/enable-or-disable-modern-authentication-in-exchange-online).
-
-### Step 4: Skype for Business
-
-To prevent legacy authentication requests made by Skype for Business, it is necessary to enable modern authentication for Skype for Business Online. For directories created after August 1, 2017, modern authentication for Skype for Business is enabled by default.
-
-We suggest you transition to Microsoft Teams, which supports modern authentication by default. However, if you are unable to migrate at this time, you will need to enable modern authentication for Skype for Business Online so that Skype for Business clients start using modern authentication. Follow the steps in this articleΓÇ»[Skype for Business topologies supported with Modern Authentication](/skypeforbusiness/plan-your-deployment/modern-authentication/topologies-supported), to enable Modern Authentication for Skype for Business.
-
-In addition to enabling modern authentication for Skype for Business Online, we recommend enabling modern authentication for Exchange Online when enabling modern authentication for Skype for Business. This process will help synchronize the state of modern authentication in Exchange Online and Skype for Business online and will prevent multiple sign-in prompts for Skype for Business clients.
-
-### Step 5: Using mobile devices
-
-Applications on your mobile device need to block legacy authentication as well. We recommend using Outlook for Mobile. Outlook for Mobile supports modern authentication by default and will satisfy other MFA baseline protection policies.
-
-In order to use the native iOS mail client, you will need to be running iOS version 11.0 or later to ensure the mail client has been updated to block legacy authentication.
-
-### Step 6: On-premises clients
-
-If you are a hybrid customer using Exchange Server on-premises and Skype for Business on-premises, both services will need to be updated to enable modern authentication. When using modern authentication in a hybrid environment, you're still authenticating users on-premises. The story of authorizing their access to resources (files or emails) changes.
-
-Before you can begin enabling modern authentication on-premises, please be sure that you have met the pre-requisites. You're now ready to enable modern authentication on-premises.
-
-Steps for enabling modern authentication can be found in the following articles:
-
-* [How to configure Exchange Server on-premises to use Hybrid Modern Authentication](/office365/enterprise/configure-exchange-server-for-hybrid-modern-authentication)
-* [How to use Modern Authentication with Skype for Business](/skypeforbusiness/manage/authentication/use-adal)
-
-## Next steps
--- [How to configure Exchange Server on-premises to use Hybrid Modern Authentication](/office365/enterprise/configure-exchange-server-for-hybrid-modern-authentication)-- [How to use Modern Authentication with Skype for Business](/skypeforbusiness/manage/authentication/use-adal)-- [Block legacy authentication](../conditional-access/block-legacy-authentication.md)
active-directory Concept Fundamentals Security Defaults https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/concept-fundamentals-security-defaults.md
Today, most compromising sign-in attempts come from legacy authentication. Legac
After security defaults are enabled in your tenant, all authentication requests made by an older protocol will be blocked. Security defaults blocks Exchange Active Sync basic authentication. > [!WARNING]
-> Before you enable security defaults, make sure your administrators aren't using older authentication protocols. For more information, see [How to move away from legacy authentication](concept-fundamentals-block-legacy-authentication.md).
+> Before you enable security defaults, make sure your administrators aren't using older authentication protocols. For more information, see [How to move away from legacy authentication](../conditional-access/block-legacy-authentication.md).
- [How to set up a multifunction device or application to send email using Microsoft 365](/exchange/mail-flow-best-practices/how-to-set-up-a-multifunction-device-or-application-to-send-email-using-microsoft-365-or-office-365)
active-directory Perform Access Review https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/perform-access-review.md
# Review access to groups and applications in Azure AD access reviews
-Azure Active Directory (Azure AD) simplifies how enterprises manage access to groups and applications in Azure AD and other Microsoft Online Services with a feature called Azure AD access reviews. This article will go over how a designated reviewer performs an access review for members of a group or users with access to an application. If you would like to review access to an access package read [Review access of an access package in Azure AD entitlement management](entitlement-management-access-reviews-review-access.md)
+Azure Active Directory (Azure AD) simplifies how enterprises manage access to groups and applications in Azure AD and other Microsoft web services with a feature called Azure AD access reviews. This article will cover how a designated reviewer performs an access review for members of a group or users with access to an application. If you want to review access to an access package, read [Review access of an access package in Azure AD entitlement management](entitlement-management-access-reviews-review-access.md).
-## Perform access review using My Access
-You can review access to groups and applications via My Access, an end-user friendly portal for granting, approving, and reviewing access needs.
+## Perform access review by using My Access
+You can review access to groups and applications via My Access. My Access is a user-friendly portal for granting, approving, and reviewing access needs.
-### Use email to navigate to My Access
+### Use email to go to My Access
>[!IMPORTANT]
-> There could be delays in receiving email and it some cases it could take up to 24 hours. Add azure-noreply@microsoft.com to your safe recipients list to make sure that you are receiving all emails.
+> There could be delays in receiving email. In some cases, it could take up to 24 hours. Add azure-noreply@microsoft.com to your safe recipients list to make sure that you're receiving all emails.
-1. Look for an email from Microsoft asking you to review access. You can see an example email message below:
+1. Look for an email from Microsoft asking you to review access. Here's an example email message:
- ![Example email from Microsoft to review access to a group](./media/perform-access-review/access-review-email-preview.png)
+ ![Screenshot of example email from Microsoft to review access to a group.](./media/perform-access-review/access-review-email-preview.png)
-1. Click the **Start review** link to open the access review.git pu
+1. Select the **Start review** link to open the access review.
-### Navigate directly to My Access
+### Go directly to My Access
You can also view your pending access reviews by using your browser to open My Access.
-1. Sign in to the My Access at https://myaccess.microsoft.com/
+1. Sign in to My Access at https://myaccess.microsoft.com/.
-2. Select **Access reviews** from the menu on the left side bar to see a list of pending access reviews assigned to you.
+2. Select **Access reviews** from the left menu to see a list of pending access reviews assigned to you.
## Review access for one or more users
-After you open My Access under Groups and Apps you can see:
+After you open My Access under **Groups and Apps**, you can see:
-- **Name** The name of the access review.-- **Due** The due date for the review. After this date denied users could be removed from the group or app being reviewed.-- **Resource** The name of the resource under review.-- **Progress** The number of users reviewed over the total number of users part of this access review.
+- **Name**: The name of the access review.
+- **Due**: The due date for the review. After this date, denied users could be removed from the group or app being reviewed.
+- **Resource**: The name of the resource under review.
+- **Progress**: The number of users reviewed over the total number of users part of this access review.
-Click on the name of an access review to get started.
+Select the name of an access review to get started.
-![Pending access reviews list for apps and groups](./media/perform-access-review/access-reviews-list-preview.png)
+![Screenshot of pending access reviews list for apps and groups.](./media/perform-access-review/access-reviews-list-preview.png)
-Once that it opens, you will see the list of users in scope for the access review.
+After it opens, you'll see the list of users in scope for the access review.
-> [!NOTE]
+> [!NOTE]
> If the request is to review your own access, the page will look different. For more information, see [Review access for yourself to groups or applications](review-your-access.md). There are two ways that you can approve or deny access:
There are two ways that you can approve or deny access:
1. Review the list of users and decide whether to approve or deny their continued access.
-1. Select one or more users by clicking the circle next to their names.
+1. Select one or more users by selecting the circle next to their names.
+
+1. Select **Approve** or **Deny** on the bar.
+
+ If you're unsure if a user should continue to have access, you can select **Don't know**. The user gets to keep their access, and your choice is recorded in the audit logs. Keep in mind that any information you provide will be available to other reviewers. They can read your comments and take them into account when they review the request.
-1. Select **Approve** or **Deny** on the bar above.
- - If you are unsure if a user should continue to have access or not, you can click **Don't know**. The user gets to keep their access and your choice is recorded in the audit logs. It is important that you keep in mind that any information you provide will be available to other reviewers. They can read your comments and take them into account when they review the request.
+ ![Screenshot of open access review listing the users who need review.](./media/perform-access-review/user-list-preview.png)
- ![Open access review listing the users who need review](./media/perform-access-review/user-list-preview.png)
+1. The administrator of the access review might require you to supply a reason for your decision in the **Reason** box, even when a reason is not required. You can still provide a reason for your decision. The information that you include will be available to other approvers for review.
-1. The administrator of the access review may require that you supply a reason in the **Reason** box for your decision. Even when a reason is not required. You can still provide a reason for your decision and the information that you include will be available to other approvers for review.
+1. Select **Submit**.
-1. Click **Submit**.
- - You can change your response at any time until the access review has ended. If you want to change your response, select the row and update the response. For example, you can approve a previously denied user or deny a previously approved user.
+ You can change your response at any time until the access review has ended. If you want to change your response, select the row and update the response. For example, you can approve a previously denied user or deny a previously approved user.
> [!IMPORTANT]
- > - If a user is denied access, they aren't removed immediately. They are removed when the review period has ended or when an administrator stops the review.
- > - If there are multiple reviewers, the last submitted response is recorded. Consider an example where an administrator designates two reviewers ΓÇô Alice and Bob. Alice opens the access review first and approves a user's access request. Before the review period ends, Bob opens the access review and denies access on the same request previously approved by Alice. The last decision denying the access is the response that gets recorded.
+ > - If a user is denied access, they aren't removed immediately. The user is removed when the review period has ended or when an administrator stops the review.
+ > - If there are multiple reviewers, the last submitted response is recorded. Consider an example where an administrator designates two reviewers: Alice and Bob. Alice opens the access review first and approves a user's access request. Before the review period ends, Bob opens the access review and denies access on the same request previously approved by Alice. The last decision denying the access is the response that gets recorded.
### Review access based on recommendations
-To make access reviews easier and faster for you, we also provide recommendations that you can accept with a single click. There are two ways recommendations are generated for the reviewer. One method the system uses to create recommendations is by the user's sign-in activity. If a user has been inactive for 30 days or more, the reviewer will be recommended to deny access. The other method is based on the access the user's peers have. If the user doesn't have the same access as their peers, the reviewer will be recommended to deny that user access.
+To make access reviews easier and faster for you, we also provide recommendations that you can accept with a single selection. There are two ways that the system generates recommendations for the reviewer. One method is by the user's sign-in activity. If a user has been inactive for 30 days or more, the system will recommend that the reviewer deny access.
-If you have **No sign-in within 30 days** or **Peer outlier** enabled, follow the steps below to accept recommendations:
+The other method is based on the access that the user's peers have. If the user doesn't have the same access as their peers, the system will recommend that the reviewer deny that user access.
-1. Select one or more users and then Click **Accept recommendations**.
+If you have **No sign-in within 30 days** or **Peer outlier** enabled, follow these steps to accept recommendations:
- ![Open access review listing showing the Accept recommendations button](./media/perform-access-review/accept-recommendations-preview.png)
+1. Select one or more users, and then select **Accept recommendations**.
-1. Or to accept recommendations for all unreviewed users, make sure that no users are selected and click on the **Accept recommendations** button on the top bar.
+ ![Screenshot of open access review listing that shows the Accept recommendations button.](./media/perform-access-review/accept-recommendations-preview.png)
-1. Click **Submit** to accept the recommendations.
+ Or to accept recommendations for all unreviewed users, make sure that no users are selected and then select the **Accept recommendations** button on the top bar.
+1. Select **Submit** to accept the recommendations.
> [!NOTE]
-> When you accept recommendations previous decisions will not be changed.
+> When you accept recommendations, previous decisions won't be changed.
### Review access for one or more users in a multi-stage access review (preview)
-If multi-stage access reviews have been enabled by the administrator, there will be 2 or 3 total stages of review. Each stage of review will have a specified reviewer.
+If the administrator has enabled multi-stage access reviews, there will be two or three total stages of review. Each stage of review will have a specified reviewer.
-You will review access either manually or accept the recommendations based on sign-in activity for the stage you are assigned as the reviewer.
+You will either review access manually or accept the recommendations based on sign-in activity for the stage you're assigned as the reviewer.
-If you are the 2nd stage or 3rd stage reviewer, you will also see the decisions made by the reviewers in the prior stage(s) if the administrator enabled this setting when creating the access review. The decision made by a 2nd or 3rd stage reviewer will overwrite the previous stage. So, the decision the 2nd stage reviewer makes will overwrite the first stage, and the 3rd stage reviewer's decision will overwrite the second stage.
+If you're the second-stage or third-stage reviewer, you'll also see the decisions made by the reviewers in the prior stages, if the administrator enabled this setting when creating the access review. The decision made by a second-stage or third-stage reviewer will overwrite the previous stage. So, the decision that the second-stage reviewer makes will overwrite the first stage. And the third-stage reviewer's decision will overwrite the second stage.
- ![Select user to show the multi-stage access review results](./media/perform-access-review/multi-stage-access-review.png)
+ ![Screenshot showing selection of a user to show the multi-stage access review results.](./media/perform-access-review/multi-stage-access-review.png)
Approve or deny access as outlined in [Review access for one or more users](#review-access-for-one-or-more-users). > [!NOTE]
-> The next stage of the review won't become active until the duration specified during the access review setup has passed. If the administrator believes a stage is done but the review duration for this stage has not expired yet, they can use the **Stop current stage** button in the overview of the access review in the Azure AD portal. This will close the active stage and start the next stage.
+> The next stage of the review won't become active until the duration specified during the access review setup has passed. If the administrator believes a stage is done but the review duration for this stage has not expired yet, they can use the **Stop current stage** button in the overview of the access review in the Azure AD portal. This action will close the active stage and start the next stage.
-### Review access for B2B direct connect users in Teams Shared Channels and Microsoft 365 groups (preview)
+### Review access for B2B direct connect users in Teams shared channels and Microsoft 365 groups (preview)
To review access of B2B direct connect users, use the following instructions:
-1. As the reviewer, you should receive an email that requests you to review access for the team or group. Click the link in the email, or navigate directly to https://myaccess.microsoft.com/.
+1. As the reviewer, you should receive an email that requests you to review access for the team or group. Select the link in the email, or go directly to https://myaccess.microsoft.com/.
-1. Follow the instructions in [Review access for one or more users](#review-access-for-one-or-more-users) to make decisions to approve or deny the users access to the Teams.
+1. Follow the instructions in [Review access for one or more users](#review-access-for-one-or-more-users) to make decisions to approve or deny the users access to the teams.
> [!NOTE]
-> Unlike internal users and B2B Collaboration users, B2B direct connect users and Teams **don't** have recommendations based on last sign-in activity to make decisions when you perform the review.
+> Unlike internal users and B2B collaboration users, B2B direct connect users and teams _don't_ have recommendations based on last sign-in activity to make decisions when you perform the review.
-If a Team you review has shared channels, all B2B direct connect users and teams that access those shared channels are part of the review. This includes B2B collaboration users and internal users. When a B2B direct connect user or team is denied access in an access review, the user will lose access to every shared channel in the Team. To learn more about B2B direct connect users, read [B2B direct connect](../external-identities/b2b-direct-connect-overview.md).
+If a team you review has shared channels, all B2B direct connect users and teams that access those shared channels are part of the review. This includes B2B collaboration users and internal users. When a B2B direct connect user or team is denied access in an access review, the user will lose access to every shared channel in the team. To learn more about B2B direct connect users, read [B2B direct connect](../external-identities/b2b-direct-connect-overview.md).
-## If no action is taken on access review
-When the access review is setup, the administrator has the option to use advanced settings to determine what will happen in the event a reviewer doesn't respond to an access review request.
+## Set up what will happen if no action is taken on access review
+When the access review is set up, the administrator has the option to use advanced settings to determine what will happen if a reviewer doesn't respond to an access review request.
-The administrator can set up the review so that if reviewers do not respond at the end of the review period, all unreviewed users can have an automatic decision made on their access. This includes the loss of access to the group or application under review.
+The administrator can set up the review so that if reviewers don't respond at the end of the review period, all unreviewed users can have an automatic decision made on their access. This includes the loss of access to the group or application under review.
## Next steps
active-directory How To Connect Fed Group Claims https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/hybrid/how-to-connect-fed-group-claims.md
You can also configure group claims in the [optional claims](../../active-direct
By default, group `ObjectID` attributes will be emitted in the group claim value. To modify the claim value to contain on-premises group attributes, or to change the claim type to a role, use the `optionalClaims` configuration described in the next step.
-3. Set optional clams for group name configuration.
+3. Set optional claims for group name configuration.
If you want the groups in the token to contain the on-premises Active Directory group attributes, specify which token-type optional claim should be applied in the `optionalClaims` section. You can list multiple token types:
active-directory Migrate From Federation To Cloud Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/hybrid/migrate-from-federation-to-cloud-authentication.md
Proactively communicate with your users how their experience will change, when i
### Plan the maintenance window
-After the domain conversion, Azure AD might continue to send some legacy authentication requests from Exchange Online to your AD FS servers for up to four hours. The delay is because the Exchange Online cache for [legacy applications authentication](../fundamentals/concept-fundamentals-block-legacy-authentication.md) can take up to 4 hours to be aware of the cutover from federation to cloud authentication.
+After the domain conversion, Azure AD might continue to send some legacy authentication requests from Exchange Online to your AD FS servers for up to four hours. The delay is because the Exchange Online cache for legacy applications authentication can take up to 4 hours to be aware of the cutover from federation to cloud authentication.
During this four-hour window, you may prompt users for credentials repeatedly when reauthenticating to applications that use legacy authentication. Although the user can still successfully authenticate against AD FS, Azure AD no longer accepts the user's issued token because that federation trust is now removed.
active-directory Howto Identity Protection Configure Risk Policies https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/identity-protection/howto-identity-protection-configure-risk-policies.md
Before organizations enable remediation policies, they may want to [investigate]
1. Browse to **Azure Active Directory** > **Security** > **Conditional Access**. 1. Select **New policy**. 1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.
-1. Under **Assignments**, select **Users and groups**.
+1. Under **Assignments**, select **Users or workload identities**..
1. Under **Include**, select **All users**. 1. Under **Exclude**, select **Users and groups** and choose your organization's emergency access or break-glass accounts. 1. Select **Done**.
Before organizations enable remediation policies, they may want to [investigate]
1. Browse to **Azure Active Directory** > **Security** > **Conditional Access**. 1. Select **New policy**. 1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.
-1. Under **Assignments**, select **Users and groups**.
+1. Under **Assignments**, select **Users or workload identities**..
1. Under **Include**, select **All users**. 1. Under **Exclude**, select **Users and groups** and choose your organization's emergency access or break-glass accounts. 1. Select **Done**.
active-directory Tutorial Manage Access Security https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/manage-apps/tutorial-manage-access-security.md
It's easier for an administrator to manage access to the application by assignin
1. In the left menu of the tenant overview, select **Security**. 1. Select **Conditional Access**, select **+ New policy**, and then select **Create new policy**. 1. Enter a name for the policy, such as *MFA Pilot*.
-1. Under **Assignments**, select **Users and groups**
+1. Under **Assignments**, select **Users or workload identities**.
1. On the **Include** tab, choose **Select users and groups**, and then select **Users and groups**. 1. Browse for and select the *MFA-Test-Group* that you previously created, and then choose **Select**. 1. Don't select **Create** yet, you add MFA to the policy in the next section.
active-directory Tutorial Azure Monitor Stream Logs To Event Hub https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/reports-monitoring/tutorial-azure-monitor-stream-logs-to-event-hub.md
To use this feature, you need:
1. Select **Azure Active Directory** > **Audit logs**. 1. Select **Export Data Settings**.
-
+ 1. In the **Diagnostics settings** pane, do either of the following: * To change existing settings, select **Edit setting**. * To add new settings, select **Add diagnostics setting**.
To use this feature, you need:
1. Select the **Stream to an event hub** check box, and then select **Event Hub/Configure**.
- [ ![Export settings](./media/tutorial-azure-monitor-stream-logs-to-event-hub/diagnostic-setting-stream-to-event-hub.png) ](./media/tutorial-azure-monitor-stream-logs-to-event-hub/diagnostic-setting-stream-to-event-hub.png)
-
- 1. Select the Azure subscription and Event Hubs namespace that you want to route the logs to.
+ [ ![Export settings](./media/tutorial-azure-monitor-stream-logs-to-event-hub/diagnostic-setting-stream-to-event-hub.png) ](./media/tutorial-azure-monitor-stream-logs-to-event-hub/diagnostic-setting-stream-to-event-hub.png#lightbox)
+
+ 1. Select the Azure subscription and Event Hubs namespace that you want to route the logs to.
The subscription and Event Hubs namespace must both be associated with the Azure AD tenant that the logs stream from. You can also specify an event hub within the Event Hubs namespace to which logs should be sent. If no event hub is specified, an event hub is created in the namespace with the default name **insights-logs-audit**. 1. Select any combination of the following items:
To use this feature, you need:
1. After about 15 minutes, verify that events are displayed in your event hub. To do so, go to the event hub from the portal and verify that the **incoming messages** count is greater than zero.
- [ ![Audit logs](./media/tutorial-azure-monitor-stream-logs-to-event-hub/azure-monitor-event-hub-instance.png)](./media/tutorial-azure-monitor-stream-logs-to-event-hub/azure-monitor-event-hub-instance.png)
+ [ ![Audit logs](./media/tutorial-azure-monitor-stream-logs-to-event-hub/azure-monitor-event-hub-instance.png)](./media/tutorial-azure-monitor-stream-logs-to-event-hub/azure-monitor-event-hub-instance.png#lightbox)
## Access data from your event hub
active-directory 4Dx Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/4dx-tutorial.md
+
+ Title: 'Tutorial: Azure AD SSO integration with 4DX'
+description: Learn how to configure single sign-on between Azure Active Directory and 4DX.
++++++++ Last updated : 08/09/2022++++
+# Tutorial: Azure AD SSO integration with 4DX
+
+In this tutorial, you'll learn how to integrate 4DX with Azure Active Directory (Azure AD). When you integrate 4DX with Azure AD, you can:
+
+* Control in Azure AD who has access to 4DX.
+* Enable your users to be automatically signed-in to 4DX with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* 4DX single sign-on (SSO) enabled subscription.
+* Along with Cloud Application Administrator, Application Administrator can also add or manage applications in Azure AD.
+For more information, see [Azure built-in roles](../roles/permissions-reference.md).
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* 4DX supports **IDP** initiated SSO.
+
+## Add 4DX from the gallery
+
+To configure the integration of 4DX into Azure AD, you need to add 4DX from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **4DX** in the search box.
+1. Select **4DX** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
+
+## Configure and test Azure AD SSO for 4DX
+
+Configure and test Azure AD SSO with 4DX using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in 4DX.
+
+To configure and test Azure AD SSO with 4DX, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure 4DX SSO](#configure-4dx-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create 4DX test user](#create-4dx-test-user)** - to have a counterpart of B.Simon in 4DX that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **4DX** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Screenshot shows how to edit Basic SAML Configuration.](common/edit-urls.png "Basic Configuration")
+
+1. On the **Basic SAML Configuration** section, the application is pre-configured and the necessary URLs are already pre-populated with Azure. The user needs to save the configuration by clicking the **Save** button.
+
+1. 4DX application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes.
+
+ ![Screenshot shows the image of attributes configuration.](common/default-attributes.png "Image")
+
+1. In addition to above, 4DX application expects few more attributes to be passed back in SAML response, which are shown below. These attributes are also pre populated but you can review them as per your requirements.
+
+ | Name | Source Attribute|
+ | | |
+ | companykey | `<unique ID>` |
+
+ > [!Note]
+ > For this `<unique ID>` of a customer assertion, please reach out to [4DX support team](mailto:support@bahrcode.com).
+
+1. On the **Set-up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Federation Metadata XML** and select **Download** to download the certificate and save it on your computer.
+
+ ![Screenshot shows the Certificate download link.](common/metadataxml.png "Certificate")
+
+1. On the **Set up 4DX** section, copy the appropriate URL(s) based on your requirement.
+
+ ![Screenshot shows how to copy a configuration appropriate URL.](common/copy-configuration-urls.png "Attributes")
+
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to 4DX.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **4DX**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure 4DX SSO
+
+To configure single sign-on on **4DX** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [4DX support team](mailto:support@bahrcode.com). They set this setting to have the SAML SSO connection set properly on both sides.
+
+### Create 4DX test user
+
+In this section, you create a user called Britta Simon in 4DX. Work with [4DX support team](mailto:support@bahrcode.com) to add the users in the 4DX platform. Users must be created and activated before you use single sign-on.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+* Click on Test this application in Azure portal and you should be automatically signed in to the 4DX for which you set up the SSO.
+
+* You can use Microsoft My Apps. When you click the 4DX tile in the My Apps, you should be automatically signed in to the 4DX for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+
+## Next steps
+
+Once you configure 4DX you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Adra By Trintech Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/adra-by-trintech-tutorial.md
+
+ Title: 'Tutorial: Azure AD SSO integration with Adra by Trintech'
+description: Learn how to configure single sign-on between Azure Active Directory and Adra by Trintech.
++++++++ Last updated : 08/22/2022++++
+# Tutorial: Azure AD SSO integration with Adra by Trintech
+
+In this tutorial, you'll learn how to integrate Adra by Trintech with Azure Active Directory (Azure AD). When you integrate Adra by Trintech with Azure AD, you can:
+
+* Control in Azure AD who has access to Adra by Trintech.
+* Enable your users to be automatically signed-in to Adra by Trintech with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Adra by Trintech single sign-on (SSO) enabled subscription.
+* Along with Cloud Application Administrator, Application Administrator can also add or manage applications in Azure AD.
+For more information, see [Azure built-in roles](../roles/permissions-reference.md).
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* Adra by Trintech supports **SP** and **IDP** initiated SSO.
+
+## Add Adra by Trintech from the gallery
+
+To configure the integration of Adra by Trintech into Azure AD, you need to add Adra by Trintech from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Adra by Trintech** in the search box.
+1. Select **Adra by Trintech** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
+
+## Configure and test Azure AD SSO for Adra by Trintech
+
+Configure and test Azure AD SSO with Adra by Trintech using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Adra by Trintech.
+
+To configure and test Azure AD SSO with Adra by Trintech, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Adra by Trintech SSO](#configure-adra-by-trintech-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Adra by Trintech test user](#create-adra-by-trintech-test-user)** - to have a counterpart of B.Simon in Adra by Trintech that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **Adra by Trintech** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Screenshot shows how to edit Basic SAML Configuration.](common/edit-urls.png "Basic Configuration")
+
+1. On the **Basic SAML Configuration** section, if you have **Service Provider metadata file** and wish to configure in **IDP** initiated mode, perform the following steps:
+
+ a. Click **Upload metadata file**.
+
+ ![Screenshot shows to upload metadata file.](common/upload-metadata.png "File")
+
+ b. Click on **folder logo** to select the metadata file and click **Upload**.
+
+ ![Screenshot shows to choose metadata file.](common/browse-upload-metadata.png "Folder")
+
+ c. After the metadata file is successfully uploaded, the **Identifier** and **Reply URL** values get auto populated in **Basic SAML Configuration** section.
+
+ d. In the **Sign-on URL** text box, type the URL:
+ `https://login.adra.com`
+
+ e. In the **Relay state** text box, type the URL:
+ `https://setup.adra.com`
+
+ f. In the **Logout URL** text box, type the URL:
+ `https://login.adra.com/Saml/SLOServiceSP`
+
+ > [!Note]
+ > You will get the **Service Provider metadata file** from the **Configure Adra by Trintech SSO** section, which is explained later in the tutorial. If the **Identifier** and **Reply URL** values do not get auto populated, then fill in the values manually according to your requirement.
+
+1. On the **Set up single sign-on with SAML** page, In the **SAML Signing Certificate** section, click copy button to copy **App Federation Metadata Url** and save it on your computer.
+
+ ![Screenshot shows the Certificate download link.](common/copy-metadataurl.png "Certificate")
+
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Adra by Trintech.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Adra by Trintech**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure Adra by Trintech SSO
+
+1. Log in to your Adra by Trintech company site as an administrator.
+
+1. Go to **Engagement** > **Security** Tab > **Security Policy** > select **Use a federated identity provider** button.
+
+1. Download the **Service Provider metadata file** by clicking **here** in the Adra page and upload this metadata file in the Azure portal.
+
+ [ ![Screenshot that shows the Configuration Settings.](./media/adra-by-trintech-tutorial/settings.png "Configuration") ](./media/adra-by-trintech-tutorial/settings.png#lightbox)
+
+1. Click on the **Add a new federated identity provider** button and perform the following steps:
+
+ [ ![Screenshot that shows the Organization Algorithm.](./media/adra-by-trintech-tutorial/certificate.png "Organization") ](./media/adra-by-trintech-tutorial/certificate.png#lightbox)
+
+ a. Enter a valid **Name** and **Description** values in the textbox.
+
+ b. In the **Metadata URL** textbox, paste the **App Federation Metadata Url** which you've copied from the Azure portal and click on the **Test URL** button.
+
+ c. Click **Save** to save the SAML configuration..
+
+### Create Adra by Trintech test user
+
+In this section, you create a user called Britta Simon at Adra by Trintech. Work with [Adra by Trintech support team](mailto:support@adra.com) to add the users in the Adra by Trintech platform. Users must be created and activated before you use single sign-on.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to Adra by Trintech Sign-on URL where you can initiate the login flow.
+
+* Go to Adra by Trintech Sign-on URL directly and initiate the login flow from there.
+
+#### IDP initiated:
+
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the Adra by Trintech for which you set up the SSO.
+
+You can also use Microsoft My Apps to test the application in any mode. When you click the Adra by Trintech tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Adra by Trintech for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+
+## Next steps
+
+Once you configure Adra by Trintech you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Defender for Cloud Apps](/cloud-app-security/proxy-deployment-any-app).
active-directory Lattice Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/lattice-tutorial.md
+
+ Title: 'Tutorial: Azure AD SSO integration with Lattice'
+description: Learn how to configure single sign-on between Azure Active Directory and Lattice.
++++++++ Last updated : 08/22/2022++++
+# Tutorial: Azure AD SSO integration with Lattice
+
+In this tutorial, you'll learn how to integrate Lattice with Azure Active Directory (Azure AD). When you integrate Lattice with Azure AD, you can:
+
+* Control in Azure AD who has access to Lattice.
+* Enable your users to be automatically signed-in to Lattice with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Lattice single sign-on (SSO) enabled subscription.
+* Along with Cloud Application Administrator, Application Administrator can also add or manage applications in Azure AD.
+For more information, see [Azure built-in roles](../roles/permissions-reference.md).
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* Lattice supports **SP** and **IDP** initiated SSO.
+
+## Add Lattice from the gallery
+
+To configure the integration of Lattice into Azure AD, you need to add Lattice from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Lattice** in the search box.
+1. Select **Lattice** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
+
+## Configure and test Azure AD SSO for Lattice
+
+Configure and test Azure AD SSO with Lattice using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Lattice.
+
+To configure and test Azure AD SSO with Lattice, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Lattice SSO](#configure-lattice-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Lattice test user](#create-lattice-test-user)** - to have a counterpart of B.Simon in Lattice that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **Lattice** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Screenshot shows how to edit Basic SAML Configuration.](common/edit-urls.png "Basic Configuration")
+
+1. On the **Basic SAML Configuration** section, perform the following steps:
+
+ a. In the **Identifier** textbox, type a URL using the following pattern:
+ `https://router.latticehq.com/sso/<subdomain>/metadata`
+
+ b. In the **Reply URL** textbox, type a URL using the following pattern:
+ `https://router.latticehq.com/sso/<subdomain>/acs`
+
+1. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode:
+
+ In the **Sign-on URL** text box, type the URL:
+ `https://router.latticehq.com/sso/lattice/sp-login-redirect`
+
+ > [!Note]
+ > These values are not real. Update these values with the actual Identifier and Reply URL. Contact [Lattice support team](mailto:customercare@lattice.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+
+1. On the **Set-up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Federation Metadata XML** and select **Download** to download the certificate and save it on your computer.
+
+ ![Screenshot shows the Certificate download link.](common/metadataxml.png "Certificate")
+
+1. On the **Set up Lattice** section, copy the appropriate URL(s) based on your requirement.
+
+ ![Screenshot shows how to copy configuration appropriate URL.](common/copy-configuration-urls.png "Attributes")
+
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Lattice.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Lattice**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure Lattice SSO
+
+1. Log in to your Lattice company site as an administrator.
+
+1. Go to **Admin** > **Platform** > **Settings** > **Single sign-on settings** and perform the following steps:
+
+ ![Screenshot that shows the Configuration Settings.](./media/lattice-tutorial/settings.png "Configuration")
+
+ a. In the **XML Metadata** textbox, paste the **Federation Metadata XML** file which you have copied from the Azure portal.
+
+ b. Click **Save**.
+
+### Create Lattice test user
+
+In this section, you create a user called Britta Simon in Lattice. Work with [Lattice support team](mailto:customercare@lattice.com) to add the users in the Lattice platform. Users must be created and activated before you use single sign-on.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to Lattice Sign-on URL where you can initiate the login flow.
+
+* Go to Lattice Sign-on URL directly and initiate the login flow from there.
+
+#### IDP initiated:
+
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the Lattice for which you set up the SSO.
+
+You can also use Microsoft My Apps to test the application in any mode. When you click the Lattice tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Lattice for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+
+## Next steps
+
+Once you configure Lattice you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Sketch Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/sketch-tutorial.md
+
+ Title: 'Tutorial: Azure AD SSO integration with Sketch'
+description: Learn how to configure single sign-on between Azure Active Directory and Sketch.
++++++++ Last updated : 08/22/2022++++
+# Tutorial: Azure AD SSO integration with Sketch
+
+In this tutorial, you'll learn how to integrate Sketch with Azure Active Directory (Azure AD). When you integrate Sketch with Azure AD, you can:
+
+* Control in Azure AD who has access to Sketch.
+* Enable your users to be automatically signed-in to Sketch with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Sketch single sign-on (SSO) enabled subscription.
+* Along with Cloud Application Administrator, Application Administrator can also add or manage applications in Azure AD.
+For more information, see [Azure built-in roles](../roles/permissions-reference.md).
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* Sketch supports **SP** initiated SSO.
+* Sketch supports **Just In Time** user provisioning.
+
+## Add Sketch from the gallery
+
+To configure the integration of Sketch into Azure AD, you need to add Sketch from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Sketch** in the search box.
+1. Select **Sketch** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
+
+## Configure and test Azure AD SSO for Sketch
+
+Configure and test Azure AD SSO with Sketch using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Sketch.
+
+To configure and test Azure AD SSO with Sketch, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Sketch SSO](#configure-sketch-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Sketch test user](#create-sketch-test-user)** - to have a counterpart of B.Simon in Sketch that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **Sketch** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Screenshot shows how to edit Basic SAML Configuration.](common/edit-urls.png "Basic Configuration")
+
+1. On the **Basic SAML Configuration** section, perform the following steps:
+
+ a. In the **Identifier** textbox, type a value using the following pattern:
+ `sketch-<uuid_v4>`
+
+ b. In the **Reply URL** textbox, type a URL using the following pattern:
+ `https://sso.sketch.com/saml/acs?id=<uuid_v4>`
+
+1. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode:
+
+ In the **Sign-on URL** text box, type the URL:
+ `https://www.sketch.com`
+
+ > [!Note]
+ > These values are not real. Update these values with the actual Identifier and Reply URL. Contact [Sketch support team](mailto:sso-support@sketch.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+
+1. Sketch application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes.
+
+ ![Screenshot shows the image of attribute mappings.](common/default-attributes.png "Attributes")
+
+1. In addition to above, Sketch application expects few more attributes to be passed back in SAML response, which are shown below. These attributes are also pre populated but you can review them as per your requirements.
+
+ | Name | Source Attribute|
+ | | |
+ | email | user.mail |
+ | first_name | user.givenname |
+ | surname | user.surname |
+
+1. On the **Set-up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Federation Metadata XML** and select **Download** to download the certificate and save it on your computer.
+
+ ![Screenshot shows the Certificate download link.](common/metadataxml.png "Certificate")
+
+1. On the **Set up Sketch** section, copy the appropriate URL(s) based on your requirement.
+
+ ![Screenshot shows how to copy configuration appropriate URL.](common/copy-configuration-urls.png "Metadata")
+
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Sketch.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Sketch**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure Sketch SSO
+
+To configure single sign-on on **Sketch** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [Sketch support team](mailto:sso-support@sketch.com). They set this setting to have the SAML SSO connection set properly on both sides.
+
+### Create Sketch test user
+
+In this section, a user called B.Simon is created in Sketch. Sketch supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in Sketch, a new one is created after authentication.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+* Click on **Test this application** in Azure portal. This will redirect to Sketch Sign-on URL where you can initiate the login flow.
+
+* Go to Sketch Sign-on URL directly and initiate the login flow from there.
+
+* You can use Microsoft My Apps. When you click the Sketch tile in the My Apps, this will redirect to Sketch Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+
+## Next steps
+
+Once you configure Sketch you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Skybreathe Analytics Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/skybreathe-analytics-tutorial.md
+
+ Title: 'Tutorial: Azure AD SSO integration with Skybreathe® Analytics'
+description: Learn how to configure single sign-on between Azure Active Directory and Skybreathe® Analytics.
++++++++ Last updated : 08/22/2022++++
+# Tutorial: Azure AD SSO integration with Skybreathe® Analytics
+
+In this tutorial, you'll learn how to integrate Skybreathe® Analytics with Azure Active Directory (Azure AD). When you integrate Skybreathe® Analytics with Azure AD, you can:
+
+* Control in Azure AD who has access to Skybreathe® Analytics.
+* Enable your users to be automatically signed-in to Skybreathe® Analytics with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Skybreathe® Analytics single sign-on (SSO) enabled subscription.
+* Along with Cloud Application Administrator, Application Administrator can also add or manage applications in Azure AD.
+For more information, see [Azure built-in roles](../roles/permissions-reference.md).
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* Skybreathe® Analytics supports **SP** and **IDP** initiated SSO.
+
+## Add Skybreathe® Analytics from the gallery
+
+To configure the integration of Skybreathe® Analytics into Azure AD, you need to add Skybreathe® Analytics from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Skybreathe® Analytics** in the search box.
+1. Select **Skybreathe® Analytics** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
+
+## Configure and test Azure AD SSO for Skybreathe® Analytics
+
+Configure and test Azure AD SSO with Skybreathe® Analytics using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Skybreathe® Analytics.
+
+To configure and test Azure AD SSO with Skybreathe® Analytics, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Skybreathe Analytics SSO](#configure-skybreathe-analytics-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Skybreathe Analytics test user](#create-skybreathe-analytics-test-user)** - to have a counterpart of B.Simon in Skybreathe® Analytics that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **Skybreathe® Analytics** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Screenshot shows to edit Basic S A M L Configuration.](common/edit-urls.png "Basic Configuration")
+
+1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, perform the following steps:
+
+ 1. In the **Identifier** text box, type a URL using the following pattern:
+ `https://auth.skybreathe.com/auth/realms/<ICAO>`
+`
+ 1. In the **Reply URL** text box, type a URL using the following pattern:
+ `https://auth.skybreathe.com/auth/realms/<ICAO>/broker/sbfe-<icao>-idp/endpoint/client/sso`
+
+1. Click **Set additional URLs** and perform the following steps if you wish to configure the application in SP initiated mode:
+
+ 1. In the **Reply URL** text box, type a URL using the following pattern:
+ `https://auth.skybreathe.com/auth/realms/<ICAO>/broker/sbfe-<icao>-idp/endpoint`
+
+ 1. In the **Sign-on URL** text box, type a URL using the following pattern:
+ `https://<domain>.skybreathe.com/saml/login`
+
+ > [!NOTE]
+ > These values are not real. Update these values with the actual Identifier, Reply URL and Sign-on URL. Contact [Skybreathe® Analytics Client support team](mailto:support@openairlines.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+
+1. Skybreathe® Analytics application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes.
+
+ ![Screenshot shows the image of attribute mappings.](common/default-attributes.png "Attributes")
+
+1. In addition to above, Skybreathe® Analytics application expects few more attributes to be passed back in SAML response, which are shown below. These attributes are also pre populated but you can review them as per your requirements.
+
+ | Name | Source Attribute|
+ | | |
+ | firstname | user.givenname |
+ | initials | user.employeeid |
+ | lastname | user.surname |
+ | groups | user.groups |
+
+1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, click copy button to copy **App Federation Metadata Url** and save it on your computer.
+
+ ![Screenshot shows the Certificate download link.](common/copy-metadataurl.png "Certificate")
+
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Skybreathe® Analytics.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Skybreathe® Analytics**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure Skybreathe Analytics SSO
+
+To configure single sign-on on **Skybreathe® Analytics** side, you need to send the **App Federation Metadata Url** to [Skybreathe® Analytics support team](mailto:support@openairlines.com). They set this setting to have the SAML SSO connection set properly on both sides.
+
+### Create Skybreathe Analytics test user
+
+In this section, you create a user called Britta Simon in Skybreathe® Analytics. Work with [Skybreathe® Analytics support team](mailto:support@openairlines.com) to add the users in the Skybreathe® Analytics platform. Users must be created and activated before you use single sign-on.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to Skybreathe® Analytics Sign-on URL where you can initiate the login flow.
+
+* Go to Skybreathe® Analytics Sign-on URL directly and initiate the login flow from there.
+
+#### IDP initiated:
+
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the Skybreathe® Analytics for which you set up the SSO.
+
+You can also use Microsoft My Apps to test the application in any mode. When you click the Skybreathe® Analytics tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Skybreathe® Analytics for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+
+## Next steps
+
+Once you configure Skybreathe® Analytics you can enforce session control, which protects exfiltration and infiltration of your organization’s sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Defender for Cloud Apps](/cloud-app-security/proxy-deployment-any-app).
active-directory Tigergraph Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/tigergraph-tutorial.md
+
+ Title: 'Tutorial: Azure AD SSO integration with TigerGraph'
+description: Learn how to configure single sign-on between Azure Active Directory and TigerGraph.
++++++++ Last updated : 08/22/2022++++
+# Tutorial: Azure AD SSO integration with TigerGraph
+
+In this tutorial, you'll learn how to integrate TigerGraph with Azure Active Directory (Azure AD). When you integrate TigerGraph with Azure AD, you can:
+
+* Control in Azure AD who has access to TigerGraph.
+* Enable your users to be automatically signed-in to TigerGraph with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* TigerGraph single sign-on (SSO) enabled subscription.
+* Along with Cloud Application Administrator, Application Administrator can also add or manage applications in Azure AD.
+For more information, see [Azure built-in roles](../roles/permissions-reference.md).
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* TigerGraph supports **SP** and **IDP** initiated SSO.
+
+## Add TigerGraph from the gallery
+
+To configure the integration of TigerGraph into Azure AD, you need to add TigerGraph from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **TigerGraph** in the search box.
+1. Select **TigerGraph** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
+
+## Configure and test Azure AD SSO for TigerGraph
+
+Configure and test Azure AD SSO with TigerGraph using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user at TigerGraph.
+
+To configure and test Azure AD SSO with TigerGraph, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure TigerGraph SSO](#configure-tigergraph-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create TigerGraph test user](#create-tigergraph-test-user)** - to have a counterpart of B.Simon in TigerGraph that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **TigerGraph** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Screenshot shows how to edit a Basic SAML Configuration.](common/edit-urls.png "Basic Configuration")
+
+1. On the **Basic SAML Configuration** section, perform the following steps:
+
+ a. In the **Identifier** textbox, type a URL using the following pattern:
+ `https://<your-tigergraph-hostname>:14240/gsqlserver/gsql/saml/meta`
+
+ b. In the **Reply URL** textbox, type a URL using the following pattern:
+ `https://<your-tigergraph-hostname>:14240/api/auth/saml/acs`
+
+ c. In the **Sign-on URL** text box, type a URL using the following pattern:
+ `https://<your-tigergraph-hostname>:14240/#/login`
+
+ > [!Note]
+ > These values are not real. Update these values with the actual Identifier, Reply URL and Sign on URL. Contact [TigerGraph support team](mailto:support@tigergraph.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+
+1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Certificate (Base64)** and select **Download** to download the certificate and save it on your computer.
+
+ ![Screenshot shows the Certificate download link.](common/certificatebase64.png "Certificate")
+
+1. On the **Set up TigerGraph** section, copy the appropriate URL(s) based on your requirement.
+
+ ![Screenshot shows how to copy a configuration appropriate URL.](common/copy-configuration-urls.png "Metadata")
+
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to TigerGraph.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **TigerGraph**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure TigerGraph SSO
+
+To configure single sign-on on **TigerGraph** side, you need to send the downloaded **Certificate (Base64)** and appropriate copied URLs from Azure portal to [TigerGraph support team](mailto:support@tigergraph.com). They set this setting to have the SAML SSO connection set properly on both sides.
+
+### Create TigerGraph test user
+
+In this section, you create a user called Britta Simon at TigerGraph. Work with [TigerGraph support team](mailto:support@tigergraph.com) to add the users in the TigerGraph platform. Users must be created and activated before you use single sign-on.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to TigerGraph Sign-on URL where you can initiate the login flow.
+
+* Go to TigerGraph Sign-on URL directly and initiate the login flow from there.
+
+#### IDP initiated:
+
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the TigerGraph for which you set up the SSO.
+
+You can also use Microsoft My Apps to test the application in any mode. When you click the TigerGraph tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the TigerGraph for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+
+## Next steps
+
+Once you configure TigerGraph you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Workhub Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/workhub-tutorial.md
+
+ Title: 'Tutorial: Azure AD SSO integration with workhub'
+description: Learn how to configure single sign-on between Azure Active Directory and workhub.
++++++++ Last updated : 08/22/2022++++
+# Tutorial: Azure AD SSO integration with workhub
+
+In this tutorial, you'll learn how to integrate workhub with Azure Active Directory (Azure AD). When you integrate workhub with Azure AD, you can:
+
+* Control in Azure AD who has access to workhub.
+* Enable your users to be automatically signed-in to workhub with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* workhub single sign-on (SSO) enabled subscription.
+* Along with Cloud Application Administrator, Application Administrator can also add or manage applications in Azure AD.
+For more information, see [Azure built-in roles](../roles/permissions-reference.md).
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* workhub supports **SP** initiated SSO.
+
+> [!NOTE]
+> Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
+
+## Add workhub from the gallery
+
+To configure the integration of workhub into Azure AD, you need to add workhub from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **workhub** in the search box.
+1. Select **workhub** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
+
+## Configure and test Azure AD SSO for workhub
+
+Configure and test Azure AD SSO with workhub using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user at workhub.
+
+To configure and test Azure AD SSO with workhub, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure workhub SSO](#configure-workhub-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create workhub test user](#create-workhub-test-user)** - to have a counterpart of B.Simon in workhub that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **workhub** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Screenshot shows how to edit basic SAML Configuration.](common/edit-urls.png "Configuration")
+
+1. On the **Basic SAML Configuration** section, perform the following steps:
+
+ a. In the **Identifier** textbox, type the URL:
+ `https://ainz-okal-gown.firebaseapp.com/__/auth/handler`
+
+ b. In the **Reply URL** textbox, type the URL:
+ `https://ainz-okal-gown.firebaseapp.com/__/auth/handler`
+
+ c. In the **Sign-on URL** text box, type the URL:
+ `https://admin.workhub.site/sso`
+
+1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Certificate (Base64)** and select **Download** to download the certificate and save it on your computer.
+
+ ![Screenshot shows the Certificate download link.](common/certificatebase64.png "Certificate")
+
+1. On the **Set up workhub** section, copy the appropriate URL(s) based on your requirement.
+
+ ![Screenshot shows to copy configuration appropriate URL.](common/copy-configuration-urls.png "Metadata")
+
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to workhub.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **workhub**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure workhub SSO
+
+To configure single sign-on on **workhub** side, you need to send the downloaded **Certificate (Base64)** and appropriate copied URLs from Azure portal to [workhub support team](mailto:team_bkp@bitkey.jp). They set this setting to have the SAML SSO connection set properly on both sides.
+
+### Create workhub test user
+
+In this section, you create a user called Britta Simon at workhub. Work with [workhub support team](mailto:team_bkp@bitkey.jp) to add the users in the workhub platform. Users must be created and activated before you use single sign-on.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+* Click on **Test this application** in Azure portal. This will redirect to workhub Sign-on URL where you can initiate the login flow.
+
+* Go to workhub Sign-on URL directly and initiate the login flow from there.
+
+* You can use Microsoft My Apps. When you click the workhub tile in the My Apps, this will redirect to workhub Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+
+## Next steps
+
+Once you configure workhub you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
advisor Advisor Cost Recommendations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/advisor/advisor-cost-recommendations.md
Advisor recommends resizing virtual machines when it's possible to fit the curre
### Burstable recommendations
-We evaluate is workloads are eligible to run on specialized SKUs called **Burstable SKUs** that support variable workload performance requirements and are less expensive than general purpose SKUs. Learn more about burstable SKUs here: [B-series burstable - Azure Virtual Machines](../virtual-machines/sizes-b-series-burstable.md).
+We evaluate if workloads are eligible to run on specialized SKUs called **Burstable SKUs** that support variable workload performance requirements and are less expensive than general purpose SKUs. Learn more about burstable SKUs here: [B-series burstable - Azure Virtual Machines](../virtual-machines/sizes-b-series-burstable.md).
- A burstable SKU recommendation is made if: - The average **CPU utilization** is less than a burstable SKUs' baseline performance
aks Operator Best Practices Cluster Isolation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/operator-best-practices-cluster-isolation.md
For more information about these features, see [Best practices for authenticatio
### Containers *Containers* include: * The Azure Policy Add-on for AKS to enforce pod security.
-* The use of pod security contexts.
+* The use of pod security admission.
* Scanning both images and the runtime for vulnerabilities. * Using App Armor or Seccomp (Secure Computing) to restrict container access to the underlying node.
aks Operator Best Practices Cluster Security https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/operator-best-practices-cluster-security.md
For even more granular control of container actions, you can also use built-in L
Built-in Linux security features are only available on Linux nodes and pods. > [!NOTE]
-> Currently, Kubernetes environments aren't completely safe for hostile multi-tenant usage. Additional security features, like *AppArmor*, *seccomp*,*Pod Security Policies*, or Kubernetes RBAC for nodes, efficiently block exploits.
+> Currently, Kubernetes environments aren't completely safe for hostile multi-tenant usage. Additional security features, like *Microsoft Defender for Containers* *AppArmor*, *seccomp*,*Pod Security Admission*, or Kubernetes RBAC for nodes, efficiently block exploits.
> >For true security when running hostile multi-tenant workloads, only trust a hypervisor. The security domain for Kubernetes becomes the entire cluster, not an individual node. >
api-management Gateway Log Schema Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/gateway-log-schema-reference.md
The following properties are logged for each API request.
| Property | Type | Description | | - | - | - |
-| method | string | HTTP method of the incoming request |
-| url | string | URL of the incoming request |
-| responseCode | integer | Status code of the HTTP response sent to a client |
-| responseSize | integer | Number of bytes sent to a client during request processing |
-| cache | string | Status of API Management cache involvement in request processing (hit, miss, none) |
-| apiId | string | API entity identifier for current request |
-| operationId | string | Operation entity identifier for current request |
-| clientProtocol | string | HTTP protocol version of the incoming request |
-| clientTime | integer | Number of milliseconds spent on overall client I/O (connecting, sending, and receiving bytes) |
-| apiRevision | string | API revision for current request |
-| clientTlsVersion| string | TLS version used by client sending request |
-| lastError | object | For an unsuccessful request, details about the last request processing error |
-| backendMethod | string | HTTP method of the request sent to a backend |
-| backendUrl | string | URL of the request sent to a backend |
-| backendResponseCode | integer | Code of the HTTP response received from a backend |
-| backedProtocol | string | HTTP protocol version of the request sent to a backend |
-| backendTime | integer | Number of milliseconds spent on overall backend IO (connecting, sending, and receiving bytes) |
+| ApiId | string | API entity identifier for current request |
+| ApimSubscriptionId | string | Subscription entity identifier for current request |
+| ApiRevision | string | API revision for current request |
+| BackendId | string | Backend entity identifier for current request |
+| BackendMethod | string | HTTP method of the request sent to a backend |
+| BackendProtocol | string | HTTP protocol version of the request sent to a backend |
+| BackendRequestBody | string | Backend request body |
+| BackendRequestHeaders | dynamic | Collection of HTTP headers sent to a backend |
+| BackendResponseBody | string | Backend response body |
+| BackendResponseCode | int | Code of the HTTP response received from a backend |
+| BackendResponseHeaders | dynamic | Collection of HTTP headers received from a backend |
+| BackendTime | long | Number of milliseconds spent on overall backend I/O (connecting, sending, and receiving bytes) |
+| BackendUrl | string | URL of the request sent to a backend |
+| Cache | string | Status of API Management cache involvement in request processing (hit, miss, none) |
+| CacheTime | long | Number of milliseconds spent on overall API Management cache IO (connecting, sending and receiving bytes) |
+| ClientProtocol | string | HTTP protocol version of the incoming request |
+| ClientTime | long | Number of milliseconds spent on overall client I/O (connecting, sending, and receiving bytes) |
+| ClientTlsVersion | string | TLS version used by client sending request |
+| Errors | dynamic | Collection of error occurred during request processing |
+| IsRequestSuccess | bool | HTTP request completed with response status code within 2xx or 3xx range |
+| LastErrorElapsed | long | Number of milliseconds elapsed since gateway received request until the error occurred |
+| LastErrorMessage | string | Error message |
+| LastErrorReason | string | Error reason |
+| LastErrorScope | string | Scope of the policy document containing policy caused the error |
+| LastErrorSection | string | Section of the policy document containing policy caused the error |
+| LastErrorSource | string | Naming of the policy or processing internal handler caused the error |
+| Method | string | HTTP method of the incoming request |
+| OperationId | string | Operation entity identifier for current request |
+| ProductId | string | Product entity identifier for current request |
+| RequestBody | string | Client request body |
+| RequestHeaders | dynamic | Collection of HTTP headers sent by a client |
+| RequestSize | int | Number of bytes received from a client during request processing |
+| ResponseBody | string | Gateway response body |
+| ResponseCode | int | Status code of the HTTP response sent to a client |
+| ResponseHeaders | dynamic | Collection of HTTP headers sent to a client |
+| ResponseSize | int | Number of bytes sent to a client during request processing |
+| TotalTime | long | Number of milliseconds spent on overall HTTP request (from first byte received by API Management to last byte a client received back) |
+| TraceRecords | dynamic | Records emitted by trace policies |
+| Url | string | URL of the incoming request |
+| UserId | string | User entity identifier for current request |
## Next steps
api-management Import Container App With Oas https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/import-container-app-with-oas.md
This article shows how to import an Azure Container App to Azure API Management
> * Import a Container App that exposes a Web API > * Test the API in the Azure portal
-> [!NOTE]
-> Azure Container Apps are currently in preview.
- ## Expose Container App with API Management [Azure Container Apps](../container-apps/overview.md) allows you to deploy containerized apps without managing complex infrastructure. API developers can write code using their preferred programming language or framework, build microservices with full support for Distributed Application Runtime (Dapr), and scale based on HTTP traffic or other events.
app-service Tutorial Python Postgresql App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/tutorial-python-postgresql-app.md
Title: 'Tutorial: Deploy a Python Django or Flask web app with PostgreSQL' description: Create a Python Django or Flask web app with a PostgreSQL database and deploy it to Azure. The tutorial uses either the Django or Flask framework and the app is hosted on Azure App Service on Linux.-+ ms.devlang: python
Having issues? [Let us know](https://aka.ms/DjangoCLITutorialHelp).
## 4 - Allow web app to access the database
-After the Azure Database for PostgreSQL server is created, configure access to the server from the web app by adding a firewall rule. This can be done through the Azure portal or the Azure CLI.
+After the Azure Database for PostgreSQL server is created, configure access to the server from the web app by adding a firewall rule. This can be done through the Azure portal or the Azure CLI.
If you're working in VS Code, right-click the database server and select **Open in Portal** to go to the Azure portal. Or, go to the [Azure Cloud Shell](https://shell.azure.com) and run the Azure CLI commands. ### [Azure portal](#tab/azure-portal-access)
Follow these steps while signed-in to the Azure portal to delete a resource grou
| [!INCLUDE [Remove resource group Azure portal 2](<./includes/tutorial-python-postgresql-app/remove-resource-group-azure-portal-2.md>)] | :::image type="content" source="./media/tutorial-python-postgresql-app/remove-resource-group-azure-portal-2-240px.png" lightbox="./media/tutorial-python-postgresql-app/remove-resource-group-azure-portal-2.png" alt-text="A screenshot showing how to delete a resource group in the Azure portal." ::: | | [!INCLUDE [Remove resource group Azure portal 3](<./includes/tutorial-python-postgresql-app/remove-resource-group-azure-portal-3.md>)] | | - ### [VS Code](#tab/vscode-aztools) | Instructions | Screenshot |
Follow these steps while signed-in to the Azure portal to delete a resource grou
[!INCLUDE [Stream logs CLI](<./includes/tutorial-python-postgresql-app/clean-up-resources-cli.md>)] --+ Having issues? [Let us know](https://aka.ms/DjangoCLITutorialHelp).
applied-ai-services Compose Custom Models V2 1 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/compose-custom-models-v2-1.md
+
+ Title: "How to guide: create and compose custom models with Form Recognizer v2.1"
+
+description: Learn how to create, compose use, and manage custom models with Form Recognizer v2.1
+++++ Last updated : 08/22/2022+
+recommendations: false
++
+# Compose custom models v2.1
+
+> [!NOTE]
+> This how-to guide references Form Recognizer v2.1 . To try Form Recognizer v3.0 , see [Compose custom models v3.0](compose-custom-models-v3.md).
+
+Form Recognizer uses advanced machine-learning technology to detect and extract information from document images and return the extracted data in a structured JSON output. With Form Recognizer, you can train standalone custom models or combine custom models to create composed models.
+
+* **Custom models**. Form Recognizer custom models enable you to analyze and extract data from forms and documents specific to your business. Custom models are trained for your distinct data and use cases.
+
+* **Composed models**. A composed model is created by taking a collection of custom models and assigning them to a single model that encompasses your form types. When a document is submitted to a composed model, the service performs a classification step to decide which custom model accurately represents the form presented for analysis.
+
+In this article, you'll learn how to create Form Recognizer custom and composed models using our [Form Recognizer Sample Labeling tool](label-tool.md), [REST APIs](quickstarts/client-library.md?branch=main&pivots=programming-language-rest-api#train-a-custom-model), or [client-library SDKs](quickstarts/client-library.md?branch=main&pivots=programming-language-csharp#train-a-custom-model).
+
+## Sample Labeling tool
+
+Try extracting data from custom forms using our Sample Labeling tool. You'll need the following resources:
+
+* An Azure subscriptionΓÇöyou can [create one for free](https://azure.microsoft.com/free/cognitive-services/)
+
+* A [Form Recognizer instance](https://portal.azure.com/#create/Microsoft.CognitiveServicesFormRecognizer) in the Azure portal. You can use the free pricing tier (`F0`) to try the service. After your resource deploys, select **Go to resource** to get your key and endpoint.
+
+ :::image type="content" source="media/containers/keys-and-endpoint.png" alt-text="Screenshot: keys and endpoint location in the Azure portal.":::
+
+> [!div class="nextstepaction"]
+> [Try it](https://fott-2-1.azurewebsites.net/projects/create)
+
+In the Form Recognizer UI:
+
+1. Select **Use Custom to train a model with labels and get key value pairs**.
+
+ :::image type="content" source="media/label-tool/fott-use-custom.png" alt-text="Screenshot of the FOTT tool select custom model option.":::
+
+1. In the next window, select **New project**:
+
+ :::image type="content" source="media/label-tool/fott-new-project.png" alt-text="Screenshot of the FOTT tool select new project option.":::
+
+## Create your models
+
+The steps for building, training, and using custom and composed models are as follows:
+
+* [**Assemble your training dataset**](#assemble-your-training-dataset)
+* [**Upload your training set to Azure blob storage**](#upload-your-training-dataset)
+* [**Train your custom model**](#train-your-custom-model)
+* [**Compose custom models**](#create-a-composed-model)
+* [**Analyze documents**](#analyze-documents-with-your-custom-or-composed-model)
+* [**Manage your custom models**](#manage-your-custom-models)
+
+## Assemble your training dataset
+
+Building a custom model begins with establishing your training dataset. You'll need a minimum of five completed forms of the same type for your sample dataset. They can be of different file types (jpg, png, pdf, tiff) and contain both text and handwriting. Your forms must follow the [input requirements](build-training-data-set.md#custom-model-input-requirements) for Form Recognizer.
+
+## Upload your training dataset
+
+You'll need to [upload your training data](build-training-data-set.md#upload-your-training-data)
+to an Azure blob storage container. If you don't know how to create an Azure storage account with a container, *see* [Azure Storage quickstart for Azure portal](../../storage/blobs/storage-quickstart-blobs-portal.md). You can use the free pricing tier (F0) to try the service, and upgrade later to a paid tier for production.
+
+## Train your custom model
+
+You [train your model](./quickstarts/try-sdk-rest-api.md#train-a-custom-model) with labeled data sets. Labeled datasets rely on the prebuilt-layout API, but supplementary human input is included such as your specific labels and field locations. Start with at least five completed forms of the same type for your labeled training data.
+
+When you train with labeled data, the model uses supervised learning to extract values of interest, using the labeled forms you provide. Labeled data results in better-performing models and can produce models that work with complex forms or forms containing values without keys.
+
+Form Recognizer uses the [Layout](concept-layout.md) API to learn the expected sizes and positions of typeface and handwritten text elements and extract tables. Then it uses user-specified labels to learn the key/value associations and tables in the documents. We recommend that you use five manually labeled forms of the same type (same structure) to get started when training a new model. Add more labeled data as needed to improve the model accuracy. Form Recognizer enables training a model to extract key value pairs and tables using supervised learning capabilities.
+
+[Get started with Train with labels](label-tool.md)
+
+> [!VIDEO https://docs.microsoft.com/Shows/Docs-Azure/Azure-Form-Recognizer/player]
+
+## Create a composed model
+
+> [!NOTE]
+> **Model Compose is only available for custom models trained _with_ labels.** Attempting to compose unlabeled models will produce an error.
+
+With the Model Compose operation, you can assign up to 100 trained custom models to a single model ID. When you call Analyze with the composed model ID, Form Recognizer will first classify the form you submitted, choose the best matching assigned model, and then return results for that model. This operation is useful when incoming forms may belong to one of several templates.
+
+Using the Form Recognizer Sample Labeling tool, the REST API, or the Client-library SDKs, follow the steps below to set up a composed model:
+
+1. [**Gather your custom model IDs**](#gather-your-custom-model-ids)
+1. [**Compose your custom models**](#compose-your-custom-models)
+
+#### Gather your custom model IDs
+
+Once the training process has successfully completed, your custom model will be assigned a model ID. You can retrieve a model ID as follows:
+
+### [**Form Recognizer Sample Labeling tool**](#tab/fott)
+
+When you train models using the [**Form Recognizer Sample Labeling tool**](https://fott-2-1.azurewebsites.net/), the model ID is located in the Train Result window:
++
+### [**REST API**](#tab/rest-api)
+
+The [**REST API**](./quickstarts/try-sdk-rest-api.md?pivots=programming-language-rest-api#train-a-custom-model) will return a `201 (Success)` response with a **Location** header. The value of the last parameter in this header is the model ID for the newly trained model:
++
+### [**Client-library SDKs**](#tab/sdks)
+
+ The [**client-library SDKs**](./quickstarts/try-sdk-rest-api.md?pivots=programming-language-csharp#train-a-custom-model) return a model object that can be queried to return the trained model ID:
+
+* C\# | [CustomFormModel Class](/dotnet/api/azure.ai.formrecognizer.training.customformmodel?view=azure-dotnet&preserve-view=true#properties "Azure SDK for .NET")
+
+* Java | [CustomFormModelInfo Class](/java/api/com.azure.ai.formrecognizer.training.models.customformmodelinfo?view=azure-java-stable&preserve-view=true#methods "Azure SDK for Java")
+
+* JavaScript | [CustomFormModelInfo interface](/javascript/api/@azure/ai-form-recognizer/customformmodelinfo?view=azure-node-latest&preserve-view=true&branch=main#properties "Azure SDK for JavaScript")
+
+* Python | [CustomFormModelInfo Class](/python/api/azure-ai-formrecognizer/azure.ai.formrecognizer.customformmodelinfo?view=azure-python&preserve-view=true&branch=main#variables "Azure SDK for Python")
+++
+#### Compose your custom models
+
+After you've gathered your custom models corresponding to a single form type, you can compose them into a single model.
+
+### [**Form Recognizer Sample Labeling tool**](#tab/fott)
+
+The **Sample Labeling tool** enables you to quickly get started training models and composing them to a single model ID.
+
+After you have completed training, compose your models as follows:
+
+1. On the left rail menu, select the **Model Compose** icon (merging arrow).
+
+1. In the main window, select the models you wish to assign to a single model ID. Models with the arrows icon are already composed models.
+
+1. Choose the **Compose button** from the upper-left corner.
+
+1. In the pop-up window, name your newly composed model and select **Compose**.
+
+When the operation completes, your newly composed model will appear in the list.
+
+ :::image type="content" source="media/custom-model-compose.png" alt-text="Screenshot of the model compose window." lightbox="media/custom-model-compose-expanded.png":::
+
+### [**REST API**](#tab/rest-api)
+
+Using the **REST API**, you can make a [**Compose Custom Model**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/ComposeDocumentModel) request to create a single composed model from existing models. The request body requires a string array of your `modelIds` to compose and you can optionally define the `modelName`.
+
+### [**Client-library SDKs**](#tab/sdks)
+
+Use the programming language code of your choice to create a composed model that will be called with a single model ID. Below are links to code samples that demonstrate how to create a composed model from existing custom models:
+
+* [**C#/.NET**](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/Sample_ModelCompose.md).
+
+* [**Java**](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/formrecognizer/azure-ai-formrecognizer/src/samples/java/com/azure/ai/formrecognizer/administration/ComposeModel.java).
+
+* [**JavaScript**](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/formrecognizer/ai-form-recognizer/samples/v3/javascript/createComposedModel.js).
+
+* [**Python**](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/formrecognizer/azure-ai-formrecognizer/samples/v3.2-beta/sample_compose_model.py)
+++
+## Analyze documents with your custom or composed model
+
+ The custom form **Analyze** operation requires you to provide the `modelID` in the call to Form Recognizer. You can provide a single custom model ID or a composed model ID for the `modelID` parameter.
+
+### [**Form Recognizer Sample Labeling tool**](#tab/fott)
+
+1. On the tool's left-pane menu, select the **Analyze icon** (light bulb).
+
+1. Choose a local file or image URL to analyze.
+
+1. Select the **Run Analysis** button.
+
+1. The tool will apply tags in bounding boxes and report the confidence percentage for each tag.
++
+### [**REST API**](#tab/rest-api)
+
+Using the REST API, you can make an [Analyze Document](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument) request to analyze a document and extract key-value pairs and table data.
+
+### [**Client-library SDKs**](#tab/sdks)
+
+Using the programming language of your choice to analyze a form or document with a custom or composed model. You'll need your Form Recognizer endpoint, key, and model ID.
+
+* [**C#/.NET**](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/Sample_ModelCompose.md)
+
+* [**Java**](https://github.com/Azure/azure-sdk-for-javocumentFromUrl.java)
+
+* [**JavaScript**](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/formrecognizer/ai-form-recognizer/samples/v3/javascript/recognizeCustomForm.js)
+
+* [**Python**](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/formrecognizer/azure-ai-formrecognizer/samples/v3.1/sample_recognize_custom_forms.py)
+++
+Test your newly trained models by [analyzing forms](./quickstarts/try-sdk-rest-api.md#analyze-forms-with-a-custom-model) that weren't part of the training dataset. Depending on the reported accuracy, you may want to do further training to improve the model. You can continue further training to [improve results](label-tool.md#improve-results).
+
+## Manage your custom models
+
+You can [manage your custom models](./quickstarts/try-sdk-rest-api.md#manage-custom-models) throughout their lifecycle by viewing a [list of all custom models](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/GetModels) under your subscription, retrieving information about [a specific custom model](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/GetModel), and [deleting custom models](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/DeleteModel) from your account.
+
+Great! You've learned the steps to create custom and composed models and use them in your Form Recognizer projects and applications.
+
+## Next steps
+
+Learn more about the Form Recognizer client library by exploring our API reference documentation.
+
+> [!div class="nextstepaction"]
+> [Form Recognizer API reference](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)
+>
applied-ai-services Compose Custom Models V3 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/compose-custom-models-v3.md
+
+ Title: "How to guide: create and compose custom models with Form Recognizer v2.0"
+
+description: Learn how to create, use, and manage Form Recognizer v2.0 custom and composed models
+++++ Last updated : 08/22/2022+
+recommendations: false
++
+# Compose custom models v3.0
+
+> [!NOTE]
+> This how-to guide references Form Recognizer v3.0 . To use Form Recognizer v2.1 , see [Compose custom models v2.1](compose-custom-models-v2-1.md).
+
+A composed model is created by taking a collection of custom models and assigning them to a single model ID. You can assign up to 100 trained custom models to a single composed model ID. When a document is submitted to a composed model, the service performs a classification step to decide which custom model accurately represents the form presented for analysis. Composed models are useful when you've trained several models and want to group them to analyze similar form types. For example, your composed model might include custom models trained to analyze your supply, equipment, and furniture purchase orders. Instead of manually trying to select the appropriate model, you can use a composed model to determine the appropriate custom model for each analysis and extraction.
+
+To learn more, see [Composed custom models](concept-composed-models.md).
+
+In this article, you'll learn how to create and use composed custom models to analyze your forms and documents.
+
+## Prerequisites
+
+To get started, you'll need the following resources:
+
+* **An Azure subscription**. You can [create a free Azure subscription](https://azure.microsoft.com/free/cognitive-services/).
+
+* **A Form Recognizer instance**. Once you have your Azure subscription, [create a Form Recognizer resource](https://portal.azure.com/#create/Microsoft.CognitiveServicesFormRecognizer) in the Azure portal to get your key and endpoint. If you have an existing Form Recognizer resource, navigate directly to your resource page. You can use the free pricing tier (F0) to try the service, and upgrade later to a paid tier for production.
+
+ 1. After the resource deploys, select **Go to resource**.
+
+ 1. Copy the **Keys and Endpoint** values from the Azure portal and paste them in a convenient location, such as *Microsoft Notepad*. You'll need the key and endpoint values to connect your application to the Form Recognizer API.
+
+ :::image border="true" type="content" source="media/containers/keys-and-endpoint.png" alt-text="Still photo showing how to access resource key and endpoint URL.":::
+
+ > [!TIP]
+ > For more information, see [**create a Form Recognizer resource**](create-a-form-recognizer-resource.md).
+
+* **An Azure storage account.** If you don't know how to create an Azure storage account, follow the [Azure Storage quickstart for Azure portal](../../storage/blobs/storage-quickstart-blobs-portal.md). You can use the free pricing tier (F0) to try the service, and upgrade later to a paid tier for production.
+
+## Create your custom models
+
+First, you'll need a set of custom models to compose. You can use the Form Recognizer Studio, REST API, or client-library SDKs. The steps are as follows:
+
+* [**Assemble your training dataset**](#assemble-your-training-dataset)
+* [**Upload your training set to Azure blob storage**](#upload-your-training-dataset)
+* [**Train your custom models**](#train-your-custom-model)
+
+## Assemble your training dataset
+
+Building a custom model begins with establishing your training dataset. You'll need a minimum of five completed forms of the same type for your sample dataset. They can be of different file types (jpg, png, pdf, tiff) and contain both text and handwriting. Your forms must follow the [input requirements](build-training-data-set.md#custom-model-input-requirements) for Form Recognizer.
+
+>[!TIP]
+> Follow these tips to optimize your data set for training:
+>
+> * If possible, use text-based PDF documents instead of image-based documents. Scanned PDFs are handled as images.
+> * For filled-in forms, use examples that have all of their fields filled in.
+> * Use forms with different values in each field.
+> * If your form images are of lower quality, use a larger data set (10-15 images, for example).
+
+See [Build a training data set](./build-training-data-set.md) for tips on how to collect your training documents.
+
+## Upload your training dataset
+
+When you've gathered a set of training documents, you'll need to [upload your training data](build-training-data-set.md#upload-your-training-data) to an Azure blob storage container.
+
+If you want to use manually labeled data, you'll also have to upload the *.labels.json* and *.ocr.json* files that correspond to your training documents.
+
+## Train your custom model
+
+When you [train your model](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects) with labeled data, the model uses supervised learning to extract values of interest, using the labeled forms you provide. Labeled data results in better-performing models and can produce models that work with complex forms or forms containing values without keys.
+
+Form Recognizer uses the [prebuilt-layout model](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument) API to learn the expected sizes and positions of typeface and handwritten text elements and extract tables. Then it uses user-specified labels to learn the key/value associations and tables in the documents. We recommend that you use five manually labeled forms of the same type (same structure) to get started with training a new model. Then, add more labeled data, as needed, to improve the model accuracy. Form Recognizer enables training a model to extract key-value pairs and tables using supervised learning capabilities.
+
+### [Form Recognizer Studio](#tab/studio)
+
+To create custom models, start with configuring your project:
+
+1. From the Studio homepage, select [**Create new**](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects) from the Custom model card.
+
+1. Use the Γ₧ò **Create a project** command to start the new project configuration wizard.
+
+1. Enter project details, select the Azure subscription and resource, and the Azure Blob storage container that contains your data.
+
+1. Review and submit your settings to create the project.
++
+While creating your custom models, you may need to extract data collections from your documents. The collections may appear one of two formats. Using tables as the visual pattern:
+
+* Dynamic or variable count of values (rows) for a given set of fields (columns)
+
+* Specific collection of values for a given set of fields (columns and/or rows)
+
+See [Form Recognizer Studio: labeling as tables](quickstarts/try-v3-form-recognizer-studio.md#labeling-as-tables)
+
+### [REST API](#tab/rest)
+
+Training with labels leads to better performance in some scenarios. To train with labels, you need to have special label information files (*\<filename\>.pdf.labels.json*) in your blob storage container alongside the training documents.
+
+Label files contain key-value associations that a user has entered manually. They're needed for labeled data training, but not every source file needs to have a corresponding label file. Source files without labels will be treated as ordinary training documents. We recommend five or more labeled files for reliable training. You can use a UI tool like [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/customform/projects) to generate these files.
+
+Once you have your label files, you can include them with by calling the training method with the *useLabelFile* parameter set to `true`.
++
+### [Client-libraries](#tab/sdks)
+
+Training with labels leads to better performance in some scenarios. To train with labels, you need to have special label information files (*\<filename\>.pdf.labels.json*) in your blob storage container alongside the training documents. Once you've them, you can call the training method with the *useTrainingLabels* parameter set to `true`.
+
+|Language |Method|
+|--|--|
+|**C#**|[**StartBuildModel**](/dotnet/api/azure.ai.formrecognizer.documentanalysis.documentmodeladministrationclient.startbuildmodel?view=azure-dotnet#azure-ai-formrecognizer-documentanalysis-documentmodeladministrationclient-startbuildmodel&preserve-view=true)|
+|**Java**| [**beginBuildModel**](/java/api/com.azure.ai.formrecognizer.administration.documentmodeladministrationclient.beginbuildmodel?view=azure-java-preview&preserve-view=true)|
+|**JavaScript** | [**beginBuildModel**](/javascript/api/@azure/ai-form-recognizer/documentmodeladministrationclient?view=azure-node-latest#@azure-ai-form-recognizer-documentmodeladministrationclient-beginbuildmodel&preserve-view=true)|
+| **Python** | [**begin_build_model**](/python/api/azure-ai-formrecognizer/azure.ai.formrecognizer.aio.documentmodeladministrationclient?view=azure-python#azure-ai-formrecognizer-aio-documentmodeladministrationclient-begin-build-model&preserve-view=true)
+++
+## Create a composed model
+
+> [!NOTE]
+> **the `create compose model` operation is only available for custom models trained _with_ labels.** Attempting to compose unlabeled models will produce an error.
+
+With the [**create compose model**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/ComposeDocumentModel) operation, you can assign up to 100 trained custom models to a single model ID. When analyze documents with a composed model, Form Recognizer first classifies the form you submitted, then chooses the best matching assigned model, and returns results for that model. This operation is useful when incoming forms may belong to one of several templates.
+
+### [Form Recognizer Studio](#tab/studio)
+
+Once the training process has successfully completed, you can begin to build your composed model. Here are the steps for creating and using composed models:
+
+* [**Gather your custom model IDs**](#gather-your-model-ids)
+* [**Compose your custom models**](#compose-your-custom-models)
+* [**Analyze documents**](#analyze-documents)
+* [**Manage your composed models**](#manage-your-composed-models)
+
+#### Gather your model IDs
+
+When you train models using the [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/), the model ID is located in the models menu under a project:
++
+#### Compose your custom models
+
+1. Select a custom models project.
+
+1. In the project, select the ```Models``` menu item.
+
+1. From the resulting list of models, select the models you wish to compose.
+
+1. Choose the **Compose button** from the upper-left corner.
+
+1. In the pop-up window, name your newly composed model and select **Compose**.
+
+1. When the operation completes, your newly composed model will appear in the list.
+
+1. Once the model is ready, use the **Test** command to validate it with your test documents and observe the results.
+
+#### Analyze documents
+
+The custom model **Analyze** operation requires you to provide the `modelID` in the call to Form Recognizer. You should provide the composed model ID for the `modelID` parameter in your applications.
++
+#### Manage your composed models
+
+You can manage your custom models throughout life cycles:
+
+* Test and validate new documents.
+* Download your model to use in your applications.
+* Delete your model when its lifecycle is complete.
++
+### [REST API](#tab/rest)
+
+Once the training process has successfully completed, you can begin to build your composed model. Here are the steps for creating and using composed models:
+
+* [**Compose your custom models**](#compose-your-custom-models)
+* [**Analyze documents**](#analyze-documents)
+* [**Manage your composed models**](#manage-your-composed-models)
++
+#### Compose your custom models
+
+The [compose model API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/ComposeDocumentModel) accepts a list of model IDs to be composed.
++
+#### Analyze documents
+
+To make an [**Analyze document**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument) request, use a unique model name in the request parameters.
++
+#### Manage your composed models
+
+You can manage custom models throughout your development needs including [**copying**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/CopyDocumentModelTo), [**listing**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/GetModels), and [**deleting**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/DeleteModel) your models.
+
+### [Client-libraries](#tab/sdks)
+
+Once the training process has successfully completed, you can begin to build your composed model. Here are the steps for creating and using composed models:
+
+* [**Create a composed model**](#create-a-composed-model)
+* [**Analyze documents**](#analyze-documents)
+* [**Manage your composed models**](#manage-your-composed-models)
+
+#### Create a composed model
+
+You can use the programming language of your choice to create a composed model:
+
+| Programming language| Code sample |
+|--|--|
+|**C#** | [Model compose](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/Sample_ModelCompose.md#create-a-composed-model)
+|**Java** | [Model compose](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/Sample_ModelCompose.md#create-a-composed-model)
+|**JavaScript** | [Compose model](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/formrecognizer/ai-form-recognizer/samples/v4-beta/javascript/composeModel.js)
+|**Python** | [Create composed model](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0b3/sdk/formrecognizer/azure-ai-formrecognizer/samples/v3.2-beta/sample_create_composed_model.py)
+
+#### Analyze documents
+
+Once you've built your composed model, you can use it to analyze forms and documents. Use your composed `model ID` and let the service decide which of your aggregated custom models fits best according to the document provided.
+
+|Programming language| Code sample |
+|--|--|
+|**C#** | [Analyze a document with a custom/composed model](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/Sample_AnalyzeWithCustomModel.md)
+|**Java** | [Analyze forms with your custom/composed model ](https://github.com/Azure/azure-sdk-for-javocumentFromUrl.java)
+|**JavaScript** | [Analyze documents by model ID](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/formrecognizer/ai-form-recognizer/samples/v4-beta/javascript/analyzeReceiptByModelId.js)
+|**Python** | [Analyze custom documents](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0b3/sdk/formrecognizer/azure-ai-formrecognizer/samples/v3.2-beta/sample_analyze_custom_documents.py)
+
+## Manage your composed models
+
+You can manage a custom model at each stage in its life cycles. You can view a list of all custom models under your subscription, retrieve information about a specific custom model, and delete custom models from your account.
+
+|Programming language| Code sample |
+|--|--|
+|**C#** | [Analyze a document with a custom/composed model](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/Sample_AnalyzeWithCustomModel.md)|
+|**Java** | [Custom model management operations](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/formrecognizer/azure-ai-formrecognizer/src/samples/java/com/azure/ai/formrecognizer/administration/ManageCustomModels.java)|
+|**JavaScript** | [Get model types and schema](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/formrecognizer/ai-form-recognizer/samples/v4-beta/javascript/getModel.js)|
+|**Python** | [Manage models](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0b3/sdk/formrecognizer/azure-ai-formrecognizer/samples/v3.2-beta/sample_manage_models.py)|
+++
+## Next steps
+
+Try one of our Form Recognizer quickstarts:
+
+> [!div class="nextstepaction"]
+> [Form Recognizer Studio](quickstarts/try-v3-form-recognizer-studio.md)
+
+> [!div class="nextstepaction"]
+> [REST API](quickstarts/get-started-v3-sdk-rest-api.md)
+
+> [!div class="nextstepaction"]
+> [C#](quickstarts/get-started-v3-sdk-rest-api.md#prerequisites)
+
+> [!div class="nextstepaction"]
+> [Java](quickstarts/get-started-v3-sdk-rest-api.md)
+
+> [!div class="nextstepaction"]
+> [JavaScript](quickstarts/get-started-v3-sdk-rest-api.md)
+
+> [!div class="nextstepaction"]
+> [Python](quickstarts/get-started-v3-sdk-rest-api.md)
applied-ai-services Concept Accuracy Confidence https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-accuracy-confidence.md
Form Recognizer analysis results return an estimated confidence for predicted wo
Field confidence indicates an estimated probability between 0 and 1 that the prediction is correct. For example, a confidence value of 0.95 (95%) indicates that the prediction is likely correct 19 out of 20 times. For scenarios where accuracy is critical, confidence may be used to determine whether to automatically accept the prediction or flag it for human review.
-Confidence scores comprise of 2 components, the field level confidence score and the text extraction confidence score. In addition to the field confidence of position and span, the text extraction confidence in the ```pages``` section of the response is the model's confidence in the text extraction (OCR) process. The two confidence scores should be combined to generate a overall confidence score.
+Confidence scores have two data points: the field level confidence score and the text extraction confidence score. In addition to the field confidence of position and span, the text extraction confidence in the ```pages``` section of the response is the model's confidence in the text extraction (OCR) process. The two confidence scores should be combined to generate one overall confidence score.
**Form Recognizer Studio** </br> **Analyzed invoice prebuilt-invoice model**
The accuracy of your model is affected by variances in the visual structure of y
* Separate visually distinct document types to train different models. * As a general rule, if you remove all user entered values and the documents look similar, you need to add more training data to the existing model.
- * If the documents are dissimilar, split your training data into different folders and train a model for each variation. You can then [compose](compose-custom-models.md#create-a-composed-model) the different variations into a single model.
+ * If the documents are dissimilar, split your training data into different folders and train a model for each variation. You can then [compose](compose-custom-models-v2-1.md#create-a-composed-model) the different variations into a single model.
* Make sure that you don't have any extraneous labels.
applied-ai-services Concept Business Card https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-business-card.md
Previously updated : 06/06/2022 Last updated : 08/22/2022 recommendations: false
The business card model combines powerful Optical Character Recognition (OCR) ca
## Development options
+The following tools are supported by Form Recognizer v3.0:
+
+| Feature | Resources | Model ID |
+|-|-|--|
+|**Business card model**| <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**JavaScript SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li></ul>|**prebuilt-businessCard**|
+ The following tools are supported by Form Recognizer v2.1: | Feature | Resources | |-|-| |**Business card model**| <ul><li>[**Form Recognizer labeling tool**](https://fott-2-1.azurewebsites.net/prebuilts-analyze)</li><li>[**REST API**](quickstarts/try-sdk-rest-api.md?pivots=programming-language-rest-api#analyze-business-cards)</li><li>[**Client-library SDK**](quickstarts/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=business-card#run-the-container-with-the-docker-compose-up-command)</li></ul>|
-The following tools are supported by Form Recognizer v3.0:
-
-| Feature | Resources | Model ID |
-|-|-|--|
-|**Business card model**| <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md)</li><li>[**JavaScript SDK**](quickstarts/try-v3-javascript-sdk.md)</li></ul>|**prebuilt-businessCard**|
- ### Try Form Recognizer See how data, including name, job title, address, email, and company name, is extracted from business cards using the Form Recognizer Studio or our Sample Labeling tool. You'll need the following resources:
See how data, including name, job title, address, email, and company name, is ex
:::image type="content" source="media/containers/keys-and-endpoint.png" alt-text="Screenshot: keys and endpoint location in the Azure portal.":::
-#### Form Recognizer Studio (preview)
+#### Form Recognizer Studio
> [!NOTE]
-> Form Recognizer studio is available with the preview (v3.0) API.
+> Form Recognizer studio is available with the v3.0 API.
1. On the Form Recognizer Studio home page, select **Business cards**
See how data, including name, job title, address, email, and company name, is ex
> [!div class="nextstepaction"] > [Try Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=businessCard)
-#### Sample Labeling tool (API v2.1)
-
-You'll need a business card document. You can use our [sample business card document](https://raw.githubusercontent.com/Azure-Samples/cognitive-services-REST-api-samples/master/curl/form-recognizer/businessCard.png).
-
- 1. On the Sample Labeling tool home page, select **Use prebuilt model to get data**.
-
- 1. Select **Business card** from the **Form Type** dropdown menu:
-
- :::image type="content" source="media/try-business-card.png" alt-text="Screenshot: Sample Labeling tool dropdown prebuilt model selection menu.":::
-
- > [!div class="nextstepaction"]
- > [Try Sample Labeling tool](https://fott-2-1.azurewebsites.net/prebuilts-analyze)
- ## Input requirements * For best results, provide one clear photo or high-quality scan per document.
You'll need a business card document. You can use our [sample business card docu
|--|:-|:| |Business card| <ul><li>English (United States)ΓÇöen-US</li><li> English (Australia)ΓÇöen-AU</li><li>English (Canada)ΓÇöen-CA</li><li>English (United Kingdom)ΓÇöen-GB</li><li>English (India)ΓÇöen-IN</li><li>English (Japan)ΓÇöen-JP</li><li>Japanese (Japan)ΓÇöja-JP</li></ul> | Autodetected (en-US or ja-JP) |
-## Field extraction
+## Field extractions
|Name| Type | Description |Standardized output | |:--|:-|:-|:-:|
You'll need a business card document. You can use our [sample business card docu
| WorkPhones | Array of phone numbers | Work phone number(s) from business card | +1 xxx xxx xxxx | | OtherPhones | Array of phone numbers | Other phone number(s) from business card | +1 xxx xxx xxxx |
-## Form Recognizer preview v3.0
+## Form Recognizer v3.0
- The Form Recognizer preview introduces several new features and capabilities.
+ Form Recognizer v3.0 introduces several new features and capabilities.
-* Follow our [**Form Recognizer v3.0 migration guide**](v3-migration-guide.md) to learn how to use the preview version in your applications and workflows.
+* Follow our [**Form Recognizer v3.0 migration guide**](v3-migration-guide.md) to learn how to use the v3.0 version in your applications and workflows.
-* Explore our [**REST API (preview)**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument) to learn more about the preview version and new capabilities.
+* Explore our [**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument) to learn more about the v3.0 version and new capabilities.
## Next steps
You'll need a business card document. You can use our [sample business card docu
* Explore our REST API: > [!div class="nextstepaction"]
- > [Form Recognizer API v3.0](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)
+ > [Form Recognizer API v3.0](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)
applied-ai-services Concept Composed Models https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-composed-models.md
Previously updated : 06/06/2022 Last updated : 08/22/2022 recommendations: false
recommendations: false
With composed models, you can assign multiple custom models to a composed model called with a single model ID. It's useful when you've trained several models and want to group them to analyze similar form types. For example, your composed model might include custom models trained to analyze your supply, equipment, and furniture purchase orders. Instead of manually trying to select the appropriate model, you can use a composed model to determine the appropriate custom model for each analysis and extraction.
-* ```Custom form```and ```Custom document``` models can be composed together into a single composed model when they're trained with the same API version or an API version later than ```2021-06-30-preview```. For more information on composing custom template and custom neural models, see [compose model limits](#compose-model-limits).
+* ```Custom form```and ```Custom document``` models can be composed together into a single composed model when they're trained with the same API version or an API version later than ```2022-08-31```. For more information on composing custom template and custom neural models, see [compose model limits](#compose-model-limits).
* With the model compose operation, you can assign up to 100 trained custom models to a single composed model. To analyze a document with a composed model, Form Recognizer first classifies the submitted form, chooses the best-matching assigned model, and returns results. * For **_custom template models_**, the composed model can be created using variations of a custom template or different form types. This operation is useful when incoming forms may belong to one of several templates. * The response will include a ```docType``` property to indicate which of the composed models was used to analyze the document.
With composed models, you can assign multiple custom models to a composed model
### Composed model compatibility
- |Custom model type | API Version |Custom form 2021-06-30-preview (v3.0)| Custom document 2021-06-30-preview(v3.0) | Custom form GA version (v2.1) or earlier|
+ |Custom model type | API Version |Custom form `2022-08-31` (v3.0)| Custom document `2022-08-31` (v3.0) | Custom form GA version (v2.1) or earlier|
|--|--|--|--|--|
-|**Custom template** (updated custom form)| 2021-06-30-preview | &#10033;| Γ£ô | X |
-|**Custom neural**| trained with current API version (2021-06-30-preview) |Γ£ô |Γ£ô | X |
+|**Custom template** (updated custom form)| v3.0 | &#10033;| Γ£ô | X |
+|**Custom neural**| trained with current API version (`2022-08-31`) |Γ£ô |Γ£ô | X |
|**Custom form**| Custom form GA version (v2.1) or earlier | X | X| ✓| **Table symbols**: ✔—supported; **X—not supported; ✱—unsupported for this API version, but will be supported in a future API version.
With composed models, you can assign multiple custom models to a composed model
## Development options
-The following resources are supported by Form Recognizer **v3.0** (preview):
+The following resources are supported by Form Recognizer **v3.0** :
| Feature | Resources | |-|-|
-|_**Custom model**_| <ul><li>[Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects)</li><li>[REST API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)</li><li>[C# SDK](quickstarts/try-v3-csharp-sdk.md)</li><li>[Java SDK](quickstarts/try-v3-java-sdk.md)</li><li>[JavaScript SDK](quickstarts/try-v3-javascript-sdk.md)</li><li>[Python SDK](quickstarts/try-v3-python-sdk.md)</li></ul>|
-| _**Composed model**_| <ul><li>[Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects)</li><li>[REST API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/ComposeDocumentModel)</li><li>[C# SDK](/dotnet/api/azure.ai.formrecognizer.documentanalysis.documentmodeladministrationclient.startcreatecomposedmodel?view=azure-dotnet-preview&preserve-view=true)</li><li>[Java SDK](/java/api/com.azure.ai.formrecognizer.administration.documentmodeladministrationclient.begincreatecomposedmodel?view=azure-java-preview&preserve-view=true)</li><li>[JavaScript SDK](/javascript/api/@azure/ai-form-recognizer/documentmodeladministrationclient?view=azure-node-preview#@azure-ai-form-recognizer-documentmodeladministrationclient-begincomposemodel&preserve-view=true)</li><li>[Python SDK](/python/api/azure-ai-formrecognizer/azure.ai.formrecognizer.formtrainingclient?view=azure-python-preview#azure-ai-formrecognizer-formtrainingclient-begin-create-composed-model&preserve-view=true)</li></ul>|
+|_**Custom model**_| <ul><li>[Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects)</li><li>[REST API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)</li><li>[C# SDK](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[Java SDK](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[JavaScript SDK](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[Python SDK](quickstarts/get-started-v3-sdk-rest-api.md)</li></ul>|
+| _**Composed model**_| <ul><li>[Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects)</li><li>[REST API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/ComposeDocumentModel)</li><li>[C# SDK](/dotnet/api/azure.ai.formrecognizer.documentanalysis.documentmodeladministrationclient.startcreatecomposedmodel?view=azure-dotnet&preserve-view=true)</li><li>[Java SDK](/java/api/com.azure.ai.formrecognizer.administration.documentmodeladministrationclient.begincreatecomposedmodel?view=azure-java-stable&preserve-view=true)</li><li>[JavaScript SDK](/javascript/api/@azure/ai-form-recognizer/documentmodeladministrationclient?view=azure-node-latest#@azure-ai-form-recognizer-documentmodeladministrationclient-begincomposemodel&preserve-view=true)</li><li>[Python SDK](/python/api/azure-ai-formrecognizer/azure.ai.formrecognizer.formtrainingclient?view=azure-python#azure-ai-formrecognizer-formtrainingclient-begin-create-composed-model&preserve-view=true)</li></ul>|
The following resources are supported by Form Recognizer v2.1:
The following resources are supported by Form Recognizer v2.1:
Learn to create and compose custom models: > [!div class="nextstepaction"]
-> [**Form Recognizer v2.1 (GA)**](compose-custom-models.md)
+> [**Form Recognizer v2.1**](compose-custom-models-v2-1.md)
applied-ai-services Concept Custom Neural https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-custom-neural.md
Custom neural models currently only support key-value pairs and selection marks,
## Tabular fields
-With the release of API version **2022-06-30-preview**, custom neural models will support tabular fields (tables):
+With the release of API versions **2022-06-30-preview** and later, custom neural models will support tabular fields (tables):
-* Models trained with API version 2022-06-30-preview or later will accept tabular field labels.
+* Models trained with API version 2022-08-31, or later will accept tabular field labels.
* Documents analyzed with custom neural models using API version 2022-06-30-preview or later will produce tabular fields aggregated across the tables. * The results can be found in the ```analyzeResult``` object's ```documents``` array that is returned following an analysis operation.
Tabular fields are also useful when extracting repeating information within a do
## Supported regions
-Starting August 01, 2022, Form Recognizer custom neural model training will only be available in the following Azure regions until further notice:
+As of August 01, 2022, Form Recognizer custom neural model training will only be available in the following Azure regions until further notice:
* Brazil South * Canada Central
Starting August 01, 2022, Form Recognizer custom neural model training will only
> [!TIP] > You can [copy a model](disaster-recovery.md#copy-api-overview) trained in one of the select regions listed above to **any other region** and use it accordingly. >
-> Use the [**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/CopyDocumentModelTo) or [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects) to copy a model to another region.
+> Use the [**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/CopyDocumentModelTo) or [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects) to copy a model to another region.
## Best practices
Custom neural models are only available in the [v3 API](v3-migration-guide.md).
| Document Type | REST API | SDK | Label and Test Models| |--|--|--|--|
-| Custom document | [Form Recognizer 3.0 (preview)](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)| [Form Recognizer Preview SDK](quickstarts/try-v3-python-sdk.md)| [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio)
+| Custom document | [Form Recognizer 3.0 ](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)| [Form Recognizer SDK](quickstarts/get-started-v3-sdk-rest-api.md)| [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio)
The build operation to train model supports a new ```buildMode``` property, to train a custom neural model, set the ```buildMode``` to ```neural```. ```REST
-https://{endpoint}/formrecognizer/documentModels:build?api-version=2022-06-30
+https://{endpoint}/formrecognizer/documentModels:build?api-version=2022-08-31
{ "modelId": "string",
https://{endpoint}/formrecognizer/documentModels:build?api-version=2022-06-30
* View the REST API: > [!div class="nextstepaction"]
- > [Form Recognizer API v3.0](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)
+ > [Form Recognizer API v3.0](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)
applied-ai-services Concept Custom Template https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-custom-template.md
Previously updated : 06/06/2022 Last updated : 08/22/2022 recommendations: false # Form Recognizer custom template model
-Custom template (formerly custom form) are easy-to-train models that accurately extract labeled key-value pairs, selection marks, tables, regions, and signatures from documents. Template models use layout cues to extract values from documents and are suitable to extract fields from highly structured documents with defined visual templates.
+Custom template (formerly custom form) is an easy-to-train model that accurately extracts labeled key-value pairs, selection marks, tables, regions, and signatures from documents. Template models use layout cues to extract values from documents and are suitable to extract fields from highly structured documents with defined visual templates.
Custom template models share the same labeling format and strategy as custom neural models, with support for more field types and languages.
Custom template models support key-value pairs, selection marks, tables, signatu
| Form fields | Selection marks | Tabular fields (Tables) | Signature | Selected regions | |:--:|:--:|:--:|:--:|:--:|
-| Supported| Supported | Supported | Preview | Supported |
+| Supported| Supported | Supported | Supported| Supported |
## Tabular fields
-With the release of API version **2022-06-30-preview**, custom template models will add support for **cross page** tabular fields (tables):
+With the release of API versions **2022-06-30-preview** and later, custom template models will add support for **cross page** tabular fields (tables):
* To label a table that spans multiple pages, label each row of the table across the different pages in a single table. * As a best practice, ensure that your dataset contains a few samples of the expected variations. For example, include samples where the entire table is on a single page and where tables span two or more pages if you expect to see those variations in documents.
Template models rely on a defined visual template, changes to the template will
## Training a model
-Template models are available generally [v2.1 API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/AnalyzeWithCustomForm) and in preview [v3 API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/BuildDocumentModel). If you're starting with a new project or have an existing labeled dataset, work with the v3 API and Form Recognizer Studio to train a custom template model.
+Template models are available generally [v3.0 API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/BuildDocumentModel) and [v2.1 API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/AnalyzeWithCustomForm). If you're starting with a new project or have an existing labeled dataset, work with the v3 API and Form Recognizer Studio to train a custom template model.
| Model | REST API | SDK | Label and Test Models| |--|--|--|--|
-| Custom template (preview) | [Form Recognizer 3.0 (preview)](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)| [Form Recognizer Preview SDK](quickstarts/try-v3-python-sdk.md)| [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio)|
-| Custom template | [Form Recognizer 2.1 (GA)](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/AnalyzeWithCustomForm)| [Form Recognizer SDK](quickstarts/get-started-sdk-rest-api.md?pivots=programming-language-python)| [Form Recognizer Sample labeling tool](https://fott-2-1.azurewebsites.net/)|
+| Custom template | [Form Recognizer 3.0 ](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)| [Form Recognizer SDK](quickstarts/get-started-v3-sdk-rest-api.md)| [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio)|
+| Custom template | [Form Recognizer 2.1 ](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/AnalyzeWithCustomForm)| [Form Recognizer SDK](quickstarts/get-started-v2-1-sdk-rest-api.md?pivots=programming-language-python)| [Form Recognizer Sample labeling tool](https://fott-2-1.azurewebsites.net/)|
On the v3 API, the build operation to train model supports a new ```buildMode``` property, to train a custom template model, set the ```buildMode``` to ```template```. ```REST
-https://{endpoint}/formrecognizer/documentModels:build?api-version=2022-06-30
+https://{endpoint}/formrecognizer/documentModels:build?api-version=2022-08-31
{ "modelId": "string",
applied-ai-services Concept Custom https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-custom.md
Previously updated : 06/06/2022 Last updated : 08/22/2022 recommendations: false
This table provides links to the build mode programming language SDK references
|Programming language | SDK reference | Code sample | ||||
-| C#/.NET | [DocumentBuildMode Struct](/dotnet/api/azure.ai.formrecognizer.documentanalysis.documentbuildmode?view=azure-dotnet-preview&preserve-view=true#properties) | [Sample_BuildCustomModelAsync.cs](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/formrecognizer/Azure.AI.FormRecognizer/tests/samples/Sample_BuildCustomModelAsync.cs)
+| C#/.NET | [DocumentBuildMode Struct](/dotnet/api/azure.ai.formrecognizer.documentanalysis.documentbuildmode?view=azure-dotnet&preserve-view=true#properties) | [Sample_BuildCustomModelAsync.cs](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/formrecognizer/Azure.AI.FormRecognizer/tests/samples/Sample_BuildCustomModelAsync.cs)
|Java| [DocumentBuildMode Class](/java/api/com.azure.ai.formrecognizer.administration.models.documentbuildmode?view=azure-java-preview&preserve-view=true#fields) | [BuildModel.java](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/formrecognizer/azure-ai-formrecognizer/src/samples/java/com/azure/ai/formrecognizer/administration/BuildModel.java)|
-|JavaScript | [DocumentBuildMode type](/javascript/api/@azure/ai-form-recognizer/documentbuildmode?view=azure-node-preview&preserve-view=true)| [buildModel.js](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/formrecognizer/ai-form-recognizer/samples/v4-beta/javascript/buildModel.js)|
-|Python | [DocumentBuildMode Enum](/python/api/azure-ai-formrecognizer/azure.ai.formrecognizer.documentbuildmode?view=azure-python-preview&preserve-view=true#fields)| [sample_build_model.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0b3/sdk/formrecognizer/azure-ai-formrecognizer/samples/v3.2-beta/sample_build_model.py)|
+|JavaScript | [DocumentBuildMode type](/javascript/api/@azure/ai-form-recognizer/documentbuildmode?view=azure-node-latest&preserve-view=true)| [buildModel.js](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/formrecognizer/ai-form-recognizer/samples/v4-beta/javascript/buildModel.js)|
+|Python | [DocumentBuildMode Enum](/python/api/azure-ai-formrecognizer/azure.ai.formrecognizer.documentbuildmode?view=azure-python&preserve-view=true#fields)| [sample_build_model.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0b3/sdk/formrecognizer/azure-ai-formrecognizer/samples/v3.2-beta/sample_build_model.py)|
## Compare model features
The table below compares custom template and custom neural features:
## Custom model tools
-The following tools are supported by Form Recognizer v2.1:
+The following tools are supported by Form Recognizer v3.0:
| Feature | Resources | Model ID| |||:|
-|Custom model| <ul><li>[Form Recognizer labeling tool](https://fott-2-1.azurewebsites.net)</li><li>[REST API](quickstarts/try-sdk-rest-api.md?pivots=programming-language-rest-api#analyze-forms-with-a-custom-model)</li><li>[Client library SDK](quickstarts/try-sdk-rest-api.md)</li><li>[Form Recognizer Docker container](containers/form-recognizer-container-install-run.md?tabs=custom#run-the-container-with-the-docker-compose-up-command)</li></ul>|***custom-model-id***|
+|Custom model| <ul><li>[Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/customform/projects)</li><li>[REST API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)</li><li>[C# SDK](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[Python SDK](quickstarts/get-started-v3-sdk-rest-api.md)</li></ul>|***custom-model-id***|
-The following tools are supported by Form Recognizer v3.0:
+The following tools are supported by Form Recognizer v2.1:
| Feature | Resources | Model ID| |||:|
-|Custom model| <ul><li>[Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/customform/projects)</li><li>[REST API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)</li><li>[C# SDK](quickstarts/try-v3-csharp-sdk.md)</li><li>[Python SDK](quickstarts/try-v3-python-sdk.md)</li></ul>|***custom-model-id***|
+|Custom model| <ul><li>[Form Recognizer labeling tool](https://fott-2-1.azurewebsites.net)</li><li>[REST API](quickstarts/try-sdk-rest-api.md?pivots=programming-language-rest-api#analyze-forms-with-a-custom-model)</li><li>[Client library SDK](quickstarts/try-sdk-rest-api.md)</li><li>[Form Recognizer Docker container](containers/form-recognizer-container-install-run.md?tabs=custom#run-the-container-with-the-docker-compose-up-command)</li></ul>|***custom-model-id***|
+ ### Try Form Recognizer
Try extracting data from your specific or unique documents using custom models.
:::image type="content" source="media/containers/keys-and-endpoint.png" alt-text="Screenshot that shows the keys and endpoint location in the Azure portal.":::
-#### Form Recognizer Studio (preview)
+#### Form Recognizer Studio
> [!NOTE]
-> Form Recognizer Studio is available with the preview (v3.0) API.
+> Form Recognizer Studio is available with the v3.0 API.
1. On the **Form Recognizer Studio** home page, select **Custom form**.
Try extracting data from your specific or unique documents using custom models.
> [!div class="nextstepaction"] > [Try Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/customform/projects)
-#### Sample Labeling tool (API v2.1)
--
-|Feature |Custom Template | Custom Neural |
-|--|--|--|
-|Document structure |Template, fixed form, and structured documents.| Structured, semi-structured, and unstructured documents.|
-|Training time | 1 - 5 minutes | 20 - 60 minutes |
-|Data extraction| Key-value pairs, tables, selection marks, signatures, and regions| Key-value pairs and selections marks.|
-|Models per Document type | Requires one model per each document-type variation| Supports a single model for all document-type variations.|
-|Language support| See [custom template model language support](language-support.md)| The custom neural model currently supports English-language documents only.|
- ## Model capabilities This table compares the supported data extraction areas: |Model| Form fields | Selection marks | Structured fields (Tables) | Signature | Region labeling | |--|:--:|:--:|:--:|:--:|:--:|
-|Custom template| Γ£ö | Γ£ö | Γ£ö |&#10033; | Γ£ö |
+|Custom template| Γ£ö | Γ£ö | Γ£ö | Γ£ö | Γ£ö |
|Custom neural| Γ£ö| Γ£ö |**n/a**| **n/a** | **n/a** |
-**Table symbols**: ✔—supported; ✱—preview; **n/a—currently unavailable
+**Table symbols**: Γ£öΓÇösupported; **n/aΓÇöcurrently unavailable
> [!TIP] > When choosing between the two model types, start with a custom neural model if it meets your functional needs. See [custom neural](concept-custom-neural.md ) to learn more about custom neural models.
The following table describes the features available with the associated tools a
| Document type | REST API | SDK | Label and Test Models| |--|--|--|--|
-| Custom form 2.1 | [Form Recognizer 2.1 GA API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/AnalyzeWithCustomForm) | [Form Recognizer SDK](quickstarts/get-started-sdk-rest-api.md?pivots=programming-language-python)| [Sample labeling tool](https://fott-2-1.azurewebsites.net/)|
-| Custom template 3.0 | [Form Recognizer 3.0 (preview)](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)| [Form Recognizer Preview SDK](quickstarts/try-v3-python-sdk.md)| [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio)|
-| Custom neural | [Form Recognizer 3.0 (preview)](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)| [Form Recognizer Preview SDK](quickstarts/try-v3-python-sdk.md)| [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio)
-
+| Custom form 2.1 | [Form Recognizer 2.1 GA API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/AnalyzeWithCustomForm) | [Form Recognizer SDK](quickstarts/get-started-v2-1-sdk-rest-api.md?pivots=programming-language-python)| [Sample labeling tool](https://fott-2-1.azurewebsites.net/)|
+| Custom template 3.0 | [Form Recognizer 3.0 ](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)| [Form Recognizer SDK](quickstarts/get-started-v3-sdk-rest-api.md)| [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio)|
+| Custom neural | [Form Recognizer 3.0 ](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)| [Form Recognizer SDK](quickstarts/get-started-v3-sdk-rest-api.md)| [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio)
> [!NOTE] > Custom template models trained with the 3.0 API will have a few improvements over the 2.1 API stemming from improvements to the OCR engine. Datasets used to train a custom template model using the 2.1 API can still be used to train a new model using the 3.0 API.
The [Sample Labeling tool](https://fott-2-1.azurewebsites.net/) doesn't support
## Supported languages and locales
- The Form Recognizer preview version introduces more language support for custom models. For a list of supported handwritten and printed text, see [Language support](language-support.md).
+ The Form Recognizer v3.0 version introduces more language support for custom models. For a list of supported handwritten and printed text, see [Language support](language-support.md).
-## Form Recognizer v3.0 (preview)
+## Form Recognizer v3.0
- Form Recognizer v3.0 (preview) introduces several new features and capabilities:
+ Form Recognizer v3.0 introduces several new features and capabilities:
* **Custom model API (v3.0)**: This version supports signature detection for custom forms. When you train custom models, you can specify certain fields as signatures. When a document is analyzed with your custom model, it indicates whether a signature was detected or not.
-* [Form Recognizer v3.0 migration guide](v3-migration-guide.md): This guide shows you how to use the preview version in your applications and workflows.
-* [REST API (preview)](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument): This API shows you more about the preview version and new capabilities.
+* [Form Recognizer v3.0 migration guide](v3-migration-guide.md): This guide shows you how to use the v3.0 version in your applications and workflows.
+* [REST API ](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument): This API shows you more about the v3.0 version and new capabilities.
### Try signature detection
Explore Form Recognizer quickstarts and REST APIs:
| Quickstart | REST API| |--|--|
-|[v3.0 Studio quickstart](quickstarts/try-v3-form-recognizer-studio.md) |[Form Recognizer v3.0 API 2022-06-30](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)|
-| [v2.1 quickstart](quickstarts/get-started-sdk-rest-api.md) | [Form Recognizer API v2.1](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/BuildDocumentModel) |
+|[v3.0 Studio quickstart](quickstarts/try-v3-form-recognizer-studio.md) |[Form Recognizer v3.0 API 2022-08-31](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)|
+| [v2.1 quickstart](quickstarts/get-started-v2-1-sdk-rest-api.md) | [Form Recognizer API v2.1](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/BuildDocumentModel) |
+
applied-ai-services Concept Form Recognizer Studio https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-form-recognizer-studio.md
Title: "Form Recognizer Studio | Preview"
+ Title: "Form Recognizer Studio"
-description: "Concept: Form and document processing, data extraction, and analysis using Form Recognizer Studio (preview)"
+description: "Concept: Form and document processing, data extraction, and analysis using Form Recognizer Studio "
Previously updated : 11/02/2021 Last updated : 08/22/2022 -
-# Form Recognizer Studio (preview)
+# Form Recognizer Studio
->[!NOTE]
-> Form Recognizer Studio is currently in public preview. Some features may not be supported or have limited capabilities.
-
-[Form Recognizer Studio preview](https://formrecognizer.appliedai.azure.com/) is an online tool for visually exploring, understanding, and integrating features from the Form Recognizer service into your applications. Use the [Form Recognizer Studio quickstart](quickstarts/try-v3-form-recognizer-studio.md) to get started analyzing documents with pre-trained models. Build custom template models and reference the models in your applications using the [Python SDK preview](quickstarts/try-v3-python-sdk.md) and other quickstarts.
+[Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/) is an online tool for visually exploring, understanding, and integrating features from the Form Recognizer service into your applications. Use the [Form Recognizer Studio quickstart](quickstarts/try-v3-form-recognizer-studio.md) to get started analyzing documents with pre-trained models. Build custom template models and reference the models in your applications using the [Python SDK v3.0](quickstarts/get-started-v3-sdk-rest-api.md) and other quickstarts.
The following image shows the Invoice prebuilt model feature at work.
The following image shows the Invoice prebuilt model feature at work.
The following Form Recognizer service features are available in the Studio.
-* **Read**: Try out Form Recognizer's Read feature to extract text lines, words, detected languages, and handwritten style if detected. Start with the [Studio Read feature](https://formrecognizer.appliedai.azure.com/studio/read). Explore with sample documents and your documents. Use the interactive visualization and JSON output to understand how the feature works. See the [Read overview](concept-read.md) to learn more and get started with the [Python SDK quickstart for Layout](quickstarts/try-v3-python-sdk.md).
+* **Read**: Try out Form Recognizer's Read feature to extract text lines, words, detected languages, and handwritten style if detected. Start with the [Studio Read feature](https://formrecognizer.appliedai.azure.com/studio/read). Explore with sample documents and your documents. Use the interactive visualization and JSON output to understand how the feature works. See the [Read overview](concept-read.md) to learn more and get started with the [Python SDK quickstart for Layout](quickstarts/get-started-v3-sdk-rest-api.md).
-* **Layout**: Try out Form Recognizer's Layout feature to extract text, tables, selection marks, and structure information. Start with the [Studio Layout feature](https://formrecognizer.appliedai.azure.com/studio/layout). Explore with sample documents and your documents. Use the interactive visualization and JSON output to understand how the feature works. See the [Layout overview](concept-layout.md) to learn more and get started with the [Python SDK quickstart for Layout](quickstarts/try-v3-python-sdk.md#layout-model).
+* **Layout**: Try out Form Recognizer's Layout feature to extract text, tables, selection marks, and structure information. Start with the [Studio Layout feature](https://formrecognizer.appliedai.azure.com/studio/layout). Explore with sample documents and your documents. Use the interactive visualization and JSON output to understand how the feature works. See the [Layout overview](concept-layout.md) to learn more and get started with the [Python SDK quickstart for Layout](quickstarts/get-started-v3-sdk-rest-api.md#layout-model).
-* **General Documents**: Try out Form Recognizer's General Documents feature to extract key-value pairs and entities. Start with the [Studio General Documents feature](https://formrecognizer.appliedai.azure.com/studio/document). Explore with sample documents and your documents. Use the interactive visualization and JSON output to understand how the feature works. See the [General Documents overview](concept-general-document.md) to learn more and get started with the [Python SDK quickstart for Layout](quickstarts/try-v3-python-sdk.md#general-document-model).
+* **General Documents**: Try out Form Recognizer's General Documents feature to extract key-value pairs and entities. Start with the [Studio General Documents feature](https://formrecognizer.appliedai.azure.com/studio/document). Explore with sample documents and your documents. Use the interactive visualization and JSON output to understand how the feature works. See the [General Documents overview](concept-general-document.md) to learn more and get started with the [Python SDK quickstart for Layout](quickstarts/get-started-v3-sdk-rest-api.md#general-document-model).
-* **Prebuilt models**: Form Recognizer's pre-built models enable you to add intelligent document processing to your apps and flows without having to train and build your own models. As an example, start with the [Studio Invoice feature](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=invoice). Explore with sample documents and your documents. Use the interactive visualization, extracted fields list, and JSON output to understand how the feature works. See the [Models overview](concept-model-overview.md) to learn more and get started with the [Python SDK quickstart for Prebuilt Invoice](quickstarts/try-v3-python-sdk.md#prebuilt-model).
+* **Prebuilt models**: Form Recognizer's pre-built models enable you to add intelligent document processing to your apps and flows without having to train and build your own models. As an example, start with the [Studio Invoice feature](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=invoice). Explore with sample documents and your documents. Use the interactive visualization, extracted fields list, and JSON output to understand how the feature works. See the [Models overview](concept-model-overview.md) to learn more and get started with the [Python SDK quickstart for Prebuilt Invoice](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model).
-* **Custom models**: Form Recognizer's custom models enable you to extract fields and values from models trained with your data, tailored to your forms and documents. Create standalone custom models or combine two or more custom models to create a composed model to extract data from multiple form types. Start with the [Studio Custom models feature](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects). Use the online wizard, labeling interface, training step, and visualizations to understand how the feature works. Test the custom model with your sample documents and iterate to improve the model. See the [Custom models overview](concept-custom.md) to learn more and use the [Form Recognizer v3.0 preview migration guide](v3-migration-guide.md) to start integrating the new models with your applications.
+* **Custom models**: Form Recognizer's custom models enable you to extract fields and values from models trained with your data, tailored to your forms and documents. Create standalone custom models or combine two or more custom models to create a composed model to extract data from multiple form types. Start with the [Studio Custom models feature](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects). Use the online wizard, labeling interface, training step, and visualizations to understand how the feature works. Test the custom model with your sample documents and iterate to improve the model. See the [Custom models overview](concept-custom.md) to learn more and use the [Form Recognizer v3.0 migration guide](v3-migration-guide.md) to start integrating the new models with your applications.
## Next steps * Follow our [**Form Recognizer v3.0 migration guide**](v3-migration-guide.md) to learn the differences from the previous version of the REST API.
-* Explore our [**preview SDK quickstarts**](quickstarts/try-v3-python-sdk.md) to try the preview features in your applications using the new SDKs.
-* Refer to our [**preview REST API quickstarts**](quickstarts/try-v3-rest-api.md) to try the preview features using the new REST API.
+* Explore our [**v3.0 SDK quickstarts**](quickstarts/get-started-v3-sdk-rest-api.md) to try the v3.0 features in your applications using the new SDKs.
+* Refer to our [**v3.0 REST API quickstarts**](quickstarts/get-started-v3-sdk-rest-api.md) to try the v3.0features using the new REST API.
> [!div class="nextstepaction"]
-> [Form Recognizer Studio (preview) quickstart](quickstarts/try-v3-form-recognizer-studio.md)
+> [Form Recognizer Studio quickstart](quickstarts/try-v3-form-recognizer-studio.md)
applied-ai-services Concept General Document https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-general-document.md
Title: Form Recognizer general document model | Preview
+ Title: Form Recognizer general document model
-description: Concepts related to data extraction and analysis using prebuilt general document preview model
+description: Concepts related to data extraction and analysis using prebuilt general document v3.0 model
Previously updated : 07/20/2022 Last updated : 08/22/2022 recommendations: false <!-- markdownlint-disable MD033 -->
-# Form Recognizer general document model (preview)
+# Form Recognizer general document model
-The General document preview model combines powerful Optical Character Recognition (OCR) capabilities with deep learning models to extract key-value pairs, tables, and selection marks from documents. General document is only available with the preview (v3.0) API. For more information on using the preview (v3.0) API, see our [migration guide](v3-migration-guide.md).
+The General document v3.0 model combines powerful Optical Character Recognition (OCR) capabilities with deep learning models to extract key-value pairs, tables, and selection marks from documents. General document is only available with the v3.0 API. For more information on using the v3.0 API, see our [migration guide](v3-migration-guide.md).
The general document API supports most form types and will analyze your documents and extract keys and associated values. It's ideal for extracting common key-value pairs from documents. You can use the general document model as an alternative to training a custom model without labels. > [!NOTE]
-> The ```2022-06-30``` update to the general document model adds support for selection marks.
+> The ```2022-06-30``` and later versions of the general document model add support for selection marks.
## General document features
The following tools are supported by Form Recognizer v3.0:
| Feature | Resources | |-|-|
-|🆕 **General document model**|<ul ><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md)</li><li>[**JavaScript SDK**](quickstarts/try-v3-javascript-sdk.md)</li></ul>|
+| **General document model**|<ul ><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**JavaScript SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li></ul>|
### Try Form Recognizer
You'll need the following resources:
:::image type="content" source="media/containers/keys-and-endpoint.png" alt-text="Screenshot: keys and endpoint location in the Azure portal.":::
-#### Form Recognizer Studio (preview)
+#### Form Recognizer Studio
> [!NOTE]
-> Form Recognizer studio and the general document model are available with the preview (v3.0) API.
+> Form Recognizer studio and the general document model are available with the v3.0 API.
1. On the Form Recognizer Studio home page, select **General documents**
You'll need the following resources:
Key-value pairs are specific spans within the document that identify a label or key and its associated response or value. In a structured form, these pairs could be the label and the value the user entered for that field. In an unstructured document, they could be the date a contract was executed on based on the text in a paragraph. The AI model is trained to extract identifiable keys and values based on a wide variety of document types, formats, and structures.
-Keys can also exist in isolation when the model detects that a key exists, with no associated value or when processing optional fields. For example, a middle name field may be left blank on a form in some instances. Key-value pairs are spans of text contained in the document. If you have documents where the same value is described in different ways, for example, customer and user, the associated key will be either customer or user based on context.
+Keys can also exist in isolation when the model detects that a key exists, with no associated value or when processing optional fields. For example, a middle name field may be left blank on a form in some instances. Key-value pairs are spans of text contained in the document. If you have documents where the same value is described in different ways, for example, customer and user, the associated key will be either customer or user, based on context.
## Data extraction
Keys can also exist in isolation when the model detects that a key exists, with
## Next steps
-* Follow our [**Form Recognizer v3.0 migration guide**](v3-migration-guide.md) to learn how to use the preview version in your applications and workflows.
+* Follow our [**Form Recognizer v3.0 migration guide**](v3-migration-guide.md) to learn how to use the v3.0 version in your applications and workflows.
-* Explore our [**REST API (preview)**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument) to learn more about the preview version and new capabilities.
+* Explore our [**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument) to learn more about the v3.0 version and new capabilities.
> [!div class="nextstepaction"] > [Try the Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio)
applied-ai-services Concept Id Document https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-id-document.md
Previously updated : 06/06/2022 Last updated : 08/22/2022 recommendations: false
recommendations: false
# Form Recognizer ID document model
-The ID document model combines Optical Character Recognition (OCR) with deep learning models to analyze and extracts key information from US Drivers Licenses (all 50 states and District of Columbia) and international passport biographical pages (excludes visa and other travel documents). The API analyzes identity documents, extracts key information, and returns a structured JSON data representation.
+The ID document model combines Optical Character Recognition (OCR) with deep learning models to analyze and extracts key information from US Drivers Licenses (all 50 states and District of Columbia), international passport biographical pages, US state ID, social security card, green card and more. The API analyzes identity documents, extracts key information, and returns a structured JSON data representation.
***Sample U.S. Driver's License processed with [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=idDocument)***
The ID document model combines Optical Character Recognition (OCR) with deep lea
## Development options
+The following tools are supported by Form Recognizer v3.0:
+
+| Feature | Resources | Model ID |
+|-|-|--|
+|**ID document model**|<ul><li> [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**JavaScript SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li></ul>|**prebuilt-idDocument**|
+ The following tools are supported by Form Recognizer v2.1: | Feature | Resources | |-|-| |**ID document model**| <ul><li>[**Form Recognizer labeling tool**](https://fott-2-1.azurewebsites.net/prebuilts-analyze)</li><li>[**REST API**](quickstarts/try-sdk-rest-api.md?pivots=programming-language-rest-api#analyze-identity-id-documents)</li><li>[**Client-library SDK**](quickstarts/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=id-document#run-the-container-with-the-docker-compose-up-command)</li></ul>|
-The following tools are supported by Form Recognizer v3.0:
-
-| Feature | Resources | Model ID |
-|-|-|--|
-|**ID document model**|<ul><li> [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md)</li><li>[**JavaScript SDK**](quickstarts/try-v3-javascript-sdk.md)</li></ul>|**prebuilt-idDocument**|
- ### Try Form Recognizer
-Extract data, including name, birth date, machine-readable zone, and expiration date, from ID documents using the Form Recognizer Studio or our Sample Labeling tool. You'll need the following resources:
+Extract data, including name, birth date, machine-readable zone, and expiration date, from ID documents using the Form Recognizer Studio. You'll need the following resources:
* An Azure subscriptionΓÇöyou can [create one for free](https://azure.microsoft.com/free/cognitive-services/)
Extract data, including name, birth date, machine-readable zone, and expiration
:::image type="content" source="media/containers/keys-and-endpoint.png" alt-text="Screenshot: keys and endpoint location in the Azure portal.":::
-#### Form Recognizer Studio (preview)
+#### Form Recognizer Studio
> [!NOTE]
-> Form Recognizer studio is available with the preview (v3.0) API.
+> Form Recognizer studio is available with the v3.0 API (API version 2022-08-31 generally available (GA) release)
1. On the Form Recognizer Studio home page, select **Identity documents**
Extract data, including name, birth date, machine-readable zone, and expiration
> [!div class="nextstepaction"] > [Try Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=idDocument)
-#### Sample Labeling tool (API v2.1)
-
-You'll need an ID document. You can use our [sample ID document](https://raw.githubusercontent.com/Azure-Samples/cognitive-services-REST-api-samples/master/curl/form-recognizer/DriverLicense.png).
-
-1. On the Sample Labeling tool home page, select **Use prebuilt model to get data**.
-
-1. Select **Identity documents** from the **Form Type** dropdown menu:
-
- :::image type="content" source="media/try-id-document.png" alt-text="Screenshot: Sample Labeling tool dropdown prebuilt model selection menu.":::
-
- > [!div class="nextstepaction"]
- > [Try Sample Labeling tool](https://fott-2-1.azurewebsites.net/prebuilts-analyze)
- ## Input requirements [!INCLUDE [input requirements](./includes/input-requirements.md)]
-> [!NOTE]
-> The [Sample Labeling tool](https://fott-2-1.azurewebsites.net/) does not support the BMP file format. This is a limitation of the tool not the Form Recognizer Service.
-
-## Supported languages and locales v2.1
+## Supported languages and locales
| Model | LanguageΓÇöLocale code | Default | |--|:-|:|
-|ID document| <ul><li>English (United States)ΓÇöen-US (driver's license)</li><li>Biographical pages from international passports</br> (excluding visa and other travel documents)</li></ul></br>|English (United States)ΓÇöen-US|
+|ID document| <ul><li>English (United States)ΓÇöen-US (driver's license)</li><li>Biographical pages from international passports</br> (excluding visa and other travel documents)</li><li>English (United States)ΓÇöen-US (state ID)</li><li>English (United States)ΓÇöen-US (social security card)</li><li>English (United States)ΓÇöen-US (Green card)</li></ul></br>|English (United States)ΓÇöen-US|
-## Field extraction
+## Field extractions
|Name| Type | Description | Standardized output| |:--|:-|:-|:-|
You'll need an ID document. You can use our [sample ID document](https://raw.git
| Address | String | Extracted address (Driver's License only) || | Region | String | Extracted region, state, province, etc. (Driver's License only) | |
-## Form Recognizer preview v3.0
+## Form Recognizer v3.0
- The Form Recognizer preview v3.0 introduces several new features and capabilities:
+ The Form Recognizer v3.0 introduces several new features and capabilities:
* **ID document (v3.0)** prebuilt model supports extraction of endorsement, restriction, and vehicle class codes from US driver's licenses.
-* The ID Document **2022-06-30-preview** release supports the following data extraction from US driver's licenses:
+* The ID Document **2022-06-30** and later releases support the following data extraction from US driver's licenses:
* Date issued * Height
You'll need an ID document. You can use our [sample ID document](https://raw.git
* Hair color * Document discriminator security code
-### ID document preview field extraction
+### ID document field extractions
|Name| Type | Description | Standardized output| |:--|:-|:-|:-|
-| 🆕 DateOfIssue | Date | Issue date | yyyy-mm-dd |
-| 🆕 Height | String | Height of the holder. | |
-| 🆕 Weight | String | Weight of the holder. | |
-| 🆕 EyeColor | String | Eye color of the holder. | |
-| 🆕 HairColor | String | Hair color of the holder. | |
-| 🆕 DocumentDiscriminator | String | Document discriminator is a security code that identifies where and when the license was issued. | |
+| DateOfIssue | Date | Issue date | yyyy-mm-dd |
+| Height | String | Height of the holder. | |
+| Weight | String | Weight of the holder. | |
+| EyeColor | String | Eye color of the holder. | |
+| HairColor | String | Hair color of the holder. | |
+| DocumentDiscriminator | String | Document discriminator is a security code that identifies where and when the license was issued. | |
| Endorsements | String | More driving privileges granted to a driver such as Motorcycle or School bus. | | | Restrictions | String | Restricted driving privileges applicable to suspended or revoked licenses.| | | VehicleClassification | String | Types of vehicles that can be driven by a driver. ||
You'll need an ID document. You can use our [sample ID document](https://raw.git
| Nationality | countryRegion | Country or region code compliant with ISO 3166 standard (Passport only) | | | Sex | String | Possible extracted values include "M", "F" and "X" | | | MachineReadableZone | Object | Extracted Passport MRZ including two lines of 44 characters each | "P<USABROOKS<<JENNIFER<<<<<<<<<<<<<<<<<<<<<<< 3400200135USA8001014F1905054710000307<715816" |
-| DocumentType | String | Document type, for example, Passport, Driver's License | "passport" |
-| Address | String | Extracted address (Driver's License only) ||
+| DocumentType | String | Document type, for example, Passport, Driver's License, Social security card and more | "passport" |
+| Address | String | Extracted address, address is also parsed to its components - address, city, state, country, zip code ||
| Region | String | Extracted region, state, province, etc. (Driver's License only) | | ### Migration guide and REST API v3.0
-* Follow our [**Form Recognizer v3.0 migration guide**](v3-migration-guide.md) to learn how to use the preview version in your applications and workflows.
+* Follow our [**Form Recognizer v3.0 migration guide**](v3-migration-guide.md) to learn how to use the v3.0 version in your applications and workflows.
-* Explore our [**REST API (preview)**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument) to learn more about the preview version and new capabilities.
+* Explore our [**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument) to learn more about the v3.0 version and new capabilities.
## Next steps
You'll need an ID document. You can use our [sample ID document](https://raw.git
* Explore our REST API: > [!div class="nextstepaction"]
- > [Form Recognizer API v3.0](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)
+ > [Form Recognizer API v3.0](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)
applied-ai-services Concept Invoice https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-invoice.md
Previously updated : 06/06/2022 Last updated : 08/22/2022 recommendations: false
recommendations: false
## Development options
+The following tools are supported by Form Recognizer v3.0:
+
+| Feature | Resources | Model ID |
+|-|-|--|
+|**Invoice model** | <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**JavaScript SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li></ul>|**prebuilt-invoice**|
+ The following tools are supported by Form Recognizer v2.1: | Feature | Resources | |-|-| |**Invoice model**| <ul><li>[**Form Recognizer labeling tool**](https://fott-2-1.azurewebsites.net/prebuilts-analyze)</li><li>[**REST API**](quickstarts/try-sdk-rest-api.md?pivots=programming-language-rest-api#analyze-invoices)</li><li>[**Client-library SDK**](quickstarts/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=invoice#run-the-container-with-the-docker-compose-up-command)</li></ul>|
-The following tools are supported by Form Recognizer v3.0:
-
-| Feature | Resources | Model ID |
-|-|-|--|
-|**Invoice model** | <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md)</li><li>[**JavaScript SDK**](quickstarts/try-v3-javascript-sdk.md)</li></ul>|**prebuilt-invoice**|
### Try Form Recognizer
-See how data, including customer information, vendor details, and line items, is extracted from invoices using the Form Recognizer Studio or our Sample Labeling tool. You'll need the following resources:
+See how data, including customer information, vendor details, and line items, is extracted from invoices using the Form Recognizer Studio. You'll need the following resources:
* An Azure subscriptionΓÇöyou can [create one for free](https://azure.microsoft.com/free/cognitive-services/)
See how data, including customer information, vendor details, and line items, is
:::image type="content" source="media/containers/keys-and-endpoint.png" alt-text="Screenshot: keys and endpoint location in the Azure portal.":::
-#### Form Recognizer Studio (preview)
+#### Form Recognizer Studio
1. On the Form Recognizer Studio home page, select **Invoices**
See how data, including customer information, vendor details, and line items, is
> [!div class="nextstepaction"] > [Try Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=invoice)
-#### Sample Labeling tool (API v2.1)
-> [!NOTE]
-> Unless you must use API v2.1, it is strongly suggested that you use the [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com) for testing purposes instead of the sample labeling tool.
-
-You'll need an invoice document. You can use our [sample invoice document](https://raw.githubusercontent.com/Azure-Samples/cognitive-services-REST-api-samples/master/curl/form-recognizer/sample-invoice.pdf).
-
-1. On the Sample Labeling tool home page, select **Use prebuilt model to get data**.
-
-1. Select **Invoice** from the **Form Type** dropdown menu:
-
- :::image type="content" source="media/try-invoice.png" alt-text="Screenshot: Sample Labeling tool dropdown prebuilt model selection menu.":::
-
- > [!div class="nextstepaction"]
- > [Try Sample Labeling tool](https://fott-2-1.azurewebsites.net/prebuilts-analyze)
- ## Input requirements [!INCLUDE [input requirements](./includes/input-requirements.md)]
You'll need an invoice document. You can use our [sample invoice document](https
|--|:-|:| |Invoice| <ul><li>English (United States)ΓÇöen-US</li></ul>| English (United States)ΓÇöen-US| |Invoice| <ul><li>SpanishΓÇöes</li></ul>| Spanish (United States)ΓÇöes|
-|Invoice (preview)| <ul><li>GermanΓÇöde</li></ul>| German (Germany)-de|
-|Invoice (preview)| <ul><li>FrenchΓÇöfr</li></ul>| French (France)ΓÇöfr|
-|Invoice (preview)| <ul><li>ItalianΓÇöit</li></ul>| Italian (Italy)ΓÇöit|
-|Invoice (preview)| <ul><li>PortugueseΓÇöpt</li></ul>| Portuguese (Portugal)ΓÇöpt|
-|Invoice (preview)| <ul><li>DutchΓÇönl</li></ul>| Dutch (Netherlands)ΓÇönl|
+|Invoice | <ul><li>GermanΓÇöde</li></ul>| German (Germany)-de|
+|Invoice | <ul><li>FrenchΓÇöfr</li></ul>| French (France)ΓÇöfr|
+|Invoice | <ul><li>ItalianΓÇöit</li></ul>| Italian (Italy)ΓÇöit|
+|Invoice | <ul><li>PortugueseΓÇöpt</li></ul>| Portuguese (Portugal)ΓÇöpt|
+|Invoice | <ul><li>DutchΓÇönl</li></ul>| Dutch (Netherlands)ΓÇönl|
## Field extraction
Following are the line items extracted from an invoice in the JSON output respon
The invoice key-value pairs and line items extracted are in the `documentResults` section of the JSON output.
-### Key-value pairs (Preview)
+### Key-value pairs
-The prebuilt invoice **2022-06-30-preview** release returns key-value pairs at no extra cost. Key-value pairs are specific spans within the invoice that identify a label or key and its associated response or value. In an invoice, these pairs could be the label and the value the user entered for that field or telephone number. The AI model is trained to extract identifiable keys and values based on a wide variety of document types, formats, and structures.
+The prebuilt invoice **2022-06-30** and later releases support returns key-value pairs at no extra cost. Key-value pairs are specific spans within the invoice that identify a label or key and its associated response or value. In an invoice, these pairs could be the label and the value the user entered for that field or telephone number. The AI model is trained to extract identifiable keys and values based on a wide variety of document types, formats, and structures.
Keys can also exist in isolation when the model detects that a key exists, with no associated value or when processing optional fields. For example, a middle name field may be left blank on a form in some instances. key-value pairs are always spans of text contained in the document. If you have documents where the same value is described in different ways, for example, a customer or a user, the associated key will be either customer or user based on context.
-## Form Recognizer preview v3.0
+## Form Recognizer v3.0
- The Form Recognizer preview introduces several new features, capabilities, and AI quality improvements to underlying technologies.
+ The Form Recognizer v3.0 introduces several new features, capabilities, and AI quality improvements to underlying technologies.
-* Follow our [**Form Recognizer v3.0 migration guide**](v3-migration-guide.md) to learn how to use the preview version in your applications and workflows.
+* Follow our [**Form Recognizer v3.0 migration guide**](v3-migration-guide.md) to learn how to use the v3.0 version in your applications and workflows.
-* Explore our [**REST API (preview)**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument) to learn more about the preview version and new capabilities.
+* Explore our [**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument) to learn more about the v3.0 version and new capabilities.
## Next steps
Keys can also exist in isolation when the model detects that a key exists, with
* Explore our REST API: > [!div class="nextstepaction"]
- > [Form Recognizer API v3.0 (Preview)](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)
+ > [Form Recognizer API v3.0 ](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)
> [!div class="nextstepaction"] > [Form Recognizer API v2.1](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/5ed8c9843c2794cbb1a96291)
applied-ai-services Concept Layout https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-layout.md
Previously updated : 06/06/2022 Last updated : 08/22/2022 recommendations: false- # Form Recognizer layout model
The paragraph roles are best used with unstructured documents. Paragraph roles
## Development options
-The following tools are supported by Form Recognizer v2.1:
-
-| Feature | Resources |
-|-|-|
-|**Layout API**| <ul><li>[**Form Recognizer labeling tool**](https://fott-2-1.azurewebsites.net/layout-analyze)</li><li>[**REST API**](quickstarts/try-sdk-rest-api.md?pivots=programming-language-rest-api#analyze-layout)</li><li>[**Client-library SDK**](quickstarts/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?branch=main&tabs=layout#run-the-container-with-the-docker-compose-up-command)</li></ul>|
- The following tools are supported by Form Recognizer v3.0: | Feature | Resources | Model ID | |-|||
-|**Layout model**| <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md)</li><li>[**JavaScript SDK**](quickstarts/try-v3-javascript-sdk.md)</li></ul>|**prebuilt-layout**|
+|**Layout model**| <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**JavaScript SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li></ul>|**prebuilt-layout**|
+
+The following tools are supported by Form Recognizer v2.1:
+
+| Feature | Resources |
+|-|-|
+|**Layout API**| <ul><li>[**Form Recognizer labeling tool**](https://fott-2-1.azurewebsites.net/layout-analyze)</li><li>[**REST API**](quickstarts/try-sdk-rest-api.md?pivots=programming-language-rest-api#analyze-layout)</li><li>[**Client-library SDK**](quickstarts/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?branch=main&tabs=layout#run-the-container-with-the-docker-compose-up-command)</li></ul>|
## Try Form Recognizer
Try extracting data from forms and documents using the Form Recognizer Studio. Y
:::image type="content" source="media/containers/keys-and-endpoint.png" alt-text="Screenshot: keys and endpoint location in the Azure portal.":::
-### Form Recognizer Studio (preview)
+### Form Recognizer Studio
> [!NOTE]
-> Form Recognizer studio is available with the preview (v3.0) API.
+> Form Recognizer studio is available with the v3.0 API.
***Sample form processed with [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/layout)***
The layout model extracts text, selection marks, tables, paragraphs, and paragra
Layout API extracts print and handwritten style text as `lines` and `words`. The model outputs bounding `polygon` coordinates and `confidence` for the extracted words. The `styles` collection includes any handwritten style for lines, if detected, along with the spans pointing to the associated text. This feature applies to [supported handwritten languages](language-support.md).
+```json
+{
+ "words": [
+ {
+ "content": "CONTOSO",
+ "polygon": [
+ 76,
+ 30,
+ 118,
+ 32,
+ 118,
+ 43,
+ 76,
+ 43
+ ],
+ "confidence": 1,
+ "span": {
+ "offset": 0,
+ "length": 7
+ }
+ }
+ ]
+}
+
+```
+ ### Selection marks Layout API also extracts selection marks from documents. Extracted selection marks appear within the `pages` collection for each page. They include the bounding `polygon`, `confidence`, and selection `state` (`selected/unselected`). Any associated text if extracted is also included as the starting index (`offset`) and `length` that references the top level `content` property that contains the full text from the document.
+```json
+{
+ "selectionMarks": [
+ {
+ "state": "unselected",
+ "polygon": [
+ 217,
+ 862,
+ 254,
+ 862,
+ 254,
+ 899,
+ 217,
+ 899
+ ],
+ "confidence": 0.995,
+ "span": {
+ "offset": 1421,
+ "length": 12
+ }
+ }
+ ]
+}
++
+```
+ ### Tables and table headers Layout API extracts tables in the `pageResults` section of the JSON output. Documents can be scanned, photographed, or digitized. Extracted table information includes the number of columns and rows, row span, and column span. Each cell with its bounding `polygon` is output along with information whether it's recognized as a `columnHeader` or not. The API also works with rotated tables. Each table cell contains the row and column index and bounding polygon coordinates. For the cell text, the model outputs the `span` information containing the starting index (`offset`). The model also outputs the `length` within the top level `content` that contains the full text from the document.
+```json
+{
+ "tables": [
+ {
+ "rowCount": 9,
+ "columnCount": 4,
+ "cells": [
+ {
+ "kind": "columnHeader",
+ "rowIndex": 0,
+ "columnIndex": 0,
+ "columnSpan": 4,
+ "content": "(In millions, except earnings per share)",
+ "boundingRegions": [
+ {
+ "pageNumber": 1,
+ "polygon": [
+ 36,
+ 184,
+ 843,
+ 183,
+ 843,
+ 209,
+ 36,
+ 207
+ ]
+ }
+ ],
+ "spans": [
+ {
+ "offset": 511,
+ "length": 40
+ }
+ ]
+ },
+ ]
+ }
+ .
+ .
+ .
+ ]
+}
+
+```
+ ### Paragraphs The Layout model extracts all identified blocks of text in the `paragraphs` collection as a top level object under `analyzeResults`. Each entry in this collection represents a text block and includes the extracted text as`content`and the bounding `polygon` coordinates. The `span` information points to the text fragment within the top level `content` property that contains the full text from the document.
+```json
+{
+ "paragraphs": [
+ {
+ "spans": [
+ {
+ "offset": 0,
+ "length": 21
+ }
+ ],
+ "boundingRegions": [
+ {
+ "pageNumber": 1,
+ "polygon": [
+ 75,
+ 30,
+ 118,
+ 31,
+ 117,
+ 68,
+ 74,
+ 67
+ ]
+ }
+ ],
+ "content": "Tuesday, Sep 20, YYYY"
+ }
+ ]
+}
+
+```
+ ### Paragraph roles The Layout model may flag certain paragraphs with their specialized type or `role` as predicted by the model. They're best used with unstructured documents to help understand the layout of the extracted content for a richer semantic analysis. The following paragraph roles are supported:
The Layout model may flag certain paragraphs with their specialized type or `rol
| `pageFooter` | Text near the bottom edge of the page | | `pageNumber` | Page number |
+```json
+{
+ "paragraphs": [
+ {
+ "spans": [
+ {
+ "offset": 22,
+ "length": 10
+ }
+ ],
+ "boundingRegions": [
+ {
+ "pageNumber": 1,
+ "polygon": [
+ 139,
+ 10,
+ 605,
+ 8,
+ 605,
+ 56,
+ 139,
+ 58
+ ]
+ }
+ ],
+ "role": "title",
+ "content": "NEWS TODAY"
+ }
+ ]
+}
+
+```
+ ### Select page numbers or ranges for text extraction For large multi-page documents, use the `pages` query parameter to indicate specific page numbers or page ranges for text extraction.
For large multi-page documents, use the `pages` query parameter to indicate spec
* Explore our REST API:
- > [!div class="nextstepaction"]
- > [Form Recognizer API v3.0](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)
+ > [!div class="nextstepaction"]
+ > [Form Recognizer API v3.0](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)
applied-ai-services Concept Model Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-model-overview.md
Previously updated : 06/06/2022 Last updated : 08/22/2022 recommendations: false
| **Model** | **Description** | | | | |**Document analysis**||
-| 🆕[Read (preview)](#read-preview) | Extract typeface and handwritten text lines, words, locations, and detected languages.|
-| 🆕[General document (preview)](#general-document-preview) | Extract text, tables, structure, key-value pairs, and named entities.|
+| [Read](#read) | Extract typeface and handwritten text lines, words, locations, and detected languages.|
+| [General document](#general-document) | Extract text, tables, structure, key-value pairs, and named entities.|
| [Layout](#layout) | Extract text and layout information from documents.| |**Prebuilt**||
-| 🆕[W-2 (preview)](#w-2-preview) | Extract employee, employer, wage information, etc. from US W-2 forms. |
+| [W-2](#w-2) | Extract employee, employer, wage information, etc. from US W-2 forms. |
| [Invoice](#invoice) | Extract key information from English and Spanish invoices. | | [Receipt](#receipt) | Extract key information from English receipts. | | [ID document](#id-document) | Extract key information from US driver licenses and international passports. |
| [Custom](#custom) | Extract data from forms and documents specific to your business. Custom models are trained for your distinct data and use cases. | | [Composed](#composed-custom-model) | Compose a collection of custom models and assign them to a single model built from your form types.
-### Read (preview)
+### Read
[:::image type="icon" source="media/studio/read-card.png" :::](https://formrecognizer.appliedai.azure.com/studio/read)
The Read API analyzes and extracts ext lines, words, their locations, detected l
> [!div class="nextstepaction"] > [Learn more: read model](concept-read.md)
-### W-2 (preview)
+### W-2
[:::image type="icon" source="media/studio/w2.png":::](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=tax.us.w2)
The W-2 model analyzes and extracts key information reported in each box on a W-
> [!div class="nextstepaction"] > [Learn more: W-2 model](concept-w2.md)
-### General document (preview)
+### General document
[:::image type="icon" source="media/studio/general-document.png":::](https://formrecognizer.appliedai.azure.com/studio/document)
The invoice model analyzes and extracts key information from sales invoices. The
* The receipt model analyzes and extracts key information from printed and handwritten sales receipts.
-* The preview version v3.0 also supports single-page hotel receipt processing.
+* Version v3.0 also supports single-page hotel receipt processing.
***Sample receipt processed using [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=receipt)***:
The business card model analyzes and extracts key information from business card
* Custom models analyze and extract data from forms and documents specific to your business. The API is a machine-learning program trained to recognize form fields within your distinct content and extract key-value pairs and table data. You only need five examples of the same form type to get started and your custom model can be trained with or without labeled datasets.
-* The preview version v3.0 custom model supports signature detection in custom forms (template model) and cross-page tables in both template and neural models.
+* Version v3.0 custom model supports signature detection in custom forms (template model) and cross-page tables in both template and neural models.
***Sample custom template processed using [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/customform/projects)***:
A composed model is created by taking a collection of custom models and assignin
| **Model ID** | **Text extraction** | **Language detection** | **Selection Marks** | **Tables** | **Paragraphs** | **Paragraph roles** | **Key-Value pairs** | **Fields** | |:--|:-:|:-:|:-:|:-:|:-:|:-:|:-:|:-:|
-|🆕 [prebuilt-read](concept-read.md#data-extraction) | ✓ | ✓ | | | ✓ | | | |
-|🆕 [prebuilt-tax.us.w2](concept-w2.md#field-extraction) | ✓ | | ✓ | | ✓ | | | ✓ |
-|🆕 [prebuilt-document](concept-general-document.md#data-extraction)| ✓ | | ✓ | ✓ | ✓ | | ✓ | |
+| [prebuilt-read](concept-read.md#data-extraction) | Γ£ô | Γ£ô | | | Γ£ô | | | |
+| [prebuilt-tax.us.w2](concept-w2.md#field-extraction) | Γ£ô | | Γ£ô | | Γ£ô | | | Γ£ô |
+| [prebuilt-document](concept-general-document.md#data-extraction)| Γ£ô | | Γ£ô | Γ£ô | Γ£ô | | Γ£ô | |
| [prebuilt-layout](concept-layout.md#data-extraction) | Γ£ô | | Γ£ô | Γ£ô | Γ£ô | Γ£ô | | | | [prebuilt-invoice](concept-invoice.md#field-extraction) | Γ£ô | | Γ£ô | Γ£ô | Γ£ô | | Γ£ô | Γ£ô | | [prebuilt-receipt](concept-receipt.md#field-extraction) | Γ£ô | | | | Γ£ô | | | Γ£ô |
-| [prebuilt-idDocument](concept-id-document.md#field-extraction) | Γ£ô | | | | Γ£ô | | | Γ£ô |
-| [prebuilt-businessCard](concept-business-card.md#field-extraction) | Γ£ô | | | | Γ£ô | | | Γ£ô |
+| [prebuilt-idDocument](concept-id-document.md#field-extractions) | Γ£ô | | | | Γ£ô | | | Γ£ô |
+| [prebuilt-businessCard](concept-business-card.md#field-extractions) | Γ£ô | | | | Γ£ô | | | Γ£ô |
| [Custom](concept-custom.md#compare-model-features) | Γ£ô | | Γ£ô | Γ£ô | Γ£ô | | | Γ£ô | ## Input requirements
applied-ai-services Concept Read https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-read.md
Previously updated : 06/06/2022 Last updated : 08/22/2022 recommendations: false
# Form Recognizer Read OCR model
-Form Recognizer v3.0 preview includes the new Read Optical Character Recognition (OCR) model. The Read OCR model extracts typeface and handwritten text including mixed languages in documents. The Read OCR model can detect lines, words, locations, and languages and is the core of all other Form Recognizer models. Layout, general document, custom, and prebuilt models all use the Read OCR model as a foundation for extracting texts from documents.
+Form Recognizer v3.0 includes the new Read Optical Character Recognition (OCR) model. The Read OCR model extracts typeface and handwritten text including mixed languages in documents. The Read OCR model can detect lines, words, locations, and languages and is the core of all other Form Recognizer models. Layout, general document, custom, and prebuilt models all use the Read OCR model as a foundation for extracting texts from documents.
+
+> [!NOTE]
+>
+> * Only API Version 2022-06-30-preview supports Microsoft Word, Excel, PowerPoint, and HTML file formats in addition to all other document types supported by the GA versions.
+> * For these file formats, Read API ignores the pages parameter and extracts all pages by default. Each embedded image counts as 1 page unit and each worksheet, slide, and page (up to 3000 characters) count as 1 page.
## Supported document types
Try extracting text from forms and documents using the Form Recognizer Studio. Y
:::image type="content" source="media/containers/keys-and-endpoint.png" alt-text="Screenshot: keys and endpoint location in the Azure portal.":::
-### Form Recognizer Studio (preview)
+### Form Recognizer Studio
> [!NOTE]
-> Currently, Form Recognizer Studio doesn't support Microsoft Word, Excel, PowerPoint, and HTML file formats in the Read preview.
+> Currently, Form Recognizer Studio doesn't support Microsoft Word, Excel, PowerPoint, and HTML file formats in the Read version v3.0.
***Sample form processed with [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/read)***
Try extracting text from forms and documents using the Form Recognizer Studio. Y
## Supported languages and locales
-Form Recognizer preview version supports several languages for the read model. *See* our [Language Support](language-support.md) for a complete list of supported handwritten and printed languages.
+Form Recognizer v3.0 version supports several languages for the read model. *See* our [Language Support](language-support.md) for a complete list of supported handwritten and printed languages.
## Data detection and extraction
Complete a Form Recognizer quickstart:
Explore our REST API: > [!div class="nextstepaction"]
-> [Form Recognizer API v3.0](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)
+> [Form Recognizer API v3.0](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)
applied-ai-services Concept Receipt https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-receipt.md
Previously updated : 06/06/2022 Last updated : 08/22/2022 recommendations: false
# Form Recognizer receipt model
-The receipt model combines powerful Optical Character Recognition (OCR) capabilities with deep learning models to analyze and extract key information from sales receipts. Receipts can be of various formats and quality including printed and handwritten receipts. The API extracts key information such as merchant name, merchant phone number, transaction date, total tax, and transaction total and returns a structured JSON data representation.
+The receipt model combines powerful Optical Character Recognition (OCR) capabilities with deep learning models to analyze and extract key information from sales receipts. Receipts can be of various formats and quality including printed and handwritten receipts. The API extracts key information such as merchant name, merchant phone number, transaction date, tax, and transaction total and returns structured JSON data.
***Sample receipt processed with [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=receipt)***:
The receipt model combines powerful Optical Character Recognition (OCR) capabili
## Development options
+The following tools are supported by Form Recognizer v3.0:
+
+| Feature | Resources | Model ID |
+|-|-|--|
+|**Receipt model**| <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li></ul>|**prebuilt-receipt**|
+ The following tools are supported by Form Recognizer v2.1: | Feature | Resources | |-|-| |**Receipt model**| <ul><li>[**Form Recognizer labeling tool**](https://fott-2-1.azurewebsites.net/prebuilts-analyze)</li><li>[**REST API**](quickstarts/try-sdk-rest-api.md?pivots=programming-language-rest-api#analyze-receipts)</li><li>[**Client-library SDK**](quickstarts/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=receipt#run-the-container-with-the-docker-compose-up-command)</li></ul>|
-The following tools are supported by Form Recognizer v3.0:
-
-| Feature | Resources | Model ID |
-|-|-|--|
-|**Receipt model**| <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md)</li></ul>|**prebuilt-receipt**|
- ### Try Form Recognizer
-See how data, including time and date of transactions, merchant information, and amount totals, is extracted from receipts using the Form Recognizer Studio or our Sample Labeling tool. You'll need the following resources:
+See how data, including time and date of transactions, merchant information, and amount totals, is extracted from receipts using the Form Recognizer Studio. You'll need the following resources:
* An Azure subscriptionΓÇöyou can [create one for free](https://azure.microsoft.com/free/cognitive-services/)
See how data, including time and date of transactions, merchant information, and
:::image type="content" source="media/containers/keys-and-endpoint.png" alt-text="Screenshot: keys and endpoint location in the Azure portal.":::
-#### Form Recognizer Studio (preview)
+#### Form Recognizer Studio
> [!NOTE]
-> Form Recognizer studio is available with the preview (v3.0) API.
+> Form Recognizer studio is available with the v3.0 API.
1. On the Form Recognizer Studio home page, select **Receipts**
See how data, including time and date of transactions, merchant information, and
> [!div class="nextstepaction"] > [Try Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=receipt)
-#### Sample Labeling tool (API v2.1)
-
-You'll need a receipt document. You can use our [sample receipt document](https://raw.githubusercontent.com/Azure-Samples/cognitive-services-REST-api-samples/master/curl/form-recognizer/contoso-receipt.png).
-
-1. On the Sample Labeling tool home page, select **Use prebuilt model to get data**.
-
-1. Select **Receipt** from the **Form Type** dropdown menu:
-
- :::image type="content" source="media/try-receipt.png" alt-text="Screenshot: Sample Labeling tool dropdown prebuilt model selection menu.":::
-
- > [!div class="nextstepaction"]
- > [Try Sample Labeling tool](https://fott-2-1.azurewebsites.net/prebuilts-analyze)
- ## Input requirements [!INCLUDE [input requirements](./includes/input-requirements.md)]
You'll need a receipt document. You can use our [sample receipt document](https:
| TransactionTime | Time | Time the receipt was issued | hh-mm-ss (24-hour) | | Total | Number (USD)| Full transaction total of receipt | Two-decimal float| | Subtotal | Number (USD) | Subtotal of receipt, often before taxes are applied | Two-decimal float|
- | Tax | Number (USD) | Total tax on receipt (often sales tax or equivalent). **Renamed to "TotalTax" in 2022-06-30-preview version**. | Two-decimal float |
+ | Tax | Number (USD) | Total tax on receipt (often sales tax or equivalent). **Renamed to "TotalTax" in 2022-06-30 version**. | Two-decimal float |
| Tip | Number (USD) | Tip included by buyer | Two-decimal float| | Items | Array of objects | Extracted line items, with name, quantity, unit price, and total price extracted | |
-| Name | String | Item description. **Renamed to "Description" in 2022-06-30-preview version**. | |
+| Name | String | Item description. **Renamed to "Description" in 2022-06-30 version**. | |
| Quantity | Number | Quantity of each item | Two-decimal float | | Price | Number | Individual price of each item unit| Two-decimal float | | TotalPrice | Number | Total price of line item | Two-decimal float |
-## Form Recognizer preview v3.0
+## Form Recognizer v3.0
- The Form Recognizer preview introduces several new features and capabilities. The **Receipt** model supports single-page hotel receipt processing.
+ Form Recognizer v3.0 introduces several new features and capabilities. The **Receipt** model supports single-page hotel receipt processing.
### Hotel receipt field extraction
You'll need a receipt document. You can use our [sample receipt document](https:
### Migration guide and REST API v3.0
-* Follow our [**Form Recognizer v3.0 migration guide**](v3-migration-guide.md) to learn how to use the preview version in your applications and workflows.
+* Follow our [**Form Recognizer v3.0 migration guide**](v3-migration-guide.md) to learn how to use the v3.0 version in your applications and workflows.
-* Explore our [**REST API (preview)**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument) to learn more about the preview version and new capabilities.
+* Explore our [**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument) to learn more about the v3.0 version and new capabilities.
## Next steps
You'll need a receipt document. You can use our [sample receipt document](https:
* Explore our REST API: > [!div class="nextstepaction"]
- > [Form Recognizer API v3.0](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)
+ > [Form Recognizer API v3.0](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)
applied-ai-services Concept W2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-w2.md
Previously updated : 06/06/2022 Last updated : 08/22/2022 recommendations: false
-# Form Recognizer W-2 model | Preview
+# Form Recognizer W-2 model | v3.0
The Form W-2, Wage and Tax Statement, is a [US Internal Revenue Service (IRS) tax form](https://www.irs.gov/forms-pubs/about-form-w-2). It's used to report employees' salary, wages, compensation, and taxes withheld. Employers send a W-2 form to each employee on or before January 31 each year and employees use the form to prepare their tax returns. W-2 is a key document used in employee's federal and state taxes filing, as well as other processes like mortgage loan and Social Security Administration (SSA).
The prebuilt W-2 model is supported by Form Recognizer v3.0 with the following t
| Feature | Resources | Model ID | |-|-|--|
-|**W-2 model**|<ul><li> [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#prebuilt-model)</li><li>[**JavaScript SDK**](quickstarts/try-v3-javascript-sdk.md#prebuilt-model)</li></ul>|**prebuilt-tax.us.w2**|
+|**W-2 model**|<ul><li> [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**JavaScript SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li></ul>|**prebuilt-tax.us.w2**|
### Try Form Recognizer
Try extracting data from W-2 forms using the Form Recognizer Studio. You'll need
#### Form Recognizer Studio > [!NOTE]
-> Form Recognizer studio is available with v3.0 preview API.
+> Form Recognizer studio is available with v3.0 API.
1. On the [Form Recognizer Studio home page](https://formrecognizer.appliedai.azure.com/studio), select **W-2**.
Try extracting data from W-2 forms using the Form Recognizer Studio. You'll need
### Migration guide and REST API v3.0
-* Follow our [**Form Recognizer v3.0 migration guide**](v3-migration-guide.md) to learn how to use the preview version in your applications and workflows.
+* Follow our [**Form Recognizer v3.0 migration guide**](v3-migration-guide.md) to learn how to use the v3.0 version in your applications and workflows.
-* Explore our [**REST API (preview)**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument) to learn more about the preview version and new capabilities.
+* Explore our [**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument) to learn more about the v3.0 version and new capabilities.
## Next steps * Complete a Form Recognizer quickstart: > [!div class="checklist"] >
-> * [**REST API**](quickstarts/try-v3-rest-api.md)
-> * [**C# SDK**](quickstarts/try-v3-csharp-sdk.md#prebuilt-model)
-> * [**Python SDK**](quickstarts/try-v3-python-sdk.md#prebuilt-model)
-> * [**Java SDK**](quickstarts/try-v3-java-sdk.md#prebuilt-model)
-> * [**JavaScript**](quickstarts/try-v3-javascript-sdk.md#prebuilt-model)</li></ul>
+> * [**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)
+> * [**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)
+> * [**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)
+> * [**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)
+> * [**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li></ul>
applied-ai-services Form Recognizer Container Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/containers/form-recognizer-container-configuration.md
Previously updated : 06/06/2022 Last updated : 08/22/2022 # Configure Form Recognizer containers
applied-ai-services Form Recognizer Container Install Run https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/containers/form-recognizer-container-install-run.md
Last updated 12/16/2021 keywords: on-premises, Docker, container, identify- # Install and run Form Recognizer v2.1-preview containers
> > * The online request form requires that you provide a valid email address that belongs to the organization that owns the Azure subscription ID and that you have or have been granted access to that subscription.
-Azure Form Recognizer is an Azure Applied AI Service that lets you build automated data processing software using machine-learning technology. Form Recognizer enables you to identify and extract text, key/value pairs, selection marks, table data, and more from your form documents and output structured data that includes the relationships in the original file.
+Azure Form Recognizer is an Azure Applied AI Service that lets you build automated data processing software using machine-learning technology. Form Recognizer enables you to identify and extract text, key/value pairs, selection marks, table data, and more from your form documents. The results are delivered as structured data that includes the relationships in the original file.
In this article you'll learn how to download, install, and run Form Recognizer containers. Containers enable you to run the Form Recognizer service in your own environment. Containers are great for specific security and data governance requirements. Form Recognizer features are supported by six Form Recognizer feature containersΓÇö**Layout**, **Business Card**,**ID Document**, **Receipt**, **Invoice**, and **Custom** (for Receipt, Business Card and ID Document containers you'll also need the **Read** OCR container).
The following table lists the supporting container(s) for each Form Recognizer c
#### Recommended CPU cores and memory
-> [!Note]
+> [!NOTE]
+>
> The minimum and recommended values are based on Docker limits and *not* the host machine resources. ##### Read, Layout, and Prebuilt containers
Azure Cognitive Services containers aren't licensed to run without being connect
### Connect to Azure
-The container needs the billing argument values to run. These values allow the container to connect to the billing endpoint. The container reports usage about every 10 to 15 minutes. If the container doesn't connect to Azure within the allowed time window, the container continues to run but doesn't serve queries until the billing endpoint is restored. The connection is attempted 10 times at the same time interval of 10 to 15 minutes. If it can't connect to the billing endpoint within the 10 tries, the container stops serving requests. See the [Cognitive Services container FAQ](../../../cognitive-services/containers/container-faq.yml#how-does-billing-work) for an example of the information sent to Microsoft for billing.
+The container needs the billing argument values to run. These values allow the container to connect to the billing endpoint. The container reports usage about every 10 to 15 minutes. If the container doesn't connect to Azure within the allowed time window, the container continues to run, but doesn't serve queries until the billing endpoint is restored. The connection is attempted 10 times at the same time interval of 10 to 15 minutes. If it can't connect to the billing endpoint within the 10 tries, the container stops serving requests. See the [Cognitive Services container FAQ](../../../cognitive-services/containers/container-faq.yml#how-does-billing-work) for an example of the information sent to Microsoft for billing.
### Billing arguments
applied-ai-services Create A Form Recognizer Resource https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/create-a-form-recognizer-resource.md
Previously updated : 06/06/2022 Last updated : 08/22/2022 recommendations: false #Customer intent: I want to learn how to use create a Form Recognizer service in the Azure portal.
Let's get started:
1. Copy the key and endpoint values from your Form Recognizer resource paste them in a convenient location, such as *Microsoft Notepad*. You'll need the key and endpoint values to connect your application to the Form Recognizer API.
-1. If your overview page doesn't have the keys and endpoint visible, you can select the **Keys and Endpoint** button on the left navigation bar and retrieve them there.
+1. If your overview page doesn't have the keys and endpoint visible, you can select the **Keys and Endpoint** button, on the left navigation bar, and retrieve them there.
:::image border="true" type="content" source="media/containers/keys-and-endpoint.png" alt-text="Still photo showing how to access resource key and endpoint URL":::
That's it! You're now ready to start automating data extraction using Azure Form
* Complete a Form Recognizer quickstart and get started creating a document processing app in the development language of your choice:
- * [C#](quickstarts/try-v3-csharp-sdk.md)
- * [Python](quickstarts/try-v3-python-sdk.md)
- * [Java](quickstarts/try-v3-java-sdk.md)
- * [JavaScript](quickstarts/try-v3-javascript-sdk.md)
+ * [C#](quickstarts/get-started-v3-sdk-rest-api.md)
+ * [Python](quickstarts/get-started-v3-sdk-rest-api.md)
+ * [Java](quickstarts/get-started-v3-sdk-rest-api.md)
+ * [JavaScript](quickstarts/get-started-v3-sdk-rest-api.md)
applied-ai-services Create Sas Tokens https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/create-sas-tokens.md
The SAS URL includes a special set of [query parameters](/rest/api/storageservic
### REST API
-To use your SAS URL with the [REST API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/BuildDocumentModel), add the SAS URL to the request body:
+To use your SAS URL with the [REST API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/BuildDocumentModel), add the SAS URL to the request body:
```json {
To use your SAS URL with the [REST API](https://westus.dev.cognitive.microsoft.c
} ```
-### Sample Labeling Tool
-
-To use your SAS URL with the [Form Recognizer labeling tool](https://fott-2-1.azurewebsites.net/connections/create), add the SAS URL to the **Connection Settings** → **Azure blob container** → **SAS URI** field:
-
- :::image type="content" source="media/sas-tokens/fott-add-sas-uri.png" alt-text="Screenshot that shows the SAS URI field.":::
- That's it! You've learned how to create SAS tokens to authorize how clients access your data. ## Next step
applied-ai-services Deploy Label Tool https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/deploy-label-tool.md
>[!TIP] >
-> * For an enhanced experience and advanced model quality, try the [Form Recognizer v3.0 Studio (preview)](https://formrecognizer.appliedai.azure.com/studio).
+> * For an enhanced experience and advanced model quality, try the [Form Recognizer v3.0 Studio ](https://formrecognizer.appliedai.azure.com/studio).
> * The v3.0 Studio supports any model trained with v2.1 labeled data. > * You can refer to the [API migration guide](v3-migration-guide.md) for detailed information about migrating from v2.1 to v3.0.
-> * *See* our [**REST API**](quickstarts/try-v3-rest-api.md) or [**C#**](quickstarts/try-v3-csharp-sdk.md), [**Java**](quickstarts/try-v3-java-sdk.md), [**JavaScript**](quickstarts/try-v3-javascript-sdk.md), or [Python](quickstarts/try-v3-python-sdk.md) SDK quickstarts to get started with the V3.0 preview.
+> * *See* our [**REST API**](quickstarts/get-started-v3-sdk-rest-api.md) or [**C#**](quickstarts/get-started-v3-sdk-rest-api.md), [**Java**](quickstarts/get-started-v3-sdk-rest-api.md), [**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md), or [Python](quickstarts/get-started-v3-sdk-rest-api.md) SDK quickstarts to get started with the v3.0 version.
> [!NOTE] > The [cloud hosted](https://fott-2-1.azurewebsites.net/) labeling tool is available at [https://fott-2-1.azurewebsites.net/](https://fott-2-1.azurewebsites.net/). Follow the steps in this document only if you want to deploy the sample labeling tool for yourself.
applied-ai-services Disaster Recovery https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/disaster-recovery.md
The process for copying a custom model consists of the following steps:
1. Next you send the copy request to the source resource&mdash;the resource that contains the model to be copied with the payload (copy authorization) returned from the previous call. You'll get back a URL that you can query to track the progress of the operation. 1. You'll use your source resource credentials to query the progress URL until the operation is a success. You can also query the new model ID in the target resource to get the status of the new model.
-### [Form Recognizer REST API v3.0 (Preview)](#tab/v30)
+### [Form Recognizer REST API v3.0 ](#tab/v30)
## Generate Copy authorization request The following HTTP request gets copy authorization from your target resource. You'll need to enter the endpoint and key of your target resource as headers. ```http
-POST https://{TARGET_FORM_RECOGNIZER_RESOURCE_ENDPOINT}/formrecognizer/documentModels:authorizeCopy?api-version=2022-06-30-preview
+POST https://{TARGET_FORM_RECOGNIZER_RESOURCE_ENDPOINT}/formrecognizer/documentModels:authorizeCopy?api-version=2022-08-31
Ocp-Apim-Subscription-Key: {TARGET_FORM_RECOGNIZER_RESOURCE_KEY} ```
You'll get a `200` response code with response body that contains the JSON paylo
The following HTTP request starts the copy operation on the source resource. You'll need to enter the endpoint and key of your source resource as the url and header. Notice that the request URL contains the model ID of the source model you want to copy. ```http
-POST {{source-endpoint}}formrecognizer/documentModels/{model-to-be-copied}:copyTo?api-version=2022-06-30-preview
+POST {{source-endpoint}}formrecognizer/documentModels/{model-to-be-copied}:copyTo?api-version=2022-08-31
Ocp-Apim-Subscription-Key: {SOURCE_FORM_RECOGNIZER_RESOURCE_KEY} ```
You'll get a `202\Accepted` response with an Operation-Location header. This val
```http HTTP/1.1 202 Accepted
-Operation-Location: https://{source-resource}.cognitiveservices.azure.com/formrecognizer/operations/{operation-id}?api-version=2022-06-30-preview
+Operation-Location: https://{source-resource}.cognitiveservices.azure.com/formrecognizer/operations/{operation-id}?api-version=2022-08-31
```
-### [Form Recognizer REST API v2.1 (GA)](#tab/v21)
+### [Form Recognizer REST API v2.1 ](#tab/v21)
## Generate Copy authorization request
Operation-Location: https://{SOURCE_FORM_RECOGNIZER_RESOURCE_ENDPOINT}/formrecog
## Track Copy progress
-### [Form Recognizer v3.0 (Preview)](#tab/v30)
+### [Form Recognizer v3.0 ](#tab/v30)
```
-GET https://{source-resource}.cognitiveservices.azure.com/formrecognizer/operations/{operation-id}?api-version=2022-06-30-preview
+GET https://{source-resource}.cognitiveservices.azure.com/formrecognizer/operations/{operation-id}?api-version=2022-08-31
Ocp-Apim-Subscription-Key: {SOURCE_FORM_RECOGNIZER_RESOURCE_KEY} ```
-### [Form Recognizer v2.1 (GA)](#tab/v21)
+### [Form Recognizer v2.1 ](#tab/v21)
Track your progress by querying the **Get Copy Model Result** API against the source resource endpoint.
curl -i GET "https://<SOURCE_FORM_RECOGNIZER_RESOURCE_ENDPOINT>/formrecognizer/v
## Next steps In this guide, you learned how to use the Copy API to back up your custom models to a secondary Form Recognizer resource. Next, explore the API reference docs to see what else you can do with Form Recognizer.
-* [REST API reference documentation](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)
+* [REST API reference documentation](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)
applied-ai-services Form Recognizer Studio Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/form-recognizer-studio-overview.md
Previously updated : 07/18/2022 Last updated : 08/22/2022 recommendations: false
recommendations: false
<!-- markdownlint-disable MD033 --> # What is Form Recognizer Studio?
->[!NOTE]
-> Form Recognizer Studio is currently in public preview. Some features may not be supported or have limited capabilities.
- Form Recognizer Studio is an online tool to visually explore, understand, train, and integrate features from the Form Recognizer service into your applications. The studio provides a platform for you to experiment with the different Form Recognizer models and sample their returned data in an interactive manner without the need to write code.
-The studio supports all Form Recognizer v3.0 models and v2.1 models with labeled data. Refer to the [REST API migration guide](v3-migration-guide.md) for detailed information about migrating from v2.1 to v3.0.
+The studio supports Form Recognizer v3.0 models and v3.0 model training. Previously trained v2.1 models with labeled data are supported, but not v2.1 model training. Refer to the [REST API migration guide](v3-migration-guide.md) for detailed information about migrating from v2.1 to v3.0.
## Get started using Form Recognizer Studio
The studio supports all Form Recognizer v3.0 models and v2.1 models with labeled
:::image type="content" source="media/studio/form-recognizer-studio-front.png" alt-text="Screenshot of Form Recognizer Studio front page.":::
-1. After you've tried Form Recognizer Studio, use the [**C#**](quickstarts/try-v3-csharp-sdk.md), [**Java**](quickstarts/try-v3-java-sdk.md), [**JavaScript**](quickstarts/try-v3-javascript-sdk.md) or [**Python**](quickstarts/try-v3-python-sdk.md) client libraries or the [**REST API**](quickstarts/try-v3-rest-api.md) to get started incorporating Form Recognizer models into your own applications.
+1. After you've tried Form Recognizer Studio, use the [**C#**](quickstarts/get-started-v3-sdk-rest-api.md), [**Java**](quickstarts/get-started-v3-sdk-rest-api.md), [**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md) or [**Python**](quickstarts/get-started-v3-sdk-rest-api.md) client libraries or the [**REST API**](quickstarts/get-started-v3-sdk-rest-api.md) to get started incorporating Form Recognizer models into your own applications.
To learn more about each model, *see* concepts pages.
applied-ai-services Try V2 1 Sdk Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/how-to-guides/try-v2-1-sdk-rest-api.md
+
+ Title: "Use Form Recognizer client library SDKs or REST API"
+
+description: How to use a Form Recognizer client libraries or REST API to create apps that extracts key-value pairs and table data from your custom documents.
+++++ Last updated : 02/01/2022+
+zone_pivot_groups: programming-languages-set-formre
+recommendations: false
+++
+# Use Form Recognizer SDKs or REST API
+
+ In this how-to guide, you'll learn how to add Form Recognizer to your applications and workflows using an SDK, in a programming language of your choice, or the REST API. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract key-value pairs, text, and tables from your documents. We recommend that you use the free service when you're learning the technology. Remember that the number of free pages is limited to 500 per month.
+
+You'll use the following APIs to extract structured data from forms and documents:
+
+* [Authenticate the client](#authenticate-the-client)
+* [Analyze Layout](#analyze-layout)
+* [Analyze receipts](#analyze-receipts)
+* [Analyze business cards](#analyze-business-cards)
+* [Analyze invoices](#analyze-invoices)
+* [Analyze ID documents](#analyze-id-documents)
+* [Train a custom model](#train-a-custom-model)
+* [Analyze forms with a custom model](#analyze-forms-with-a-custom-model)
+* [Manage custom models](#manage-custom-models)
+++++++++++++++
applied-ai-services Use Prebuilt Read https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/how-to-guides/use-prebuilt-read.md
recommendations: false
The read model is the core of all the other Form Recognizer models. Layout, general document, custom, and prebuilt models all use the read model as a foundation for extracting texts from documents.
->[!NOTE]
-> Form Recognizer v3.0 is currently in public preview. Some features may not be supported or have limited capabilities.
-The current API version is ```2022-06-30```.
+The current API version is ```2022-08-31```.
::: zone pivot="programming-language-csharp"
applied-ai-services Label Tool https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/label-tool.md
keywords: document processing
>[!TIP] >
-> * For an enhanced experience and advanced model quality, try the [Form Recognizer v3.0 Studio (preview)](https://formrecognizer.appliedai.azure.com/studio).
+> * For an enhanced experience and advanced model quality, try the [Form Recognizer v3.0 Studio ](https://formrecognizer.appliedai.azure.com/studio).
> * The v3.0 Studio supports any model trained with v2.1 labeled data. > * You can refer to the [API migration guide](v3-migration-guide.md) for detailed information about migrating from v2.1 to v3.0.
-> * *See* our [**REST API**](quickstarts/try-v3-rest-api.md) or [**C#**](quickstarts/try-v3-csharp-sdk.md), [**Java**](quickstarts/try-v3-java-sdk.md), [**JavaScript**](quickstarts/try-v3-javascript-sdk.md), or [Python](quickstarts/try-v3-python-sdk.md) SDK quickstarts to get started with the V3.0 preview.
+> * *See* our [**REST API**](quickstarts/get-started-v3-sdk-rest-api.md) or [**C#**](quickstarts/get-started-v3-sdk-rest-api.md), [**Java**](quickstarts/get-started-v3-sdk-rest-api.md), [**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md), or [Python](quickstarts/get-started-v3-sdk-rest-api.md) SDK quickstarts to get started with the V3.0.
In this article, you'll use the Form Recognizer REST API with the Sample Labeling tool to train a custom model with manually labeled data.
In the Sample Labeling tool, projects store your configurations and settings. Cr
When you create or open a project, the main tag editor window opens. The tag editor consists of three parts:
-* A resizable preview pane that contains a scrollable list of forms from the source connection.
+* A resizable v3.0 pane that contains a scrollable list of forms from the source connection.
* The main editor pane that allows you to apply tags. * The tags editor pane that allows users to modify, lock, reorder, and delete tags.
applied-ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/language-support.md
This article covers the supported languages for text and field **extraction (by
## Read, layout, and custom form (template) model
-The following lists include the currently GA languages in for the v2.1 version and the most recent v3.0 preview. These languages are supported by Read, Layout, and Custom form (template) model features.
+The following lists include the currently GA languages in for the v2.1 version and the most recent v3.0 version. These languages are supported by Read, Layout, and Custom form (template) model features.
> [!NOTE] > **Language code optional** > > Form Recognizer's deep learning based universal models extract all multi-lingual text in your documents, including text lines with mixed languages, and do not require specifying a language code. Do not provide the language code as the parameter unless you are sure about the language and want to force the service to apply only the relevant model. Otherwise, the service may return incomplete and incorrect text.
-To use the preview languages, refer to the [v3.0 REST API migration guide](/rest/api/medi).
+To use the v3.0-supported languages, refer to the [v3.0 REST API migration guide](/rest/api/medi).
-### Handwritten text (preview and GA)
+### Handwritten text (v3.0 and v2.1)
The following table lists the supported languages for extracting handwritten texts. |Language| Language code (optional) | Language| Language code (optional) | |:--|:-:|:--|:-:|
-|English|`en`|Japanese (preview) |`ja`|
-|Chinese Simplified (preview) |`zh-Hans`|Korean (preview)|`ko`|
-|French (preview) |`fr`|Portuguese (preview)|`pt`|
-|German (preview) |`de`|Spanish (preview) |`es`|
-|Italian (preview) |`it`|
+|English|`en`|Japanese |`ja`|
+|Chinese Simplified |`zh-Hans`|Korean |`ko`|
+|French |`fr`|Portuguese |`pt`|
+|German |`de`|Spanish |`es`|
+|Italian |`it`|
-### Print text (preview)
+### Print text
-This section lists the supported languages for extracting printed texts in the latest preview.
+This section lists the supported languages for extracting printed texts using version v3.0.
|Language| Code (optional) |Language| Code (optional) | |:--|:-:|:--|:-:|
This section lists the supported languages for extracting printed texts in the l
|Kurukh (Devanagari) | `kru`|Welsh | `cy` |Kyrgyz (Cyrillic) | `ky`
-### Print text (GA)
+### Print text
This section lists the supported languages for extracting printed texts in the latest GA version.
Business Card supports all English business cards with the following locales:
|English (India|`en-in`| |English (United States)| `en-us`|
-The **2022-06-30-preview** release includes Japanese language support:
+The **2022-06-30** and later releases include Japanese language support:
|Language| Locale code | |:--|:-:|
Language| Locale code |
|:--|:-:| |English (United States) |en-US| |Spanish| es|
-|German (**2022-06-30-preview**)| de|
-|French (**2022-06-30-preview**)| fr|
-|Italian (**2022-06-30-preview**)|it|
-|Portuguese (**2022-06-30-preview**)|pt|
-|Dutch (**2022-06-30-preview**)| nl|
+|German (**2022-06-30** and later)| de|
+|French (**2022-06-30** and later)| fr|
+|Italian (**2022-06-30** and later)|it|
+|Portuguese (**2022-06-30** and later)|pt|
+|Dutch (**2022-06-30** and later)| nl|
## ID document model
applied-ai-services Managed Identities Secured Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/managed-identities-secured-access.md
This how-to guide will walk you through the process of enabling secure connectio
* Communication between a client application within a Virtual Network (VNET) and your Form Recognizer Resource.
-* Communication between Form Recognizer Studio or the sample labeling tool (FOTT) and your Form Recognizer resource.
+* Communication between Form Recognizer Studio and your Form Recognizer resource.
* Communication between your Form Recognizer resource and a storage account (needed when training a custom model).
applied-ai-services Overview Experiment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/overview-experiment.md
Previously updated : 07/20/2022 Last updated : 08/22/2022 recommendations: false
This section will help you decide which Form Recognizer v3.0 supported model you
| Type of document | Data to extract |Document format | Your best solution | | --|-| -|-|
-|**A text-based document** like a contract or letter.|You want to extract primarily text lines, words, locations, and detected languages.|</li></ul>The document is written or printed in a [supported language](language-support.md#read-layout-and-custom-form-template-model).| [**Read (preview) model**](concept-read.md)|
+|**A text-based document** like a contract or letter.|You want to extract primarily text lines, words, locations, and detected languages.|</li></ul>The document is written or printed in a [supported language](language-support.md#read-layout-and-custom-form-template-model).| [**Read model**](concept-read.md)|
|**A document that includes structural information** like a report or study.|In addition to text, you need to extract structural information like tables, selection marks, paragraphs, titles, headings, and subheadings.|The document is written or printed in a [supported language](language-support.md#read-layout-and-custom-form-template-model)| [**Layout model**](concept-layout.md)
-|**A structured or semi-structured document that includes content formatted as fields and values**, like a credit application or survey form.|You want to extract fields and values including ones not covered by the scenario-specific prebuilt models **without having to train a custom model**.| The form or document is a standardized format commonly used in your business or industry and printed in a [supported language](language-support.md#read-layout-and-custom-form-template-model).|[**General document (preview) model**](concept-general-document.md)
+|**A structured or semi-structured document that includes content formatted as fields and values**, like a credit application or survey form.|You want to extract fields and values including ones not covered by the scenario-specific prebuilt models **without having to train a custom model**.| The form or document is a standardized format commonly used in your business or industry and printed in a [supported language](language-support.md#read-layout-and-custom-form-template-model).|[**General document model**](concept-general-document.md)
|**U.S. W-2 form**|You want to extract key information such as salary, wages, and taxes withheld from US W2 tax forms.</li></ul> |The W-2 document is in United States English (en-US) text.|[**W-2 model**](concept-w2.md) |**Invoice**|You want to extract key information such as customer name, billing address, and amount due from invoices.</li></ul> |The invoice document is written or printed in a [supported language](language-support.md#invoice-model).|[**Invoice model**](concept-invoice.md) |**Receipt**|You want to extract key information such as merchant name, transaction date, and transaction total from a sales or single-page hotel receipt.</li></ul> |The receipt is written or printed in a [supported language](language-support.md#receipt-model). |[**Receipt model**](concept-receipt.md)|
This section will help you decide which Form Recognizer v3.0 supported model you
## Form Recognizer models and development options
-### [Form Recognizer preview (v3.0)](#tab/v3-0)
+### [Form Recognizer v3.0](#tab/v3-0)
The following models and development options are supported by the Form Recognizer service v3.0. You can Use Form Recognizer to automate your data processing in applications and workflows, enhance data-driven strategies, and enrich document search capabilities. Use the links in the table to learn more about each model and browse the API references. | Model | Description |Automation use cases | Development options | |-|--|-|--|
-|[🆕 **Read**](concept-read.md)|Extract text lines, words, detected languages, and handwritten style if detected.| <ul><li>Contract processing. </li><li>Financial or medical report processing.</li></ul>|<ul ><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/read)</li><li>[**REST API**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-rest-api)</li><li>[**C# SDK**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-csharp)</li><li>[**Python SDK**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-python)</li><li>[**Java SDK**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-java)</li><li>[**JavaScript**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-javascript)</li></ul> |
-|[🆕 **General document model**](concept-general-document.md)|Extract text, tables, structure, and key-value pairs.|<ul><li>Key-value pair extraction.</li><li>Form processing.</li><li>Survey data collection and analysis.</li></ul>|<ul ><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/document)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md#reference-table)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#general-document-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#general-document-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#general-document-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#general-document-model)</li></ul> |
-|[**Layout model**](concept-layout.md) | Extract text, selection marks, and tables structures, along with their bounding box coordinates, from forms and documents.</br></br> Layout API has been updated to a prebuilt model. |<ul><li>Document indexing and retrieval by structure.</li><li>Preprocessing prior to OCR analysis.</li></ul> |<ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/layout)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md#reference-table)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#layout-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#layout-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#layout-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#layout-model)</li></ul>|
-|[**Custom model (updated)**](concept-custom.md) | Extraction and analysis of data from forms and documents specific to distinct business data and use cases.</br></br>Custom model API v3.0 supports **signature detection for custom template (custom form) models**.</br></br>Custom model API v3.0 now supports two model types:ul><li>[**Custom Template model**](concept-custom-template.md) (custom form) is used to analyze structured and semi-structured documents.</li><li> [**Custom Neural model**](concept-custom-neural.md) (custom document) is used to analyze unstructured documents.</li></ul>|<ul><li>Identification and compilation of data, unique to your business, impacted by a regulatory change or market event.</li><li>Identification and analysis of previously overlooked unique data.</li></ul> |[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md)</li></ul>|
-|[🆕 **W-2 Form**](concept-w2.md) | Extract information reported in each box on a W-2 form.|<ul><li>Automated tax document management.</li><li>Mortgage loan application processing.</li></ul> |<ul ><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=tax.us.w2)<li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#prebuilt-model)</li></ul> |
-|[**Invoice model**](concept-invoice.md) | Automated data processing and extraction of key information from sales invoices. |<ul><li>Accounts payable processing.</li><li>Automated tax recording and reporting.</li></ul> |<ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=invoice)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#prebuilt-model)</li></ul>|
-|[**Receipt model (updated)**](concept-receipt.md) | Automated data processing and extraction of key information from sales receipts.</br></br>Receipt model v3.0 supports processing of **single-page hotel receipts**.|<ul><li>Expense management.</li><li>Consumer behavior data analysis.</li><li>Customer loyalty program.</li><li>Merchandise return processing.</li><li>Automated tax recording and reporting.</li></ul> |<ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=receipt)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#prebuilt-model)</li></ul>|
-|[**ID document model (updated)**](concept-id-document.md) |Automated data processing and extraction of key information from US driver's licenses and international passports.</br></br>Prebuilt ID document API supports the **extraction of endorsements, restrictions, and vehicle classifications from US driver's licenses**. |<ul><li>Know your customer (KYC) financial services guidelines compliance.</li><li>Medical account management.</li><li>Identity checkpoints and gateways.</li><li>Hotel registration.</li></ul> |<ul><li> [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=idDocument)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#prebuilt-model)</li></ul>|
-|[**Business card model**](concept-business-card.md) |Automated data processing and extraction of key information from business cards.|<ul><li>Sales lead and marketing management.</li></ul> |<ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=businessCard)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#prebuilt-model)</li></ul>|
+|[ **Read**](concept-read.md)|Extract text lines, words, detected languages, and handwritten style if detected.| <ul><li>Contract processing. </li><li>Financial or medical report processing.</li></ul>|<ul ><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/read)</li><li>[**REST API**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-rest-api)</li><li>[**C# SDK**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-csharp)</li><li>[**Python SDK**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-python)</li><li>[**Java SDK**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-java)</li><li>[**JavaScript**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-javascript)</li></ul> |
+|[**General document model**](concept-general-document.md)|Extract text, tables, structure, and key-value pairs.|<ul><li>Key-value pair extraction.</li><li>Form processing.</li><li>Survey data collection and analysis.</li></ul>|<ul ><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/document)</li><li>[**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#general-document-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#general-document-model)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#general-document-model)</li><li>[**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#general-document-model)</li></ul> |
+|[**Layout model**](concept-layout.md) | Extract text, selection marks, and tables structures, along with their bounding box coordinates, from forms and documents.</br></br> Layout API has been updated to a prebuilt model. |<ul><li>Document indexing and retrieval by structure.</li><li>Preprocessing prior to OCR analysis.</li></ul> |<ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/layout)</li><li>[**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#layout-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#layout-model)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#layout-model)</li><li>[**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#layout-model)</li></ul>|
+|[**Custom model (updated)**](concept-custom.md) | Extraction and analysis of data from forms and documents specific to distinct business data and use cases.</br></br>Custom model API v3.0 supports **signature detection for custom template (custom form) models**.</br></br>Custom model API v3.0 now supports two model types:ul><li>[**Custom Template model**](concept-custom-template.md) (custom form) is used to analyze structured and semi-structured documents.</li><li> [**Custom Neural model**](concept-custom-neural.md) (custom document) is used to analyze unstructured documents.</li></ul>|<ul><li>Identification and compilation of data, unique to your business, impacted by a regulatory change or market event.</li><li>Identification and analysis of previously overlooked unique data.</li></ul> |[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects)</li><li>[**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md)</li></ul>|
+|[ **W-2 Form**](concept-w2.md) | Extract information reported in each box on a W-2 form.|<ul><li>Automated tax document management.</li><li>Mortgage loan application processing.</li></ul> |<ul ><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=tax.us.w2)<li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li></ul> |
+|[**Invoice model**](concept-invoice.md) | Automated data processing and extraction of key information from sales invoices. |<ul><li>Accounts payable processing.</li><li>Automated tax recording and reporting.</li></ul> |<ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=invoice)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li></ul>|
+|[**Receipt model (updated)**](concept-receipt.md) | Automated data processing and extraction of key information from sales receipts.</br></br>Receipt model v3.0 supports processing of **single-page hotel receipts**.|<ul><li>Expense management.</li><li>Consumer behavior data analysis.</li><li>Customer loyalty program.</li><li>Merchandise return processing.</li><li>Automated tax recording and reporting.</li></ul> |<ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=receipt)</li><li>[**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li></ul>|
+|[**ID document model (updated)**](concept-id-document.md) |Automated data processing and extraction of key information from US driver's licenses and international passports.</br></br>Prebuilt ID document API supports the **extraction of endorsements, restrictions, and vehicle classifications from US driver's licenses**. |<ul><li>Know your customer (KYC) financial services guidelines compliance.</li><li>Medical account management.</li><li>Identity checkpoints and gateways.</li><li>Hotel registration.</li></ul> |<ul><li> [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=idDocument)</li><li>[**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li></ul>|
+|[**Business card model**](concept-business-card.md) |Automated data processing and extraction of key information from business cards.|<ul><li>Sales lead and marketing management.</li></ul> |<ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=businessCard)</li><li>[**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li></ul>|
### [Form Recognizer GA (v2.1)](#tab/v2-1) >[!TIP] >
- > * For an enhanced experience and advanced model quality, try the [Form Recognizer v3.0 Studio (preview)](https://formrecognizer.appliedai.azure.com/studio).
+ > * For an enhanced experience and advanced model quality, try the [Form Recognizer v3.0 Studio ](https://formrecognizer.appliedai.azure.com/studio).
> * The v3.0 Studio supports any model trained with v2.1 labeled data. > * You can refer to the API migration guide for detailed information about migrating from v2.1 to v3.0.
The following models are supported by Form Recognizer v2.1. Use the links in the
| Model| Description | Development options | |-|--|-|
-|[**Layout API**](concept-layout.md) | Extraction and analysis of text, selection marks, tables, and bounding box coordinates, from forms and documents. | <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-layout)</li><li>[**REST API**](quickstarts/get-started-sdk-rest-api.md#try-it-layout-model)</li><li>[**Client-library SDK**](quickstarts/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?branch=main&tabs=layout#run-the-container-with-the-docker-compose-up-command)</li></ul>|
+|[**Layout API**](concept-layout.md) | Extraction and analysis of text, selection marks, tables, and bounding box coordinates, from forms and documents. | <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-layout)</li><li>[**REST API**](quickstarts/get-started-v2-1-sdk-rest-api.md#try-it-layout-model)</li><li>[**Client-library SDK**](quickstarts/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?branch=main&tabs=layout#run-the-container-with-the-docker-compose-up-command)</li></ul>|
|[**Custom model**](concept-custom.md) | Extraction and analysis of data from forms and documents specific to distinct business data and use cases.| <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#train-a-custom-form-model)</li><li>[**REST API**](quickstarts/try-sdk-rest-api.md)</li><li>[**Client-library SDK**](how-to-guides/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=custom#run-the-container-with-the-docker-compose-up-command)</li></ul>|
-|[**Invoice model**](concept-invoice.md) | Automated data processing and extraction of key information from sales invoices. | <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</li><li>[**REST API**](quickstarts/get-started-sdk-rest-api.md#try-it-prebuilt-model)</li><li>[**Client-library SDK**](quickstarts/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=invoice#run-the-container-with-the-docker-compose-up-command)</li></ul>|
-|[**Receipt model**](concept-receipt.md) | Automated data processing and extraction of key information from sales receipts.| <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</li><li>[**REST API**](quickstarts/get-started-sdk-rest-api.md#try-it-prebuilt-model)</li><li>[**Client-library SDK**](how-to-guides/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=receipt#run-the-container-with-the-docker-compose-up-command)</li></ul>|
-|[**ID document model**](concept-id-document.md) | Automated data processing and extraction of key information from US driver's licenses and international passports.| <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</li><li>[**REST API**](quickstarts/get-started-sdk-rest-api.md#try-it-prebuilt-model)</li><li>[**Client-library SDK**](how-to-guides/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=id-document#run-the-container-with-the-docker-compose-up-command)</li></ul>|
-|[**Business card model**](concept-business-card.md) | Automated data processing and extraction of key information from business cards.| <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</li><li>[**REST API**](quickstarts/get-started-sdk-rest-api.md#try-it-prebuilt-model)</li><li>[**Client-library SDK**](how-to-guides/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=business-card#run-the-container-with-the-docker-compose-up-command)</li></ul>|
+|[**Invoice model**](concept-invoice.md) | Automated data processing and extraction of key information from sales invoices. | <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</li><li>[**REST API**](quickstarts/get-started-v2-1-sdk-rest-api.md#try-it-prebuilt-model)</li><li>[**Client-library SDK**](quickstarts/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=invoice#run-the-container-with-the-docker-compose-up-command)</li></ul>|
+|[**Receipt model**](concept-receipt.md) | Automated data processing and extraction of key information from sales receipts.| <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</li><li>[**REST API**](quickstarts/get-started-v2-1-sdk-rest-api.md#try-it-prebuilt-model)</li><li>[**Client-library SDK**](how-to-guides/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=receipt#run-the-container-with-the-docker-compose-up-command)</li></ul>|
+|[**ID document model**](concept-id-document.md) | Automated data processing and extraction of key information from US driver's licenses and international passports.| <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</li><li>[**REST API**](quickstarts/get-started-v2-1-sdk-rest-api.md#try-it-prebuilt-model)</li><li>[**Client-library SDK**](how-to-guides/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=id-document#run-the-container-with-the-docker-compose-up-command)</li></ul>|
+|[**Business card model**](concept-business-card.md) | Automated data processing and extraction of key information from business cards.| <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</li><li>[**REST API**](quickstarts/get-started-v2-1-sdk-rest-api.md#try-it-prebuilt-model)</li><li>[**Client-library SDK**](how-to-guides/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=business-card#run-the-container-with-the-docker-compose-up-command)</li></ul>|
This documentation contains the following article types:
> [!div class="checklist"] > > * Try our [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com)
-> * Explore the [**REST API reference documentation**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument) to learn more.
+> * Explore the [**REST API reference documentation**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument) to learn more.
> * If you're familiar with a previous version of the API, see the [**What's new**](./whats-new.md) article to learn of recent changes. ### [Form Recognizer v2.1](#tab/v2-1)
applied-ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/overview.md
Previously updated : 06/06/2022 Last updated : 08/22/2022 recommendations: false adobe-target: true
This section helps you decide which Form Recognizer v3.0 supported feature you s
| What type of document do you want to analyze?| How is the document formatted? | Your best solution | | --|-| -|
-|<ul><li>**W-2 Form**</li></yl>| Is your W-2 document composed in United States English (en-US) text?|<ul><li>If **Yes**, use the [**W-2 Form**](concept-w2.md) model.<li>If **No**, use the [**Layout**](concept-layout.md) or [**General document (preview)**](concept-general-document.md) model.</li></ul>|
+|<ul><li>**W-2 Form**</li></yl>| Is your W-2 document composed in United States English (en-US) text?|<ul><li>If **Yes**, use the [**W-2 Form**](concept-w2.md) model.<li>If **No**, use the [**Layout**](concept-layout.md) or [**General document **](concept-general-document.md) model.</li></ul>|
|<ul><li>**Primarily text content**</li></yl>| Is your document _printed_ in a [supported language](language-support.md#read-layout-and-custom-form-template-model) and are you only interested in text and not tables, selection marks, and the structure?|<ul><li>If **Yes** to text-only extraction, use the [**Read**](concept-read.md) model.<li>If **No**, because you also need structure information, use the [**Layout**](concept-layout.md) model.</li></ul>
-|<ul><li>**General structured document**</li></yl>| Is your document mostly structured and does it contain a few fields and values that may not be covered by the other prebuilt models?|<ul><li>If **Yes**, use the [**General document (preview)**](concept-general-document.md) model.</li><li> If **No**, because the fields and values are complex and highly variable, train and build a [**Custom**](how-to-guides/build-custom-model-v3.md) model.</li></ul>
-|<ul><li>**Invoice**</li></yl>| Is your invoice document composed in a [supported language](language-support.md#invoice-model) text?|<ul><li>If **Yes**, use the [**Invoice**](concept-invoice.md) model.<li>If **No**, use the [**Layout**](concept-layout.md) or [**General document (preview)**](concept-general-document.md) model.</li></ul>
-|<ul><li>**Receipt**</li><li>**Business card**</li></ul>| Is your receipt or business card document composed in English text? | <ul><li>If **Yes**, use the [**Receipt**](concept-receipt.md) or [**Business Card**](concept-business-card.md) model.</li><li>If **No**, use the [**Layout**](concept-layout.md) or [**General document (preview)**](concept-general-document.md) model.</li></ul>|
-|<ul><li>**ID document**</li></ul>| Is your ID document a US driver's license or an international passport?| <ul><li>If **Yes**, use the [**ID document**](concept-id-document.md) model.</li><li>If **No**, use the [**Layout**](concept-layout.md) or [**General document (preview)**](concept-general-document.md) model</li></ul>|
- |<ul><li>**Form** or **Document**</li></ul>| Is your form or document an industry-standard format commonly used in your business or industry?| <ul><li>If **Yes**, use the [**Layout**](concept-layout.md) or [**General document (preview)**](concept-general-document.md).</li><li>If **No**, you can [**Train and build a custom model**](quickstarts/try-sample-label-tool.md#train-a-custom-form-model).
+|<ul><li>**General structured document**</li></yl>| Is your document mostly structured and does it contain a few fields and values that may not be covered by the other prebuilt models?|<ul><li>If **Yes**, use the [**General document **](concept-general-document.md) model.</li><li> If **No**, because the fields and values are complex and highly variable, train and build a [**Custom**](how-to-guides/build-custom-model-v3.md) model.</li></ul>
+|<ul><li>**Invoice**</li></yl>| Is your invoice document composed in a [supported language](language-support.md#invoice-model) text?|<ul><li>If **Yes**, use the [**Invoice**](concept-invoice.md) model.<li>If **No**, use the [**Layout**](concept-layout.md) or [**General document **](concept-general-document.md) model.</li></ul>
+|<ul><li>**Receipt**</li><li>**Business card**</li></ul>| Is your receipt or business card document composed in English text? | <ul><li>If **Yes**, use the [**Receipt**](concept-receipt.md) or [**Business Card**](concept-business-card.md) model.</li><li>If **No**, use the [**Layout**](concept-layout.md) or [**General document **](concept-general-document.md) model.</li></ul>|
+|<ul><li>**ID document**</li></ul>| Is your ID document a US driver's license or an international passport?| <ul><li>If **Yes**, use the [**ID document**](concept-id-document.md) model.</li><li>If **No**, use the [**Layout**](concept-layout.md) or [**General document **](concept-general-document.md) model</li></ul>|
+ |<ul><li>**Form** or **Document**</li></ul>| Is your form or document an industry-standard format commonly used in your business or industry?| <ul><li>If **Yes**, use the [**Layout**](concept-layout.md) or [**General document **](concept-general-document.md).</li><li>If **No**, you can [**Train and build a custom model**](quickstarts/try-sample-label-tool.md#train-a-custom-form-model).
## Form Recognizer features and development options
-### [Form Recognizer preview (v3.0)](#tab/v3-0)
+### [Form Recognizer v3.0](#tab/v3-0)
The following features and development options are supported by the Form Recognizer service v3.0. Use the links in the table to learn more about each feature and browse the API references. | Feature | Description | Development options | |-|--|-|
-|[🆕 **Read**](concept-read.md)|Extract text lines, words, detected languages, and handwritten style if detected.|<ul ><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/read)</li><li>[**REST API**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-rest-api)</li><li>[**C# SDK**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-csharp)</li><li>[**Python SDK**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-python)</li><li>[**Java SDK**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-java)</li><li>[**JavaScript**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-javascript)</li></ul> |
-|[🆕 **W-2 Form**](concept-w2.md) | Extract information reported in each box on a W-2 form.|<ul ><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=tax.us.w2)<li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#prebuilt-model)</li></ul> |
-|[🆕 **General document model**](concept-general-document.md)|Extract text, tables, structure, key-value pairs and, named entities.|<ul ><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/document)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md#reference-table)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#general-document-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#general-document-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#general-document-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#general-document-model)</li></ul> |
-|[**Layout model**](concept-layout.md) | Extract text, selection marks, and tables structures, along with their bounding box coordinates, from forms and documents.</br></br> Layout API has been updated to a prebuilt model. | <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/layout)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md#reference-table)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#layout-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#layout-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#layout-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#layout-model)</li></ul>|
-|[**Custom model (updated)**](concept-custom.md) | Extraction and analysis of data from forms and documents specific to distinct business data and use cases.<ul><li>Custom model API v3.0 supports **signature detection for custom template (custom form) models**.</br></br></li><li>Custom model API v3.0 offers a new model type **Custom Neural** or custom document to analyze unstructured documents.</li></ul>| [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md)</li></ul>|
-|[**Invoice model**](concept-invoice.md) | Automated data processing and extraction of key information from sales invoices. | <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=invoice)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#prebuilt-model)</li></ul>|
-|[**Receipt model (updated)**](concept-receipt.md) | Automated data processing and extraction of key information from sales receipts.</br></br>Receipt model v3.0 supports processing of **single-page hotel receipts**.| <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=receipt)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#prebuilt-model)</li></ul>|
-|[**ID document model (updated)**](concept-id-document.md) |Automated data processing and extraction of key information from US driver's licenses and international passports.</br></br>Prebuilt ID document API supports the **extraction of endorsements, restrictions, and vehicle classifications from US driver's licenses**. |<ul><li> [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=idDocument)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#prebuilt-model)</li></ul>|
-|[**Business card model**](concept-business-card.md) |Automated data processing and extraction of key information from business cards.| <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=businessCard)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#prebuilt-model)</li></ul>|
+|[ **Read**](concept-read.md)|Extract text lines, words, detected languages, and handwritten style if detected.|<ul ><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/read)</li><li>[**REST API**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-rest-api)</li><li>[**C# SDK**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-csharp)</li><li>[**Python SDK**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-python)</li><li>[**Java SDK**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-java)</li><li>[**JavaScript**](how-to-guides/use-prebuilt-read.md?pivots=programming-language-javascript)</li></ul> |
+|[ **W-2 Form**](concept-w2.md) | Extract information reported in each box on a W-2 form.|<ul ><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=tax.us.w2)<li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li></ul> |
+|[**General document model**](concept-general-document.md)|Extract text, tables, structure, key-value pairs and, named entities.|<ul ><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/document)</li><li>[**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#general-document-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#general-document-model)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#general-document-model)</li><li>[**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#general-document-model)</li></ul> |
+|[**Layout model**](concept-layout.md) | Extract text, selection marks, and tables structures, along with their bounding box coordinates, from forms and documents.</br></br> Layout API has been updated to a prebuilt model. | <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/layout)</li><li>[**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#layout-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#layout-model)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#layout-model)</li><li>[**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#layout-model)</li></ul>|
+|[**Custom model (updated)**](concept-custom.md) | Extraction and analysis of data from forms and documents specific to distinct business data and use cases.<ul><li>Custom model API v3.0 supports **signature detection for custom template (custom form) models**.</br></br></li><li>Custom model API v3.0 offers a new model type **Custom Neural** or custom document to analyze unstructured documents.</li></ul>| [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects)</li><li>[**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md)</li></ul>|
+|[**Invoice model**](concept-invoice.md) | Automated data processing and extraction of key information from sales invoices. | <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=invoice)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li></ul>|
+|[**Receipt model (updated)**](concept-receipt.md) | Automated data processing and extraction of key information from sales receipts.</br></br>Receipt model v3.0 supports processing of **single-page hotel receipts**.| <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=receipt)</li><li>[**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li></ul>|
+|[**ID document model (updated)**](concept-id-document.md) |Automated data processing and extraction of key information from US driver's licenses and international passports.</br></br>Prebuilt ID document API supports the **extraction of endorsements, restrictions, and vehicle classifications from US driver's licenses**. |<ul><li> [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=idDocument)</li><li>[**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li></ul>|
+|[**Business card model**](concept-business-card.md) |Automated data processing and extraction of key information from business cards.| <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=businessCard)</li><li>[**REST API**](quickstarts/get-started-v3-sdk-rest-api.md)</li><li>[**C# SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md#prebuilt-model)</li></ul>|
### [Form Recognizer GA (v2.1)](#tab/v2-1)
The following features are supported by Form Recognizer v2.1. Use the links in t
| Feature | Description | Development options | |-|--|-|
-|[**Layout API**](concept-layout.md) | Extraction and analysis of text, selection marks, tables, and bounding box coordinates, from forms and documents. | <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-layout)</li><li>[**REST API**](quickstarts/get-started-sdk-rest-api.md#try-it-layout-model)</li><li>[**Client-library SDK**](quickstarts/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?branch=main&tabs=layout#run-the-container-with-the-docker-compose-up-command)</li></ul>|
+|[**Layout API**](concept-layout.md) | Extraction and analysis of text, selection marks, tables, and bounding box coordinates, from forms and documents. | <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-layout)</li><li>[**REST API**](quickstarts/get-started-v2-1-sdk-rest-api.md#try-it-layout-model)</li><li>[**Client-library SDK**](quickstarts/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?branch=main&tabs=layout#run-the-container-with-the-docker-compose-up-command)</li></ul>|
|[**Custom model**](concept-custom.md) | Extraction and analysis of data from forms and documents specific to distinct business data and use cases.| <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#train-a-custom-form-model)</li><li>[**REST API**](quickstarts/try-sdk-rest-api.md)</li><li>[**Client-library SDK**](how-to-guides/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=custom#run-the-container-with-the-docker-compose-up-command)</li></ul>|
-|[**Invoice model**](concept-invoice.md) | Automated data processing and extraction of key information from sales invoices. | <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</li><li>[**REST API**](quickstarts/get-started-sdk-rest-api.md#try-it-prebuilt-model)</li><li>[**Client-library SDK**](quickstarts/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=invoice#run-the-container-with-the-docker-compose-up-command)</li></ul>|
-|[**Receipt model**](concept-receipt.md) | Automated data processing and extraction of key information from sales receipts.| <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</li><li>[**REST API**](quickstarts/get-started-sdk-rest-api.md#try-it-prebuilt-model)</li><li>[**Client-library SDK**](how-to-guides/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=receipt#run-the-container-with-the-docker-compose-up-command)</li></ul>|
-|[**ID document model**](concept-id-document.md) | Automated data processing and extraction of key information from US driver's licenses and international passports.| <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</li><li>[**REST API**](quickstarts/get-started-sdk-rest-api.md#try-it-prebuilt-model)</li><li>[**Client-library SDK**](how-to-guides/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=id-document#run-the-container-with-the-docker-compose-up-command)</li></ul>|
-|[**Business card model**](concept-business-card.md) | Automated data processing and extraction of key information from business cards.| <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</li><li>[**REST API**](quickstarts/get-started-sdk-rest-api.md#try-it-prebuilt-model)</li><li>[**Client-library SDK**](how-to-guides/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=business-card#run-the-container-with-the-docker-compose-up-command)</li></ul>|
+|[**Invoice model**](concept-invoice.md) | Automated data processing and extraction of key information from sales invoices. | <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</li><li>[**REST API**](quickstarts/get-started-v2-1-sdk-rest-api.md#try-it-prebuilt-model)</li><li>[**Client-library SDK**](quickstarts/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=invoice#run-the-container-with-the-docker-compose-up-command)</li></ul>|
+|[**Receipt model**](concept-receipt.md) | Automated data processing and extraction of key information from sales receipts.| <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</li><li>[**REST API**](quickstarts/get-started-v2-1-sdk-rest-api.md#try-it-prebuilt-model)</li><li>[**Client-library SDK**](how-to-guides/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=receipt#run-the-container-with-the-docker-compose-up-command)</li></ul>|
+|[**ID document model**](concept-id-document.md) | Automated data processing and extraction of key information from US driver's licenses and international passports.| <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</li><li>[**REST API**](quickstarts/get-started-v2-1-sdk-rest-api.md#try-it-prebuilt-model)</li><li>[**Client-library SDK**](how-to-guides/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=id-document#run-the-container-with-the-docker-compose-up-command)</li></ul>|
+|[**Business card model**](concept-business-card.md) | Automated data processing and extraction of key information from business cards.| <ul><li>[**Form Recognizer labeling tool**](quickstarts/try-sample-label-tool.md#analyze-using-a-prebuilt-model)</li><li>[**REST API**](quickstarts/get-started-v2-1-sdk-rest-api.md#try-it-prebuilt-model)</li><li>[**Client-library SDK**](how-to-guides/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=business-card#run-the-container-with-the-docker-compose-up-command)</li></ul>|
This documentation contains the following article types:
> [!div class="checklist"] > > * Try our [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com)
-> * Explore the [**REST API reference documentation**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument) to learn more.
+> * Explore the [**REST API reference documentation**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument) to learn more.
> * If you're familiar with a previous version of the API, see the [**What's new**](./whats-new.md) article to learn of recent changes. ### [Form Recognizer v2.1](#tab/v2-1)
applied-ai-services Get Started V2 1 Sdk Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/get-started-v2-1-sdk-rest-api.md
+
+ Title: "Quickstart: Form Recognizer client library SDKs | REST API"
+
+description: Use the Form Recognizer client library SDKs or REST API to create a forms processing app that extracts key/value pairs and table data from your custom documents.
+++++ Last updated : 06/21/2021+
+zone_pivot_groups: programming-languages-set-formre
+recommendations: false
++
+# Get started with Form Recognizer client library SDKs or REST API
+
+Get started with Azure Form Recognizer using the programming language of your choice. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract key-value pairs, text, and tables from your documents. You can easily call Form Recognizer models by integrating our client library SDKs into your workflows and applications. We recommend that you use the free service when you're learning the technology. Remember that the number of free pages is limited to 500 per month.
+
+To learn more about Form Recognizer features and development options, visit our [Overview](../overview.md#form-recognizer-features-and-development-options) page.
+++++++++++++++
applied-ai-services Get Started V3 Sdk Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/get-started-v3-sdk-rest-api.md
+
+ Title: "Quickstart: Form Recognizer SDKs | REST API v3.0"
+
+description: Use a Form Recognizer SDK or the REST API to create a forms processing app that extracts key data from your documents.
+++++ Last updated : 08/22/2022+
+zone_pivot_groups: programming-languages-set-formre
+recommendations: false
++
+# Quickstart: Form Recognizer SDKs | REST API v3.0
+
+Get started with the latest version of Azure Form Recognizer. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract key-value pairs, text, tables and key data from your documents. You can easily integrate Form Recognizer models into your workflows and applications by using an SDK in the programming language of your choice or calling the REST API. For this quickstart, we recommend that you use the free service while you're learning the technology. Remember that the number of free pages is limited to 500 per month.
+
+To learn more about Form Recognizer features and development options, visit our [Overview](../overview.md#form-recognizer-features-and-development-options) page.
++++++++++++++++
+That's it, congratulations!
+
+In this quickstart, you used a form Form Recognizer model to analyze various forms and documents. Next, explore the Form Recognizer Studio and reference documentation to learn about Form Recognizer API in depth.
+
+## Next steps
+
+>[!div class="nextstepaction"]
+> [**Try the Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio)
+
+> [!div class="nextstepaction"]
+> [**Explore the Form Recognizer REST API v3.0**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)
applied-ai-services Try Sample Label Tool https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/try-sample-label-tool.md
keywords: document processing
>[!TIP] >
-> * For an enhanced experience and advanced model quality, try the [Form Recognizer v3.0 Studio (preview)](https://formrecognizer.appliedai.azure.com/studio).
+> * For an enhanced experience and advanced model quality, try the [Form Recognizer v3.0 Studio ](https://formrecognizer.appliedai.azure.com/studio).
> * The v3.0 Studio supports any model trained with v2.1 labeled data. > * You can refer to the API migration guide for detailed information about migrating from v2.1 to v3.0.
-> * *See* our [**REST API**](try-v3-rest-api.md) or [**C#**](try-v3-csharp-sdk.md), [**Java**](try-v3-java-sdk.md), [**JavaScript**](try-v3-javascript-sdk.md), or [Python](try-v3-python-sdk.md) SDK quickstarts to get started with the V3.0 preview.
+> * *See* our [**REST API**](get-started-v3-sdk-rest-api.md) or [**C#**](get-started-v3-sdk-rest-api.md), [**Java**](get-started-v3-sdk-rest-api.md), [**JavaScript**](get-started-v3-sdk-rest-api.md), or [Python](get-started-v3-sdk-rest-api.md) SDK quickstarts to get started with the v3.0 version.
The Form Recognizer Sample Labeling tool is an open source tool that enables you to test the latest features of Azure Form Recognizer and Optical Character Recognition (OCR)
applied-ai-services Try V3 Form Recognizer Studio https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/try-v3-form-recognizer-studio.md
Title: "Quickstart: Form Recognizer Studio | Preview"
+ Title: "Quickstart: Form Recognizer Studio | v3.0"
-description: Form and document processing, data extraction, and analysis using Form Recognizer Studio (preview)
+description: Form and document processing, data extraction, and analysis using Form Recognizer Studio
-# Get started: Form Recognizer Studio | Preview
+# Get started: Form Recognizer Studio | v3.0
->[!NOTE]
-> Form Recognizer Studio is currently in public preview. Some features may not be supported or have limited capabilities.
-[Form Recognizer Studio preview](https://formrecognizer.appliedai.azure.com/) is an online tool for visually exploring, understanding, and integrating features from the Form Recognizer service in your applications. You can get started by exploring the pre-trained models with sample or your own documents. You can also create projects to build custom template models and reference the models in your applications using the [Python SDK preview](try-v3-python-sdk.md) and other quickstarts.
+[Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/) is an online tool for visually exploring, understanding, and integrating features from the Form Recognizer service in your applications. You can get started by exploring the pre-trained models with sample or your own documents. You can also create projects to build custom template models and reference the models in your applications using the [Python SDK](get-started-v3-sdk-rest-api.md) and other quickstarts.
:::image border="true" type="content" source="../media/quickstarts/form-recognizer-demo-preview3.gif" alt-text="Selecting the Layout API to analyze a newspaper document in the Form Recognizer Studio.":::
Prebuilt models help you add Form Recognizer features to your apps without having to build, train, and publish your own models. You can choose from several prebuilt models, each of which has its own set of supported data fields. The choice of model to use for the analyze operation depends on the type of document to be analyzed. The following prebuilt models are currently supported by Form Recognizer:
-* [🆕 **General document**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=document): extract text, tables, structure, key-value pairs and named entities.
-* [🆕**W-2**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=tax.us.w2): extract text and key information from W-2 tax forms.
-* [🆕 **Read**](https://formrecognizer.appliedai.azure.com/studio/read): extract text lines, words, their locations, detected languages, and handwritten style if detected from documents (PDF, TIFF) and images (JPG, PNG, BMP).
+* [**General document**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=document): extract text, tables, structure, key-value pairs and named entities.
+* [**W-2**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=tax.us.w2): extract text and key information from W-2 tax forms.
+* [ **Read**](https://formrecognizer.appliedai.azure.com/studio/read): extract text lines, words, their locations, detected languages, and handwritten style if detected from documents (PDF, TIFF) and images (JPG, PNG, BMP).
* [**Layout**](https://formrecognizer.appliedai.azure.com/studio/layout): extract text, tables, selection marks, and structure information from documents (PDF, TIFF) and images (JPG, PNG, BMP). * [**Invoice**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=invoice): extract text, selection marks, tables, key-value pairs, and key information from invoices. * [**Receipt**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=receipt): extract text and key information from receipts.
To label for signature detection: (Custom form only)
## Next steps * Follow our [**Form Recognizer v3.0 migration guide**](../v3-migration-guide.md) to learn the differences from the previous version of the REST API.
-* Explore our [**preview SDK quickstarts**](try-v3-python-sdk.md) to try the preview features in your applications using the new SDKs.
-* Refer to our [**preview REST API quickstarts**](try-v3-rest-api.md) to try the preview features using the new RESt API.
+* Explore our [**v3.0 SDK quickstarts**](get-started-v3-sdk-rest-api.md) to try the v3.0 features in your applications using the new SDKs.
+* Refer to our [**v3.0 REST API quickstarts**](get-started-v3-sdk-rest-api.md) to try the v3.0 features using the new REST API.
-[Get started with the Form Recognizer Studio preview](https://formrecognizer.appliedai.azure.com).
+[Get started with the Form Recognizer Studio](https://formrecognizer.appliedai.azure.com).
applied-ai-services Try V3 Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/try-v3-rest-api.md
- Title: "Quickstart: Form Recognizer REST API v3.0 | Preview"-
-description: Form and document processing, data extraction, and analysis using Form Recognizer REST API v3.0 (preview)
----- Previously updated : 06/28/2022---
-# Get started: Form Recognizer REST API 2022-06-30-preview
-
-<!-- markdownlint-disable MD036 -->
-
->[!NOTE]
-> Form Recognizer v3.0 is currently in public preview. Some features may not be supported or have limited capabilities.
-The current API version is **2022-06-30-preview**.
-
-| [Form Recognizer REST API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument) | [Azure SDKS](https://azure.github.io/azure-sdk/releases/latest/https://docsupdatetracker.net/index.html) |
-
-Get started with Azure Form Recognizer using the REST API. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract key-value pairs, text, and tables from your documents. You can easily call Form Recognizer models using the REST API or by integrating our client library SDKs into your workflows and applications. We recommend that you use the free service when you're learning the technology. Remember that the number of free pages is limited to 500 per month.
-
-To learn more about Form Recognizer features and development options, visit our [Overview](../overview.md#form-recognizer-features-and-development-options) page.
-
-## Form Recognizer models
-
- The REST API supports the following models and capabilities:
-
-**Document Analysis**
-
-* 🆕 Read—Analyze and extract printed (typeface) and handwritten text lines, words, locations, and detected languages.
-* 🆕General document—Analyze and extract text, tables, structure, key-value pairs, and named entities.
-* LayoutΓÇöAnalyze and extract tables, lines, words, and selection marks from documents, without the need to train a model.
-
-**Prebuilt Models**
-
-* 🆕 W-2—Analyze and extract fields from US W-2 tax documents (used to report income), using a pre-trained W-2 model.
-* InvoicesΓÇöAnalyze and extract common fields from invoices, using a pre-trained invoice model.
-* ReceiptsΓÇöAnalyze and extract common fields from receipts, using a pre-trained receipt model.
-* ID documentsΓÇöAnalyze and extract common fields from ID documents like passports or driver's licenses, using a pre-trained ID documents model.
-* Business CardsΓÇöAnalyze and extract common fields from business cards, using a pre-trained business cards model.
-
-**Custom Models**
-
-* CustomΓÇöAnalyze and extract form fields and other content from your custom forms, using models you trained with your own form types.
-* Composed customΓÇöCompose a collection of custom models and assign them to a single model ID.
-
-## Prerequisites
-
-* Azure subscription - [Create one for free](https://azure.microsoft.com/free/cognitive-services)
-
-* curl command line tool installed.
-
- * [Windows](https://curl.haxx.se/windows/)
- * [Mac or Linux](https://learn2torials.com/thread/how-to-install-curl-on-mac-or-linux-(ubuntu)-or-windows)
-
-* **PowerShell version 7.*+** (or a similar command-line application.):
- * [Windows](/powershell/scripting/install/installing-powershell-on-windows?view=powershell-7.2&preserve-view=true)
- * [macOS](/powershell/scripting/install/installing-powershell-on-macos?view=powershell-7.2&preserve-view=true)
- * [Linux](/powershell/scripting/install/installing-powershell-on-linux?view=powershell-7.2&preserve-view=true)
-
-* To check your PowerShell version, type the following:
- * Windows: `Get-Host | Select-Object Version`
- * macOS or Linux: `$PSVersionTable`
-
-* A Form Recognizer (single-service) or Cognitive Services (multi-service) resource. Once you have your Azure subscription, create a [single-service](https://portal.azure.com/#create/Microsoft.CognitiveServicesFormRecognizer) or [multi-service](https://portal.azure.com/#create/Microsoft.CognitiveServicesAllInOne) Form Recognizer resource in the Azure portal to get your key and endpoint. You can use the free pricing tier (`F0`) to try the service, and upgrade later to a paid tier for production.
-
-> [!TIP]
-> Create a Cognitive Services resource if you plan to access multiple cognitive services under a single endpoint/key. For Form Recognizer access only, create a Form Recognizer resource. Please note that you'll need a single-service resource if you intend to use [Azure Active Directory authentication](../../../active-directory/authentication/overview-authentication.md).
-
-* After your resource deploys, select **Go to resource**. You need the key and endpoint from the resource you create to connect your application to the Form Recognizer API. You'll paste your key and endpoint into the code below later in the quickstart:
-
- :::image type="content" source="../media/containers/keys-and-endpoint.png" alt-text="Screenshot: keys and endpoint location in the Azure portal.":::
-
-## Analyze documents and get results
-
- A POST request is used to analyze documents with a prebuilt or custom model. A GET request is used to retrieve the result of a document analysis call. The `modelId` is used with POST and `resultId` with GET operations.
-
-### Analyze document (POST Request)
-
-Before you run the cURL command, make the following changes:
-
-1. Replace `{endpoint}` with the endpoint value from your Azure portal Form Recognizer instance.
-
-1. Replace `{key}` with the key value from your Azure portal Form Recognizer instance.
-
-1. Using the table below as a reference, replace `{modelID}` and `{your-document-url}` with your desired values.
-
-1. You'll need a document file at a URL. For this quickstart, you can use the sample forms provided in the table below for each feature.
-
-> [!IMPORTANT]
-> Remember to remove the key from your code when you're done, and never post it publicly. For production, use a secure way of storing and accessing your credentials like [Azure Key Vault](../../../key-vault/general/overview.md). See the Cognitive Services [security](../../../cognitive-services/cognitive-services-security.md) article for more information.
-
-#### POST request
-
-```bash
-curl -v -i POST "{endpoint}/formrecognizer/documentModels/{modelID}:analyze?api-version=2022-06-30-preview" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {key}" --data-ascii "{'urlSource': '{your-document-url}'}"
-```
-
-#### Reference table
-
-| **Feature** | **{modelID}** | **{your-document-url}** |
-| | |--|
-| General Document | prebuilt-document | [Sample document](https://raw.githubusercontent.com/Azure-Samples/cognitive-services-REST-api-samples/master/curl/form-recognizer/sample-layout.pdf) |
-| Read | prebuilt-read | [Sample document](https://raw.githubusercontent.com/Azure-Samples/cognitive-services-REST-api-samples/master/curl/form-recognizer/rest-api/read.png) |
-| Layout | prebuilt-layout | [Sample document](https://raw.githubusercontent.com/Azure-Samples/cognitive-services-REST-api-samples/master/curl/form-recognizer/rest-api/layout.png) |
-| W-2 | prebuilt-tax.us.w2 | [Sample W-2](https://raw.githubusercontent.com/Azure-Samples/cognitive-services-REST-api-samples/master/curl/form-recognizer/rest-api/w2.png) |
-| Invoices | prebuilt-invoice | [Sample invoice](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/raw/master/curl/form-recognizer/rest-api/invoice.pdf) |
-| Receipts | prebuilt-receipt | [Sample receipt](https://raw.githubusercontent.com/Azure-Samples/cognitive-services-REST-api-samples/master/curl/form-recognizer/rest-api/receipt.png) |
-| ID Documents | prebuilt-idDocument | [Sample ID document](https://raw.githubusercontent.com/Azure-Samples/cognitive-services-REST-api-samples/master/curl/form-recognizer/rest-api/identity_documents.png) |
-| Business Cards | prebuilt-businessCard | [Sample business card](https://raw.githubusercontent.com/Azure-Samples/cognitive-services-REST-api-samples/de5e0d8982ab754823c54de47a47e8e499351523/curl/form-recognizer/rest-api/business_card.jpg) |
-
-#### POST response
-
-You'll receive a `202 (Success)` response that includes an **Operation-location** header. The value of this header contains a `resultID` that can be queried to get the status of the asynchronous operation:
--
-### Get analyze results (GET Request)
-
-After you've called the [**Analyze document**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument) API, call the [**Get analyze result**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/GetAnalyzeDocumentResult) API to get the status of the operation and the extracted data. Before you run the command, make these changes:
-
-1. Replace `{POST response}` Operation-location header from the [POST response](#post-response).
-
-1. Replace `{key}` with the key value from your Form Recognizer instance in the Azure portal.
-
-<!-- markdownlint-disable MD024 -->
-
-#### GET request
-
-```bash
-curl -v -X GET "{POST response}" -H "Ocp-Apim-Subscription-Key: {key}"
-```
-
-#### Examine the response
-
-You'll receive a `200 (Success)` response with JSON output. The first field, `"status"`, indicates the status of the operation. If the operation isn't complete, the value of `"status"` will be `"running"` or `"notStarted"`, and you should call the API again, either manually or through a script. We recommend an interval of one second or more between calls.
-
-#### Sample response for prebuilt-invoice
-
-```json
-{
- "status": "succeeded",
- "createdDateTime": "2022-03-25T19:31:37Z",
- "lastUpdatedDateTime": "2022-03-25T19:31:43Z",
- "analyzeResult": {
- "apiVersion": "2022-06-30",
- "modelId": "prebuilt-invoice",
- "stringIndexType": "textElements"...
- ..."pages": [
- {
- "pageNumber": 1,
- "angle": 0,
- "width": 8.5,
- "height": 11,
- "unit": "inch",
- "words": [
- {
- "content": "CONTOSO",
- "boundingBox": [
- 0.5911,
- 0.6857,
- 1.7451,
- 0.6857,
- 1.7451,
- 0.8664,
- 0.5911,
- 0.8664
- ],
- "confidence": 1,
- "span": {
- "offset": 0,
- "length": 7
- }
- },
-}
-```
-
-#### Supported document fields
-
-The prebuilt models extract pre-defined sets of document fields. See [Model data extraction](../concept-model-overview.md#model-data-extraction) for extracted field names, types, descriptions, and examples.
-
-## Next steps
-
-In this quickstart, you used the Form Recognizer REST API preview (v3.0) to analyze forms in different ways. Next, further explore the Form Recognizer Studio and latest reference documentation to learn more about the Form Recognizer API.
-
->[!div class="nextstepaction"]
-> [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio)
-
-> [!div class="nextstepaction"]
-> [REST API preview (v3.0) reference documentation](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)
applied-ai-services Sdk Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/sdk-overview.md
+
+ Title: About the Form Recognizer SDK?
+
+description: The Form Recognizer software development kit (SDK) exposes Form Recognizer models, features and capabilities, making it easier to develop document-processing applications.
+++++ Last updated : 08/22/2022+
+recommendations: false
++
+<!-- markdownlint-disable MD024 -->
+<!-- markdownlint-disable MD023 -->
+
+# What is the Form Recognizer SDK?
+
+Azure Cognitive Services Form Recognizer is a cloud service that uses machine learning to analyze text and structured data from documents. The Form Recognizer software development kit (SDK) is a set of libraries and tools that enable you to easily integrate Form Recognizer models and capabilities into your applications. Form Recognizer SDK is available across platforms in C#/.NET, Java, JavaScript, and Python programming languages.
+
+## Supported languages
+
+Form Recognizer SDK supports the following languages and platforms:
+
+> [!NOTE]
+>
+> * Form Recognizer SDKs are currently not supported by Form Recognizer REST API 2022-08-31.
+> * [REST API 2022-06-30](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument) and earlier releases support the current SDKs.
+
+| Programming language/SDK | Package| Azure SDK client-library |Supported API version| Platform support |
+|:-:|:-|:-| :-|--|
+|[C#/4.0.0-beta.5](quickstarts/get-started-v3-sdk-rest-api.md?pivots=programming-language-csharp#set-up)| [NuGet](https://www.nuget.org/packages/Azure.AI.FormRecognizer/4.0.0-beta.5) | [Azure SDK for .NET](https://azuresdkdocs.blob.core.windows.net/$web/dotnet/Azure.AI.FormRecognizer/4.0.0-beta.5/https://docsupdatetracker.net/index.html)|[2022-06-30-preview, 2022-01-30-preview, 2021-09-30-preview, **v2.1-ga**, v2.0](https://westus.dev.cognitive.microsoft.com/docs/services?pattern=form+recognizer) |[Windows, macOS, Linux, Docker](/dotnet.microsoft.com/download)|
+|[Jav?pivots=programming-language-java#set-up) |[Maven](https://search.maven.org/artifact/com.azure/azure-ai-formrecognizer/4.0.0-beta.5/jar) | [Azure SDK for Java](https://azuresdkdocs.blob.core.windows.net/$web/java/azure-ai-formrecognizer/4.0.0-beta.6/https://docsupdatetracker.net/index.html)|[2022-06-30-preview, 2022-01-30-preview, 2021-09-30-preview, **v2.1-ga**, v2.0](https://westus.dev.cognitive.microsoft.com/docs/services?pattern=form+recognizer)|[Windows, macOS, Linux](/java/openjdk/install)|
+|[JavaScript/4.0.0-beta.6](quickstarts/get-started-v3-sdk-rest-api.md?pivots=programming-language-javascript#set-up)| [npm](https://www.npmjs.com/package/@azure/ai-form-recognizer/v/4.0.0-beta.6)| [Azure SDK for JavaScript](https://azuresdkdocs.blob.core.windows.net/$web/javascript/azure-ai-form-recognizer/4.0.0-beta.6/https://docsupdatetracker.net/index.html) | [2022-06-30-preview, 2022-01-30-preview, 2021-09-30-preview, **v2.1-ga**, v2.0](https://westus.dev.cognitive.microsoft.com/docs/services?pattern=form+recognizer) | [Browser, Windows, macOS, Linux](https://nodejs.org/en/download/) |
+|[Python/3.2.0b6](quickstarts/get-started-v3-sdk-rest-api.md?pivots=programming-language-python#set-up) | [PyPI](https://pypi.org/project/azure-ai-formrecognizer/3.2.0b6/)| [Azure SDK for Python](https://azuresdkdocs.blob.core.windows.net/$web/python/azure-ai-formrecognizer/3.2.0b6/https://docsupdatetracker.net/index.html)| [2022-06-30-preview, 2022-01-30-preview, 2021-09-30-preview, **v2.1-ga**, v2.0](https://westus.dev.cognitive.microsoft.com/docs/services?pattern=form+recognizer) |[Windows, macOS, Linux](/azure/developer/python/configure-local-development-environment?tabs=windows%2Capt%2Ccmd#use-the-azure-cli)
+
+## How to use the Form Recognizer SDK in your applications
+
+The Form Recognizer SDK enables the use and management of the Form Recognizer service in your application. The SDK builds on the underlying Form Recognizer REST API allowing you to easily use those APIs within your programming language paradigm. Here's how you use the Form Recognizer SDK for your preferred language:
+
+### 1. Install the SDK client library
+
+### [C#/.NET](#tab/csharp)
+
+```dotnetcli
+dotnet add package Azure.AI.FormRecognizer --version 4.0.0-beta.5
+```
+
+```powershell
+Install-Package Azure.AI.FormRecognizer -Version 4.0.0-beta.5
+```
+
+### [Java](#tab/java)
+
+```xml
+ <dependency>
+ <groupId>com.azure</groupId>
+ <artifactId>azure-ai-formrecognizer</artifactId>
+ <version>4.0.0-beta.5</version>
+ </dependency>
+```
+
+```kotlin
+implementation("com.azure:azure-ai-formrecognizer:4.0.0-beta.5")
+```
+
+### [JavaScript](#tab/javascript)
+
+```javascript
+npm i @azure/ai-form-recognizer@4.0.0-beta.6
+```
+
+### [Python](#tab/python)
+
+```python
+pip install azure-ai-formrecognizer==3.2.0b6
+```
+++
+### 2. Import the SDK client library into your application
+
+### [C#/.NET](#tab/csharp)
+
+```csharp
+using Azure;
+using Azure.AI.FormRecognizer.DocumentAnalysis;
+```
+
+### [Java](#tab/java)
+
+```java
+import com.azure.ai.formrecognizer.*;
+import com.azure.ai.formrecognizer.models.*;
+import com.azure.ai.formrecognizer.DocumentAnalysisClient.*;
+
+import com.azure.core.credential.AzureKeyCredential;
+```
+
+### [JavaScript](#tab/javascript)
+
+```javascript
+const { AzureKeyCredential, DocumentAnalysisClient } = require("@azure/ai-form-recognizer");
+```
+
+### [Python](#tab/python)
+
+```python
+from azure.ai.formrecognizer import DocumentAnalysisClient
+from azure.core.credentials import AzureKeyCredential
+```
+++
+### 3. Set up authentication
+
+There are two supported methods for authentication
+
+* Use a [Form Recognizer API key](#use-your-api-key) with AzureKeyCredential from azure.core.credentials.
+
+* Use a [token credential from azure-identity](#use-an-azure-active-directory-azure-ad-token-credential) to authenticate with [Azure Active Directory](/azure/active-directory/fundamentals/active-directory-whatis).
+
+#### Use your API key
+
+Here's where to find your Form Recognizer API key in the Azure portal:
++
+### [C#/.NET](#tab/csharp)
+
+```csharp
+
+//set `<your-endpoint>` and `<your-key>` variables with the values from the Azure portal to create your `AzureKeyCredential` and `DocumentAnalysisClient` instance
+string key = "<your-key>";
+string endpoint = "<your-endpoint>";
+AzureKeyCredential credential = new AzureKeyCredential(key);
+DocumentAnalysisClient client = new DocumentAnalysisClient(new Uri(endpoint), credential);
+```
+
+### [Java](#tab/java)
+
+```java
+
+// create your `DocumentAnalysisClient` instance and `AzureKeyCredential` variable
+DocumentAnalysisClient client = new DocumentAnalysisClientBuilder()
+ .credential(new AzureKeyCredential("<your-key>"))
+ .endpoint("<your-endpoint>")
+ .buildClient();
+```
+
+### [JavaScript](#tab/javascript)
+
+```javascript
+
+// create your `DocumentAnalysisClient` instance and `AzureKeyCredential` variable
+async function main() {
+ const client = new DocumentAnalysisClient("<your-endpoint>", new AzureKeyCredential("<your-key>"));
+```
+
+### [Python](#tab/python)
+
+```python
+
+# create your `DocumentAnalysisClient` instance and `AzureKeyCredential` variable
+ document_analysis_client = DocumentAnalysisClient(endpoint="<your-endpoint>", credential=AzureKeyCredential("<your-key>"))
+```
+++
+#### Use an Azure Active Directory (Azure AD) token credential
+
+> [!NOTE]
+> Regional endpoints do not support AAD authentication. Create a [custom subdomain](/azure/cognitive-services/authentication?tabs=powershell#create-a-resource-with-a-custom-subdomain) for your resource in order to use this type of authentication.
+
+Authorization is easiest using the `DefaultAzureCredential`. It provides a default token credential, based upon the running environment, capable of handling most Azure authentication scenarios.
+
+### [C#/.NET](#tab/csharp)
+
+Here's how to acquire and use the [DefaultAzureCredential](/dotnet/api/azure.identity.defaultazurecredential?view=azure-dotnet&preserve-view=true) for .NET applications:
+
+1. Install the [Azure Identity library for .NET](/dotnet/api/overview/azure/identity-readme):
+
+ ```console
+ dotnet add package Azure.Identity
+ ```
+
+ ```powershell
+ Install-Package Azure.Identity
+ ```
+
+1. [Register an Azure AD application and create a new service principal](/azure/cognitive-services/authentication?tabs=powershell#assign-a-role-to-a-service-principal).
+
+1. Grant access to Form Recognizer by assigning the **`Cognitive Services User`** role to your service principal.
+
+1. Set the values of the client ID, tenant ID, and client secret in the Azure AD application as environment variables: **`AZURE_CLIENT_ID`**, **`AZURE_TENANT_ID`**, and **`AZURE_CLIENT_SECRET`**, respectively.
+
+1. Create your **`DocumentAnalysisClient`** instance including the **`DefaultAzureCredential`**:
+
+ ```csharp
+ string endpoint = "<your-endpoint>";
+ var client = new DocumentAnalysisClient(new Uri(endpoint), new DefaultAzureCredential());
+ ```
+
+For more information, *see* [Authenticate the client](https://github.com/Azure/azure-sdk-for-net/tree/Azure.AI.FormRecognizer_4.0.0-beta.4/sdk/formrecognizer/Azure.AI.FormRecognizer#authenticate-the-client)
+
+### [Java](#tab/java)
+
+Here's how to acquire and use the [DefaultAzureCredential](/java/api/com.azure.identity.defaultazurecredential?view=azure-java-stable&preserve-view=true) for Java applications:
+
+1. Install the [Azure Identity library for Java](/java/api/overview/azure/identity-readme?view=azure-java-stable&preserve-view=true):
+
+ ```xml
+ <dependency>
+ <groupId>com.azure</groupId>
+ <artifactId>azure-identity</artifactId>
+ <version>1.5.3</version>
+ </dependency>
+ ```
+
+1. [Register an Azure AD application and create a new service principal](/azure/cognitive-services/authentication?tabs=powershell#assign-a-role-to-a-service-principal).
+
+1. Grant access to Form Recognizer by assigning the **`Cognitive Services User`** role to your service principal.
+
+1. Set the values of the client ID, tenant ID, and client secret of the Azure AD application as environment variables: **`AZURE_CLIENT_ID`**, **`AZURE_TENANT_ID`**, and **`AZURE_CLIENT_SECRET`**, respectively.
+
+1. Create your **`DocumentAnalysisClient`** instance and **`TokenCredential`** variable:
+
+ ```java
+ TokenCredential credential = new DefaultAzureCredentialBuilder().build();
+ DocumentAnalysisClient documentAnalysisClient = new DocumentAnalysisClientBuilder()
+ .endpoint("{your-endpoint}")
+ .credential(credential)
+ .buildClient();
+ ```
+
+For more information, *see* [Authenticate the client](https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/formrecognizer/azure-ai-formrecognizer#authenticate-the-client)
+
+### [JavaScript](#tab/javascript)
+
+Here's how to acquire and use the [DefaultAzureCredential](/javascript/api/@azure/identity/defaultazurecredential?view=azure-node-latest&preserve-view=true) for JavaScript applications:
+
+1. Install the [Azure Identity library for JavaScript](/javascript/api/overview/azure/identity-readme?view=azure-node-latest&preserve-view=true):
+
+ ```javascript
+ npm install @azure/identity
+ ```
+
+1. [Register an Azure AD application and create a new service principal](/azure/cognitive-services/authentication?tabs=powershell#assign-a-role-to-a-service-principal).
+
+1. Grant access to Form Recognizer by assigning the **`Cognitive Services User`** role to your service principal.
+
+1. Set the values of the client ID, tenant ID, and client secret of the Azure AD application as environment variables: **`AZURE_CLIENT_ID`**, **`AZURE_TENANT_ID`**, and **`AZURE_CLIENT_SECRET`**, respectively.
+
+1. Create your **`DocumentAnalysisClient`** instance including the **`DefaultAzureCredential`**:
+
+ ```javascript
+ const { DocumentAnalysisClient } = require("@azure/ai-form-recognizer");
+ const { DefaultAzureCredential } = require("@azure/identity");
+
+ const client = new DocumentAnalysisClient("<your-endpoint>", new DefaultAzureCredential());
+ ```
+
+For more information, *see* [Create and authenticate a client](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/formrecognizer/ai-form-recognizer#create-and-authenticate-a-client).
+
+### [Python](#tab/python)
+
+Here's how to acquire and use the [DefaultAzureCredential](/python/api/azure-identity/azure.identity.defaultazurecredential?view=azure-python&preserve-view=true) for Python applications.
+
+1. Install the [Azure Identity library for Python](/python/api/overview/azure/identity-readme?view=azure-python&preserve-view=true):
+
+ ```python
+ pip install azure-identity
+ ```
+1. [Register an Azure AD application and create a new service principal](/azure/cognitive-services/authentication?tabs=powershell#assign-a-role-to-a-service-principal).
+
+1. Grant access to Form Recognizer by assigning the **`Cognitive Services User`** role to your service principal.
+
+1. Set the values of the client ID, tenant ID, and client secret of the Azure AD application as environment variables: **`AZURE_CLIENT_ID`**, **`AZURE_TENANT_ID`**, and **`AZURE_CLIENT_SECRET`**, respectively.
+
+1. Create your **`DocumentAnalysisClient`** instance including the **`DefaultAzureCredential`**:
+
+ ```python
+ from azure.identity import DefaultAzureCredential
+ from azure.ai.formrecognizer import DocumentAnalysisClient
+
+ credential = DefaultAzureCredential()
+ document_analysis_client = DocumentAnalysisClient(
+ endpoint="https://<my-custom-subdomain>.cognitiveservices.azure.com/",
+ credential=credential
+ )
+ ```
+
+For more information, *see* [Authenticate the client](https://github.com/Azure/azure-sdk-for-python/tree/azure-ai-formrecognizer_3.2.0b5/sdk/formrecognizer/azure-ai-formrecognizer#authenticate-the-client)
+++
+### 4. Build your application
+
+First, you'll create a client object to interact with the Form Recognizer SDK, and then call methods on that client object to interact with the service. The SDKs provide both synchronous and asynchronous methods. For more insight, try a [quickstart](quickstarts/get-started-v3-sdk-rest-api.md) in a language of your choice.
+
+## Help options
+
+The [Microsoft Q&A](/answers/topics/azure-form-recognizer.html) and [Stack Overflow](https://stackoverflow.com/questions/tagged/azure-form-recognizer) forums are available for the developer community to ask and answer questions about Azure Form Recognizer and other services. Microsoft monitors the forums and replies to questions that the community has yet to answer. To make sure that we see your question, tag it with **`azure-form-recognizer`**.
+
+## Next steps
+
+>[!div class="nextstepaction"]
+> [**Try a Form Recognizer quickstart**](quickstarts/get-started-v3-sdk-rest-api.md)
+
+> [!div class="nextstepaction"]
+> [**Explore the Form Recognizer REST API v3.0**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)
applied-ai-services Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/service-limits.md
Previously updated : 06/06/2022 Last updated : 08/22/2022
This article contains a quick reference and the **detailed description** of Azure Form Recognizer service Quotas and Limits for all [pricing tiers](https://azure.microsoft.com/pricing/details/form-recognizer/). It also contains some best practices to avoid request throttling.
-For the usage with [Form Recognizer SDK](quickstarts/try-v3-csharp-sdk.md), [Form Recognizer REST API](quickstarts/try-v3-rest-api.md), [Form Recognizer Studio](quickstarts/try-v3-form-recognizer-studio.md) and [Sample Labeling Tool](https://fott-2-1.azurewebsites.net/).
+For the usage with [Form Recognizer SDK](quickstarts/get-started-v3-sdk-rest-api.md), [Form Recognizer REST API](quickstarts/get-started-v3-sdk-rest-api.md), [Form Recognizer Studio](quickstarts/try-v3-form-recognizer-studio.md) and [Sample Labeling Tool](https://fott-2-1.azurewebsites.net/).
| Quota | Free (F0)<sup>1</sup> | Standard (S0) | |--|--|--|
For the usage with [Form Recognizer SDK](quickstarts/try-v3-csharp-sdk.md), [For
| **Max number of Neural models** | 100 | 500 | | Adjustable | No | No |
-# [Form Recognizer v3.0 (Preview)](#tab/v30)
+# [Form Recognizer v3.0 ](#tab/v30)
| Quota | Free (F0)<sup>1</sup> | Standard (S0) | |--|--|--|
For the usage with [Form Recognizer SDK](quickstarts/try-v3-csharp-sdk.md), [For
<sup>3</sup> Open a support request to increase the monthly training limit.
-# [Form Recognizer v2.1 (GA)](#tab/v21)
+# [Form Recognizer v2.1 ](#tab/v21)
| Quota | Free (F0)<sup>1</sup> | Standard (S0) | |--|--|--|
Generally, it's highly recommended to test the workload and the workload pattern
## Next steps > [!div class="nextstepaction"]
-> [Learn about error codes and troubleshooting](preview-error-guide.md)
+> [Learn about error codes and troubleshooting](v3-error-guide.md)
applied-ai-services Supervised Table Tags https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/supervised-table-tags.md
>[!TIP] >
-> * For an enhanced experience and advanced model quality, try the [Form Recognizer v3.0 Studio (preview)](https://formrecognizer.appliedai.azure.com/studio).
+> * For an enhanced experience and advanced model quality, try the [Form Recognizer v3.0 Studio ](https://formrecognizer.appliedai.azure.com/studio).
> * The v3.0 Studio supports any model trained with v2.1 labeled data. > * You can refer to the [API migration guide](v3-migration-guide.md) for detailed information about migrating from v2.1 to v3.0.
-> * *See* our [**REST API**](quickstarts/try-v3-rest-api.md) or [**C#**](quickstarts/try-v3-csharp-sdk.md), [**Java**](quickstarts/try-v3-java-sdk.md), [**JavaScript**](quickstarts/try-v3-javascript-sdk.md), or [Python](quickstarts/try-v3-python-sdk.md) SDK quickstarts to get started with the V3.0 preview.
+> * *See* our [**REST API**](quickstarts/get-started-v3-sdk-rest-api.md) or [**C#**](quickstarts/get-started-v3-sdk-rest-api.md), [**Java**](quickstarts/get-started-v3-sdk-rest-api.md), [**JavaScript**](quickstarts/get-started-v3-sdk-rest-api.md), or [Python](quickstarts/get-started-v3-sdk-rest-api.md) SDK quickstarts to get started with version v3.0.
In this article, you'll learn how to train your custom template model with table tags (labels). Some scenarios require more complex labeling than simply aligning key-value pairs. Such scenarios include extracting information from forms with complex hierarchical structures or encountering items that not automatically detected and extracted by the service. In these cases, you can use table tags to train your custom template model.
applied-ai-services Tutorial Logic Apps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/tutorial-logic-apps.md
Previously updated : 01/11/2022 Last updated : 08/22/2022 recommendations: false #Customer intent: As a form-processing software developer, I want to learn how to use the Form Recognizer service with Logic Apps.
recommendations: false
> [!IMPORTANT] >
-> This tutorial and the Logic App Form Recognizer connector targets Form Recognizer REST API v2.1.
+> This tutorial and the Logic App Form Recognizer connector targets Form Recognizer REST API v2.1 and must be used in conjuction with the [FOTT sample labeling tool](https://fott-2-1.azurewebsites.net/).
Azure Logic Apps is a cloud-based platform that can be used to automate workflows without writing a single line of code. The platform enables you to easily integrate Microsoft and third-party applications with your apps, data, services, and systems. A Logic App is the Azure resource you create when you want to develop a workflow. Here are a few examples of what you can do with a Logic App:
For more information, *see* [Logic Apps Overview](../../logic-apps/logic-apps-ov
## Prerequisites
-To complete this tutorial, You'll need the following resources:
+To complete this tutorial, you'll need the following resources:
* **An Azure subscription**. You can [create a free Azure subscription](https://azure.microsoft.com/free/cognitive-services/)
To complete this tutorial, You'll need the following resources:
1. After the resource deploys, select **Go to resource**.
- 1. Copy the **Keys and Endpoint** values from the resource you created and paste them in a convenient location, such as *Microsoft Notepad*. You'll need the key and endpoint values to connect your application to the Form Recognizer API.
+ 1. Copy the **Keys and Endpoint** values from your resource in the Azure portal and paste them in a convenient location, such as *Microsoft Notepad*. You'll need the key and endpoint values to connect your application to the Form Recognizer API.
:::image border="true" type="content" source="media/containers/keys-and-endpoint.png" alt-text="Still photo showing how to access resource key and endpoint URL."::: > [!TIP]
- > For further guidance, *see* [**create a Form Recognizer resource**](create-a-form-recognizer-resource.md).
+ > For more information, *see* [**create a Form Recognizer resource**](create-a-form-recognizer-resource.md).
* A free [**OneDrive**](https://onedrive.live.com/signup) or [**OneDrive for Business**](https://www.microsoft.com/microsoft-365/onedrive/onedrive-for-business) cloud storage account.
At this point, you should have a Form Recognizer resource and a OneDrive folder
1. A short validation check should run. After it completes successfully, select **Create** in the bottom-left corner.
-1. You will be redirected to a screen that says **Deployment in progress**. Give Azure some time to deploy; it can take a few minutes. After the deployment is complete, you should see a banner that says, **Your deployment is complete**. When you reach this screen, select **Go to resource**.
+1. You'll be redirected to a screen that says **Deployment in progress**. Give Azure some time to deploy; it can take a few minutes. After the deployment is complete, you should see a banner that says, **Your deployment is complete**. When you reach this screen, select **Go to resource**.
:::image border="true" type="content" source="media/logic-apps-tutorial/logic-app-connector-demo-seven.gif" alt-text="GIF showing how to get to newly created Logic App resource.":::
-1. You'll be redirected to the **Logic Apps Designer** page. There is a short video for a quick introduction to Logic Apps available on the home screen. When you're ready to begin designing your Logic App, select the **Blank Logic App** button.
+1. You'll be redirected to the **Logic Apps Designer** page. There's a short video for a quick introduction to Logic Apps available on the home screen. When you're ready to begin designing your Logic App, select the **Blank Logic App** button.
:::image border="true" type="content" source="media/logic-apps-tutorial/logic-app-connector-demo-eight.png" alt-text="Image showing how to enter the Logic App Designer.":::
At this point, you should have a Form Recognizer resource and a OneDrive folder
## Create automation flow
-Now that you have the Logic App connector resource set up and configured, the only thing left to do is to create the automation flow and test it out!
+Now that you have the Logic App connector resource set up and configured, the only thing left is to create the automation flow and test it out!
1. Search for and select **OneDrive** or **OneDrive for Business** in the search bar.
Now that you have the Logic App connector resource set up and configured, the on
1. Next, we're going to add a new step to the workflow. Select the plus button underneath the newly created OneDrive node.
-1. A new node should be added to the Logic App designer view. Search for "Form Recognizer" in the search bar and select **Analyze invoice (preview)** from the list.
+1. A new node should be added to the Logic App designer view. Search for "Form Recognizer" in the search bar and select **Analyze invoice** from the list.
-1. Now, you should see a window where you will create your connection. Specifically, you're going to connect your Form Recognizer resource to the Logic Apps Designer Studio:
+1. Now, you should see a window where you'll create your connection. Specifically, you're going to connect your Form Recognizer resource to the Logic Apps Designer Studio:
* Enter a **Connection name**. It should be something easy to remember. * Enter the Form Recognizer resource **Endpoint URL** and **Account Key** that you copied previously. If you skipped this step earlier or lost the strings, you can navigate back to your Form Recognizer resource and copy them again. When you're done, select **Create**.
applied-ai-services V3 Error Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/v3-error-guide.md
+
+ Title: "Reference: Form Recognizer Errors"
+
+description: Learn how errors are represented in Form Recognizer and find a list of possible errors returned by the service.
+++++ Last updated : 08/22/2022+++
+# Form Recognizer error guide v3.0
+
+Form Recognizer uses a unified design to represent all errors encountered in the REST APIs. Whenever an API operation returns a 4xx or 5xx status code, additional information about the error is returned in the response JSON body as follows:
+
+```json
+{
+ "error": {
+ "code": "InvalidRequest",
+ "message": "Invalid request.",
+ "innererror": {
+ "code": "InvalidContent",
+ "message": "The file format is unsupported or corrupted. Refer to documentation for the list of supported formats."
+ }
+ }
+}
+```
+
+For long-running operations where multiple errors may be encountered, the top-level error code is set to the most severe error, with the individual errors listed under the *error.details* property. In such scenarios, the *target* property of each individual error specifies the trigger of the error.
+
+```json
+{
+ "status": "failed",
+ "createdDateTime": "2021-07-14T10:17:51Z",
+ "lastUpdatedDateTime": "2021-07-14T10:17:51Z",
+ "error": {
+ "code": "InternalServerError",
+ "message": "An unexpected error occurred.",
+ "details": [
+ {
+ "code": "InternalServerError",
+ "message": "An unexpected error occurred."
+ },
+ {
+ "code": "InvalidContentDimensions",
+ "message": "The input image dimensions are out of range. Refer to documentation for supported image dimensions.",
+ "target": "2"
+ }
+ ]
+ }
+}
+```
+
+The top-level *error.code* property can be one of the following error code messages:
+
+| Error Code | Message | Http Status |
+| -- | | -- |
+| InvalidRequest | Invalid request. | 400 |
+| InvalidArgument | Invalid argument. | 400 |
+| Forbidden | Access forbidden due to policy or other configuration. | 403 |
+| NotFound | Resource not found. | 404 |
+| MethodNotAllowed | The requested HTTP method is not allowed. | 405 |
+| Conflict | The request could not be completed due to a conflict. | 409 |
+| UnsupportedMediaType | Request content type is not supported. | 415 |
+| InternalServerError | An unexpected error occurred. | 500 |
+| ServiceUnavailable | A transient error has occurred. Try again. | 503 |
+
+When possible, more details are specified in the *inner error* property.
+
+| Top Error Code | Inner Error Code | Message |
+| -- | - | - |
+| Conflict | ModelExists | A model with the provided name already exists. |
+| Forbidden | AuthorizationFailed | Authorization failed: {details} |
+| Forbidden | InvalidDataProtectionKey | Data protection key is invalid: {details} |
+| Forbidden | OutboundAccessForbidden | The request contains a domain name that is not allowed by the current access control policy. |
+| InternalServerError | Unknown | Unknown error. |
+| InvalidArgument | InvalidContentSourceFormat | Invalid content source: {details} |
+| InvalidArgument | InvalidParameter | The parameter {parameterName} is invalid: {details} |
+| InvalidArgument | InvalidParameterLength | Parameter {parameterName} length must not exceed {maxChars} characters. |
+| InvalidArgument | InvalidSasToken | The shared access signature (SAS) is invalid: {details} |
+| InvalidArgument | ParameterMissing | The parameter {parameterName} is required. |
+| InvalidRequest | ContentSourceNotAccessible | Content is not accessible: {details} |
+| InvalidRequest | ContentSourceTimeout | Timeout while receiving the file from client. |
+| InvalidRequest | DocumentModelLimit | Account cannot create more than {maximumModels} models. |
+| InvalidRequest | DocumentModelLimitNeural | Account cannot create more than 10 custom neural models per month. Please contact support to request additional capacity. |
+| InvalidRequest | DocumentModelLimitComposed | Account cannot create a model with more than {details} component models. |
+| InvalidRequest | InvalidContent | The file is corrupted or format is unsupported. Refer to documentation for the list of supported formats. |
+| InvalidRequest | InvalidContentDimensions | The input image dimensions are out of range. Refer to documentation for supported image dimensions. |
+| InvalidRequest | InvalidContentLength | The input image is too large. Refer to documentation for the maximum file size. |
+| InvalidRequest | InvalidFieldsDefinition | Invalid fields: {details} |
+| InvalidRequest | InvalidTrainingContentLength | Training content contains {bytes} bytes. Training is limited to {maxBytes} bytes. |
+| InvalidRequest | InvalidTrainingContentPageCount | Training content contains {pages} pages. Training is limited to {pages} pages. |
+| InvalidRequest | ModelAnalyzeError | Could not analyze using a custom model: {details} |
+| InvalidRequest | ModelBuildError | Could not build the model: {details} |
+| InvalidRequest | ModelComposeError | Could not compose the model: {details} |
+| InvalidRequest | ModelNotReady | Model is not ready for the requested operation. Wait for training to complete or check for operation errors. |
+| InvalidRequest | ModelReadOnly | The requested model is read-only. |
+| InvalidRequest | NotSupportedApiVersion | The requested operation requires {minimumApiVersion} or later. |
+| InvalidRequest | OperationNotCancellable | The operation can no longer be canceled. |
+| InvalidRequest | TrainingContentMissing | Training data is missing: {details} |
+| InvalidRequest | UnsupportedContent | Content is not supported: {details} |
+| NotFound | ModelNotFound | The requested model was not found. It may have been deleted or is still building. |
+| NotFound | OperationNotFound | The requested operation was not found. The identifier may be invalid or the operation may have expired. |
applied-ai-services V3 Migration Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/v3-migration-guide.md
Previously updated : 07/20/2022 Last updated : 08/22/2022 recommendations: false
-# Form Recognizer v3.0 migration | Preview
+# Form Recognizer v3.0 migration
> [!IMPORTANT] > > Form Recognizer REST API v3.0 introduces breaking changes in the REST API request and analyze response JSON.
-Form Recognizer v3.0 (preview) introduces several new features and capabilities:
+Form Recognizer v3.0 introduces several new features and capabilities:
-* [Form Recognizer REST API](quickstarts/try-v3-rest-api.md) has been redesigned for better usability.
+* [Form Recognizer REST API](quickstarts/get-started-v3-sdk-rest-api.md) has been redesigned for better usability.
* [**General document (v3.0)**](concept-general-document.md) model is a new API that extracts text, tables, structure, and key-value pairs, from forms and documents.
-* [**Custom document model (v3.0)**](concept-custom-neural.md) is a new custom model type to extract fields from structured and unstructured documents.
+* [**Custom neural model (v3.0)**](concept-custom-neural.md) is a new custom model type to extract fields from structured and unstructured documents.
* [**Receipt (v3.0)**](concept-receipt.md) model supports single-page hotel receipt processing. * [**ID document (v3.0)**](concept-id-document.md) model supports endorsements, restrictions, and vehicle classification extraction from US driver's licenses. * [**Custom model API (v3.0)**](concept-custom.md) supports signature detection for custom template models.
In this article, you'll learn the differences between Form Recognizer v2.1 and v
> [!CAUTION] >
-> * REST API **2022-06-30-preview** release includes a breaking change in the REST API analyze response JSON.
+> * REST API **2022-08-31** release includes a breaking change in the REST API analyze response JSON.
> * The `boundingBox` property is renamed to `polygon` in each instance. ## Changes to the REST API endpoints
In this article, you'll learn the differences between Form Recognizer v2.1 and v
### POST request ```http
-https://{your-form-recognizer-endpoint}/formrecognizer/documentModels/{modelId}?api-version=2022-06-30
+https://{your-form-recognizer-endpoint}/formrecognizer/documentModels/{modelId}?api-version=2022-08-31
``` ### GET request ```http
-https://{your-form-recognizer-endpoint}/formrecognizer/documentModels/{modelId}/AnalyzeResult/{resultId}?api-version=2022-06-30
+https://{your-form-recognizer-endpoint}/formrecognizer/documentModels/{modelId}/AnalyzeResult/{resultId}?api-version=2022-08-31
``` ### Analyze operation
https://{your-form-recognizer-endpoint}/formrecognizer/documentModels/{modelId}/
* The request payload and call pattern remain unchanged. * The Analyze operation specifies the input document and content-specific configurations, it returns the analyze result URL via the Operation-Location header in the response. * Poll this Analyze Result URL, via a GET request to check the status of the analyze operation (minimum recommended interval between requests is 1 second).
-* Upon success, status is set to succeeded and [analyzeResult](#changes-to-analyze-result) is returned in the response body. If errors are encountered, status will be set to failed and an error will be returned.
+* Upon success, status is set to succeeded and [analyzeResult](#changes-to-analyze-result) is returned in the response body. If errors are encountered, status will be set to `failed`, and an error will be returned.
| Model | v2.1 | v3.0 | |:--| :--| :--| | **Request URL prefix**| **https://{your-form-recognizer-endpoint}/formrecognizer/v2.1** | **https://{your-form-recognizer-endpoint}/formrecognizer** |
-|🆕 **General document**|N/A|`/documentModels/prebuilt-document:analyze` |
+| **General document**|N/A|`/documentModels/prebuilt-document:analyze` |
| **Layout**| /layout/analyze |`/documentModels/prebuilt-layout:analyze`| |**Custom**| /custom/{modelId}/analyze |`/documentModels/{modelId}:analyze` | | **Invoice** | /prebuilt/invoice/analyze | `/documentModels/prebuilt-invoice:analyze` |
Base64 encoding is also supported in Form Recognizer v3.0:
} ```
-### Additional parameters
+### Additional supported parameters
Parameters that continue to be supported:
Analyze response has been refactored to the following top-level results to suppo
{ // Basic analyze result metadata
-"apiVersion": "2022-06-30", // REST API version used
+"apiVersion": "2022-08-31", // REST API version used
"modelId": "prebuilt-invoice", // ModelId used "stringIndexType": "textElements", // Character unit used for string offsets and lengths: // textElements, unicodeCodePoint, utf16CodeUnit // Concatenated content in global reading order across pages.
Analyze response has been refactored to the following top-level results to suppo
"boundingRegions": [ // Polygons or Bounding boxes potentially across pages covered by table { "pageNumber": 1, // 1-indexed page number
-"polygon": [ ... ], // Previously Bounding box, renamed to polygon in the 2022-06-30-preview API
+"polygon": [ ... ], // Previously Bounding box, renamed to polygon in the 2022-08-31 API
} ], "spans": [ ... ], // Parts of top-level content covered by table // List of cells in table
The model object has three updates in the new API
* ```modelId``` is now a property that can be set on a model for a human readable name. * ```modelName``` has been renamed to ```description```
-* ```buildMode``` is a new property with values of ```template``` for custom form models or ```neural``` for custom document models.
+* ```buildMode``` is a new property with values of ```template``` for custom form models or ```neural``` for custom neural models.
-The ```build``` operation is invoked to train a model. The request payload and call pattern remain unchanged. The build operation specifies the model and training dataset, it returns the result via the Operation-Location header in the response. Poll this model operation URL, via a GET request to check the status of the build operation (minimum recommended interval between requests is 1 second). Unlike v2.1, this URL isn't the resource location of the model. Instead, the model URL can be constructed from the given modelId, also retrieved from the resourceLocation property in the response. Upon success, status is set to ```succeeded``` and result contains the custom model info. If errors are encountered, status is set to ```failed``` and the error is returned.
+The ```build``` operation is invoked to train a model. The request payload and call pattern remain unchanged. The build operation specifies the model and training dataset, it returns the result via the Operation-Location header in the response. Poll this model operation URL, via a GET request to check the status of the build operation (minimum recommended interval between requests is 1 second). Unlike v2.1, this URL isn't the resource location of the model. Instead, the model URL can be constructed from the given modelId, also retrieved from the resourceLocation property in the response. Upon success, status is set to ```succeeded``` and result contains the custom model info. If errors are encountered, status is set to ```failed```, and the error is returned.
The following code is a sample build request using a SAS token. Note the trailing slash when setting the prefix or folder path. ```json
-POST https://{your-form-recognizer-endpoint}/formrecognizer/documentModels:build?api-version=2022-06-30
+POST https://{your-form-recognizer-endpoint}/formrecognizer/documentModels:build?api-version=2022-08-31
{ "modelId": {modelId},
POST https://{your-form-recognizer-endpoint}/formrecognizer/documentModels:build
Model compose is now limited to single level of nesting. Composed models are now consistent with custom models with the addition of ```modelId``` and ```description``` properties. ```json
-POST https://{your-form-recognizer-endpoint}/formrecognizer/documentModels:compose?api-version=2022-06-30
+POST https://{your-form-recognizer-endpoint}/formrecognizer/documentModels:compose?api-version=2022-08-31
{ "modelId": "{composedModelId}", "description": "{composedModelDescription}",
The only changes to the copy model function are:
***Authorize the copy*** ```json
-POST https://{targetHost}/formrecognizer/documentModels:authorizeCopy?api-version=2022-06-30
+POST https://{targetHost}/formrecognizer/documentModels:authorizeCopy?api-version=2022-08-31
{ "modelId": "{targetModelId}", "description": "{targetModelDescription}",
POST https://{targetHost}/formrecognizer/documentModels:authorizeCopy?api-versio
Use the response body from the authorize action to construct the request for the copy. ```json
-POST https://{sourceHost}/formrecognizer/documentModels/{sourceModelId}:copy-to?api-version=2022-06-30
+POST https://{sourceHost}/formrecognizer/documentModels/{sourceModelId}:copy-to?api-version=2022-08-31
{ "targetResourceId": "{targetResourceId}", "targetResourceRegion": "{targetResourceRegion}",
List models have been extended to now return prebuilt and custom models. All pre
***Sample list models request*** ```json
-GET https://{your-form-recognizer-endpoint}/formrecognizer/documentModels?api-version=2022-06-30
+GET https://{your-form-recognizer-endpoint}/formrecognizer/documentModels?api-version=2022-08-31
``` ## Change to get model
GET https://{your-form-recognizer-endpoint}/formrecognizer/documentModels?api-ve
As get model now includes prebuilt models, the get operation returns a ```docTypes``` dictionary. Each document type is described by its name, optional description, field schema, and optional field confidence. The field schema describes the list of fields potentially returned with the document type. ```json
-GET https://{your-form-recognizer-endpoint}/formrecognizer/documentModels/{modelId}?api-version=2022-06-30
+GET https://{your-form-recognizer-endpoint}/formrecognizer/documentModels/{modelId}?api-version=2022-08-31
``` ## New get info operation
GET https://{your-form-recognizer-endpoint}/formrecognizer/documentModels/{model
The ```info``` operation on the service returns the custom model count and custom model limit. ```json
-GET https://{your-form-recognizer-endpoint}/formrecognizer/info? api-version=2022-06-30
+GET https://{your-form-recognizer-endpoint}/formrecognizer/info? api-version=2022-08-31
``` ***Sample response***
GET https://{your-form-recognizer-endpoint}/formrecognizer/info? api-version=202
## Next steps
-In this migration guide, you've learned how to upgrade your existing Form Recognizer application to use the v3.0 APIs. Continue to use the 2.1 API for all GA features and use the 3.0 API for any of the preview features.
+In this migration guide, you've learned how to upgrade your existing Form Recognizer application to use the v3.0 APIs.
-* [Review the new REST API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-06-30-preview/operations/AnalyzeDocument)
+* [Review the new REST API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-2022-08-31/operations/AnalyzeDocument)
* [What is Form Recognizer?](overview.md) * [Form Recognizer quickstart](./quickstarts/try-sdk-rest-api.md)
applied-ai-services Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/whats-new.md
Previously updated : 06/29/2022 Last updated : 08/22/2022 <!-- markdownlint-disable MD024 -->
Form Recognizer service is updated on an ongoing basis. Bookmark this page to stay up to date with release notes, feature enhancements, and documentation updates.
+## August 2022
+
+### Form Recognizer v3.0 generally available
+
+**Form Recognizer REST API v3.0 is now generally available and ready for use in production applications!**
+
+#### The August release introduces the following performance updates:
+
+##### Form Recognizer Studio updates
+
+* **Next steps**. Under each model page, the Studio now has a next steps section. Users can quickly reference sample code, troubleshooting guidelines, and pricing information.
+
+* **Custom models**. The Studio now includes the ability to reorder labels in custom model projects to improve labeling efficiency.
+
+* **Copy Models** Custom models can be copied across Form Recognizer services from within the Studio. This enables the promotion of a trained model to other environments and regions.
+
+* **Delete documents**. The Studio now supports deleting documents from labeled dataset within custom projects.
+
+##### Form Recognizer service updates
+
+* [**prebuilt-invoice**](concept-invoice.md). The TotalVAT and Line/VAT fields will now resolve to the existing fields TotalTax and Line/Tax respectively.
+
+* [**prebuilt-idDocument**](concept-id-document.md). Data extraction support for US state ID, social security, and green cards as well as passport visa information.
+
+* [**prebuilt-receipt**](concept-receipt.md). Expanded locale support for French (fr-FR), Spanish (es-ES), Portuguese (pt-PT), Italian (it-IT) and German (de-DE).
+
+* [**prebuilt-businessCard**](concept-business-card.md). Address parsing support to extract sub-fields for address components like address, city, state, country, and zip code.
+
+* **AI quality improvements**
+
+ * [**custom-neural**](concept-custom-neural.md). Improved accuracy for table detection and extraction.
+
+ * [**prebuilt-layout**](concept-layout.md). Support for better detection of cropped tables, borderless tables, and improved recognition of long spanning cells. As well improved paragraph grouping detection and logical identification of headers and titles.
+
+ * [**prebuilt-document**](concept-general-document.md). Improved value and check box detection.
+ ## June 2022 ### [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio) June Update
-The June release is the latest update to the Form Recognizer Studio. There are considerable UX and accessbility improvements addressed in this update:
+The June release is the latest update to the Form Recognizer Studio. There are considerable user experience and accessibility improvements addressed in this update:
-* 🆕 **Code sample for Javascript and C#**. The Studio code tab now adds Javascript and C# code samples in addition to the existing Python one.
-* 🆕 **New document upload UI**. Studio now supports uploading a document with drag & drop into the new upload user interface.
-* 🆕 **New feature for custom projects**. Custom projects now support creating storage account and blobs when configuring the project. In addition, custom project now supports uploading training files directly within the Studio and copying the existing custom model.
+* **Code sample for Javascript and C#**. The Studio code tab now adds JavaScript and C# code samples in addition to the existing Python one.
+* **New document upload UI**. Studio now supports uploading a document with drag & drop into the new upload user interface.
+* **New feature for custom projects**. Custom projects now support creating storage account and blobs when configuring the project. In addition, custom project now supports uploading training files directly within the Studio and copying the existing custom model.
### Form Recognizer v3.0 preview release
-The **2022-06-30-preview** release is the latest update to the Form Recognizer service for v3.0 capabilities and presents extensive updates across the feature APIs:
+The **2022-06-30-preview** release presents extensive updates across the feature APIs:
-* [🆕 **Layout extends structure extraction**](concept-layout.md). Layout now includes added structure elements including sections, section headers, and paragraphs. This update enables finer grain document segmentation scenarios. For a complete list of structure elements identified, _see_ [enhanced structure](concept-layout.md#data-extraction).
-* [🆕 **Custom neural model tabular fields support**](concept-custom-neural.md). Custom document models now support tabular fields. Tabular fields by default are also multi page. To learn more about tabular fields in custom neural models, _see_ [tabular fields](concept-custom-neural.md#tabular-fields).
-* [🆕 **Custom template model tabular fields support for cross page tables**](concept-custom-template.md). Custom form models now support tabular fields across pages. To learn more about tabular fields in custom template models, _see_ [tabular fields](concept-custom-neural.md#tabular-fields).
-* [🆕 **Invoice model output now includes general document key-value pairs**](concept-invoice.md). Where invoices contain required fields beyond the fields included in the prebuilt model, the general document model supplements the output with key-value pairs. _See_ [key value pairs](concept-invoice.md#key-value-pairs-preview).
-* [🆕 **Invoice language expansion**](concept-invoice.md). The invoice model includes expanded language support. _See_ [supported languages](concept-invoice.md#supported-languages-and-locales).
-* [🆕 **Prebuilt business card**](concept-business-card.md) now includes Japanese language support. _See_ [supported languages](concept-business-card.md#supported-languages-and-locales).
-* [🆕 **Prebuilt ID document model**](concept-id-document.md). The ID document model now extracts DateOfIssue, Height, Weight, EyeColor, HairColor, and DocumentDiscriminator from US driver's licenses. _See_ [field extraction](concept-id-document.md#id-document-preview-field-extraction).
-* [🆕 **Read model now supports common Microsoft Office document types**](concept-read.md). Document types like Word (docx) and PowerPoint (ppt) are now supported with the Read API. See [page extraction](concept-read.md#pages).
+* [**Layout extends structure extraction**](concept-layout.md). Layout now includes added structure elements including sections, section headers, and paragraphs. This update enables finer grain document segmentation scenarios. For a complete list of structure elements identified, _see_ [enhanced structure](concept-layout.md#data-extraction).
+* [**Custom neural model tabular fields support**](concept-custom-neural.md). Custom document models now support tabular fields. Tabular fields by default are also multi page. To learn more about tabular fields in custom neural models, _see_ [tabular fields](concept-custom-neural.md#tabular-fields).
+* [**Custom template model tabular fields support for cross page tables**](concept-custom-template.md). Custom form models now support tabular fields across pages. To learn more about tabular fields in custom template models, _see_ [tabular fields](concept-custom-neural.md#tabular-fields).
+* [**Invoice model output now includes general document key-value pairs**](concept-invoice.md). Where invoices contain required fields beyond the fields included in the prebuilt model, the general document model supplements the output with key-value pairs. _See_ [key value pairs](concept-invoice.md#key-value-pairs).
+* [**Invoice language expansion**](concept-invoice.md). The invoice model includes expanded language support. _See_ [supported languages](concept-invoice.md#supported-languages-and-locales).
+* [**Prebuilt business card**](concept-business-card.md) now includes Japanese language support. _See_ [supported languages](concept-business-card.md#supported-languages-and-locales).
+* [**Prebuilt ID document model**](concept-id-document.md). The ID document model now extracts DateOfIssue, Height, Weight, EyeColor, HairColor, and DocumentDiscriminator from US driver's licenses. _See_ [field extraction](concept-id-document.md).
+* [**Read model now supports common Microsoft Office document types**](concept-read.md). Document types like Word (docx) and PowerPoint (ppt) are now supported with the Read API. See [page extraction](concept-read.md#pages).
#### Form Recognizer SDK beta preview release
This new release includes the following updates:
Form Recognizer v3.0 preview release introduces several new features and capabilities and enhances existing one:
-* [🆕 **Custom neural model**](concept-custom-neural.md) or custom document model is a new custom model to extract text and selection marks from structured forms, semi-strutured and **unstructured documents**.
-* [🆕 **W-2 prebuilt model**](concept-w2.md) is a new prebuilt model to extract fields from W-2 forms for tax reporting and income verification scenarios.
-* [🆕 **Read**](concept-read.md) API extracts printed text lines, words, text locations, detected languages, and handwritten text, if detected.
+* [**Custom neural model**](concept-custom-neural.md) or custom document model is a new custom model to extract text and selection marks from structured forms, semi-strutured and **unstructured documents**.
+* [**W-2 prebuilt model**](concept-w2.md) is a new prebuilt model to extract fields from W-2 forms for tax reporting and income verification scenarios.
+* [**Read**](concept-read.md) API extracts printed text lines, words, text locations, detected languages, and handwritten text, if detected.
* [**General document**](concept-general-document.md) pre-trained model is now updated to support selection marks in addition to API text, tables, structure, key-value pairs, and named entities from forms and documents. * [**Invoice API**](language-support.md#invoice-model) Invoice prebuilt model expands support to Spanish invoices. * [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com) adds new demos for Read, W2, Hotel receipt samples, and support for training the new custom neural models. * [**Language Expansion**](language-support.md) Form Recognizer Read, Layout, and Custom Form add support for 42 new languages including Arabic, Hindi, and other languages using Arabic and Devanagari scripts to expand the coverage to 164 languages. Handwritten language support expands to Japanese and Korean.
-Get started with the new [REST API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument), [Python](quickstarts/try-v3-python-sdk.md), or [.NET](quickstarts/try-v3-csharp-sdk.md) SDK for the v3.0 preview API.
+Get started with the new [REST API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument), [Python](quickstarts/get-started-v3-sdk-rest-api.md), or [.NET](quickstarts/get-started-v3-sdk-rest-api.md) SDK for the v3.0 preview API.
#### Form Recognizer model data extraction
- | **Model** | **Text extraction** |**Key-Value pairs** |**Selection Marks** | **Tables** |**Entities** |**Signatures**|
+ | **Model** | **Text extraction** |**Key-Value pairs** |**Selection Marks** | **Tables** |**Signatures**|
| | :: |::| :: | :: |:: |
- |🆕Read | ✓ | | | | | |
- |🆕General document | ✓ | ✓ | ✓ | ✓ | ✓ | |
- | Layout | Γ£ô | | Γ£ô | Γ£ô | | |
- | Invoice | Γ£ô | Γ£ô | Γ£ô | Γ£ô || |
- |Receipt | Γ£ô | Γ£ô | | || |
- | ID document | Γ£ô | Γ£ô | | || |
- | Business card | Γ£ô | Γ£ô | | || |
- | Custom template |Γ£ô | Γ£ô | Γ£ô | Γ£ô | | Γ£ô |
- | Custom neural |Γ£ô | Γ£ô | Γ£ô | Γ£ô | | |
+ |Read | Γ£ô | | | | |
+ |General document | Γ£ô | Γ£ô | Γ£ô | Γ£ô | |
+ | Layout | Γ£ô | | Γ£ô | Γ£ô | |
+ | Invoice | Γ£ô | Γ£ô | Γ£ô | Γ£ô ||
+ |Receipt | Γ£ô | Γ£ô | | |Γ£ô|
+ | ID document | Γ£ô | Γ£ô | | ||
+ | Business card | Γ£ô | Γ£ô | | ||
+ | Custom template |Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
+ | Custom neural |Γ£ô | Γ£ô | Γ£ô | Γ£ô | |
#### Form Recognizer SDK beta preview release
The latest beta release version of the Azure Form Recognizer SDKs incorporates n
This new release includes the following updates:
-* 🆕 [Custom Document models and modes](concept-custom.md):
+* [Custom Document models and modes](concept-custom.md):
* [Custom template](concept-custom-template.md) (formerly custom form) * [Custom neural](concept-custom-neural.md). * [Custom modelΓÇöbuild mode](concept-custom.md#build-mode).
-* 🆕 [W-2 prebuilt model](concept-w2.md) (prebuilt-tax.us.w2).
+* [W-2 prebuilt model](concept-w2.md) (prebuilt-tax.us.w2).
-* 🆕 [Read prebuilt model](concept-read.md) (prebuilt-read).
+* [Read prebuilt model](concept-read.md) (prebuilt-read).
-* 🆕 [Invoice prebuilt model (Spanish)](concept-invoice.md#supported-languages-and-locales) (prebuilt-invoice).
+* [Invoice prebuilt model (Spanish)](concept-invoice.md#supported-languages-and-locales) (prebuilt-invoice).
### [**C#**](#tab/csharp)
The `BuildModelOperation` and `CopyModelOperation` now correctly populate the `P
* [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com) To simplify use of the service, you can now access the Form Recognizer Studio to test the different prebuilt models or label and train a custom model
-Get stared with the new [REST API](https://westus2.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/AnalyzeWithCustomForm), [Python](quickstarts/try-v3-python-sdk.md), or [.NET](quickstarts/try-v3-csharp-sdk.md) SDK for the v3.0 preview API.
+Get started with the new [REST API](https://westus2.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/AnalyzeWithCustomForm), [Python](quickstarts/get-started-v3-sdk-rest-api.md), or [.NET](quickstarts/get-started-v3-sdk-rest-api.md) SDK for the v3.0 preview API.
#### Form Recognizer model data extraction | **Model** | **Text extraction** |**Key-Value pairs** |**Selection Marks** | **Tables** |**Entities** | | | :: |::| :: | :: |:: |
- |🆕General document | ✓ | ✓ | ✓ | ✓ | ✓ |
+ |General document | Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
| Layout | Γ£ô | | Γ£ô | Γ£ô | | | Invoice | Γ£ô | Γ£ô | Γ£ô | Γ£ô || |Receipt | Γ£ô | Γ£ô | | ||
The patch addresses invoices that don't have subline item fields detected such a
## May 2021
-### Form Recognizer 2.1 API Generally Available (GA) release
+### Form Recognizer 2.1 API Generally Available release
-* Form Recognizer 2.1 is generally available. The General Availability (GA) release marks the stability of the changes introduced in prior 2.1 preview package versions. This release enables you to detect and extract information and data from the following document types:
+* Form Recognizer 2.1 is generally available. The General Availability release marks the stability of the changes introduced in prior 2.1 preview package versions. This release enables you to detect and extract information and data from the following document types:
* [Documents](concept-layout.md) * [Receipts](./concept-receipt.md)
pip package version 3.1.0b4
### New features
-* **SDK support for Form Recognizer API v2.0 Public Preview** - This month we expanded our service support to include a preview SDK for Form Recognizer v2.0 (preview) release. Use the links below to get started with your language of choice:
+* **SDK support for Form Recognizer API v2.0 Public Preview** - This month we expanded our service support to include a preview SDK for Form Recognizer v2.0 release. Use the links below to get started with your language of choice:
* [.NET SDK](/dotnet/api/overview/azure/ai.formrecognizer-readme) * [Java SDK](/java/api/overview/azure/ai-formrecognizer-readme) * [Python SDK](/python/api/overview/azure/ai-formrecognizer-readme)
TLS 1.2 is now enforced for all HTTP requests to this service. For more informat
## January 2020
-This release introduces the Form Recognizer 2.0 (preview). In the sections below, you'll find more information about new features, enhancements, and changes.
+This release introduces the Form Recognizer 2.0. In the sections below, you'll find more information about new features, enhancements, and changes.
### New features
Complete a [quickstart](./quickstarts/try-sdk-rest-api.md) to get started writin
## See also
-* [What is Form Recognizer?](./overview.md)
+* [What is Form Recognizer?](./overview.md)
automation Automation Hrw Run Runbooks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/automation-hrw-run-runbooks.md
There are two ways to use the Managed Identities in Hybrid Runbook Worker script
-**An Arc-enabled server running as a Hybrid Runbook Worker** already has a built-in System Managed Identity assigned to it which can be used for authentication.
+**An Arc-enabled server or Arc-enabled VMware vSphere VM** running as a Hybrid Runbook Worker already has a built-in System Managed Identity assigned to it which can be used for authentication.
1. You can grant this Managed Identity access to resources in your subscription in the Access control (IAM) blade for the resource by adding the appropriate role assignment.
automation Automation Hybrid Runbook Worker https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/automation-hybrid-runbook-worker.md
Runbooks in Azure Automation might not have access to resources in other clouds or in your on-premises environment because they run on the Azure cloud platform. You can use the Hybrid Runbook Worker feature of Azure Automation to run runbooks directly on the machine hosting the role and against resources in the environment to manage those local resources. Runbooks are stored and managed in Azure Automation and then delivered to one or more assigned machines.
-Azure Automation provides native integration of the Hybrid Runbook Worker role through the Azure virtual machine (VM) extension framework. The Azure VM agent is responsible for management of the extension on Azure VMs on Windows and Linux VMs, and on non-Azure machines through the Arc-enabled servers Connected Machine agent. Now there are two Hybrid Runbook Workers installation platforms supported by Azure Automation.
-
+Azure Automation provides native integration of the Hybrid Runbook Worker role through the Azure virtual machine (VM) extension framework. The Azure VM agent is responsible for management of the extension on Azure VMs on Windows and Linux VMs, and [Azure Connected Machine agent](../azure-arc/servers/agent-overview.md) on Non-Azure machines including [Azure Arc-enabled Servers](../azure-arc/servers/overview.md) and [Azure Arc-enabled VMware vSphere](../azure-arc/vmware-vsphere/overview.md). Now there are two Hybrid Runbook Workers installation platforms supported by Azure Automation.
+
| Platform | Description | ||| |**Extension-based (V2)** |Installed using the [Hybrid Runbook Worker VM extension](./extension-based-hybrid-runbook-worker-install.md), without any dependency on the Log Analytics agent reporting to an Azure Monitor Log Analytics workspace. **This is the recommended platform**.| |**Agent-based (V1)** |Installed after the [Log Analytics agent](../azure-monitor/agents/log-analytics-agent.md) reporting to an Azure Monitor [Log Analytics workspace](../azure-monitor/logs/log-analytics-workspace-overview.md) is completed.| - Here's a list of benefits available with the extension-based Hybrid Runbook Worker role:
automation Extension Based Hybrid Runbook Worker Install https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/extension-based-hybrid-runbook-worker-install.md
The extension-based onboarding is only for **User** Hybrid Runbook Workers. This
For **System** Hybrid Runbook Worker onboarding, see [Deploy an agent-based Windows Hybrid Runbook Worker in Automation](./automation-windows-hrw-install.md) or [Deploy an agent-based Linux Hybrid Runbook Worker in Automation](./automation-linux-hrw-install.md).
-You can use the user Hybrid Runbook Worker feature of Azure Automation to run runbooks directly on an Azure machine or a non-Azure machine through servers registered with [Azure Arc-enabled servers](../azure-arc/servers/overview.md). From the machine or server that's hosting the role, you can run runbooks directly against it and against resources in the environment to manage those local resources.
+You can use the user Hybrid Runbook Worker feature of Azure Automation to run runbooks directly on an Azure or non-Azure machine, including [Azure Arc-enabled servers](../azure-arc/servers/overview.md) and [Arc-enabled VMware vSphere](../azure-arc/vmware-vsphere/overview.md). From the machine or server that's hosting the role, you can run runbooks directly against it and against resources in the environment to manage those local resources.
Azure Automation stores and manages runbooks and then delivers them to one or more chosen machines. After you successfully deploy a runbook worker, review [Run runbooks on a Hybrid Runbook Worker](automation-hrw-run-runbooks.md) to learn how to configure your runbooks to automate processes in your on-premises datacenter or other cloud environment.
Azure Automation stores and manages runbooks and then delivers them to one or mo
- Two cores - 4 GB of RAM-- The system-assigned managed identity must be enabled on the Azure virtual machine or Arc-enabled server. If the system-assigned managed identity isn't enabled, it will be enabled as part of the adding process.-- Non-Azure machines must have the Azure Arc-enabled servers agent (the connected machine agent) installed. To install the `AzureConnectedMachineAgent`, see [Connect hybrid machines to Azure from the Azure portal](../azure-arc/servers/onboard-portal.md).
+- The system-assigned managed identity must be enabled on the Azure virtual machine, Arc-enabled server or Arc-enabled VMware vSphere VM. If the system-assigned managed identity isn't enabled, it will be enabled as part of the adding process.
+- Non-Azure machines must have the [Azure Connected Machine agent](../azure-arc/servers/agent-overview.md) installed. To install the `AzureConnectedMachineAgent`, see [Connect hybrid machines to Azure from the Azure portal](../azure-arc/servers/onboard-portal.md) for Arc-enabled servers or see [Manage VMware virtual machines Azure Arc](../azure-arc/vmware-vsphere/manage-vmware-vms-in-azure.md#enable-guest-management) for Arc-enabled VMware vSphere VMs.
### Supported operating systems
Azure Automation stores and manages runbooks and then delivers them to one or mo
If you use a proxy server for communication between Azure Automation and machines running the extension-base Hybrid Runbook Worker, ensure that the appropriate resources are accessible. The timeout for requests from the Hybrid Runbook Worker and Automation services is 30 seconds. After three attempts, a request fails. > [!NOTE]
-> You can set up the proxy settings by PowerShell cmdlets or API.
+> For Azure VMs and Arc-enabled Servers, you can set up the proxy settings using PowerShell cmdlets or API. This is currently not supported for Arc-enabled VMware vSphere VMs.
To install the extension using cmdlets:
To create a hybrid worker group in the Azure portal, follow these steps:
- If you select **Default**, the hybrid extension will be installed using the local system account. - If you select **Custom**, then from the drop-down list, select the credential asset.
-1. Select **Next** to advance to the **Hybrid workers** tab. You can select Azure virtual machines or Azure Arc-enabled servers to be added to this Hybrid worker group. If you don't select any machines, an empty Hybrid worker group will be created. You can still add machines later.
+1. Select **Next** to advance to the **Hybrid workers** tab. You can select Azure virtual machines, Azure Arc-enabled servers or Azure Arc enabled VMware vSphere to be added to this Hybrid worker group. If you don't select any machines, an empty Hybrid worker group will be created. You can still add machines later.
:::image type="content" source="./media/extension-based-hybrid-runbook-worker-install/basics-tab-portal.png" alt-text="Screenshot showing to enter name and credentials in basics tab.":::
You can also add machines to an existing hybrid worker group.
1. Select **Add** to add the machine to the group.
- Once added, you can see the machine type as Azure virtual machine or Arc-enabled server. The **Platform** field shows the worker as **Agent based (V1)** or **Extension based (V2)**.
+ After adding, you can see the machine type as Azure virtual machine, Server-Azure Arc or VMware virtual machine-Azure Arc. The **Platform** field shows the worker as **Agent based (V1)** or **Extension based (V2)**.
- :::image type="content" source="./media/extension-based-hybrid-runbook-worker-install/hybrid-worker-group-platform.png" alt-text="Platform field showing agent or extension based.":::
+ :::image type="content" source="./media/extension-based-hybrid-runbook-worker-install/hybrid-worker-group-platform-inline.png" alt-text="Screenshot of platform field showing agent or extension based." lightbox="./media/extension-based-hybrid-runbook-worker-install/hybrid-worker-group-platform-expanded.png":::
## Install Extension-based (V2) on existing Agent-based (V1) Hybrid Worker
Using [VM insights](../azure-monitor/vm/vminsights-overview.md), you can monitor
- To learn about Azure VM extensions, see [Azure VM extensions and features for Windows](../virtual-machines/extensions/features-windows.md) and [Azure VM extensions and features for Linux](../virtual-machines/extensions/features-linux.md). -- To learn about VM extensions for Arc-enabled servers, see [VM extension management with Azure Arc-enabled servers](../azure-arc/servers/manage-vm-extensions.md).
+- To learn about VM extensions for Arc-enabled servers, see [VM extension management with Azure Arc-enabled servers](../azure-arc/servers/manage-vm-extensions.md).
+- To learn about VM extensions for Arc-enabled VMware vSphere VMs, see [Manage VMware VMs in Azure through Arc-enabled VMware vSphere](../azure-arc/vmware-vsphere/manage-vmware-vms-in-azure.md).
+
azure-arc Conceptual Connectivity Modes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/kubernetes/conceptual-connectivity-modes.md
Title: "Azure Arc-enabled Kubernetes connectivity modes" Previously updated : 11/23/2021 Last updated : 08/22/2022 description: "This article provides an overview of the connectivity modes supported by Azure Arc-enabled Kubernetes" keywords: "Kubernetes, Arc, Azure, containers"
keywords: "Kubernetes, Arc, Azure, containers"
# Azure Arc-enabled Kubernetes connectivity modes
-Azure Arc-enabled Kubernetes requires deployment of Azure Arc agents on your Kubernetes clusters using which capabilities like configurations (GitOps), extensions, Cluster Connect and Custom Location are made available on the cluster. Kubernetes clusters deployed on the edge may not have constant network connectivity and as a result the agents may not be able to always reach the Azure Arc services. This semi-connected mode however is a supported scenario. To support semi-connected modes of deployment, for features like configurations and extensions, agents rely on pulling of desired state specification from the Arc services and later realizing this state on the cluster.
+Azure Arc-enabled Kubernetes requires deployment of Azure Arc agents on your Kubernetes clusters so that capabilities such as configurations (GitOps), extensions, Cluster Connect and Custom Location are made available on the cluster. Kubernetes clusters deployed on the edge may not have constant network connectivity, and as a result, in a semi-connected mode the agents may not always be able to reach the Azure Arc services. This topic explains how Azure Arc features can be used with semi-connected modes of deployment.
## Understand connectivity modes
-| Connectivity mode | Description |
-| -- | -- |
-| Fully connected | Agents can consistently communicate with Azure with little delay in propagating GitOps configurations, enforcing Azure Policy and Gatekeeper policies, and collecting workload metrics and logs in Azure Monitor. |
-| Semi-connected | The managed identity certificate pulled down by the `clusteridentityoperator` is valid for up to 90 days before the certificate expires. Upon expiration, the Azure Arc-enabled Kubernetes resource stops working. To reactivate all Azure Arc features on the cluster, delete, and recreate the Azure Arc-enabled Kubernetes resource and agents. During the 90 days, connect the cluster at least once every 30 days. |
-| Disconnected | Kubernetes clusters in disconnected environments unable to access Azure are currently unsupported by Azure Arc-enabled Kubernetes. If this capability is of interest to you, submit or up-vote an idea on [Azure Arc's UserVoice forum](https://feedback.azure.com/d365community/forum/5c778dec-0625-ec11-b6e6-000d3a4f0858).
+When working with Azure Arc-enabled Kubernetes clusters, it's important to understand how network connectivity modes impact your operations.
+- **Fully connected**: With ongoing network connectivity, agents can consistently communicate with Azure. In this mode, there is typically little delay with tasks such as propagating GitOps configurations, enforcing Azure Policy and Gatekeeper policies, or collecting workload metrics and logs in Azure Monitor.
+- **Semi-connected**: Azure Arc agents can pull desired state specification from the Arc services, then later realize this state on the cluster.
+ > [!IMPORTANT]
+ > The managed identity certificate pulled down by the `clusteridentityoperator` is valid for up to 90 days before it expires. The agents will try to renew the certificate during this time period; however, if there is no network connectivity, the certificate may expire, and the Azure Arc-enabled Kubernetes resource will stop working. Because of this, we recommend ensuring that the connected cluster has network connectivity at least once every 30 days. If the certificate expires, you'll need to delete and then recreate the Azure Arc-enabled Kubernetes resource and agents in order to reactivate Azure Arc features on the cluster.
+- **Disconnected**: Kubernetes clusters in disconnected environments that are unable to access Azure are not currently supported by Azure Arc-enabled Kubernetes.
## Connectivity status
The connectivity status of a cluster is determined by the time of the latest hea
| Status | Description | | | -- |
-| Connecting | Azure Arc-enabled Kubernetes resource is created in Azure Resource Manager, but service hasn't received the agent heartbeat yet. |
-| Connected | Azure Arc-enabled Kubernetes service received an agent heartbeat sometime within the previous 15 minutes. |
-| Offline | Azure Arc-enabled Kubernetes resource was previously connected, but the service hasn't received any agent heartbeat for 15 minutes. |
-| Expired | Managed identity certificate of the cluster has an expiration window of 90 days after it is issued. Once this certificate expires, the resource is considered `Expired` and all features such as configuration, monitoring, and policy stop working on this cluster. More information on how to address expired Azure Arc-enabled Kubernetes resources can be found [in the FAQ article](./faq.md#how-do-i-address-expired-azure-arc-enabled-kubernetes-resources). |
+| Connecting | The Azure Arc-enabled Kubernetes resource has been created in Azure, but the service hasn't received the agent heartbeat yet. |
+| Connected | The Azure Arc-enabled Kubernetes service received an agent heartbeat within the previous 15 minutes. |
+| Offline | The Azure Arc-enabled Kubernetes resource was previously connected, but the service hasn't received any agent heartbeat for 15 minutes. |
+| Expired | The managed identity certificate of the cluster has expired. In this state, Azure Arc features will no longer work on the cluster. For more information on how to address expired Azure Arc-enabled Kubernetes resources, see the [FAQ](./faq.md#how-do-i-address-expired-azure-arc-enabled-kubernetes-resources). |
## Next steps
-* Walk through our quickstart to [connect a Kubernetes cluster to Azure Arc](./quickstart-connect-cluster.md).
-* Learn more about the creating connections between your cluster and a Git repository as a [configuration resource with Azure Arc-enabled Kubernetes](./conceptual-configurations.md).
+- Walk through our quickstart to [connect a Kubernetes cluster to Azure Arc](./quickstart-connect-cluster.md).
+- Learn more about creating connections between your cluster and a Git repository as a [configuration resource with Azure Arc-enabled Kubernetes](./conceptual-configurations.md).
azure-arc Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/kubernetes/faq.md
Title: "Azure Arc-enabled Kubernetes and GitOps frequently asked questions" Previously updated : 04/06/2022 Last updated : 08/22/2022 description: "This article contains a list of frequently asked questions related to Azure Arc-enabled Kubernetes and Azure GitOps" keywords: "Kubernetes, Arc, Azure, containers, configuration, GitOps, faq"
If the Azure Arc-enabled Kubernetes cluster is on Azure Stack Edge, AKS on Azure
## How do I address expired Azure Arc-enabled Kubernetes resources?
-The system assigned managed identity associated with your Azure Arc-enabled Kubernetes cluster is only used by the Azure Arc agents to communicate with the Azure Arc services. The certificate associated with this system assigned managed identity has an expiration window of 90 days, and the agents will attempt to renew this certificate between Day 46 to Day 90. Once this certificate expires, the resource is considered `Expired` and all features (such as configuration, monitoring, and policy) stop working on this cluster and you'll then need to delete and connect the cluster to Azure Arc once again. It is thus advisable to have the cluster come online at least once between Day 46 to Day 90 time window to ensure renewal of the managed identity certificate.
+The system-assigned managed identity associated with your Azure Arc-enabled Kubernetes cluster is only used by the Azure Arc agents to communicate with the Azure Arc services. The certificate associated with this system assigned managed identity has an expiration window of 90 days, and the agents will attempt to renew this certificate between Day 46 to Day 90. To avoid having your managed identity certificate expire, be sure that the cluster comes online at least once between Day 46 and Day 90 so that the certificate can be renewed.
-To check when the certificate is about to expire for any given cluster, run the following command:
+If the managed identity certificate expires, the resource is considered `Expired` and all Azure Arc features (such as configuration, monitoring, and policy) will stop working on the cluster.
+
+To check when the managed identity certificate will expire for a given cluster, run the following command:
```azurecli az connectedk8s show -n <name> -g <resource-group> ```
-In the output, the value of the `managedIdentityCertificateExpirationTime` indicates when the managed identity certificate will expire (90D mark for that certificate).
+In the output, the value of the `managedIdentityCertificateExpirationTime` indicates when the managed identity certificate will expire (90D mark for that certificate).
If the value of `managedIdentityCertificateExpirationTime` indicates a timestamp from the past, then the `connectivityStatus` field in the above output will be set to `Expired`. In such cases, to get your Kubernetes cluster working with Azure Arc again:
-1. Delete Azure Arc-enabled Kubernetes resource and agents on the cluster.
+1. Delete the Azure Arc-enabled Kubernetes resource and agents on the cluster.
```azurecli az connectedk8s delete -n <name> -g <resource-group>
azure-arc Day2 Operations Resource Bridge https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/vmware-vsphere/day2-operations-resource-bridge.md
There are two different sets of credentials stored on the Arc resource bridge. B
- **Account for Arc resource bridge**. This account is used for deploying the Arc resource bridge VM and will be used for upgrade. - **Account for VMware cluster extension**. This account is used to discover inventory and perform all VM operations through Azure Arc-enabled VMware vSphere
-To update the credentials of the account for Arc resource bridge, use the Azure CLI command [`az arcappliance update-infracredentials vmware`](/cli/azure/arcappliance/update-infracredential#az-arcappliance-update-infracredentials-vmware). Run the command from a workstation that can access cluster configuration IP address of the Arc resource bridge locally:
+To update the credentials of the account for Arc resource bridge, use the Azure CLI command [`az arcappliance update-infracredentials vmware`](/cli/azure/arcappliance/update-infracredentials#az-arcappliance-update-infracredentials-vmware). Run the command from a workstation that can access cluster configuration IP address of the Arc resource bridge locally:
```azurecli az arcappliance update-infracredentials vmware --kubeconfig <kubeconfig>
azure-cache-for-redis Cache Private Link https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-cache-for-redis/cache-private-link.md
To create a private endpoint, follow these steps.
1. In the **Resource** tab, select your subscription, choose the resource type as `Microsoft.Cache/Redis`, and then select the cache you want to connect the private endpoint to. 1. Select the **Next: Configuration** button at the bottom of the page.-
+1. Select the **Next: Virtual Network** button at the bottom of the page.
1. In the **Configuration** tab, select the virtual network and subnet you created in the previous section.-
+1. In the **Virtual Network** tab, select the virtual network and subnet you created in the previous section.
1. Select the **Next: Tags** button at the bottom of the page. 1. Optionally, in the **Tags** tab, enter the name and value if you wish to categorize the resource.
It's only linked to your VNet. Because it's not in your VNet, NSG rules don't ne
## Next steps - To learn more about Azure Private Link, see the [Azure Private Link documentation](../private-link/private-link-overview.md).-- To compare various network isolation options for your cache, see [Azure Cache for Redis network isolation options documentation](cache-network-isolation.md).
+- To compare various network isolation options for your cache, see [Azure Cache for Redis network isolation options documentation](cache-network-isolation.md).
azure-functions Functions Add Output Binding Cosmos Db Vs Code https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-add-output-binding-cosmos-db-vs-code.md
Title: Connect Azure Functions to Azure Cosmos DB using Visual Studio Code description: Learn how to connect Azure Functions to an Azure Cosmos DB account by adding an output binding to your Visual Studio Code project.- Last updated 08/17/2021 - zone_pivot_groups: programming-languages-set-functions-temp ms.devlang: csharp, javascript
azure-monitor Azure Monitor Agent Extension Versions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/azure-monitor-agent-extension-versions.md
description: This article describes the version details for the Azure Monitor ag
Previously updated : 7/21/2022 Last updated : 8/22/2022
We strongly recommended to update to the latest version at all times, or opt in
## Version details | Release Date | Release notes | Windows | Linux | |:|:|:|:|
-| June 2022 | Bugfixes with user assigned identity support, and reliability improvements | 1.6.0.0 | Coming soon |
+| July 2022 | Fix for mismatch event timestamps for Sentinel Windows Event Forwarding | 1.7.0.0 | None |
+| June 2022 | Bugfixes with user assigned identity support, and reliability improvements | 1.6.0.0 | None |
| May 2022 | <ul><li>Fixed issue where agent stops functioning due to faulty XPath query. With this version, only query related Windows events will fail, other data types will continue to be collected</li><li>Collection of Windows network troubleshooting logs added to 'CollectAMAlogs.ps1' tool</li><li>Linux support for Debian 11 distro</li><li>Fixed issue to list mount paths instead of device names for Linux disk metrics</li></ul> | 1.5.0.0 | 1.21.0 | | April 2022 | <ul><li>Private IP information added in Log Analytics <i>Heartbeat</i> table for Windows and Linux</li><li>Fixed bugs in Windows IIS log collection (preview) <ul><li>Updated IIS site column name to match backend KQL transform</li><li>Added delay to IIS upload task to account for IIS buffering</li></ul></li><li>Fixed Linux CEF syslog forwarding for Sentinel</li><li>Removed 'error' message for Azure MSI token retrieval failure on Arc to show as 'Info' instead</li><li>Support added for Ubuntu 22.04, RHEL 8.5, 8.6, AlmaLinux and RockyLinux distros</li></ul> | 1.4.1.0<sup>Hotfix</sup> | 1.19.3 | | March 2022 | <ul><li>Fixed timestamp and XML format bugs in Windows Event logs</li><li>Full Windows OS information in Log Analytics Heartbeat table</li><li>Fixed Linux performance counters to collect instance values instead of 'total' only</li></ul> | 1.3.0.0 | 1.17.5.0 |
azure-monitor Alerts Classic Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/alerts/alerts-classic-portal.md
This sections shows how to use PowerShell commands create, view and manage class
Get-AzAlertRule -ResourceGroup montest -TargetResourceId /subscriptions/s1/resourceGroups/montest/providers/Microsoft.Compute/virtualMachines/testconfig ```
-8. Classic alert rules can no longer be created via PowerShell. To create an alert rule you need to use the new ['Add-AzMetricAlertRule'](/powershell/module/az.monitor/add-azmetricalertrule) command.
+8. Classic alert rules can no longer be created via PowerShell. Use the new ['Add-AzMetricAlertRuleV2'](/powershell/module/az.monitor/add-azmetricalertrulev2) command to create a metric alert rule instead.
## Next steps
azure-monitor Alerts Types https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/alerts/alerts-types.md
Metric alert rules include these features:
- You can configure if metric alerts are [stateful or stateless](alerts-overview.md#alerts-and-state). Metric alerts are stateful by default. The target of the metric alert rule can be:-- A single resource, such as a VM. See this article for supported resource types.
+- A single resource, such as a VM. See [this article](alerts-metric-near-real-time.md) for supported resource types.
- [Multiple resources](#monitor-multiple-resources) of the same type in the same Azure region, such as a resource group. ### Multiple conditions
azure-monitor Sdk Support Guidance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/sdk-support-guidance.md
Title: Application Insights SDK support guidance
description: Support guidance for Application Insights legacy and preview SDKs Previously updated : 03/24/2022 Last updated : 08/22/2022
Support engineers are expected to provide SDK update guidance according to the f
|Current SDK version in use |Alternative version available |Update policy for support | ||||
-|Latest GA SDK | No newer supported stable version | **NO UPDATE NECESSARY** |
-|Stable minor version of a GA SDK | Newer supported stable version | **UPDATE RECOMMENDED** |
+|Latest GA SDK | Newer preview version available | **NO UPDATE NECESSARY** |
+|GA SDK | Newer GA released < one year ago | **UPDATE RECOMMENDED** |
+|GA SDK | Newer GA released > one year ago | **UPDATE REQUIRED** |
|Unsupported ([support policy](/lifecycle/faq/azure)) | Any supported version | **UPDATE REQUIRED** |
-|Preview | Stable version | **UPDATE REQUIRED** |
-|Preview | Older stable version | **UPDATE RECOMMENDED** |
-|Preview | Newer preview version, no older stable version | **UPDATE RECOMMENDED** |
+|Latest Preview | No newer version available | **NO UPDATE NECESSARY** |
+|Latest Preview | Newer GA SDK | **UPDATE REQUIRED** |
+|Preview | Newer preview version | **UPDATE REQUIRED** |
+
+> [!NOTE]
+> * General Availability (GA) refers to non-beta versions.
+> * Preview refers to beta versions.
> [!TIP] > Switching to [auto-instrumentation](codeless-overview.md) eliminates the need for manual SDK updates.
azure-monitor Tutorial Asp Net Core https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/tutorial-asp-net-core.md
+
+ Title: Application Insights SDK for ASP.NET Core applications | Microsoft Docs
+description: Application Insights SDK tutorial to monitor ASP.NET Core web applications for availability, performance, and usage.
+
+ms.devlang: csharp
+ Last updated : 08/22/2022+++
+# Enable Application Insights for ASP.NET Core applications
+
+This article describes how to enable Application Insights for an [ASP.NET Core](/aspnet/core) application deployed as an Azure Web App. This implementation utilizes an SDK-based approach, an [auto-instrumentation approach](./codeless-overview.md) is also available.
+
+Application Insights can collect the following telemetry from your ASP.NET Core application:
+
+> [!div class="checklist"]
+> * Requests
+> * Dependencies
+> * Exceptions
+> * Performance counters
+> * Heartbeats
+> * Logs
+
+We'll use an [ASP.NET Core MVC application](/aspnet/core/tutorials/first-mvc-app) example that targets `net6.0`. You can apply these instructions to all ASP.NET Core applications. If you're using the [Worker Service](/aspnet/core/fundamentals/host/hosted-services#worker-service-template), use the instructions from [here](./worker-service.md).
+
+> [!NOTE]
+> A preview [OpenTelemetry-based .NET offering](./opentelemetry-enable.md?tabs=net) is available. [Learn more](./opentelemetry-overview.md).
++
+## Supported scenarios
+
+The [Application Insights SDK for ASP.NET Core](https://nuget.org/packages/Microsoft.ApplicationInsights.AspNetCore) can monitor your applications no matter where or how they run. If your application is running and has network connectivity to Azure, telemetry can be collected. Application Insights monitoring is supported everywhere .NET Core is supported. Support covers the following scenarios:
+* **Operating system**: Windows, Linux, or Mac
+* **Hosting method**: In process or out of process
+* **Deployment method**: Framework dependent or self-contained
+* **Web server**: IIS (Internet Information Server) or Kestrel
+* **Hosting platform**: The Web Apps feature of Azure App Service, Azure VM, Docker, Azure Kubernetes Service (AKS), and so on
+* **.NET Core version**: All officially [supported .NET Core versions](https://dotnet.microsoft.com/download/dotnet-core) that aren't in preview
+* **IDE**: Visual Studio, Visual Studio Code, or command line
+
+## Prerequisites
+
+If you'd like to follow along with the guidance in this article, certain pre-requisites are needed.
+
+* Visual Studio 2022
+* Visual Studio Workloads: ASP.NET and web development, Data storage and processing, and Azure development
+* .NET 6.0
+* Azure subscription and user account (with the ability to create and delete resources)
+
+## Deploy Azure resources
+
+Please follow the guidance to deploy the sample application from its [GitHub repository.](https://github.com/solliancenet/appinsights-azurecafe).
+
+In order to provide globally unique names to some resources, a 5 character suffix has been assigned. Please make note of this suffix for use later on in this article.
+
+![The deployed Azure resource listing displays with the 5 character suffix highlighted.](./media/tutorial-asp-net-core/naming-suffix.png "Record the 5 character suffix")
+
+## Create an Application Insights resource
+
+1. In the [Azure portal](https://portal.azure.com), locate and select the **application-insights-azure-cafe** resource group.
+
+2. From the top toolbar menu, select **+ Create**.
+
+ ![The resource group application-insights-azure-cafe displays with the + Create button highlighted on the toolbar menu.](./media/tutorial-asp-net-core/create-resource-menu.png "Create new resource")
+
+3. On the **Create a resource** screen, search for and select `Application Insights` in the marketplace search textbox.
+
+ ![The Create a resource screen displays with Application Insights entered into the search box and Application Insights highlighted from the search results.](./media/tutorial-asp-net-core/search-application-insights.png "Search for Application Insights")
+
+4. On the Application Insights resource overview screen, select **Create**.
+
+ ![The Application Insights overview screen displays with the Create button highlighted.](./media/tutorial-asp-net-core/create-application-insights-overview.png "Create Application Insights resource")
+
+5. On the Application Insights screen **Basics** tab. Complete the form as follows, then select the **Review + create** button. Fields not specified in the table below may retain their default values.
+
+ | Field | Value |
+ |-|-|
+ | Name | Enter `azure-cafe-application-insights-{SUFFIX}`, replacing **{SUFFIX}** with the appropriate suffix value recorded earlier. |
+ | Region | Select the same region chosen when deploying the article resources. |
+ | Log Analytics Workspace | Select `azure-cafe-log-analytics-workspace`, alternatively a new log analytics workspace can be created here. |
+
+ ![The Application Insights Basics tab displays with a form populated with the preceding values.](./media/tutorial-asp-net-core/application-insights-basics-tab.png "Application Insights Basics tab")
+
+6. Once validation has passed, select **Create** to deploy the resource.
+
+ ![The Application Insights validation screen displays indicating Validation passed and the Create button is highlighted.](./media/tutorial-asp-net-core/application-insights-validation-passed.png "Validation passed")
+
+7. Once deployment has completed, return to the `application-insights-azure-cafe` resource group, and select the deployed Application Insights resource.
+
+ ![The Azure Cafe resource group displays with the Application Insights resource highlighted.](./media/tutorial-asp-net-core/application-insights-resource-group.png "Application Insights")
+
+8. On the Overview screen of the Application Insights resource, copy the **Connection String** value for use in the next section of this article.
+
+ ![The Application Insights Overview screen displays with the Connection String value highlighted and the Copy button selected.](./media/tutorial-asp-net-core/application-insights-connection-string-overview.png "Copy Connection String value")
+
+## Configure the Application Insights connection string application setting in the web App Service
+
+1. Return to the `application-insights-azure-cafe` resource group, locate and open the **azure-cafe-web-{SUFFIX}** App Service resource.
+
+ ![The Azure Cafe resource group displays with the azure-cafe-web-{SUFFIX} resource highlighted.](./media/tutorial-asp-net-core/web-app-service-resource-group.png "Web App Service")
+
+2. From the left menu, beneath the Settings header, select **Configuration**. Then, on the **Application settings** tab, select **+ New application setting** beneath the Application settings header.
+
+ ![The App Service resource screen displays with the Configuration item selected from the left menu and the + New application setting toolbar button highlighted.](./media/tutorial-asp-net-core/app-service-app-setting-button.png "Create New application setting")
+
+3. In the Add/Edit application setting blade, complete the form as follows and select **OK**.
+
+ | Field | Value |
+ |-|-|
+ | Name | APPLICATIONINSIGHTS_CONNECTION_STRING |
+ | Value | Paste the Application Insights connection string obtained in the preceding section. |
+
+ ![The Add/Edit application setting blade displays populated with the preceding values.](./media/tutorial-asp-net-core/add-edit-app-setting.png "Add/Edit application setting")
+
+4. On the App Service Configuration screen, select the **Save** button from the toolbar menu. When prompted to save the changes, select **Continue**.
+
+ ![The App Service Configuration screen displays with the Save button highlighted on the toolbar menu.](./media/tutorial-asp-net-core/save-app-service-configuration.png "Save the App Service Configuration")
+
+## Install the Application Insights NuGet Package
+
+We need to configure the ASP.NET Core MVC web application to send telemetry. This is accomplished using the [Application Insights for ASP.NET Core web applications NuGet package](https://nuget.org/packages/Microsoft.ApplicationInsights.AspNetCore).
+
+1. With Visual Studio, open `1 - Starter Application\src\AzureCafe.sln`.
+
+2. In the Solution Explorer panel, right-click the AzureCafe project file, and select **Manage NuGet Packages**.
+
+ ![The Solution Explorer displays with Manage NuGet Packages selected from the context menu.](./media/tutorial-asp-net-core/manage-nuget-packages-menu.png "Manage NuGet Packages")
+
+3. Select the **Browse** tab, then search for and select **Microsoft.ApplicationInsights.AspNetCore**. Select **Install**, and accept the license terms. It is recommended to use the latest stable version. Find full release notes for the SDK on the [open-source GitHub repo](https://github.com/Microsoft/ApplicationInsights-dotnet/releases).
+
+ ![The NuGet tab displays with the Browse tab selected and Microsoft.ApplicationInsights.AspNetCore is entered in the search box. The Microsoft.ApplicationInsights.AspNetCore package is selected from a list of results. In the right pane, the latest stable version is selected from a drop down list and the Install button is highlighted.](./media/tutorial-asp-net-core/asp-net-core-install-nuget-package.png "Install NuGet Package")
+
+4. Keep Visual Studio open for the next section of the article.
+
+## Enable Application Insights server-side telemetry
+
+The Application Insights for ASP.NET Core web applications NuGet package encapsulates features to enable sending server-side telemetry to the Application Insights resource in Azure.
+
+1. From the Visual Studio Solution Explorer, locate and open the **Program.cs** file.
+
+ ![The Visual Studio Solution Explorer displays with the Program.cs highlighted.](./media/tutorial-asp-net-core/solution-explorer-programcs.png "Program.cs")
+
+2. Insert the following code prior to the `builder.Services.AddControllersWithViews()` statement. This code automatically reads the Application Insights connection string value from configuration. The `AddApplicationInsightsTelemetry` method registers the `ApplicationInsightsLoggerProvider` with the built-in dependency injection container, that will then be used to fulfill [ILogger](/dotnet/api/microsoft.extensions.logging.ilogger) and [ILogger\<TCategoryName\>](/dotnet/api/microsoft.extensions.logging.iloggerprovider) implementation requests.
+
+ ```csharp
+ builder.Services.AddApplicationInsightsTelemetry();
+ ```
+
+ ![A code window displays with the preceding code snippet highlighted.](./media/tutorial-asp-net-core/enable-server-side-telemetry.png "Enable server-side telemetry")
+
+ > [!TIP]
+ > Learn more about [configuration options in ASP.NET Core](/aspnet/core/fundamentals/configuration).
+
+## Enable client-side telemetry for web applications
+
+The preceding steps are enough to help you start collecting server-side telemetry. This application has client-side components, follow the next steps to start collecting [usage telemetry](./usage-overview.md).
+
+1. In Visual Studio Solution explorer, locate and open `\Views\_ViewImports.cshtml`. Add the following code at the end of the existing file.
+
+ ```cshtml
+ @inject Microsoft.ApplicationInsights.AspNetCore.JavaScriptSnippet JavaScriptSnippet
+ ```
+
+ ![The _ViewImports.cshtml file displays with the preceding line of code highlighted.](./media/tutorial-asp-net-core/view-imports-injection.png "JavaScriptSnippet injection")
+
+2. To properly enable client-side monitoring for your application, the JavaScript snippet must appear in the `<head>` section of each page of your application that you want to monitor. In Visual Studio Solution Explorer, locate and open `\Views\Shared\_Layout.cshtml`, insert the following code immediately preceding the closing `<\head>` tag.
+
+ ```cshtml
+ @Html.Raw(JavaScriptSnippet.FullScript)
+ ```
+
+ ![The _Layout.cshtml file displays with the preceding line of code highlighted within the head section of the page.](./media/tutorial-asp-net-core/layout-head-code.png "The head section of _Layout.cshtml")
+
+ > [!TIP]
+ > As an alternative to using the `FullScript`, the `ScriptBody` is available. Use `ScriptBody` if you need to control the `<script>` tag to set a Content Security Policy:
+
+ ```cshtml
+ <script> // apply custom changes to this script tag.
+ @Html.Raw(JavaScriptSnippet.ScriptBody)
+ </script>
+ ```
+
+> [!NOTE]
+> JavaScript injection provides a default configuration experience. If you require [configuration](./javascript.md#configuration) beyond setting the connection string, you are required to remove auto-injection as described above and manually add the [JavaScript SDK](./javascript.md#add-the-javascript-sdk).
+
+## Enable monitoring of database queries
+
+When investigating causes for performance degradation, it is important to include insights into database calls. Enable monitoring through configuration of the [dependency module](./asp-net-dependencies.md). Dependency monitoring, including SQL is enabled by default. The following steps can be followed to capture the full SQL query text.
+
+> [!NOTE]
+> SQL text may contain sensitive data such as passwords and PII. Be careful when enabling this feature.
+
+1. From the Visual Studio Solution Explorer, locate and open the **Program.cs** file.
+
+2. At the top of the file, add the following `using` statement.
+
+ ```csharp
+ using Microsoft.ApplicationInsights.DependencyCollector;
+ ```
+
+3. Immediately following the `builder.Services.AddApplicationInsightsTelemetry()` code, insert the following to enable SQL command text instrumentation.
+
+ ```csharp
+ builder.Services.ConfigureTelemetryModule<DependencyTrackingTelemetryModule>((module, o) => { module.EnableSqlCommandTextInstrumentation = true; });
+ ```
+
+ ![A code window displays with the preceding code highlighted.](./media/tutorial-asp-net-core/enable-sql-command-text-instrumentation.png "Enable SQL command text instrumentation")
+
+## Run the Azure Cafe web application
+
+After the web application code is deployed, telemetry will flow to Application Insights. The Application Insights SDK automatically collects incoming web requests to your application.
+
+1. Right-click the **AzureCafe** project in Solution Explorer and select **Publish** from the context menu.
+
+ ![The Visual Studio Solution Explorer displays with the Azure Cafe project selected and the Publish context menu item highlighted.](./media/tutorial-asp-net-core/web-project-publish-context-menu.png "Publish Web App")
+
+2. Select **Publish** to promote the new code to the Azure App Service.
+
+ ![The AzureCafe publish profile displays with the Publish button highlighted.](./media/tutorial-asp-net-core/publish-profile.png "Publish profile")
+
+3. Once the publish has succeeded, a new browser window opens to the Azure Cafe web application.
+
+ ![The Azure Cafe web application displays.](./media/tutorial-asp-net-core/azure-cafe-index.png "Azure Cafe web application")
+
+4. Perform various activities in the web application to generate some telemetry.
+
+ 1. Select **Details** next to a Cafe to view its menu and reviews.
+
+ ![A portion of the Azure Cafe list displays with the Details button highlighted.](./media/tutorial-asp-net-core/cafe-details-button.png "Azure Cafe Details")
+
+ 2. On the Cafe screen, select the **Reviews** tab to view and add reviews. Select the **Add review** button to add a review.
+
+ ![The Cafe details screen displays with the Add review button highlighted.](./media/tutorial-asp-net-core/cafe-add-review-button.png "Add review")
+
+ 3. On the Create a review dialog, enter a name, rating, comments, and upload a photo for the review. Once completed, select **Add review**.
+
+ ![The Create a review dialog displays.](./media/tutorial-asp-net-core/create-a-review-dialog.png "Create a review")
+
+ 4. Repeat adding reviews as desired to generate additional telemetry.
+
+### Live metrics
+
+[Live Metrics](./live-stream.md) can be used to quickly verify if Application Insights monitoring is configured correctly. It might take a few minutes for telemetry to appear in the portal and analytics, but Live Metrics shows CPU usage of the running process in near real time. It can also show other telemetry like Requests, Dependencies, and Traces.
+
+### Application map
+
+The sample application makes calls to multiple Azure resources, including Azure SQL, Azure Blob Storage, and the Azure Language Service (for review sentiment analysis).
+
+![The Azure Cafe sample application architecture displays.](./media/tutorial-asp-net-core/azure-cafe-app-insights.png "Azure Cafe sample application architecture")
+
+Application Insights introspects incoming telemetry data and is able to generate a visual map of detected system integrations.
+
+1. Access and log into the [Azure portal](https://portal.azure.com).
+
+2. Open the sample application resource group `application-insights-azure-cafe`.
+
+3. From the list of resources, select the `azure-cafe-insights-{SUFFIX}` Application Insights resource.
+
+4. Select **Application map** from the left menu, beneath the **Investigate** heading. Observe the generated Application map.
+
+ ![The Application Insights application map displays.](./media/tutorial-asp-net-core/application-map.png "Application map")
+
+### Viewing HTTP calls and database SQL command text
+
+1. In the Azure portal, open the Application Insights resource.
+
+2. Beneath the **Investigate** header on the left menu, select **Performance**.
+
+3. The **Operations** tab contains details of the HTTP calls received by the application. You can also toggle between Server and Browser (client-side) views of data.
+
+ ![The Performance screen of Application Insights displays with the toggle between Server and Browser highlighted along with the list of HTTP calls received by the application.](./media/tutorial-asp-net-core/server-performance.png "Server performance HTTP calls")
+
+4. Select an Operation from the table, and choose to drill into a sample of the request.
+
+ ![The Performance screen displays with a POST operation selected, the Drill into samples button is highlighted and a sample is selected from the suggested list.](./media/tutorial-asp-net-core/select-operation-performance.png "Drill into an operation")
+
+5. The End-to-end transaction displays for the selected request. In this case, a review was created including an image, thus it includes calls to Azure Storage, the Language Service (for sentiment analysis), as well as database calls into SQL Azure to persist the review. In this example, the first selected Event displays information relative to the HTTP POST call.
+
+ ![The End-to-end transaction displays with the HTTP Post call selected.](./media/tutorial-asp-net-core/e2e-http-call.png "HTTP POST details")
+
+6. Select a SQL item to review the SQL command text issued to the database.
+
+ ![The End-to-end transaction displays with SQL command details.](./media/tutorial-asp-net-core/e2e-db-call.png "SQL Command text details")
+
+7. Optionally select Dependency (outgoing) requests to Azure Storage or the Language Service.
+
+8. Return to the **Performance** screen, and select the **Dependencies** tab to investigate calls into external resources. Notice the Operations table includes calls into Sentiment Analysis, Blob Storage, and Azure SQL.
+
+ ![The Performance screen displays with the Dependencies tab selected and the Operations table highlighted.](./media/tutorial-asp-net-core/performance-dependencies.png "Dependency Operations")
+
+## Application logging with Application Insights
+
+### Logging overview
+
+Application Insights is one type of [logging provider](/dotnet/core/extensions/logging-providers) available to ASP.NET Core applications that becomes available to applications when the [Application Insights for ASP.NET Core](#install-the-application-insights-nuget-package) NuGet package is installed and [server-side telemetry collection enabled](#enable-application-insights-server-side-telemetry). As a reminder, the following code in **Program.cs** registers the `ApplicationInsightsLoggerProvider` with the built-in dependency injection container.
+
+```csharp
+builder.Services.AddApplicationInsightsTelemetry();
+```
+
+With the `ApplicationInsightsLoggerProvider` registered as the logging provider, the app is ready to log to Application Insights using either constructor injection with <xref:Microsoft.Extensions.Logging.ILogger> or the generic-type alternative <xref:Microsoft.Extensions.Logging.ILogger%601>.
+
+> [!NOTE]
+> With default settings, the logging provider is configured to automatically capture log events with a severity of <xref:Microsoft.Extensions.Logging.LogLevel.Warning?displayProperty=nameWithType> or greater.
+
+Consider the following example controller that demonstrates the injection of ILogger which is resolved with the `ApplicationInsightsLoggerProvider` that is registered with the dependency injection container. Observe in the **Get** method that an Informational, Warning and Error message are recorded.
+
+> [!NOTE]
+> By default, the Information level trace will not be recorded. Only the Warning and above levels are captured.
+
+```csharp
+using Microsoft.AspNetCore.Mvc;
+
+[Route("api/[controller]")]
+[ApiController]
+public class ValuesController : ControllerBase
+{
+ private readonly ILogger _logger;
+
+ public ValuesController(ILogger<ValuesController> logger)
+ {
+ _logger = logger;
+ }
+
+ [HttpGet]
+ public ActionResult<IEnumerable<string>> Get()
+ {
+ //Info level traces are not captured by default
+ _logger.LogInfo("An example of an Info trace..")
+ _logger.LogWarning("An example of a Warning trace..");
+ _logger.LogError("An example of an Error level message");
+
+ return new string[] { "value1", "value2" };
+ }
+}
+```
+
+For more information, see [Logging in ASP.NET Core](/aspnet/core/fundamentals/logging).
+
+## View logs in Application Insights
+
+The ValuesController above is deployed with the sample application and is located in the **Controllers** folder of the project.
+
+1. Using an internet browser, open the sample application. In the address bar, append `/api/Values` and press <kbd>Enter</kbd>.
+
+ ![A browser window displays with /api/Values appended to the URL in the address bar.](media/tutorial-asp-net-core/values-api-url.png "Values API URL")
+
+2. Wait a few moments, then return to the **Application Insights** resource in the [Azure portal](https://portal.azure.com).
+
+ ![A resource group displays with the Application Insights resource highlighted.](./media/tutorial-asp-net-core/application-insights-resource-group.png "Resource Group")
+
+3. From the left menu of the Application Insights resource, select **Logs** from beneath the **Monitoring** section. In the **Tables** pane, double-click on the **traces** table, located under the **Application Insights** tree. Modify the query to retrieve traces for the **Values** controller as follows, then select **Run** to filter the results.
+
+ ```kql
+ traces
+ | where operation_Name == "GET Values/Get"
+ ```
+
+4. Observe the results display the logging messages present in the controller. A log severity of 2 indicates a warning level, and a log severity of 3 indicates an Error level.
+
+5. Alternatively, the query can also be written to retrieve results based on the category of the log. By default, the category is the fully qualified name of the class where the ILogger is injected, in this case **ValuesController** (if there was a namespace associated with the class the name will be prefixed with the namespace). Re-write and run the following query to retrieve results based on category.
+
+ ```kql
+ traces
+ | where customDimensions.CategoryName == "ValuesController"
+ ```
+
+## Control the level of logs sent to Application Insights
+
+`ILogger` implementations have a built-in mechanism to apply [log filtering](/dotnet/core/extensions/logging#how-filtering-rules-are-applied). This filtering lets you control the logs that are sent to each registered provider, including the Application Insights provider. You can use the filtering either in configuration (using an *appsettings.json* file) or in code. For more information about log levels and guidance on appropriate use, see the [Log Level](/aspnet/core/fundamentals/logging#log-level) documentation.
+
+The following examples show how to apply filter rules to the `ApplicationInsightsLoggerProvider` to control the level of logs sent to Application Insights.
+
+### Create filter rules with configuration
+
+The `ApplicationInsightsLoggerProvider` is aliased as **ApplicationInsights** in configuration. The following section of an *appsettings.json* file sets the default log level for all providers to <xref:Microsoft.Extensions.Logging.LogLevel.Warning?displayProperty=nameWithType>. The configuration for the ApplicationInsights provider specifically for categories that start with "ValuesController" override this default value with <xref:Microsoft.Extensions.Logging.LogLevel.Error?displayProperty=nameWithType> and higher.
+
+```json
+{
+ //... additional code removed for brevity
+ "Logging": {
+ "LogLevel": { // No provider, LogLevel applies to all the enabled providers.
+ "Default": "Warning"
+ },
+ "ApplicationInsights": { // Specific to the provider, LogLevel applies to the Application Insights provider.
+ "LogLevel": {
+ "ValuesController": "Error" //Log Level for the "ValuesController" category
+ }
+ }
+ }
+}
+```
+
+Deploying the sample application with the preceding code in *appsettings.json* will yield only the error trace being sent to Application Insights when interacting with the **ValuesController**. This is because the **LogLevel** for the **ValuesController** category is set to **Error**, therefore the **Warning** trace is suppressed.
+
+## Turn off logging to Application Insights
+
+To disable logging using configuration, set all LogLevel values to "None".
+
+```json
+{
+ //... additional code removed for brevity
+ "Logging": {
+ "LogLevel": { // No provider, LogLevel applies to all the enabled providers.
+ "Default": "None"
+ },
+ "ApplicationInsights": { // Specific to the provider, LogLevel applies to the Application Insights provider.
+ "LogLevel": {
+ "ValuesController": "None" //Log Level for the "ValuesController" category
+ }
+ }
+ }
+}
+```
+
+Similarly, within code, set the default level for the `ApplicationInsightsLoggerProvider` and any subsequent log levels to **None**.
+
+```csharp
+var builder = WebApplication.CreateBuilder(args);
+builder.Logging.AddFilter<ApplicationInsightsLoggerProvider>("", LogLevel.None);
+builder.Logging.AddFilter<Microsoft.Extensions.Logging.ApplicationInsights.ApplicationInsightsLoggerProvider>("ValuesController", LogLevel.None);
+```
+
+## Open-source SDK
+
+* [Read and contribute to the code](https://github.com/microsoft/ApplicationInsights-dotnet).
+
+For the latest updates and bug fixes, see the [release notes](./release-notes.md).
+
+## Next steps
+
+* [Explore user flows](./usage-flows.md) to understand how users navigate through your app.
+* [Configure a snapshot collection](./snapshot-debugger.md) to see the state of source code and variables at the moment an exception is thrown.
+* [Use the API](./api-custom-events-metrics.md) to send your own events and metrics for a detailed view of your app's performance and usage.
+* Use [availability tests](./monitor-web-app-availability.md) to check your app constantly from around the world.
+* [Dependency Injection in ASP.NET Core](/aspnet/core/fundamentals/dependency-injection)
+* [Logging in ASP.NET Core](/aspnet/core/fundamentals/logging)
+* [.NET trace logs in Application Insights](./asp-net-trace-logs.md)
+* [Auto-instrumentation for Application Insights](./codeless-overview.md)
+
azure-monitor Metrics Dynamic Scope https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/metrics-dynamic-scope.md
Title: View multiple resources in the Azure metrics explorer description: Learn how to visualize multiple resources by using the Azure metrics explorer.- Last updated 12/14/2020- # View multiple resources in the Azure metrics explorer
azure-monitor Metrics Supported https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/metrics-supported.md
This latest update adds a new column and reorders the metrics to be alphabetical
|Metric|Exportable via Diagnostic Settings?|Metric Display Name|Unit|Aggregation Type|Description|Dimensions| ||||||||
-|ByteCount|Yes|Bytes|Bytes|Total|Total number of Bytes transmitted within time period|Protocol, Direction|
-|DatapathAvailability|Yes|Datapath Availability (Preview)|Count|Average|NAT Gateway Datapath Availability|No Dimensions|
-|PacketCount|Yes|Packets|Count|Total|Total number of Packets transmitted within time period|Protocol, Direction|
-|PacketDropCount|Yes|Dropped Packets|Count|Total|Count of dropped packets|No Dimensions|
-|SNATConnectionCount|Yes|SNAT Connection Count|Count|Total|Total concurrent active connections|Protocol, ConnectionState|
-|TotalConnectionCount|Yes|Total SNAT Connection Count|Count|Total|Total number of active SNAT connections|Protocol|
+|ByteCount|No|Bytes|Bytes|Total|Total number of Bytes transmitted within time period|Protocol, Direction|
+|DatapathAvailability|No|Datapath Availability (Preview)|Count|Average|NAT Gateway Datapath Availability|No Dimensions|
+|PacketCount|No|Packets|Count|Total|Total number of Packets transmitted within time period|Protocol, Direction|
+|PacketDropCount|No|Dropped Packets|Count|Total|Count of dropped packets|No Dimensions|
+|SNATConnectionCount|No|SNAT Connection Count|Count|Total|Total concurrent active connections|Protocol, ConnectionState|
+|TotalConnectionCount|No|Total SNAT Connection Count|Count|Total|Total number of active SNAT connections|Protocol|
## Microsoft.Network/networkInterfaces
azure-monitor Resource Logs Blob Format https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/resource-logs-blob-format.md
Title: Prepare for format change to Azure Monitor resource logs description: Azure resource logs moved to use append blobs on November 1, 2018.- Last updated 07/06/2018- # Prepare for format change to Azure Monitor platform logs archived to a storage account
azure-monitor Logs Export Logic App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/logs-export-logic-app.md
The method described in this article describes a scheduled export from a log que
## Overview This procedure uses the [Azure Monitor Logs connector](/connectors/azuremonitorlogs/) which lets you run a log query from a Logic App and use its output in other actions in the workflow. The [Azure Blob Storage connector](/connectors/azureblob/) is used in this procedure to send the query output to Azure storage.
-[![Logic app overview](media/logs-export-logic-app/logic-app-overview.png)](media/logs-export-logic-app/logic-app-overview.png#lightbox)
+[![Logic app overview](media/logs-export-logic-app/logic-app-overview.png "Screenshot of Logic app flow.")](media/logs-export-logic-app/logic-app-overview.png#lightbox)
When you export data from a Log Analytics workspace, you should filter and aggregate your log data and optimize query and limit the amount of data processed by your Logic App workflow, to the required data. For example, if you need to archive sign-in events, you should filter for required events and project only the required fields. For example:
Log Analytics workspace and log queries in Azure Monitor are multitenancy servic
1. **Create Logic App**
- 1. Go to **Logic Apps** in the Azure portal and click **Add**. Select a **Subscription**, **Resource group**, and **Region** to store the new Logic App and then give it a unique name. You can turn on **Log Analytics** setting to collect information about runtime data and events as described in [Set up Azure Monitor logs and collect diagnostics data for Azure Logic Apps](../../logic-apps/monitor-logic-apps-log-analytics.md). This setting isn't required for using the Azure Monitor Logs connector.<br>
- [![Create Logic App](media/logs-export-logic-app/create-logic-app.png)](media/logs-export-logic-app/create-logic-app.png#lightbox)
+ 1. Go to **Logic Apps** in the Azure portal and click **Add**. Select a **Subscription**, **Resource group**, and **Region** to store the new Logic App and then give it a unique name. You can turn on **Log Analytics** setting to collect information about runtime data and events as described in [Set up Azure Monitor logs and collect diagnostics data for Azure Logic Apps](../../logic-apps/monitor-logic-apps-log-analytics.md). This setting isn't required for using the Azure Monitor Logs connector.
+\
+ [![Create Logic App](media/logs-export-logic-app/create-logic-app.png "Screenshot of Logic App resource create.")](media/logs-export-logic-app/create-logic-app.png#lightbox)
- 1. Click **Review + create** and then **Create**. When the deployment is complete, click **Go to resource** to open the **Logic Apps Designer**.
+ 2. Click **Review + create** and then **Create**. When the deployment is complete, click **Go to resource** to open the **Logic Apps Designer**.
-1. **Create a trigger for the Logic App**
+2. **Create a trigger for the Logic App**
- 1. Under **Start with a common trigger**, select **Recurrence**. This creates a Logic App that automatically runs at a regular interval. In the **Frequency** box of the action, select **Day** and in the **Interval** box, enter **1** to run the workflow once per day.<br>
- [![Recurrence action](media/logs-export-logic-app/recurrence-action.png)](media/logs-export-logic-app/recurrence-action.png#lightbox)
+ 1. Under **Start with a common trigger**, select **Recurrence**. This creates a Logic App that automatically runs at a regular interval. In the **Frequency** box of the action, select **Day** and in the **Interval** box, enter **1** to run the workflow once per day.
+ \
+ [![Recurrence action](media/logs-export-logic-app/recurrence-action.png "Screenshot of recurrence action create.")](media/logs-export-logic-app/recurrence-action.png#lightbox)
-2. **Add Azure Monitor Logs action**
+3. **Add Azure Monitor Logs action**
The Azure Monitor Logs action lets you specify the query to run. The log query used in this example is optimized for hourly recurrence and collects the data ingested for the particular execution time. For example, if the workflow runs at 4:35, the time range would be 3:00 to 4:00. If you change the Logic App to run at a different frequency, you need the change the query as well. For example, if you set the recurrence to run daily, you would set startTime in the query to startofday(make_datetime(year,month,day,0,0)). You will be prompted to select a tenant to grant access to the Log Analytics workspace with the account that the workflow will use to run the query.
- 1. Click **+ New step** to add an action that runs after the recurrence action. Under **Choose an action**, type **azure monitor** and then select **Azure Monitor Logs**.<br>
- [![Azure Monitor Logs action](media/logs-export-logic-app/select-azure-monitor-connector.png)](media/logs-export-logic-app/select-azure-monitor-connector.png#lightbox)
+ 1. Click **+ New step** to add an action that runs after the recurrence action. Under **Choose an action**, type **azure monitor** and then select **Azure Monitor Logs**.
+ \
+ [![Azure Monitor Logs action](media/logs-export-logic-app/select-azure-monitor-connector.png "Screenshot of Azure Monitor Logs action create.")](media/logs-export-logic-app/select-azure-monitor-connector.png#lightbox)
- 2. Click **Azure Log Analytics ΓÇô Run query and list results**.<br>
- [![Screenshot of a new action being added to a step in the Logic App Designer. Azure Monitor Logs is highlighted under Choose an action.](media/logs-export-logic-app/select-query-action-list.png)](media/logs-export-logic-app/select-query-action-list.png#lightbox)
+ 1. Click **Azure Log Analytics ΓÇô Run query and list results**.
+ \
+ [![Azure Monitor Logs is highlighted under Choose an action.](media/logs-export-logic-app/select-query-action-list.png "Screenshot of a new action being added to a step in the Logic App Designer.")](media/logs-export-logic-app/select-query-action-list.png#lightbox)
- 3. Select the **Subscription** and **Resource Group** for your Log Analytics workspace. Select *Log Analytics Workspace* for the **Resource Type** and then select the workspace's name under **Resource Name**.
+ 2. Select the **Subscription** and **Resource Group** for your Log Analytics workspace. Select *Log Analytics Workspace* for the **Resource Type** and then select the workspace's name under **Resource Name**.
- 4. Add the following log query to the **Query** window.
+ 3. Add the following log query to the **Query** window.
```Kusto let dt = now();
Log Analytics workspace and log queries in Azure Monitor are multitenancy servic
ResourceId = _ResourceId ```
- 5. The **Time Range** specifies the records that will be included in the query based on the **TimeGenerated** column. This should be set to a value greater than the time range selected in the query. Since this query isn't using the **TimeGenerated** column, then **Set in query** option isn't available. See [Query scope](./scope.md) for more details about the time range. Select **Last 4 hours** for the **Time Range**. This will ensure that any records with an ingestion time larger than **TimeGenerated** will be included in the results.<br>
- [![Screenshot of the settings for the new Azure Monitor Logs action named Run query and visualize results.](media/logs-export-logic-app/run-query-list-action.png)](media/logs-export-logic-app/run-query-list-action.png#lightbox)
+ 4. The **Time Range** specifies the records that will be included in the query based on the **TimeGenerated** column. This should be set to a value greater than the time range selected in the query. Since this query isn't using the **TimeGenerated** column, then **Set in query** option isn't available. See [Query scope](./scope.md) for more details about the time range. Select **Last 4 hours** for the **Time Range**. This will ensure that any records with an ingestion time larger than **TimeGenerated** will be included in the results.
+ \
+ [![Screenshot of the settings for the new Azure Monitor Logs action named Run query and visualize results.](media/logs-export-logic-app/run-query-list-action.png "of the settings for the new Azure Monitor Logs action named Run query and visualize results.")](media/logs-export-logic-app/run-query-list-action.png#lightbox)
-3. **Add Parse JSON activity (optional)**
+4. **Add Parse JSON activity (optional)**
The output from the **Run query and list results** action is formatted in JSON. You can parse this data and manipulate it as part of the preparation for **Compose** action.
- You can provide a JSON schema that describes the payload you expect to receive. The designer parses JSON content by using this schema and generates user-friendly tokens that represent the properties in your JSON content. You can then easily reference and use those properties throughout your Logic App's workflow.
-
- 1. Click **+ New step**, and then click **+ Add an action**. Under **Choose an action**, type **json** and then select **Parse JSON**.<br>
- [![Select Parse JSON activity](media/logs-export-logic-app/select-parse-json.png)](media/logs-export-logic-app/select-parse-json.png#lightbox)
-
- 2. Click in the **Content** box to display a list of values from previous activities. Select **Body** from the **Run query and list results** action. This is the output from the log query.<br>
- [![Select Body](media/logs-export-logic-app/select-body.png)](media/logs-export-logic-app/select-body.png#lightbox)
-
- 3. Click **Use sample payload to generate schema**. Run the log query and copy the output to use for the sample payload. For the sample query here, you can use the following output:
-
- ```json
- {
- "TimeGenerated": "2020-09-29T23:11:02.578Z",
- "BlobTime": "2020-09-29T23:00:00Z",
- "OperationName": "Returns Storage Account SAS Token",
- "OperationNameValue": "MICROSOFT.RESOURCES/DEPLOYMENTS/WRITE",
- "Level": "Informational",
- "ActivityStatus": "Started",
- "ResourceGroup": "monitoring",
- "SubscriptionId": "00000000-0000-0000-0000-000000000000",
- "Category": "Administrative",
- "EventSubmissionTimestamp": "2020-09-29T23:11:02Z",
- "ClientIpAddress": "192.168.1.100",
- "ResourceId": "/subscriptions/00000000-0000-0000-0000-000000000000/resourcegroups/monitoring/providers/microsoft.storage/storageaccounts/my-storage-account"
- }
- ```
-
- [![Parse JSON payload](media/logs-export-logic-app/parse-json-payload.png)](media/logs-export-logic-app/parse-json-payload.png#lightbox)
-
-4. **Add the Compose action**
+ You can provide a JSON schema that describes the payload you expect to receive. The designer parses JSON content by using this schema and generates user-friendly tokens that represent the properties in your JSON content. You can then easily reference and use those properties throughout your Logic App's workflow.
+
+ You can use a sample output from **Run query and list results** step. Click **Run Trigger** in Logic App ribbon, then **Run**, download and save an output record. For the sample query in previous stem, you can use the following sample output:
+
+ ```json
+ {
+ "TimeGenerated": "2020-09-29T23:11:02.578Z",
+ "BlobTime": "2020-09-29T23:00:00Z",
+ "OperationName": "Returns Storage Account SAS Token",
+ "OperationNameValue": "MICROSOFT.RESOURCES/DEPLOYMENTS/WRITE",
+ "Level": "Informational",
+ "ActivityStatus": "Started",
+ "ResourceGroup": "monitoring",
+ "SubscriptionId": "00000000-0000-0000-0000-000000000000",
+ "Category": "Administrative",
+ "EventSubmissionTimestamp": "2020-09-29T23:11:02Z",
+ "ClientIpAddress": "192.168.1.100",
+ "ResourceId": "/subscriptions/00000000-0000-0000-0000-000000000000/resourcegroups/monitoring/providers/microsoft.storage/storageaccounts/my-storage-account"
+ }
+ ```
+
+ 1. Click **+ New step**, and then click **+ Add an action**. Under **Choose an action**, type **json** and then select **Parse JSON**.
+ \
+ [![Select Parse JSON operator](media/logs-export-logic-app/select-parse-json.png "Screenshot of Parse JSON operator.")](media/logs-export-logic-app/select-parse-json.png#lightbox)
+
+ 1. Click in the **Content** box to display a list of values from previous activities. Select **Body** from the **Run query and list results** action. This is the output from the log query.
+ \
+ [![Select Body](media/logs-export-logic-app/select-body.png "Screenshot of Par JSON Content setting with output Body from previous step.")](media/logs-export-logic-app/select-body.png#lightbox)
+
+ 1. Copy the sample record saved earlier, click **Use sample payload to generate schema** and paste.
+\
+ [![Parse JSON payload](media/logs-export-logic-app/parse-json-payload.png "Screenshot of Parse JSON schema.")](media/logs-export-logic-app/parse-json-payload.png#lightbox)
+
+5. **Add the Compose action**
The **Compose** action takes the parsed JSON output and creates the object that you need to store in the blob.
- 1. Click **+ New step**, and then click **+ Add an action**. Under **Choose an action**, type **compose** and then select the **Compose** action.<br>
- [![Select Compose action](media/logs-export-logic-app/select-compose.png)](media/logs-export-logic-app/select-compose.png#lightbox)
+ 1. Click **+ New step**, and then click **+ Add an action**. Under **Choose an action**, type **compose** and then select the **Compose** action.
+ \
+ [![Select Compose action](media/logs-export-logic-app/select-compose.png "Screenshot of Compose action.")](media/logs-export-logic-app/select-compose.png#lightbox)
- 2. Click the **Inputs** box display a list of values from previous activities. Select **Body** from the **Parse JSON** action. This is the parsed output from the log query.<br>
- [![Select body for Compose action](media/logs-export-logic-app/select-body-compose.png)](media/logs-export-logic-app/select-body-compose.png#lightbox)
+ 1. Click the **Inputs** box display a list of values from previous activities. Select **Body** from the **Parse JSON** action. This is the parsed output from the log query.
+ \
+ [![Select body for Compose action](media/logs-export-logic-app/select-body-compose.png "Screenshot of body for Compose action.")](media/logs-export-logic-app/select-body-compose.png#lightbox)
-5. **Add the Create Blob action**
+6. **Add the Create Blob action**
The Create Blob action writes the composed JSON to storage.
- 1. Click **+ New step**, and then click **+ Add an action**. Under **Choose an action**, type **blob** and then select the **Create Blob** action.<br>
- [![Select Create blob](media/logs-export-logic-app/select-create-blob.png)](media/logs-export-logic-app/select-create-blob.png#lightbox)
+ 1. Click **+ New step**, and then click **+ Add an action**. Under **Choose an action**, type **blob** and then select the **Create Blob** action.
+ \
+ [![Select Create blob](media/logs-export-logic-app/select-create-blob.png "Screenshot of blob storage action create.")](media/logs-export-logic-app/select-create-blob.png#lightbox)
- 2. Type a name for the connection to your Storage Account in **Connection Name** and then click the folder icon in the **Folder path** box to select the container in your Storage Account. Click the **Blob name** to see a list of values from previous activities. Click **Expression** and enter an expression that matches your time interval. For this query which is run hourly, the following expression sets the blob name per previous hour:
+ 1. Type a name for the connection to your Storage Account in **Connection Name** and then click the folder icon in the **Folder path** box to select the container in your Storage Account. Click the **Blob name** to see a list of values from previous activities. Click **Expression** and enter an expression that matches your time interval. For this query which is run hourly, the following expression sets the blob name per previous hour:
```json subtractFromTime(formatDateTime(utcNow(),'yyyy-MM-ddTHH:00:00'), 1,'Hour') ```
+ \
+ [![Blob expression](media/logs-export-logic-app/blob-expression.png "Screenshot of blob action connection.")](media/logs-export-logic-app/blob-expression.png#lightbox)
- [![Blob expression](media/logs-export-logic-app/blob-expression.png)](media/logs-export-logic-app/blob-expression.png#lightbox)
-
- 3. Click the **Blob content** box to display a list of values from previous activities and then select **Outputs** in the **Compose** section.<br>
- [![Create blob expression](media/logs-export-logic-app/create-blob.png)](media/logs-export-logic-app/create-blob.png#lightbox)
+ 2. Click the **Blob content** box to display a list of values from previous activities and then select **Outputs** in the **Compose** section.
+ \
+ [![Create blob expression](media/logs-export-logic-app/create-blob.png "Screenshot of blob action output configuration.")](media/logs-export-logic-app/create-blob.png#lightbox)
-6. **Test the Logic App**
+7. **Test the Logic App**
- Test the workflow by clicking **Run**. If the workflow has errors, it will be indicated on the step with the problem. You can view the executions and drill in to each step to view the input and output to investigate failures. See [Troubleshoot and diagnose workflow failures in Azure Logic Apps](../../logic-apps/logic-apps-diagnosing-failures.md) if necessary.<br>
- [![Runs history](media/logs-export-logic-app/runs-history.png)](media/logs-export-logic-app/runs-history.png#lightbox)
+ Test the workflow by clicking **Run**. If the workflow has errors, it will be indicated on the step with the problem. You can view the executions and drill in to each step to view the input and output to investigate failures. See [Troubleshoot and diagnose workflow failures in Azure Logic Apps](../../logic-apps/logic-apps-diagnosing-failures.md) if necessary.
+ \
+ [![Runs history](media/logs-export-logic-app/runs-history.png "Screenshot of trigger run history.")](media/logs-export-logic-app/runs-history.png#lightbox)
-7. **View logs in Storage**
+8. **View logs in Storage**
- Go to the **Storage accounts** menu in the Azure portal and select your Storage Account. Click the **Blobs** tile and select the container you specified in the Create blob action. Select one of the blobs and then **Edit blob**.<br>
- [![Blob data](media/logs-export-logic-app/blob-data.png)](media/logs-export-logic-app/blob-data.png#lightbox)
+ Go to the **Storage accounts** menu in the Azure portal and select your Storage Account. Click the **Blobs** tile and select the container you specified in the Create blob action. Select one of the blobs and then **Edit blob**.
+ \
+ [![Blob data](media/logs-export-logic-app/blob-data.png "Screenshot of sample data exported to blob.")](media/logs-export-logic-app/blob-data.png#lightbox)
## Next steps
azure-netapp-files Performance Impact Kerberos https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-netapp-files/performance-impact-kerberos.md
na Previously updated : 07/22/2022 Last updated : 08/22/2022 # Performance impact of Kerberos on Azure NetApp Files NFSv4.1 volumes
There are two areas of focus: light load and upper limit. The following lists de
* Average throughput decreased by 77% * Average latency increased by 1.6 ms
+## Performance considerations with `nconnect`
++ ## Next steps * [Configure NFSv4.1 Kerberos encryption for Azure NetApp Files](configure-kerberos-encryption.md)
azure-percept Azure Percept On Azure Stack Hci Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-percept/hci/azure-percept-on-azure-stack-hci-overview.md
Previously updated : 08/15/2022 Last updated : 08/22/2022 # Azure Percept on Azure Stack HCI overview
The Percept VM leverages Azure IoT Edge to communicate with [Azure IoT Hub](http
Whether you're a beginner, an expert, or anywhere in between, from zero to low code, to creating or bringing your own models, Azure Percept has a solution development path for you to build your Edge artificial intelligence (AI) solution. Azure Percept has three solution development paths that you can use to build Edge AI solutions: Azure Percept Studio, Azure Percept for DeepStream, and Azure Percept Open-Source Project. You aren't limited to one path; you can choose any or all of them depending on your business needs. For more information about the solution development paths, visit [Azure Percept solution development paths overview](https://microsoft.sharepoint-df.com/:w:/t/AzurePerceptHCIDocumentation/EU92ZnNynDBGuVn3P5Xr5gcBFKS5HQguZm7O5sEENPUvPA?e=33T6Vi). #### *Azure Percept Studio*
-[Azure Percept Studio](https://microsoft.sharepoint-df.com/:w:/t/AzurePerceptHCIDocumentation/EeyEj0dBcplEs9LSFaz95DsBApnmxRMdjZ9I3QinSgO0yA?e=cbIJkI) is a user-friendly portal for creating, deploying, and operating Edge artificial intelligence (AI) solutions. Using a low-code to no-code approach, you can discover and complete guided workflows and create an end-to-end Edge AI solution. This solution integrates Azure IoT and Azure AI cloud services like Azure IoT Hub, IoT Edge, Azure Storage, Log Analytics, and Spatial Analysis from Azure Cognitive Services.
+[Azure Percept Studio](/azure/azure-percept/studio/azure-percept-studio-overview) is a user-friendly portal for creating, deploying, and operating Edge artificial intelligence (AI) solutions. Using a low-code to no-code approach, you can discover and complete guided workflows and create an end-to-end Edge AI solution. This solution integrates Azure IoT and Azure AI cloud services like Azure IoT Hub, IoT Edge, Azure Storage, Log Analytics, and Spatial Analysis from Azure Cognitive Services.
#### *Azure Percept for DeepStream*
-[Azure Percept for DeepStream](https://microsoft.sharepoint-df.com/:w:/t/AzurePerceptHCIDocumentation/ETDSdi6ruptBkwMqvLPRL90Bzv3ORhpmAZ1YLeGt1LvtVA?e=lY2Q4f&CID=DDDB383F-4BFE-4C97-86A7-70766B16EB93&wdLOR=cDA23C19C-5685-46EC-BA28-7C9DEC460A5B&isSPOFile=1&clickparams=eyJBcHBOYW1lIjoiVGVhbXMtRGVza3RvcCIsIkFwcFZlcnNpb24iOiIyNy8yMjA3MzEwMTAwNSIsIkhhc0ZlZGVyYXRlZFVzZXIiOmZhbHNlfQ%3D%3D) includes developer tools that provide a custom developer experience. It enables you to create NVIDIA DeepStream containers using Microsoft-based images and guidance, supported models from NVIDIA out of the box, and/or bring your own models (BYOM). DeepStream is NVIDIAΓÇÖs toolkit to develop and deploy Vision AI applications and services. It provides multi-platform, scalable, Transport Layer Security (TLS)-encrypted security that can be deployed on-premises, on the edge, and in the cloud.
+[Azure Percept for DeepStream](/azure/azure-percept/deepstream/azure-percept-for-deepstream-overview) includes developer tools that provide a custom developer experience. It enables you to create NVIDIA DeepStream containers using Microsoft-based images and guidance, supported models from NVIDIA out of the box, and/or bring your own models (BYOM). DeepStream is NVIDIAΓÇÖs toolkit to develop and deploy Vision AI applications and services. It provides multi-platform, scalable, Transport Layer Security (TLS)-encrypted security that can be deployed on-premises, on the edge, and in the cloud.
#### *Azure Percept Open-Source Project*
-[Azure Percept Open-Source Project](https://microsoft.sharepoint-df.com/:w:/t/AzurePerceptHCIDocumentation/Eeoh0pZk5g1MqwJZUAZFEvEBMYmfAqdibII6Znm-PnnDIQ?e=4ZDfUT) is a framework for creating, deploying, and operating Edge artificial intelligence (AI) solutions at scale with the control and flexibility of open-source natively on your environment. Azure Percept Open-Source Project is fully open-sourced and leverages the open-source software (OSS) community to deliver enhanced experiences. It's a self-managed solution where you host the environment in your own cluster.
+[Azure Percept Open-Source Project](/azure/azure-percept/open-source/azure-percept-open-source-project-overview) is a framework for creating, deploying, and operating Edge artificial intelligence (AI) solutions at scale with the control and flexibility of open-source natively on your environment. Azure Percept Open-Source Project is fully open-sourced and leverages the open-source software (OSS) community to deliver enhanced experiences. It's a self-managed solution where you host the environment in your own cluster.
## Next steps
azure-portal How To Manage Azure Support Request https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-portal/supportability/how-to-manage-azure-support-request.md
Title: Manage an Azure support request
description: Learn about viewing support requests and how to send messages, upload files, and manage options. tags: billing Previously updated : 02/07/2022 Last updated : 07/21/2022 # To add: close and reopen, review request status, update contact info # Manage an Azure support request
-After you [create an Azure support request](how-to-create-azure-support-request.md), you can manage it in the [Azure portal](https://portal.azure.com). You can also create and manage requests programmatically by using the [Azure support ticket REST API](/rest/api/support) or [Azure CLI](/cli/azure/azure-cli-support-request). Additionally, you can view your open requests in the [Azure mobile app](https://azure.microsoft.com/get-started/azure-portal/mobile-app/).
+After you [create an Azure support request](how-to-create-azure-support-request.md), you can manage it in the [Azure portal](https://portal.azure.com).
+
+> [!TIP]
+> You can create and manage requests programmatically by using the [Azure support ticket REST API](/rest/api/support) or [Azure CLI](/cli/azure/azure-cli-support-request). Additionally, you can view open requests, reply to your support engineer, or edit the severity of your ticket in the [Azure mobile app](https://azure.microsoft.com/get-started/azure-portal/mobile-app/).
To manage a support request, you must have the [Owner](../../role-based-access-control/built-in-roles.md#owner), [Contributor](../../role-based-access-control/built-in-roles.md#contributor), or [Support Request Contributor](../../role-based-access-control/built-in-roles.md#support-request-contributor) role at the subscription level. To manage a support request that was created without a subscription, you must be an [Admin](../../active-directory/roles/permissions-reference.md). ## View support requests
-View the details and status of support requests by going to **Help + support** > **All support requests**.
+View the details and status of support requests by going to **Help + support** > **All support requests** in the Azure portal.
:::image type="content" source="media/how-to-manage-azure-support-request/all-requests-lower.png" alt-text="All support requests":::
You can use the file upload option to upload diagnostic files or any other files
1. On the **All support requests** page, select the support request.
-1. On the **Support Request** page, browse to find your file, then select **Upload**. Repeat the process if you have multiple files.
-
- :::image type="content" source="media/how-to-manage-azure-support-request/file-upload.png" alt-text="Upload file":::
+1. On the **Support Request** page, select the **File upload** box, then browse to find your file and select **Upload**. Repeat the process if you have multiple files.
### File upload guidelines
azure-resource-manager Linter Rule Outputs Should Not Contain Secrets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/bicep/linter-rule-outputs-should-not-contain-secrets.md
The following example shows a secure pattern for retrieving a storageAccount key
output storageId string = stg.id ```
-Which can be used in a subsequent deployment as sown in the following example
+Which can be used in a subsequent deployment as shown in the following example
```bicep someProperty: listKeys(myStorageModule.outputs.storageId.value, '2021-09-01').keys[0].value
azure-resource-manager Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/managed-applications/overview.md
Title: Overview of Azure Managed Applications
-description: Describes the concepts for Azure Managed Applications that provide cloud solutions that are easy for consumers to deploy and operate.
+description: Describes the concepts for Azure Managed Applications that provide cloud solutions that are easy for customers to deploy and operate.
Previously updated : 08/03/2022 Last updated : 08/19/2022 # Azure Managed Applications overview
-Azure Managed Applications enable you to offer cloud solutions that are easy for consumers to deploy and operate. You implement the infrastructure and provide ongoing support. To make a managed application available to all customers, publish it in Azure Marketplace. To make it available to only users in your organization, publish it to an internal catalog.
+Azure Managed Applications enable you to offer cloud solutions that are easy for customers to deploy and operate. You implement the infrastructure and provide ongoing support. To make a managed application available to all customers, publish it in Azure Marketplace. To make it available to only users in your organization, publish it to an internal catalog.
-A managed application is similar to a solution template in Azure Marketplace, with one key difference. In a managed application, the resources are deployed to a resource group that's managed by the publisher of the app. The resource group is present in the consumer's subscription, but an identity in the publisher's tenant has access to the resource group. As the publisher, you specify the cost for ongoing support of the solution.
+A managed application is similar to a solution template in Azure Marketplace, with one key difference. In a managed application, the resources are deployed to a resource group that's managed by the publisher of the app. The resource group is present in the customer's subscription, but an identity in the publisher's tenant has access to the resource group. As the publisher, you specify the cost for ongoing support of the solution.
> [!NOTE] > The documentation for Azure Custom Providers used to be included with Managed Applications. That documentation was moved to [Azure Custom Providers](../custom-providers/overview.md). ## Advantages of managed applications
-Managed applications reduce barriers to consumers using your solutions. They don't need expertise in cloud infrastructure to use your solution. Consumers have limited access to the critical resources and don't need to worry about making a mistake when managing it.
+Managed applications reduce barriers to customers using your solutions. They don't need expertise in cloud infrastructure to use your solution. Customers have limited access to the critical resources and don't need to worry about making a mistake when managing it.
-Managed applications enable you to establish an ongoing relationship with your consumers. You define terms for managing the application and all charges are handled through Azure billing.
+Managed applications enable you to establish an ongoing relationship with your customers. You define terms for managing the application and all charges are handled through Azure billing.
Although customers deploy managed applications in their subscriptions, they don't have to maintain, update, or service them. You can make sure that all customers are using approved versions. Customers don't have to develop application-specific domain knowledge to manage these applications. Customers automatically acquire application updates without the need to worry about troubleshooting and diagnosing issues with the applications. For IT teams, managed applications enable you to offer pre-approved solutions to users in the organization. You know these solutions are compliant with organizational standards.
-Managed Applications support [managed identities for Azure resources](./publish-managed-identity.md).
+Managed applications support [managed identities for Azure resources](./publish-managed-identity.md).
## Types of managed applications You can publish your managed application either internally in the service catalog or externally in Azure Marketplace. ### Service catalog
For information about publishing a managed application to Azure Marketplace, see
## Resource groups for managed applications
-Typically, the resources for a managed application are in two resource groups. The consumer manages one resource group, and the publisher manages the other resource group. When the managed application is defined, the publisher specifies the levels of access. The publisher can request either a permanent role assignment, or [just-in-time access](request-just-in-time-access.md) for an assignment that is constrained to a time period.
+Typically, the resources for a managed application are in two resource groups. The customer manages one resource group, and the publisher manages the other resource group. When the managed application is defined, the publisher specifies the levels of access. The publisher can request either a permanent role assignment, or [just-in-time access](request-just-in-time-access.md) for an assignment that's constrained to a time period.
Restricting access for [data operations](../../role-based-access-control/role-definitions.md) is currently not supported for all data providers in Azure.
-The following image shows a scenario where the publisher requests the owner role for the managed resource group. The publisher placed a read-only lock on this resource group for the consumer. The publisher's identities that are granted access to the managed resource group are exempt from the lock.
+The following image shows the relationship between the customer's Azure subscription and the publisher's Azure subscription. The managed application and managed resource group are in the customer's subscription. The publisher has management access to the managed resource group to maintain the managed application's resources. The publisher places a read-only lock on the managed resource group that limits the customer's access to manage resources. The publisher's identities that have access to the managed resource group are exempt from the lock.
### Application resource group
-This resource group holds the managed application instance. This resource group may only contain one resource. The resource type of the managed application is [Microsoft.Solutions/applications](/azure/templates/microsoft.solutions/applications).
+This resource group holds the managed application instance. This resource group may only contain one resource. The resource type of the managed application is [Microsoft.Solutions/applications](#resource-provider).
-The consumer has full access to the resource group and uses it to manage the lifecycle of the managed application.
+The customer has full access to the resource group and uses it to manage the lifecycle of the managed application.
### Managed resource group
-This resource group holds all the resources that are required by the managed application. For example, this resource group contains the virtual machines, storage accounts, and virtual networks for the solution. The consumer has limited access to this resource group because the consumer doesn't manage the individual resources for the managed application. The publisher's access to this resource group corresponds to the role specified in the managed application definition. For example, the publisher might request the Owner or Contributor role for this resource group. The access is either permanent or limited to a specific time.
+This resource group holds all the resources that are required by the managed application. For example, this resource group contains the virtual machines, storage accounts, and virtual networks for the solution. The customer has limited access to this resource group because the customer doesn't manage the individual resources for the managed application. The publisher's access to this resource group corresponds to the role specified in the managed application definition. For example, the publisher might request the Owner or Contributor role for this resource group. The access is either permanent or limited to a specific time.
-When the [managed application is published to the marketplace](../../marketplace/azure-app-offer-setup.md), the publisher can grant consumers the ability to perform specific actions on resources in the managed resource group. For example, the publisher can specify that consumers can restart virtual machines. All other actions beyond read actions are still denied. Changes to resources in a managed resource group by a consumer with granted actions are subject to the [Azure Policy](../../governance/policy/overview.md) assignments within the consumer's tenant scoped to include the managed resource group.
+When the [managed application is published to the marketplace](../../marketplace/azure-app-offer-setup.md), the publisher can grant customers the ability to perform specific actions on resources in the managed resource group. For example, the publisher can specify that customers can restart virtual machines. All other actions beyond read actions are still denied. Changes to resources in a managed resource group by a customer with granted actions are subject to the [Azure Policy](../../governance/policy/overview.md) assignments within the customer's tenant scoped to include the managed resource group.
-When the consumer deletes the managed application, the managed resource group is also deleted.
+When the customer deletes the managed application, the managed resource group is also deleted.
+
+## Resource provider
+
+Managed applications use the `Microsoft.Solutions` resource provider with ARM template JSON. For more information, see the resource types and API versions.
+
+- [Microsoft.Solutions/applicationDefinitions](/azure/templates/microsoft.solutions/applicationdefinitions?pivots=deployment-language-arm-template)
+- [Microsoft.Solutions/applications](/azure/templates/microsoft.solutions/applications?pivots=deployment-language-arm-template)
+- [Microsoft.Solutions/jitRequests](/azure/templates/microsoft.solutions/jitrequests?pivots=deployment-language-arm-template)
## Azure Policy
You can apply an [Azure Policy](../../governance/policy/overview.md) to audit yo
In this article, you learned about benefits of using managed applications. Go to the next article to create a managed application definition. > [!div class="nextstepaction"]
-> [Quickstart: Create and publish a managed application definition](publish-service-catalog-app.md)
+> [Quickstart: Create and publish an Azure managed application definition](publish-service-catalog-app.md)
azure-resource-manager Publish Service Catalog App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/managed-applications/publish-service-catalog-app.md
Previously updated : 08/16/2022 Last updated : 08/22/2022 # Quickstart: Create and publish an Azure Managed Application definition
Add the following JSON and save the file.
} ```
-For more information about the ARM template's properties, see [Microsoft.Solutions](/azure/templates/microsoft.solutions/applicationdefinitions).
+For more information about the ARM template's properties, see [Microsoft.Solutions/applicationDefinitions](/azure/templates/microsoft.solutions/applicationdefinitions?pivots=deployment-language-arm-template). Managed applications only use ARM template JSON.
### Deploy the definition
azure-resource-manager Lock Resources https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/management/lock-resources.md
Applying locks can lead to unexpected results. Some operations, which don't seem
- A read-only lock on a **Log Analytics workspace** prevents **User and Entity Behavior Analytics (UEBA)** from being enabled.
+- A delete-only lock on a **Log Analytics workspace** does not prevent [data purge operations](../../azure-monitor/logs/personal-data-mgmt.md#delete), remove the [data purge](../../role-based-access-control/built-in-roles.md#data-purger role from the user instead.
+ - A read-only lock on a **subscription** prevents **Azure Advisor** from working correctly. Advisor is unable to store the results of its queries. - A read-only lock on an **Application Gateway** prevents you from getting the backend health of the application gateway. That [operation uses a POST method](/rest/api/application-gateway/application-gateways/backend-health), which a read-only lock blocks.
In the request, include a JSON object that specifies the lock properties.
- To learn about logically organizing your resources, see [Using tags to organize your resources](tag-resources.md). - You can apply restrictions and conventions across your subscription with customized policies. For more information, see [What is Azure Policy?](../../governance/policy/overview.md).-- For guidance on how enterprises can use Resource Manager to effectively manage subscriptions, see [Azure enterprise scaffold - prescriptive subscription governance](/azure/architecture/cloud-adoption-guide/subscription-governance).
+- For guidance on how enterprises can use Resource Manager to effectively manage subscriptions, see [Azure enterprise scaffold - prescriptive subscription governance](/azure/architecture/cloud-adoption-guide/subscription-governance).
azure-resource-manager Template Tutorial Add Tags https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/templates/template-tutorial-add-tags.md
Title: Tutorial - add tags to resources in template description: Add tags to resources that you deploy in your Azure Resource Manager template (ARM template). Tags let you logically organize resources. Previously updated : 03/27/2020 Last updated : 08/22/2022
# Tutorial: Add tags in your ARM template
-In this tutorial, you learn how to add tags to resources in your Azure Resource Manager template (ARM template). [Tags](../management/tag-resources.md) help you logically organize your resources. The tag values show up in cost reports. This tutorial takes **8 minutes** to complete.
+In this tutorial, you learn how to add tags to resources in your Azure Resource Manager template (ARM template). [Tags](../management/tag-resources.md) are metadata elements made up of key-value pairs that help you identify resources and show up in cost reports. This instruction takes **8 minutes** to complete.
## Prerequisites
-We recommend that you complete the [tutorial about Quickstart templates](template-tutorial-quickstart-template.md), but it's not required.
+We recommend that you complete the [tutorial about Quickstart Templates](template-tutorial-quickstart-template.md), but it's not required.
-You must have Visual Studio Code with the Resource Manager Tools extension, and either Azure PowerShell or Azure CLI. For more information, see [template tools](template-tutorial-create-first-template.md#get-tools).
+You need to have Visual Studio Code with the Resource Manager Tools extension and either Azure PowerShell or Azure Command-Line Interface (CLI). For more information, see [template tools](template-tutorial-create-first-template.md#get-tools).
## Review template
-Your previous template deployed a storage account, App Service plan, and web app.
+Your previous template deployed a storage account, an App Service plan, and a web app.
:::code language="json" source="~/resourcemanager-templates/get-started-with-templates/quickstart-template/azuredeploy.json":::
-After deploying these resources, you might need to track costs and find resources that belong to a category. You can add tags to help solve these issues.
+After you deploy these resources, you might need to track costs and find resources that belong to a category. You can add tags to help solve these issues.
## Add tags
-You tag resources to add values that help you identify their use. For example, you can add tags that list the environment and the project. You could add tags that identify a cost center or the team that owns the resource. Add any values that make sense for your organization.
+You tag resources to add values that help you identify their use. You can add tags that list the environment and the project. You can also add them to identify a cost center or the team that owns the resource. Add any values that make sense for your organization.
The following example highlights the changes to the template. Copy the whole file and replace your template with its contents.
New-AzResourceGroupDeployment `
# [Azure CLI](#tab/azure-cli)
-To run this deployment command, you must have the [latest version](/cli/azure/install-azure-cli) of Azure CLI.
+To run this deployment command, you need to have the [latest version](/cli/azure/install-azure-cli) of Azure CLI.
```azurecli az deployment group create \
az deployment group create \
> [!NOTE]
-> If the deployment failed, use the `verbose` switch to get information about the resources being created. Use the `debug` switch to get more information for debugging.
+> If the deployment fails, use the `verbose` switch to get information about the resources you're creating. Use the `debug` switch to get more information for debugging.
## Verify deployment
You can verify the deployment by exploring the resource group from the Azure por
If you're moving on to the next tutorial, you don't need to delete the resource group.
-If you're stopping now, you might want to clean up the resources you deployed by deleting the resource group.
+If you're stopping now, you might want to delete the resource group.
-1. From the Azure portal, select **Resource group** from the left menu.
-2. Enter the resource group name in the **Filter by name** field.
-3. Select the resource group name.
+1. From the Azure portal, select **Resource groups** from the left menu.
+2. Type the resource group name in the **Filter for any field...** text field.
+3. Check the box next to **myResourceGroup** and select **myResourceGroup** or your resource group name.
4. Select **Delete resource group** from the top menu. ## Next steps
-In this tutorial, you added tags to the resources. In the next tutorial, you'll learn how to use parameter files to simplify passing in values to the template.
+In this tutorial, you add tags to the resources. In the next tutorial, you learn how to use parameter files to simplify passing in values to the template.
> [!div class="nextstepaction"] > [Use parameter file](template-tutorial-use-parameter-file.md)
azure-video-indexer Accounts Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-indexer/accounts-overview.md
With a trial account, you don't have to set up an Azure subscription. When creat
## Create accounts
-* ARM accounts: [Get started with Azure Video Indexer in Azure portal](create-account-portal.md). **The recommended paid account type is the ARM-based account**.
-
- * Upgrade a trial account to an ARM based account and [**import** your content for free](connect-to-azure.md#import-your-content-from-the-trial-account).
+* ARM accounts: **The recommended paid account type is the ARM-based account**.
+
+ * You can create an Azure Video Indexer **ARM-based** account through one of the following:
+
+ 1. [Azure Video Indexer portal](https://aka.ms/vi-portal-link)
+ 2. [Azure portal](https://portal.azure.com/#home)
+
+ For the detailed description, [Get started with Azure Video Indexer in Azure portal](create-account-portal.md).
+* Upgrade a trial account to an ARM-based account and [import your content for free](import-content-from-trial.md).
* Classic accounts: [Create classic accounts using API](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Create-Paid-Account). * Connect a classic account to ARM: [Connect an existing classic paid Azure Video Indexer account to an ARM-based account](connect-classic-account-to-arm.md).
azure-video-indexer Connect To Azure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-indexer/connect-to-azure.md
Title: Create an Azure Video Indexer account connected to Azure
-description: Learn how to create an Azure Video Indexer account connected to Azure.
+ Title: Create a classic Azure Video Indexer account connected to Azure
+description: Learn how to create a classic Azure Video Indexer account connected to Azure.
Last updated 05/03/2022
-# Create an Azure Video Indexer account
+# Create a classic Azure Video Indexer account
[!INCLUDE [Gate notice](./includes/face-limited-access.md)]
-When creating an Azure Video Indexer account, you can choose a free trial account (where you get a certain number of free indexing minutes) or a paid option (where you're not limited by the quota). With a free trial, Azure Video Indexer provides up to 600 minutes of free indexing to users and up to 2400 minutes of free indexing to users that subscribe to the Azure Video Indexer API on the [developer portal](https://aka.ms/avam-dev-portal). With the paid options, Azure Video Indexer offers two types of accounts: classic accounts(General Availability), and ARM-based accounts(Public Preview). Main difference between the two is account management platform. While classic accounts are built on the API Management, ARM-based accounts management is built on Azure, which enables apply access control to all services with role-based access control (Azure RBAC) natively.
+This topic shows how to create a new classic account connected to Azure using the [Azure Video Indexer website](https://aka.ms/vi-portal-link). You can also create an Azure Video Indexer classic account through our [API](https://aka.ms/avam-dev-portal).
-> [!NOTE]
-> Before creating a new account, review [Account types](accounts-overview.md).
-
-* You can create an Azure Video Indexer **classic** account through our [API](https://aka.ms/avam-dev-portal).
-* You can create an Azure Video Indexer **ARM-based** account through one of the following:
-
- 1. [Azure Video Indexer portal](https://aka.ms/vi-portal-link)
- 2. [Azure portal](https://portal.azure.com/#home)
-
-To read more on how to create a **new ARM-Based** Azure Video Indexer account, read this [article](create-video-analyzer-for-media-account.md)
+The topic discusses prerequisites that you need to connect to your Azure subscription and how to configure an Azure Media Services account.
-For more details, see [pricing](https://azure.microsoft.com/pricing/details/video-indexer/).
+A few Azure Video Indexer account types are available to you. For detailed explanation, review [Account types](accounts-overview.md).
-## How to create classic accounts
-
-This article shows how to create an Azure Video Indexer classic account. The topic provides steps for connecting to Azure using the automatic (default) flow. It also shows how to connect to Azure manually (advanced).
-
-If you are moving from a *trial* to *paid ARM-Based* Azure Video Indexer account, you can choose to copy all of the videos and model customization to the new account, as discussed in the [Import your content from the trial account](#import-your-content-from-the-trial-account) section.
-
-The article also covers [Linking an Azure Video Indexer account to Azure Government](#azure-video-indexer-in-azure-government).
+For the pricing details, see [pricing](https://azure.microsoft.com/pricing/details/video-indexer/).
## Prerequisites for connecting to Azure
The article also covers [Linking an Azure Video Indexer account to Azure Governm
This user should be an Azure AD user with a work or school account. Don't use a personal account, such as outlook.com, live.com, or hotmail.com. :::image type="content" alt-text="Screenshot that shows how to choose a user in your Azure A D domain." source="./media/create-account/all-aad-users.png":::-
-### Additional prerequisites for automatic flow
- * A user and member in your Azure AD domain. You'll use this member when connecting your Azure Video Indexer account to Azure.
The article also covers [Linking an Azure Video Indexer account to Azure Governm
This user should be a member in your Azure subscription with either an **Owner** role, or both **Contributor** and **User Access Administrator** roles. A user can be added twice, with two roles. Once with Contributor and once with user Access Administrator. For more information, see [View the access a user has to Azure resources](../role-based-access-control/check-access.md). :::image type="content" alt-text="Screenshot that shows the access control settings." source="./media/create-account/access-control-iam.png":::-
-### Additional prerequisites for manual flow
- * Register the Event Grid resource provider using the Azure portal. In the [Azure portal](https://portal.azure.com/), go to **Subscriptions**->[subscription]->**ResourceProviders**.
- Search for **Microsoft.Media** and **Microsoft.EventGrid**. If not in the "Registered" state, click **Register**. It takes a couple of minutes to register.
+ Search for **Microsoft.Media** and **Microsoft.EventGrid**. If not in the "Registered" state, select **Register**. It takes a couple of minutes to register.
:::image type="content" alt-text="Screenshot that shows how to select an Event Grid subscription." source="./media/create-account/event-grid.png":::
-## Connect to Azure manually (advanced option)
-
-If the connection to Azure failed, you can attempt to troubleshoot the problem by connecting manually.
+## Connect to Azure
> [!NOTE]
-> It's mandatory to have the following three accounts in the same region: the Azure Video Indexer account that you're connecting with the Media Services account, as well as the Azure storage account connected to the same Media Services account. When you create an Azure Video Indexer account and connect it to Media Services, the media and metadata files are stored in the Azure storage account associated with that Media Services account.
+> Use the same Azure AD user you used when connecting to Azure.
+
+It's mandatory to have the following three accounts located in the same region:
+
+* The Azure Video Indexer account that you're creating.
+* The Azure Video Indexer account that you're connecting with the Media Services account.
+* The Azure storage account connected to the same Media Services account.
+
+ When you create an Azure Video Indexer account and connect it to Media Services, the media and metadata files are stored in the Azure storage account associated with that Media Services account.
+
+If your storage account is behind a firewall, see [storage account that is behind a firewall](faq.yml#can-a-storage-account-connected-to-the-media-services-account-be-behind-a-firewall).
### Create and configure a Media Services account
If the connection to Azure failed, you can attempt to troubleshoot the problem b
:::image type="content" alt-text="Screenshot that shows how to specify a storage account." source="./media/create-account/create-new-ams-account.png"::: > [!NOTE]
- > Make sure to write down the Media Services resource and account names. You'll need them for the steps in the next section.
-
+ > Make sure to write down the Media Services resource and account names.
1. Before you can play your videos in the Azure Video Indexer web app, you must start the default **Streaming Endpoint** of the new Media Services account. In the new Media Services account, select **Streaming endpoints**. Then select the streaming endpoint and press start. :::image type="content" alt-text="Screenshot that shows how to specify streaming endpoints." source="./media/create-account/create-ams-account-se.png":::
-4. For Azure Video Indexer to authenticate with Media Services API, an AD app needs to be created. The following steps guide you through the Azure AD authentication process described in [Get started with Azure AD authentication by using the Azure portal](/azure/media-services/previous/media-services-portal-get-started-with-aad):
+1. For Azure Video Indexer to authenticate with Media Services API, an AD app needs to be created. The following steps guide you through the Azure AD authentication process described in [Get started with Azure AD authentication by using the Azure portal](/azure/media-services/previous/media-services-portal-get-started-with-aad):
1. In the new Media Services account, select **API access**. 2. Select [Service principal authentication method](/azure/media-services/previous/media-services-portal-get-started-with-aad).
If the connection to Azure failed, you can attempt to troubleshoot the problem b
> [!NOTE] > Make sure to write down the key value and the Application ID. You'll need it for the steps in the next section.
-### Connect manually
-
-In the **Create a new account on an Azure subscription** dialog of your [Azure Video Indexer](https://www.videoindexer.ai/) page, select the **Switch to manual configuration** link.
-
-In the dialog, provide the following information:
-
-|Setting|Description|
-|||
-|Azure Video Indexer account region|The name of the Azure Video Indexer account region. For better performance and lower costs, it's highly recommended to specify the name of the region where the Azure Media Services resource and Azure Storage account are located. |
-|Azure AD tenant|The name of the Azure AD tenant, for example "contoso.onmicrosoft.com". The tenant information can be retrieved from the Azure portal. Place your cursor over the name of the signed-in user in the top-right corner. Find the name to the right of **Domain**.|
-|Subscription ID|The Azure subscription under which this connection should be created. The subscription ID can be retrieved from the Azure portal. Select **All services** in the left panel, and search for "subscriptions". Select **Subscriptions** and choose the desired ID from the list of your subscriptions.|
-|Azure Media Services resource group name|The name for the resource group in which you created the Media Services account.|
-|Media service resource name|The name of the Azure Media Services account that you created in the previous section.|
-|Application ID|The Azure AD application ID (with permissions for the specified Media Services account) that you created in the previous section.|
-|Application key|The Azure AD application key that you created in the previous section. |
-
-### Import your content from the *trial* account
-
-When creating a new **ARM-Based** account, you have an option to import your content from the *trial* account into the new **ARM-Based** account free of charge.
-> [!NOTE]
-> * Import from trial can be performed only once per trial account.
-> * The target ARM-Based account needs to be created and available before import is assigned.
-> * Target ARM-Based account has to be an empty account (never indexed any media files).
-
-To import your data, follow the steps:
- 1. Go to [Azure Video Indexer portal](https://aka.ms/vi-portal-link)
- 2. Select your trial account and go to the *account settings* page
- 3. Click the *Import content to an ARM-based account*
- 4. From the dropdown menu choose the ARM-based account you wish to import the data to.
- * If the account ID isn't showing, you can copy and paste the account ID from Azure portal or the account list, on the side blade in the Azure Video Indexer Portal.
- 5. Click **Import content**
-
- :::image type="content" alt-text="Screenshot that shows how to import your data." source="./media/create-account/import-to-arm-account.png":::
-
-All media and content model customizations will be copied from the *trial* account into the new ARM-Based account.
--
-> [!NOTE]
->
-> The *trial* account is not availagle on the Azure Government cloud.
-
-## Azure Media Services considerations
+### Azure Media Services considerations
The following Azure Media Services related considerations apply:
The following Azure Media Services related considerations apply:
![Media Services reserved units](./media/create-account/ams-reserved-units.png)
+## Create a classic account
+
+1. On the [Azure Video Indexer website](https://aka.ms/vi-portal-link), select **Create unlimited account** (the paid account).
+2. To create a classic account, select **Switch to manual configuration**.
+
+In the dialog, provide the following information:
+
+|Setting|Description|
+|||
+|Azure Video Indexer account region|The name of the Azure Video Indexer account region. For better performance and lower costs, it's highly recommended to specify the name of the region where the Azure Media Services resource and Azure Storage account are located. |
+|Azure AD tenant|The name of the Azure AD tenant, for example "contoso.onmicrosoft.com". The tenant information can be retrieved from the Azure portal. Place your cursor over the name of the signed-in user in the top-right corner. Find the name to the right of **Domain**.|
+|Subscription ID|The Azure subscription under which this connection should be created. The subscription ID can be retrieved from the Azure portal. Select **All services** in the left panel, and search for "subscriptions". Select **Subscriptions** and choose the desired ID from the list of your subscriptions.|
+|Azure Media Services resource group name|The name for the resource group in which you created the Media Services account.|
+|Media service resource name|The name of the Azure Media Services account that you created in the previous section.|
+|Application ID|The Azure AD application ID (with permissions for the specified Media Services account) that you created in the previous section.|
+|Application key|The Azure AD application key that you created in the previous section. |
+
+## Import your content from the trial account
+
+See [Import your content from the trial account](import-content-from-trial.md).
+ ## Automate creation of the Azure Video Indexer account To automate the creation of the account is a two steps process:
To automate the creation of the account is a two steps process:
### Prerequisites for connecting to Azure Government -- An Azure subscription in [Azure Government](../azure-government/index.yml).
+- An Azure subscription in [Azure Government](../azure-government/index.yml).
- An Azure AD account in Azure Government.-- All pre-requirements of permissions and resources as described above in [Prerequisites for connecting to Azure](#prerequisites-for-connecting-to-azure). Make sure to check [Additional prerequisites for automatic flow](#additional-prerequisites-for-automatic-flow) and [Additional prerequisites for manual flow](#additional-prerequisites-for-manual-flow).
+- All pre-requirements of permissions and resources as described above in [Prerequisites for connecting to Azure](#prerequisites-for-connecting-to-azure).
### Create new account via the Azure Government portal
To automate the creation of the account is a two steps process:
To create a paid account via the Azure Video Indexer portal: 1. Go to https://videoindexer.ai.azure.us
-1. Log in with your Azure Government Azure AD account.
-1. If you do not have any Azure Video Indexer accounts in Azure Government that you are an owner or a contributor to, you will get an empty experience from which you can start creating your account.
+1. Sign-in with your Azure Government Azure AD account.
+1. If you don't have any Azure Video Indexer accounts in Azure Government that you're an owner or a contributor to, you'll get an empty experience from which you can start creating your account.
The rest of the flow is as described in above, only the regions to select from will be Government regions in which Azure Video Indexer is available
- If you already are a contributor or an admin of an existing one or more Azure Video Indexer accounts in Azure Government, you will be taken to that account and from there you can start a following steps for creating an additional account if needed, as described above.
+ If you already are a contributor or an admin of an existing one or more Azure Video Indexer accounts in Azure Government, you'll be taken to that account and from there you can start a following steps for creating an additional account if needed, as described above.
### Create new account via the API on Azure Government
To create a paid account in Azure Government, follow the instructions in [Create
In the public cloud when content is deemed offensive based on a content moderation, the customer can ask for a human to look at that content and potentially revert that decision. * No trial accounts.
-* Bing description - in Gov cloud we will not present a description of celebrities and named entities identified. This is a UI capability only.
+* Bing description - in Gov cloud we won't present a description of celebrities and named entities identified. This is a UI capability only.
## Clean up resources
-After you are done with this tutorial, delete resources that you are not planning to use.
+After you're done with this tutorial, delete resources that you aren't planning to use.
### Delete an Azure Video Indexer account
Select the account -> **Settings** -> **Delete this account**.
The account will be permanently deleted in 90 days.
-## Firewall
-
-See [Storage account that is behind a firewall](faq.yml#can-a-storage-account-connected-to-the-media-services-account-be-behind-a-firewall).
- ## Next steps You can programmatically interact with your trial account and/or with your Azure Video Indexer accounts that are connected to Azure by following the instructions in: [Use APIs](video-indexer-use-apis.md).
-You should use the same Azure AD user you used when connecting to Azure.
azure-video-indexer Import Content From Trial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-indexer/import-content-from-trial.md
+
+ Title: Import your content from the trial account
+description: Learn how to import your content from the trial account.
+ Last updated : 05/03/2022++++
+# Import your content from the trial account
+
+When creating a new ARM-based account, you have an option to import your content from the trial account into the new ARM-based account free of charge.
+
+> [!NOTE]
+> Make sure to review the following considerations.
+
+## Considerations
+
+Review the following considerations.
+
+* Import from trial can be performed only once per trial account.
+* The target ARM-based account needs to be created and available before import is assigned.
+* Target ARM-based account has to be an empty account (never indexed any media files).
+
+## Import your data
+
+To import your data, follow the steps:
+
+ 1. Go to [Azure Video Indexer portal](https://aka.ms/vi-portal-link)
+ 2. Select your trial account and go to the **Account settings** page.
+ 3. Click the **Import content to an ARM-based account**.
+ 4. From the dropdown menu choose the ARM-based account you wish to import the data to.
+
+ * If the account ID isn't showing, you can copy and paste the account ID from Azure portal or the account list, on the side blade in the Azure Video Indexer Portal.
+ 5. Click **Import content**
+
+ :::image type="content" alt-text="Screenshot that shows how to import your data." source="./media/create-account/import-to-arm-account.png":::
+
+All media and content model customizations will be copied from the trial account into the new ARM-based account.
+
+## Next steps
+
+You can programmatically interact with your trial account and/or with your Azure Video Indexer accounts that are connected to Azure by following the instructions in: [Use APIs](video-indexer-use-apis.md).
+
+You should use the same Azure AD user you used when connecting to Azure.
azure-vmware Enable Hcx Access Over Internet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/enable-hcx-access-over-internet.md
Last updated 7/19/2022
# Enable HCX access over the internet -
-In this article, you'll learn how to perform HCX migration over a Public IP address using Azure VMware Solution.
+In this article, you'll learn how to perform HCX migration over a public IP address using Azure VMware Solution.
>[!IMPORTANT]
->Before configuring a Public IP on your Azure VMware Solution private cloud, please consult your Network Administrator to understand the implications and the impact to your environment.
-
-You'll also learn how to pair HCX sites and create service mesh from on-premises to an Azure VMware Solution private cloud using a Public IP. The service mesh allows you to migrate a workload from an on-premises datacenter to an Azure VMware Solution private cloud over the public internet. This solution is useful when the customer is not using ExpressRoute or VPN connectivity with the Azure cloud.
-
-
-> [!IMPORTANT]
-> The on-premises HCX appliance should be reachable from the internet to establish HCX communication from on-premises to the Azure VMware Solution private cloud.
-
-## Configure Public IP block
-
-To perform HCX Migration over the public internet, you'll need a minimum of six Public IP addresses. Five of these Public IP addresses will be used for the Public IP segment, and one will be used for configuring Network Address Translation (NAT). You can obtain the Public IP block by reserving a /29 from the Azure VMware Solution portal. Configure a Public IP block through portal by using the Public IP feature of the Azure VMware Solution private cloud.
+>Before configuring a public IP on your Azure VMware Solution private cloud, consult your network administrator to understand the implications and the impact to your environment.
-1. Sign in to Azure VMware Solution portal.
-1. Under **Workload Networking**, select **Public IP (preview)**.
-1. Select **+Public IP**.
-1. Enter the **Public IP name** and select the address space from the **Address space** drop-down list according to the number of IPs required, then select **Configure**.
- >[!Note]
- > It will take 15-20 minutes to configure the Public IP block on private cloud.
+You'll also learn how to pair HCX sites and create service mesh from on-premises to an Azure VMware Solution private cloud using Public IP. The service mesh allows you to migrate a workload from an on-premises datacenter to an Azure VMware Solution private cloud over the public internet. This solution is useful when the customer isn't using ExpressRoute or VPN connectivity with the Azure cloud.
-After the Public IP is configured successfully, you should see it appear under the Public IP section. The provisioning state shows **Succeeded**. This Public IP block is configured as NSX-T segment on the Tier-1 router.
+> [!IMPORTANT]
+> The on-premises HCX appliance should be reachable from the internet to establish HCX communication from on-premises to the Azure VMware Solution private cloud.
-For more information about how to enable a public IP to the NSX Edge for Azure VMware Solution, see [Enable Public IP to the NSX Edge for Azure VMware Solution](./enable-public-ip-nsx-edge.md).
+## Configure public IP block
-## Create Public IP segment on NSX-T
-Before you create a Public IP segment, get your credentials for NSX-T Manager from Azure VMware Solution portal.
+For HCX manager to be available over the public IP address, you'll need one public IP address for DNAT rule.
-1. Sign in to NSX-T Manager using credentials provided by the Azure VMware Solution portal.
-1. Under the **Manage** section, select **Identity**.
-1. Copy the NSX-T Manager admin user password.
+To perform HCX migration over the public internet, you'll need other IP addresses. You can have a /29 subnet to create minimum configuration when defining HCX network profile (usable IPs in subnet will be assigned to IX, NE appliances). You can choose a bigger subnet based on the requirements. You'll create an NSX-T segment using this public subnet. This segment can be used for creating HCX network profile.
-1. Browse the NSX-T Manger and paste the admin password in the password field, and select **Login**.
-1. Under the **Networking** section select **Connectivity** and **Segments**, then select **ADD SEGMENT**.
-1. Provide Segment name, select Tier-1 router as connected gateway, and provide the reserved Public IP under subnets. The Public IP block for this Public IP segment shouldn't include the first and last Public IPs from the overall Public IP block. For example, if you reserved 20.95.1.16/29, you would input 20.95.1.16/30.
-1. Select **Save**. ΓÇ»
+>[!Note]
+> After assigning a subnet to NSX-T segment, you can't use an IP from that subnet to create a DNAT rule. Both subnets should be different.
-## Assign public IP to HCX manager
-HCX manager of destination Azure VMware Solution SDDC should be reachable from the internet to do site pairing with source site. HCX Manager can be exposed by way of DNAT rule and a static null route. Because HCX Manager is in the provider space, not within the NSX-T environment, the null route is necessary to allow HCX Manager to route back to the client by way of the DNAT rule.
+Configure a Public IP block through portal by using the [Public IP feature of the Azure VMware Solution](enable-hcx-access-over-internet.MD#enable-hcx-access-over-the-internet) private cloud.
-### Add static null route to the T1 router
+## Use public IP address for Cloud HCX Manager public access
+Cloud HCX manager can be available over a public IP address by using a DNAT rule. However, since Cloud HCX manager is in the provider space, the null route is necessary to allow HCX Manager to route back to the client by way of the DNAT rule. It forces the NAT traffic through NSX-T Tier-0 router.
-The static null route is used to allow HCX private IP to route through the NSX T1 for public endpoints.
+## Add static null route to the Tier1 router
+The static null route is used to allow HCX private IP to route through the NSX Tier-1 for public endpoints. This static route can be the default Tier-1 router created in your private cloud or you can create a new tier-1 router.
1. Sign in to NSX-T manager, and select **Networking**. 1. Under the **Connectivity** section, select **Tier-1 Gateways**.
-1. Edit the existing T1 gateway.
+1. Edit the existing Tier-1 gateway.
1. Expand **STATIC ROUTES**. 1. Select the number next to **Static Routes**. 1. Select **ADD STATIC ROUTE**. A pop-up window is displayed. 1. Under **Name**, enter the name of the route.
-1. Under **network**, enter a non-overlapping /32 IP address under Network.
+1. Under **Network**, enter a non-overlapping /32 IP address under Network.
>[!NOTE]
- > This address should not overlap with any other IP addresses on the network.
+ > This address should not overlap with any other IP addresses on the private cloud network and the customer network.
+
+ :::image type="content" source="media/hcx-over-internet/hcx-sample-static-route.png" alt-text="Diagram showing a sample static route configuration." border="false" lightbox="media/hcx-over-internet/hcx-sample-static-route.png":::
1. Under **Next hops**, select **Set**. 1. Select **NULL** as IP Address.
- Leave defaults for Admin distance and scope.
-1. Select **ADD**, then select **APPLY**.
+ Leave defaults for Admin distance and scope.
+1. Select **ADD**, then select **APPLY**.
1. Select **SAVE**, then select **CLOSE**.
+ :::image type="content" source="media/hcx-over-internet/hcx-sample-null-route.png" alt-text="Diagram showing a sample Null route configuration." border="false" lightbox="media/hcx-over-internet/hcx-sample-null-route.png":::
1. Select **CLOSE EDITING**.
-### Add NAT rule to T1 gateway
+## Add NAT rule to Tier-1 gateway
->[!Note]
->The NAT rules should use a different Public IP address than your Public IP segment.
1. Sign in to NSX-T Manager, and select **Networking**. 1. Select **NAT**.
-1. Select the T1 Gateway.
+1. Select the Tier-1 Gateway. Use same Tier-1 router to create NAT rule that you used to create null route in previous steps.
1. Select **ADD NAT RULE**.
-1. Add one SNAT rule for HCX Manager.
+1. Add one SNAT rule and one DNAT rule for HCX Manager.
1. The DNAT Rule Destination is the Public IP for HCX Manager. The Translated IP is the HCX Manager IP in the cloud.
- 1. The SNAT Rule Source is the HCX Manager IP in the cloud. The Translated IP is the non-overlapping /32 IP from the Static Route.
-1. Make sure to set the Firewall option on DNAT rule to **Match External Address**.
-1. Create T1 Gateway Firewall rules to allow only expected traffic to the Public IP for HCX Manager and drop everything else.
- 1. Create a Gateway Firewall rule on the T1 that allows your On-Premise as the **Source IP** and the Azure VMware Solution reserved Public as the **Destination IP**. This rule should be the highest priority.
- 1. Create a Gateway Firewall rule on the T1 that denies all other traffic where the **Source IP** is and ΓÇ£AnyΓÇ¥ and **Destination IP** is the Azure VMware Solution reserved Public IP.
+ 1. The SNAT Rule Destination is the HCX Manager IP in the cloud. The Translated IP is the non-overlapping /32 IP from the Static Route.
+ 1. Make sure to set the Firewall option on DNAT rule to **Match External Address**.
+ :::image type="content" source="media/hcx-over-internet/hcx-sample-public-access-route.png" alt-text="Diagram showing a sample NAT rule for public access of HCX Virtual machine." border="false" lightbox="media/hcx-over-internet/hcx-sample-public-access-route.png":::
+
+1. Create Tier-1 Gateway Firewall rules to allow only expected traffic to the Public IP for HCX Manager and drop everything else.
+ 1. Create a Gateway Firewall rule on the T1 that allows your on-premises as the **Source IP** and the Azure VMware Solution reserved Public as the **Destination IP**. This rule should be the highest priority.
+ 1. Create a Gateway Firewall rule on the Tier-1 that denies all other traffic where the **Source IP** is **Any** and **Destination IP** is the Azure VMware Solution reserved Public IP.
+
+For more information, see [HCX ports](https://ports.esp.vmware.com/home/VMware-HCX)
->[!NOTE]
+> [!NOTE]
> HCX manager can now be accessed over the internet using public IP.
-### Create network profile for HCX at destination site
-1. Sign in to Destination HCX Manager.
-1. Select **Interconnect** and then select the **Network Profiles** tab.
-1. Select **Create Network Profile**.
-1. Select **NSX Networks** as network type under **Network**.
-1. Select the **Public-IP-Segment** created on NSX-T.
-1. Enter **Name**.
-1. Under IP pools, enter the **IP Ranges** for HCX uplink, **Prefix Length**, and **Gateway** of public IP segment.
-1. Scroll down and select the **HCX Uplink** checkbox under **HCX Traffic Type** as this profile will be used for HCX uplink.
-1. To create the Network Profile, select **Create**.
+## Pair sites using HCX Cloud manager's public IP address
-### Pair site
-Site pairing is required to create service mesh between source and destination sites.
+Site pairing is required before you create service mesh between source and destination sites.
1. Sign in to the **Source** site HCX Manager.
-1. Select **Site Pairing** and select **ADD SITE PAIRING**.
-1. Enter the remote HCX URL and sign in credentials, then select **Connect**.
+1. Select **Site Pairing** and select **ADD SITE PAIRING**.
+1. Enter the **Cloud HCX Manager Public URL** as remote site and sign in credentials, then select **Connect**.
After pairing is done, it will appear under site pairing.
-### Create service mesh
-Service Mesh will deploy HCX WAN Optimizer, HCX Network Extension and HCX-IX appliances.
+## Create public IP segment on NSX-T
+Before you create a Public IP segment, get your credentials for NSX-T Manager from Azure VMware Solution portal.
+
+1. Under the **Networking** section select **Connectivity**, **Segments**, and then select **ADD SEGMENT**.
+1. Provide Segment name, select **Tier-1 router** as connected gateway, and provide the reserved public IP under subnets.
+1. Select **Save**. ΓÇ»
+
+## Create network profile for HCX at destination site
+1. Sign in to Destination HCX Manager (cloud manager in this case).
+1. Select **Interconnect** and then select the **Network Profiles** tab.
+1. Select **Create Network Profile**.
+1. Select **NSX Networks** as network type under **Network**.
+1. Select the **Public-IP-Segment** created on NSX-T.
+1. Enter **Name**.
+1. Under IP pools, enter the **IP Ranges** for HCX uplink, **Prefix Length**, and **Gateway** of public IP segment.
+1. Scroll down and select the **HCX Uplink** checkbox under **HCX Traffic Type** as this profile will be used for HCX uplink.
+1. Select **Create** to create the network profile.
+
+## Create service mesh
+Service Mesh will deploy HCX WAN Optimizer, HCX Network Extension and HCX-IX appliances.
1. Sign in to **Source** site HCX Manager. 1. Select **Interconnect** and then select the **Service Mesh** tab.
-1. Select **CREATE SERVICE MESH**.
-1. Select the **destination** site to create service mesh with and select **Continue**.
+1. Select **CREATE SERVICE MESH**.
+1. Select the **destination** site to create service mesh with and then select **Continue**.
1. Select the compute profiles for both sites and select **Continue**.
-1. Select the HCX services to be activated and select **Continue**.
+1. Select the HCX services to be activated and select **Continue**.
>[!Note]
- >Premium services require an additional HCX Enterprise license.
-1. Select the Network Profile of source site.
-1. Select the Network Profile of Destination that you created in the Network Profile section.
+ >Premium services require an additional HCX Enterprise license.
+1. Select the network profile of source site.
+1. Select the network profile of destination that you created in the **Network Profile** section.
1. Select **Continue**.
-1. Review the Transport Zone information, and then select **Continue**.
-1. Review the Topological view, and select **Continue**.
-1. Enter the Service Mesh name and select **FINISH**.
-
-### Extend network
-The HCX Network Extension service provides layer 2 connectivity between sites. The extension service also allows you to keep the same IP and MAC addresses during virtual machine migrations.
-1. Sign in to **source** HCX Manager.
-1. Under the **Network Extension** section, select the site for which you want to extend the network, and then select **EXTEND NETWORKS**.
-1. Select the network that you want to extend to destination site, and select **Next**.
+1. Review the **Transport Zone** information, and then select **Continue**.
+1. Review the **Topological view**, and select **Continue**.
+1. Enter the **Service Mesh name** and select **FINISH**.
+1. Add the public IP addresses in firewall to allow required ports only.
+
+## Extend network
+The HCX Network Extension service provides layer 2 connectivity between sites. The extension service also allows you to keep the same IP and MAC addresses during virtual machine migrations.
+1. Sign in to **source** HCX Manager.
+1. Under the **Network Extension** section, select the site for which you want to extend the network, and then select **EXTEND NETWORKS**.
+1. Select the network that you want to extend to destination site, and select **Next**.
1. Enter the subnet details of network that you're extending.
-1. Select the destination first hop route (T1), and select **Submit**.
-1. Sign in to the **destination** NSX, you'll see Network 10.14.27.1/24 has been extended.
+1. Select the destination first hop route (Tier-1), and select **Submit**.
+1. Sign in to the **destination** NSX, you'll see Network 10.14.27.1/24 has been extended.
-After the network is extended to destination site, VMs can be migrated over Layer 2 Extension.
+After the network is extended to destination site, VMs can be migrated over Layer 2 extension.
-## Next steps
+## Next steps
[Enable Public IP to the NSX Edge for Azure VMware Solution](./enable-public-ip-nsx-edge.md) For detailed information on HCX network underlay minimum requirements, see [Network Underlay Minimum Requirements](https://docs.vmware.com/en/VMware-HCX/4.3/hcx-user-guide/GUID-8128EB85-4E3F-4E0C-A32C-4F9B15DACC6D.html).
cognitive-services Get Analytics Knowledge Base https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/QnAMaker/How-To/get-analytics-knowledge-base.md
QnA Maker stores all chat logs and other telemetry, if you have enabled Applicat
| join kind= inner ( traces | extend id = operation_ParentId ) on id
+ | where message == "QnAMaker GenerateAnswer"
| extend question = tostring(customDimensions['Question']) | extend answer = tostring(customDimensions['Answer']) | extend score = tostring(customDimensions['Score'])
cognitive-services How To Configure Azure Ad Auth https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Speech-Service/how-to-configure-azure-ad-auth.md
$resourceId = resource.Id
*** ## Create the Speech SDK configuration object+ With an Azure AD access token, you can now create a Speech SDK configuration object. The method of providing the token, and the method to construct the corresponding Speech SDK ```Config``` object varies by the object you'll be using.
-### SpeechRecognizer, IntentRecognizer, ConversationTranscriber, and SpeechSynthesizer
+### SpeechRecognizer, SpeechSynthesizer, IntentRecognizer, ConversationTranscriber, and SourceLanguageRecognizer
-For ```SpeechRecognizer```, ```IntentRecognizer```, ```ConversationTranscriber```, and ```SpeechSynthesizer``` objects, build the authorization token from the resource ID and the Azure AD access token and then use it to create a ```SpeechConfig``` object.
+For ```SpeechRecognizer```, ```SpeechSynthesizer```, ```IntentRecognizer```, ```ConversationTranscriber```, and ```SourceLanguageRecognizer``` objects, build the authorization token from the resource ID and the Azure AD access token and then use it to create a ```SpeechConfig``` object.
::: zone pivot="programming-language-csharp" ```C#
cognitive-services How To Custom Voice Create Voice https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Speech-Service/how-to-custom-voice-create-voice.md
Unresolved errors listed in the next table affect the quality of training, but d
| Script | Non-normalized text|This script contains symbols. Normalize the symbols to match the audio. For example, normalize *50%* to *fifty percent*.| | Script | Not enough question utterances| At least 10 percent of the total utterances should be question sentences. This helps the voice model properly express a questioning tone.| | Script | Not enough exclamation utterances| At least 10 percent of the total utterances should be exclamation sentences. This helps the voice model properly express an excited tone.|
+| Script | No valid end punctuation| Add one of the following at the end of the line: full stop (half-width '.' or full-width '。'), exclamation point (half-width '!' or full-width '!' ), or question mark ( half-width '?' or full-width '?').|
| Audio| Low sampling rate for neural voice | It's recommended that the sampling rate of your .wav files should be 24 KHz or higher for creating neural voices. If it's lower, it will be automatically raised to 24 KHz.| | Volume |Overall volume too low|Volume shouldn't be lower than -18 dB (10 percent of max volume). Control the volume average level within proper range during the sample recording or data preparation.| | Volume | Volume overflow| Overflowing volume is detected at {}s. Adjust the recording equipment to avoid the volume overflow at its peak value.|
For more information, [learn more about the capabilities and limits of this feat
- [Deploy and use your voice model](how-to-deploy-and-use-endpoint.md) - [How to record voice samples](record-custom-voice-samples.md) - [Text-to-Speech API reference](rest-text-to-speech.md)-- [Long Audio API](long-audio-api.md)
+- [Long Audio API](long-audio-api.md)
communication-services Teams User Calling https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/interop/teams-user-calling.md
# Support for Teams identity in Calling SDK - The Azure Communication Services Calling SDK for JavaScript enables Teams user devices to drive voice and video communication experiences. This page provides detailed descriptions of Calling features, including platform and browser support information. To get started right away, check out [Calling quickstarts](../../quickstarts/voice-video-calling/get-started-with-voice-video-calling-custom-teams-client.md). Key features of the Calling SDK:
communication-services Join Teams Meeting https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/join-teams-meeting.md
# Join a Teams meeting
-> [!IMPORTANT]
-> BYOI interoperability is now generally available to all Communication Services applications and Teams organizations.
- Azure Communication Services can be used to build applications that enable users to join and participate in Teams meetings. [Standard Azure Communication Services pricing](https://azure.microsoft.com/pricing/details/communication-services/) applies to these users, but there's no additional fee for the interoperability capability itself. With the bring your own identity (BYOI) model, you control user authentication and users of your applications don't need Teams licenses to join Teams meetings. This is ideal for applications that enable licensed Teams users and external users using a custom application to join into a virtual consultation experience. For example, healthcare providers using Teams can conduct teleheath virtual visits with their patients who use a custom application. It's also possible to use Teams identities with the Azure Communication Services SDKs. More information is available [here](./teams-interop.md).
Microsoft will indicate to you via the Azure Communication Services API that rec
- [How-to: Join a Teams meeting](../how-tos/calling-sdk/teams-interoperability.md) - [Quickstart: Join a BYOI calling app to a Teams meeting](../quickstarts/voice-video-calling/get-started-teams-interop.md)-- [Quickstart: Join a BYOI chat app to a Teams meeting](../quickstarts/chat/meeting-interop.md)
+- [Quickstart: Join a BYOI chat app to a Teams meeting](../quickstarts/chat/meeting-interop.md)
communication-services Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/service-limits.md
If you require sending an amount of messages that exceeds the rate-limits, pleas
For more information on the SMS SDK and service, see the [SMS SDK overview](./sms/sdk-features.md) page or the [SMS FAQ](./sms/sms-faq.md) page.
+## Email
+Sending high volume of messages has a set of limitation on the number of email messages that you can send. If you hit these limits, your messages will not be queued to be sent. You can submit these requests again, once the Retry-After time expires.
+
+### Rate Limits
+
+|Operation|Scope|Timeframe (minutes)| Limit (number of email) |
+||--|-|-|
+|Send Email|Per Subscription|1|10|
+|Send Email|Per Subscription|60|25|
+|Get Email Status|Per Subscription|1|20|
+|Get Email Status|Per Subscription|60|50|
+
+### Size Limits
+
+| **Name** | Limit |
+|--|--|
+|Number of recipients in Email|50 |
+|Attachment size - per messages|10 MB |
+
+### Action to take
+This sandbox setup is to help developers to start building the application and gradually you can request to increase the sending volume as soon as the application is ready to go live. If you require sending a number of messages that exceeds the rate-limits, please submit a support request to increase to your desired sending limit.
+ ## Chat ### Size Limits
communication-services Teams Endpoint https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/teams-endpoint.md
# Communication as Teams user - You can use Azure Communication Services and Graph API to integrate communication as Teams users into your products. Teams users can communicate with other people in and outside their organization. The benefits for enterprises are: - No requirement to download Teams desktop, mobile or web clients for Teams users - Teams users don't lose context by switching between applications for day-to-day work and Teams client for communication
communication-services Teams Interop https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/teams-interop.md
# Teams interoperability
-> [!IMPORTANT]
-> Teams external users interoperability for Teams meetings is now generally available to all Communication Services applications and Teams organizations.
->
-> Support for Teams users in Azure Communication Services SDK is in public preview and available to Web-based applications.
->
-> Preview APIs and SDKs are provided without a service-level agreement and are not recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
- Azure Communication Services can be used to build custom applications and experiences that enable interaction with Microsoft Teams users over voice, video, chat, and screen sharing. The [Communication Services UI Library](ui-library/ui-library-overview.md) provides customizable, production-ready UI components that can be easily added to these applications. The following video demonstrates some of the capabilities of Teams interoperability: <br>
communication-services Manage Teams Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/quickstarts/manage-teams-identity.md
# Quickstart: Set up and manage access tokens for Teams users - In this quickstart, you'll build a .NET console application to authenticate a Microsoft 365 user by using the Microsoft Authentication Library (MSAL) and retrieving a Microsoft Azure Active Directory (Azure AD) user token. You'll then exchange that token for an access token of Teams user with the Azure Communication Services Identity SDK. The access token for Teams user can then be used by the Communication Services Calling SDK to integrate calling capability as Teams user. > [!NOTE]
communication-services Get Started With Voice Video Calling Custom Teams Client https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/quickstarts/voice-video-calling/get-started-with-voice-video-calling-custom-teams-client.md
# QuickStart: Add 1:1 video calling as a Teams user to your application - [!INCLUDE [Video calling with JavaScript](./includes/custom-teams-endpoint/voice-video-calling-cte-javascript.md)] ## Clean up resources
confidential-computing Application Development https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/confidential-computing/application-development.md
Title: Azure confidential computing development tools description: Use tools and libraries to develop applications for confidential computing on Intel SGX -+ Last updated 11/01/2021-+
confidential-computing Attestation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/confidential-computing/attestation.md
Title: Attestation for SGX enclaves description: You can use attestation to verify that your Azure confidential computing SGX enclave is secure. -+ Last updated 12/20/2021-+
confidential-computing Confidential Computing Deployment Models https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/confidential-computing/confidential-computing-deployment-models.md
Title: Choose Between Deployment Models description: Choose Between Deployment Models-+ Last updated 11/04/2021-+
confidential-computing Confidential Computing Enclaves https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/confidential-computing/confidential-computing-enclaves.md
Title: Build with SGX enclaves - Azure Virtual Machines description: Learn about Intel SGX hardware to enable your confidential computing workloads.-+ Last updated 11/01/2021-+
confidential-computing Confidential Computing Solutions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/confidential-computing/confidential-computing-solutions.md
Title: Building Azure confidential solutions description: Learn how to build solutions on Azure confidential computing-+ Last updated 11/01/2021-+
confidential-computing Enclave Development Oss https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/confidential-computing/enclave-development-oss.md
Title: Develop application enclaves with open-source solutions in Azure Confidential Computing description: Learn how to use tools to develop Intel SGX applications for Azure confidential computing.-+ Last updated 11/01/2021-+
confidential-computing How To Fortanix Confidential Computing Manager Node Agent https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/confidential-computing/how-to-fortanix-confidential-computing-manager-node-agent.md
Title: Run an app with Fortanix Confidential Computing Manager description: Learn how to use Fortanix Confidential Computing Manager to convert your containerized images. -+ Last updated 03/24/2021-+ # Run an application by using Fortanix Confidential Computing Manager
confidential-computing How To Fortanix Confidential Computing Manager https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/confidential-computing/how-to-fortanix-confidential-computing-manager.md
Title: Fortanix Confidential Computing Manager in an Azure managed application description: Learn how to deploy Fortanix Confidential Computing Manager (CCM) in a managed application in the Azure portal.-+ Last updated 02/03/2021-+
confidential-computing Overview Azure Products https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/confidential-computing/overview-azure-products.md
Title: Azure confidential computing products description: Learn about all the confidential computing services that Azure provides-+ Last updated 11/04/2021-+
confidential-computing Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/confidential-computing/overview.md
Title: Azure Confidential Computing Overview description: Overview of Azure Confidential (ACC) Computing -+ Last updated 11/01/2021-+
confidential-computing Quick Create Marketplace https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/confidential-computing/quick-create-marketplace.md
Title: Quickstart - Create Intel SGX VM in the Azure Marketplace description: Get started with your deployments by learning how to quickly create an Intel SGX VM with Marketplace.-+ Last updated 11/01/2021-+
confidential-computing Quick Create Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/confidential-computing/quick-create-portal.md
Title: Quickstart - Create Intel SGX VM in the Azure Portal description: Get started with your deployments by learning how to quickly create an Intel SGX VM in the Azure Portal-+ Last updated 11/1/2021-+
confidential-computing Use Cases Scenarios https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/confidential-computing/use-cases-scenarios.md
Title: Common Azure confidential computing scenarios and use cases description: Understand how to use confidential computing in your scenario. -+ Last updated 11/04/2021-+ # Use cases and scenarios
confidential-computing Virtual Machine Solutions Sgx https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/confidential-computing/virtual-machine-solutions-sgx.md
Title: Deploy Intel SGX virtual machines description: Learn about using Intel SGX virtual machines (VMs) in Azure confidential computing.-+ Last updated 12/20/2021-+
connectors Connectors Create Api Sendgrid https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/connectors/connectors-create-api-sendgrid.md
- Title: Connect to SendGrid from Azure Logic Apps
-description: Automate tasks and workflows that send emails and manage mailing lists in SendGrid using Azure Logic Apps.
--- Previously updated : 08/24/2018
-tags: connectors
--
-# Connect to SendGrid from Azure Logic Apps
-
-With Azure Logic Apps and the SendGrid connector,
-you can create automated tasks and workflows that
-send emails and manage your recipient lists,
-for example:
-
-* Send email.
-* Add recipients to lists.
-* Get, add, and manage global suppression.
-
-You can use SendGrid actions in your logic apps to perform these tasks.
-You can also have other actions use the output from SendGrid actions.
-
-This connector provides only actions, so to start your logic app,
-use a separate trigger, such as a **Recurrence** trigger.
-For example, if you regularly add recipients to your lists,
-you can send email about recipients and lists using the
-Office 365 Outlook connector or Outlook.com connector.
-If you're new to logic apps, review
-[What is Azure Logic Apps?](../logic-apps/logic-apps-overview.md)
-
-## Prerequisites
-
-* An Azure account and subscription. If you don't have an Azure subscription,
-[sign up for a free Azure account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-
-* A [SendGrid account](https://www.sendgrid.com/)
-and a [SendGrid API key](https://sendgrid.com/docs/ui/account-and-settings/api-keys/)
-
- Your API key authorizes your logic app to create
- a connection and access your SendGrid account.
-
-* Basic knowledge about
-[how to create logic apps](../logic-apps/quickstart-create-first-logic-app-workflow.md)
-
-* The logic app where you want to access your SendGrid account.
-To use a SendGrid action, start your logic app with another trigger,
-for example, the **Recurrence** trigger.
-
-## Connect to SendGrid
--
-1. Sign in to the [Azure portal](https://portal.azure.com),
-and open your logic app in Logic App Designer, if not open already.
-
-1. Choose a path:
-
- * Under the last step where you want to add an action,
- choose **New step**.
-
- -or-
-
- * Between the steps where you want to add an action,
- move your pointer over the arrow between steps.
- Choose the plus sign (**+**) that appears,
- and then select **Add an action**.
-
-1. In the search box, enter "sendgrid" as your filter.
-Under the actions list, select the action you want.
-
-1. Provide a name for your connection.
-
-1. Enter your SendGrid API key,
-and then choose **Create**.
-
-1. Provide the necessary details for your selected action
-and continue building your logic app's workflow.
-
-## Connector reference
-
-For technical details about triggers, actions, and limits, which are
-described by the connector's OpenAPI (formerly Swagger) description,
-review the connector's [reference page](/connectors/sendgrid/).
-
-## Get support
-
-* For questions, visit the [Microsoft Q&A question page for Azure Logic Apps](/answers/topics/azure-logic-apps.html).
-* To submit or vote on feature ideas, visit the [Logic Apps user feedback site](https://aka.ms/logicapps-wish).
-
-## Next steps
-
-* Learn about other [Logic Apps connectors](../connectors/apis-list.md)
cosmos-db Automated Recommendations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/automated-recommendations.md
Title: Automated performance, cost, security recommendations for Azure Cosmos DB description: Learn how to view customized performance, cost, security, and other recommendations for Azure Cosmos DB based on your workload patterns.--++ Last updated 08/26/2021
cosmos-db Compliance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/compliance.md
Title: Azure Cosmos DB compliance description: This article describes compliance coverage for Azure Cosmos DB.--++ Last updated 09/11/2021
cosmos-db Database Encryption At Rest https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/database-encryption-at-rest.md
Title: Encryption at rest in Azure Cosmos DB description: Learn how Azure Cosmos DB provides encryption of data at rest and how it is implemented.--++ Last updated 10/26/2021
cosmos-db Database Security https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/database-security.md
Title: Database security - Azure Cosmos DB description: Learn how Azure Cosmos DB provides database protection and data security for your data.--++ Last updated 07/18/2022
cosmos-db How To Always Encrypted https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/how-to-always-encrypted.md
description: Learn how to use client-side encryption with Always Encrypted for A
Last updated 04/04/2022--++ # Use client-side encryption with Always Encrypted for Azure Cosmos DB
cosmos-db How To Configure Firewall https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/how-to-configure-firewall.md
description: Learn how to configure IP access control policies for firewall supp
Last updated 02/18/2022--++
cosmos-db How To Configure Private Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/how-to-configure-private-endpoints.md
Title: Configure Azure Private Link for an Azure Cosmos account description: Learn how to set up Azure Private Link to access an Azure Cosmos account by using a private IP address in a virtual network. -+ Last updated 06/08/2021-+
cosmos-db How To Configure Vnet Service Endpoint https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/how-to-configure-vnet-service-endpoint.md
Title: Configure virtual network based access for an Azure Cosmos account description: This document describes the steps required to set up a virtual network service endpoint for Azure Cosmos DB. -+ Last updated 07/07/2021-+
cosmos-db How To Define Unique Keys https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/how-to-define-unique-keys.md
Title: Define unique keys for an Azure Cosmos container description: Learn how to define unique keys for an Azure Cosmos container using Azure portal, PowerShell, .NET, Java, and various other SDKs. -+ Last updated 12/02/2019-+
cosmos-db How To Setup Cmk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/how-to-setup-cmk.md
Title: Configure customer-managed keys for your Azure Cosmos DB account description: Learn how to configure customer-managed keys for your Azure Cosmos DB account with Azure Key Vault-+ Last updated 07/20/2022-+ ms.devlang: azurecli
cosmos-db How To Setup Managed Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/how-to-setup-managed-identity.md
Title: Configure managed identities with Azure AD for your Azure Cosmos DB account description: Learn how to configure managed identities with Azure Active Directory for your Azure Cosmos DB account-+ Last updated 10/15/2021-+ # Configure managed identities with Azure Active Directory for your Azure Cosmos DB account
cosmos-db How To Setup Rbac https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/how-to-setup-rbac.md
Title: Configure role-based access control for your Azure Cosmos DB account with Azure AD description: Learn how to configure role-based access control with Azure Active Directory for your Azure Cosmos DB account-+ Last updated 02/16/2022-+
cosmos-db Limit Total Account Throughput https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/limit-total-account-throughput.md
Title: Limit the total throughput provisioned on your Azure Cosmos DB account description: Learn how to limit the total throughput provisioned on your Azure Cosmos DB account-+ Last updated 03/31/2022-+
cosmos-db Role Based Access Control https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/role-based-access-control.md
description: Learn how Azure Cosmos DB provides database protection with Active
Last updated 05/11/2022--++
cosmos-db Secure Access To Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/secure-access-to-data.md
Title: Learn how to secure access to data in Azure Cosmos DB description: Learn about access control concepts in Azure Cosmos DB, including primary keys, read-only keys, users, and permissions.--++
cosmos-db Defender For Cosmos Db https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/defender-for-cosmos-db.md
Last updated 06/21/2022--++ # Microsoft Defender for Azure Cosmos DB
cosmos-db How To Model Partition Example https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/how-to-model-partition-example.md
Title: Model and partition data on Azure Cosmos DB with a real-world example description: Learn how to model and partition a real-world example using the Azure Cosmos DB Core API-+ Last updated 08/26/2021-+ ms.devlang: javascript
cosmos-db Create Table Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/table/create-table-dotnet.md
ms.devlang: dotnet Previously updated : 06/24/2022 Last updated : 08/22/2022 # Quickstart: Azure Cosmos DB Table API for .NET+ [!INCLUDE[appliesto-table-api](../includes/appliesto-table-api.md)] This quickstart shows how to get started with the Azure Cosmos DB Table API from a .NET application. The Cosmos DB Table API is a schemaless data store allowing applications to store structured NoSQL data in the cloud. You'll learn how to create tables, rows, and perform basic tasks within your Cosmos DB resource using the [Azure.Data.Tables Package (NuGet)](https://www.nuget.org/packages/Azure.Data.Tables/).
This quickstart shows how to get started with the Azure Cosmos DB Table API from
## Prerequisites * An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free).
-* [.NET 6.0](https://dotnet.microsoft.com/en-us/download)
+* [.NET 6.0](https://dotnet.microsoft.com/download)
* [Azure Command-Line Interface (CLI)](/cli/azure/) or [Azure PowerShell](/powershell/azure/) ### Prerequisite check
This quickstart shows how to get started with the Azure Cosmos DB Table API from
## Setting up
-This section walks you through how to create an Azure Cosmos account and set up a project that uses the Table API NuGet packages.
+This section walks you through how to create an Azure Cosmos account and set up a project that uses the Table API NuGet packages.
### Create an Azure Cosmos DB account
This quickstart will create a single Azure Cosmos DB account using the Table API
### Create a new .NET app
-Create a new .NET application in an empty folder using your preferred terminal. Use the [``dotnet new console``](/dotnet/core/tools/dotnet-new) to create a new console app.
+Create a new .NET application in an empty folder using your preferred terminal. Use the [``dotnet new console``](/dotnet/core/tools/dotnet-new) to create a new console app.
```console
-dotnet new console -output <app-name>
+dotnet new console --output <app-name>
``` ### Install the NuGet package
The sample code described in this article creates a table named ``adventureworks
You'll use the following Table API classes to interact with these resources: -- [``TableServiceClient``](/dotnet/api/azure.data.tables.tableserviceclient) - This class provides methods to perform service level operations with Azure Cosmos DB Table API.-- [``TableClient``](/dotnet/api/azure.data.tables.tableclient) - This class allows you to interact with tables hosted in the Azure Cosmos DB table API.-- [``TableEntity``](/dotnet/api/azure.data.tables.tableentity) - This class is a reference to a row in a table that allows you to manage properties and column data.
+* [``TableServiceClient``](/dotnet/api/azure.data.tables.tableserviceclient) - This class provides methods to perform service level operations with Azure Cosmos DB Table API.
+* [``TableClient``](/dotnet/api/azure.data.tables.tableclient) - This class allows you to interact with tables hosted in the Azure Cosmos DB table API.
+* [``TableEntity``](/dotnet/api/azure.data.tables.tableentity) - This class is a reference to a row in a table that allows you to manage properties and column data.
### Authenticate the client
The easiest way to create a new item in a table is to create a class that implem
:::code language="csharp" source="~/azure-cosmos-tableapi-dotnet/001-quickstart/Product.cs" id="type" :::
-Create an item in the collection using the `Product` class by calling [``TableClient.AddEntityAsync<T>``](/dotnet/api/azure.data.tables.tableclient.addentityasync).
+Create an item in the collection using the `Product` class by calling [``TableClient.AddEntityAsync<T>``](/dotnet/api/azure.data.tables.tableclient.addentityasync).
:::code language="csharp" source="~/azure-cosmos-tableapi-dotnet/001-quickstart/Program.cs" id="create_object_add" :::
You can retrieve a specific item from a table using the [``TableEntity.GetEntity
### Query items
-After you insert an item, you can also run a query to get all items that match a specific filter by using the `TableClient.Query<T>` method. This example filters products by category using [Linq](/dotnet/standard/linq) syntax, which is a benefit of using strongly typed `ITableEntity` models like the `Product` class.
+After you insert an item, you can also run a query to get all items that match a specific filter by using the `TableClient.Query<T>` method. This example filters products by category using [Linq](/dotnet/standard/linq) syntax, which is a benefit of using typed `ITableEntity` models like the `Product` class.
> [!NOTE] > You can also query items using [OData](/rest/api/storageservices/querying-tables-and-entities) syntax. You can see an example of this approach in the [Query Data](./tutorial-query-table.md) tutorial.
cosmos-db Find Request Unit Charge https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/table/find-request-unit-charge.md
Title: Find request unit (RU) charge for a Table API queries in Azure Cosmos DB description: Learn how to find the request unit (RU) charge for Table API queries executed against an Azure Cosmos container. You can use the Azure portal, .NET, Java, Python, and Node.js languages to find the RU charge. -+ Last updated 10/14/2020-+ ms.devlang: csharp
data-factory Connect Data Factory To Azure Purview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/connect-data-factory-to-azure-purview.md
You have two options to connect data factory to Microsoft Purview:
### Connect to Microsoft Purview account in Data Factory
-You need to have **Owner** or **Contributor** role on your data factory to connect to a Microsoft Purview account.
+You need to have **Owner** or **Contributor** role on your data factory to connect to a Microsoft Purview account. Your data factory needs to have system assigned managed identity enabled.
To establish the connection on Data Factory authoring UI:
databox-online Azure Stack Edge Gpu Manage Certificates https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/databox-online/azure-stack-edge-gpu-manage-certificates.md
To delete a signing chain certificate from your Azure Stack Edge device, take th
1. Select the signing chain certificate you want to delete. Then select **Delete**.
- [ ![Screenshot of the Certificates blade of the local Web UI of an Azure Stack Edge device. The Delete option for the signing certificates is highlighted.](media/azure-stack-edge-gpu-manage-certificates/delete-signing-certificate-01.png) ](media/azure-stack-edge-gpu-manage-certificates/delete-signing-certificate-01.png)
+ [ ![Screenshot of the Certificates blade of the local Web UI of an Azure Stack Edge device. The Delete option for the signing certificates is highlighted.](media/azure-stack-edge-gpu-manage-certificates/delete-signing-certificate-01.png) ](media/azure-stack-edge-gpu-manage-certificates/delete-signing-certificate-01.png#lightbox)
1. On the **Delete certificate** pane, verify the certificate's thumbprint, and then select **Delete**. Certificate deletion can't be reversed.
defender-for-cloud Onboard Management Group https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/onboard-management-group.md
To onboard a management group and all its subscriptions:
The remediation task will then enable Defender for Cloud, for free, on the non-compliant subscriptions. > [!IMPORTANT]
-> The policy definition will only enable Defender for Cloud on **existing** subscriptions. To register newly created subscriptions, open the compliance tab, select the relevant non-compliant subscriptions, and create a remediation task.Repeat this step when you have one or more new subscriptions you want to monitor with Defender for Cloud.
+> The policy definition will only enable Defender for Cloud on **existing** subscriptions. To register newly created subscriptions, open the compliance tab, select the relevant non-compliant subscriptions, and create a remediation task. Repeat this step when you have one or more new subscriptions you want to monitor with Defender for Cloud.
## Optional modifications
defender-for-iot Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-iot/organizations/release-notes.md
For more information, see the [Microsoft Security Development Lifecycle practice
| Version | Date released | End support date | |--|--|--|
+| 22.2.5 | 08/2022 | 04/2023 |
| 22.2.4 | 07/2022 <br> There's a known compatibility issue with Hyper-V, please use version 22.1.7 | 04/2023 | | 22.2.3 | 07/2022 <br> There's a known compatibility issue with Hyper-V, please use version 22.1.7 | 04/2023 | | 22.1.7 | 07/2022 | 04/2023 |
For more information, see the [Microsoft Security Development Lifecycle practice
## August 2022
+- **Sensor software version 22.2.5**: Minor version with stability improvements
- [New alert columns with timestamp data](#new-alert-columns-with-timestamp-data) - [Sensor health from the Azure portal (Public preview)](#sensor-health-from-the-azure-portal-public-preview)
digital-twins How To Ingest Iot Hub Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/digital-twins/how-to-ingest-iot-hub-data.md
When the twin is created successfully, the CLI output from the command should lo
In this section, you'll create an Azure function to access Azure Digital Twins and update twins based on IoT telemetry events that it receives. Follow the steps below to create and publish the function.
-1. First, create a new Azure Functions project.
+1. First, create a new Azure Functions project of Event Grid trigger type.
You can do this using **Visual Studio** (for instructions, see [Develop Azure Functions using Visual Studio](../azure-functions/functions-develop-vs.md#create-an-azure-functions-project)), **Visual Studio Code** (for instructions, see [Create a C# function in Azure using Visual Studio Code](../azure-functions/create-first-function-vs-code-csharp.md?tabs=in-process#create-an-azure-functions-project)), or the **Azure CLI** (for instructions, see [Create a C# function in Azure from the command line](../azure-functions/create-first-function-cli-csharp.md?tabs=azure-cli%2Cin-process#create-a-local-function-project)).
dns Tutorial Alias Tm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/dns/tutorial-alias-tm.md
Create an alias record that points to the Traffic Manager profile.
1. From a web browser, browse to `contoso.com` or your apex domain name. You see the IIS default page with `Hello World from Web-01`. The Traffic Manager directed traffic to **Web-01** IIS web server because it has the highest priority. Close the web browser and shut down **Web-01** virtual machine. Wait a few minutes for the virtual machine to completely shut down. 1. Open a new web browser, and browse again to `contoso.com` or your apex domain name.
-1. You should see the IIS default page with `Hello World from Web-01`. The Traffic Manager handled the situation and directed traffic to the second IIS server after shutting down the first server that has the highest priority.
+1. You should see the IIS default page with `Hello World from Web-02`. The Traffic Manager handled the situation and directed traffic to the second IIS server after shutting down the first server that has the highest priority.
## Clean up resources
expressroute Expressroute Howto Erdirect https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/expressroute/expressroute-howto-erdirect.md
ExpressRoute Direct and ExpressRoute circuit(s) in different subscriptions or Az
```powershell Add-AzExpressRoutePortAuthorization -Name $Name -ExpressRoutePort $ERPort
- Set-AzExpressRoutePort -ExpressRoutePort $ERPort
``` Sample output:
hdinsight Apache Hive Warehouse Connector https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/interactive-query/apache-hive-warehouse-connector.md
value. The value may be similar to: `thrift://iqgiro.rekufuk2y2cezcbowjkbwfnyvd.
| Configuration | Value | |-|-|
- |`spark.datasource.hive.warehouse.load.staging.dir`|`wasbs://STORAGE_CONTAINER_NAME@STORAGE_ACCOUNT_NAME.blob.core.windows.net/tmp`. <br> Set to a suitable HDFS-compatible staging directory. If you have two different clusters, the staging directory should be a folder in the staging directory of the LLAP cluster's storage account so that HiveServer2 has access to it. Replace `STORAGE_ACCOUNT_NAME` with the name of the storage account being used by the cluster, and `STORAGE_CONTAINER_NAME` with the name of the storage container. |
+ |`spark.datasource.hive.warehouse.load.staging.dir`| If you are using ADLS Gen2 Storage Account, use `abfss://STORAGE_CONTAINER_NAME@STORAGE_ACCOUNT_NAME.dfs.core.windows.net/tmp`<br>If you are using Azure Blob Storage Account, use `wasbs://STORAGE_CONTAINER_NAME@STORAGE_ACCOUNT_NAME.blob.core.windows.net/tmp`. <br> Set to a suitable HDFS-compatible staging directory. If you have two different clusters, the staging directory should be a folder in the staging directory of the LLAP cluster's storage account so that HiveServer2 has access to it. Replace `STORAGE_ACCOUNT_NAME` with the name of the storage account being used by the cluster, and `STORAGE_CONTAINER_NAME` with the name of the storage container. |
|`spark.sql.hive.hiveserver2.jdbc.url`| The value you obtained earlier from **HiveServer2 Interactive JDBC URL** | |`spark.datasource.hive.warehouse.metastoreUri`| The value you obtained earlier from **hive.metastore.uris**. | |`spark.security.credentials.hiveserver2.enabled`|`true` for YARN cluster mode and `false` for YARN client mode. |
healthcare-apis Overview Of Search https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/fhir/overview-of-search.md
Title: Overview of FHIR search in Azure Health Data Services description: This article describes an overview of FHIR search that is implemented in Azure Health Data Services-+ Last updated 06/06/2022-+ # Overview of FHIR search
-The Fast Healthcare Interoperability Resources (FHIR&#174;) specification defines the fundamentals of search for FHIR resources. This article will guide you through some key aspects to searching resources in FHIR. For complete details about searching FHIR resources, refer to [Search](https://www.hl7.org/fhir/search.html) in the HL7 FHIR Specification. Throughout this article, we'll give examples of search syntax. Each search will be against your FHIR server, which typically has a URL of `https://<WORKSPACE NAME>-<ACCOUNT-NAME>.fhir.azurehealthcareapis.com`. In the examples, we'll use the placeholder {{FHIR_URL}} for this URL.
+The Fast Healthcare Interoperability Resources (FHIR&#174;) specification defines the fundamentals of search for FHIR resources. This article will guide you through some key aspects to searching resources in FHIR. For complete details about searching FHIR resources, refer to [Search](https://www.hl7.org/fhir/search.html) in the HL7 FHIR Specification. Throughout this article, we'll give examples of search syntax. Each search will be against your FHIR server, which typically has a URL of `https://<WORKSPACE NAME>-<ACCOUNT-NAME>.fhir.azurehealthcareapis.com`. In the examples, we'll use the placeholder `{{FHIR_URL}}` for this URL.
FHIR searches can be against a specific resource type, a specified [compartment](https://www.hl7.org/fhir/compartmentdefinition.html), or all resources. The simplest way to execute a search in FHIR is to use a `GET` request. For example, if you want to pull all patients in the database, you could use the following request:
import-export Storage Import Export View Drive Status https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/import-export/storage-import-export-view-drive-status.md
If you created your import or export job in Azure Data Box (the Preview experien
A list of Import/Export jobs appears on the page.
- [ ![Screenshot of Data Box resources in the Azure portal filtered to Import Export jobs. The job name, transfer type, status, and model are highlighted.](./media/storage-import-export-view-drive-status/preview-jobs-list.png) ](./media/storage-import-export-view-drive-status/preview-jobs-list.png)
+ [ ![Screenshot of Data Box resources in the Azure portal filtered to Import Export jobs. The job name, transfer type, status, and model are highlighted.](./media/storage-import-export-view-drive-status/preview-jobs-list.png) ](./media/storage-import-export-view-drive-status/preview-jobs-list.png#lightbox)
4. Select a job name to view job details.
iot-central Concepts Device Implementation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/concepts-device-implementation.md
A DTDL model can be a _no-component_ or a _multi-component_ model:
- Multi-component model. A more complex model that includes two or more components. These components include a single root component, and one or more nested components. For an example, see the [Temperature Controller](https://github.com/Azure/opendigitaltwins-dtdl/blob/master/DTDL/v2/samples/TemperatureController.json) model. > [!TIP]
-> You can [export a device model](howto-set-up-template.md#interfaces-and-components) from an IoT Central device template as a DTDL v2 file.
+> You can [import and export a complete device model or individual interface](howto-set-up-template.md#interfaces-and-components) from an IoT Central device template as a DTDL v2 file.
To learn more about device models, see the [IoT Plug and Play modeling guide](../../iot-develop/concepts-modeling-guide.md)
iot-central Concepts Device Templates https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/concepts-device-templates.md
To lean more about the DPS payload, see the sample code used in the [Tutorial: C
## Device models
-A device model defines how a device interacts with your IoT Central application. The device developer must make sure that the device implements the behaviors defined in the device model so that IoT Central can monitor and manage the device. A device model is made up of one or more _interfaces_, and each interface can define a collection of _telemetry_ types, _device properties_, and _commands_. A solution developer can import a JSON file that defines the device model into a device template, or use the web UI in IoT Central to create or edit a device model.
+A device model defines how a device interacts with your IoT Central application. The device developer must make sure that the device implements the behaviors defined in the device model so that IoT Central can monitor and manage the device. A device model is made up of one or more _interfaces_, and each interface can define a collection of _telemetry_ types, _device properties_, and _commands_. A solution developer can import a JSON file that defines a complete device model or individual interface into a device template, or use the web UI in IoT Central to create or edit a device model.
To learn more about editing a device model, see [Edit an existing device template](howto-edit-device-template.md)
-A solution developer can also export a JSON file that contains the device model. A device developer can use this JSON document to understand how the device should communicate with the IoT Central application.
+A solution developer can also export a JSON file from the device template that contains a complete device model or individual interface. A device developer can use this JSON document to understand how the device should communicate with the IoT Central application.
The JSON file that defines the device model uses the [Digital Twin Definition Language (DTDL) V2](https://github.com/Azure/opendigitaltwins-dtdl/blob/master/DTDL/v2/dtdlv2.md). IoT Central expects the JSON file to contain the device model with the interfaces defined inline, rather than in separate files. To learn more, see [IoT Plug and Play modeling guide](../../iot-develop/concepts-modeling-guide.md).
You can choose queue commands if a device is currently offline by enabling the *
Offline commands are one-way notifications to the device from your solution. Offline commands can have request parameters but don't return a response. > [!NOTE]
-> This option is only available in the IoT Central web UI. This setting isn't included if you export a model or interface from the device template.
+> Offline commands are marked as `durable` if you export the model as DTDL.
## Cloud properties
iot-central Concepts Telemetry Properties Commands https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/concepts-telemetry-properties-commands.md
When the device has finished processing the request, it should send a property t
In the IoT Central web UI, you can select the **Queue if offline** option for a command. Offline commands are one-way notifications to the device from your solution that are delivered as soon as a device connects. Offline commands can have a request parameter but don't return a response.
-The **Queue if offline** setting isn't included if you export a model or interface from the device template. You can't tell by looking at an exported model or interface JSON that a command is an offline command.
+Offline commands are marked as `durable` if you export the model as DTDL.
Offline commands use [IoT Hub cloud-to-device messages](../../iot-hub/iot-hub-devguide-messages-c2d.md) to send the command and payload to the device.
iot-central Howto Create Organizations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-create-organizations.md
To reassign an organization to a new parent, select **Edit** and choose a new pa
To delete an organization, you must delete or move to another organization any associated items such as dashboards, devices, users, device groups, and jobs. > [!TIP]
-> You can also use the REST API to [create and manage organizations](/rest/api/iotcentral/1.2-previewdataplane/organizations).
+> You can also use the REST API to [create and manage organizations](/rest/api/iotcentral/2022-07-31dataplane/organizations).
## Assign devices
The following limits apply to organizations:
## Next steps Now that you've learned how to manage Azure IoT Central organizations, the suggested next step is to learn how to [Export IoT data to cloud destinations using Blob Storage](howto-export-to-blob-storage.md).-
iot-central Howto Export Data Legacy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-export-data-legacy.md
instanceOf: .device.templateId,
properties: .device.properties.reported | map({ key: .name, value: .value }) | from_entries ```
-Device templates: If you're currently using legacy data exports with the device templates data type, then you can obtain the same data using the [Device Templates - Get API call](/rest/api/iotcentral/2022-05-31dataplane/device-templates/get).
+Device templates: If you're currently using legacy data exports with the device templates data type, then you can obtain the same data using the [Device Templates - Get API call](/rest/api/iotcentral/2022-07-31dataplane/device-templates/get).
### Destination migration considerations
This example snapshot shows a message that contains device and properties data i
If you have an existing data export in your preview application with the *Devices* and *Device templates* streams turned on, update your export by **30 June 2020**. This requirement applies to exports to Azure Blob storage, Azure Event Hubs, and Azure Service Bus.
-Starting 3 February 2020, all new exports in applications with Devices and Device templates enabled will have the data format described above. All exports created before this date remain on the old data format until 30 June 2020, at which time these exports will automatically be migrated to the new data format. The new data format matches the [device](/rest/api/iotcentral/2022-05-31dataplane/devices/get), [device property](/rest/api/iotcentral/2022-05-31dataplane/devices/get-properties), and [device template](/rest/api/iotcentral/2022-05-31dataplane/device-templates/get) objects in the IoT Central public API.
+Starting 3 February 2020, all new exports in applications with Devices and Device templates enabled will have the data format described above. All exports created before this date remain on the old data format until 30 June 2020, at which time these exports will automatically be migrated to the new data format. The new data format matches the [device](/rest/api/iotcentral/2022-07-31dataplane/devices/get), [device property](/rest/api/iotcentral/2022-07-31dataplane/devices/get-properties), and [device template](/rest/api/iotcentral/2022-07-31dataplane/device-templates/get) objects in the IoT Central public API.
For **Devices**, notable differences between the old data format and the new data format include: - `@id` for device is removed, `deviceId` is renamed to `id`
iot-central Howto Manage Device Templates With Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-manage-device-templates-with-rest-api.md
The request body has some required fields:
* `capabilityModel` : Every device template has a capability model. A relationship is established between each module capability model and a device model. A capability model implements one or more module interfaces. > [!TIP]
-> The device template JSON is not a standard DTDL document. The device template JSON includes IoT Central specific data such as cloud property definitions and display units. You can use the device template JSON format to import and export device templates in IoT Central by using the REST API and the CLI.
+> The device template JSON is not a standard DTDL document. The device template JSON includes IoT Central specific data such as cloud property definitions and display units. You can use the device template JSON format to import and export device templates in IoT Central by using the REST API, the CLI, and the UI.
There are some optional fields you can use to add more details to the capability model, such as display name and description.
iot-central Howto Manage Users Roles With Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-manage-users-roles-with-rest-api.md
The response to this request looks like the following example. The role value id
} ```
-You can also add a service principal user which is useful if you need to use service principal authentication for REST API calls. To learn more, see [Add or update a service principal user](/rest/api/iotcentral/2022-05-31dataplane/users/create#add-or-update-a-service-principal-user).
+You can also add a service principal user which is useful if you need to use service principal authentication for REST API calls. To learn more, see [Add or update a service principal user](/rest/api/iotcentral/2022-07-31dataplane/users/create#add-or-update-a-service-principal-user).
### Change the role of a user
DELETE https://{your app subdomain}.azureiotcentral.com/api/users/user-001?api-v
## Next steps
-Now that you've learned how to manage users and roles with the REST API, a suggested next step is to [How to use the IoT Central REST API to manage organizations.](howto-manage-organizations-with-rest-api.md)
+Now that you've learned how to manage users and roles with the REST API, a suggested next step is to [How to use the IoT Central REST API to manage organizations.](howto-manage-organizations-with-rest-api.md)
iot-central Howto Migrate To Iot Hub https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-migrate-to-iot-hub.md
npm install
npm start ```
-After the migrator app starts, navigate to [http://localhost:3000](http://localhost:3000) to view the tool.
+After the migrator app starts, navigate to `http://localhost:3000` to view the tool.
## Migrate devices
iot-central Howto Query With Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-query-with-rest-api.md
To find a device template ID, navigate to the **Devices** page in your IoT Centr
:::image type="content" source="media/howto-query-with-rest-api/show-device-template-id.png" alt-text="Screenshot that shows how to find the device template ID in the page URL.":::
-You can also use the [Devices - Get](/rest/api/iotcentral/1.2-previewdataplane/devices/get) REST API call to get the device template ID for a device.
+You can also use the [Devices - Get](/rest/api/iotcentral/2022-07-31dataplane/devices/get) REST API call to get the device template ID for a device.
## WHERE clause
iot-central Howto Set Up Template https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-set-up-template.md
Cloud-to-device messages:
- Require the device to implement a message handler to process the cloud-to-device message. > [!NOTE]
-> This option is only available in the IoT Central web UI. This setting isn't included if you export a model or component from the device template.
+> Offline commands are marked as `durable` if you export the model as DTDL.
## Cloud properties
iot-central Howto Use Commands https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-use-commands.md
The following snippet shows the JSON representation of the command in the device
``` > [!TIP]
-> You can export a device model from the device template page.
+> You can export a device model or interface from the device template page.
You can relate this command definition to the screenshot of the UI using the following fields:
iot-central Overview Iot Central Api Tour https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/overview-iot-central-api-tour.md
This article introduces you to Azure IoT Central REST API. Use the API to create
The REST API operations are grouped into the: -- *Data plane* operations that let you work with resources inside IoT Central applications. Data plane operations let you automate tasks that can also be completed using the IoT Central UI. Currently, there are [generally available](/rest/api/iotcentral/2022-05-31dataplane/api-tokens) and [preview](/rest/api/iotcentral/1.2-previewdataplane/api-tokens) versions of the data plane API.
+- *Data plane* operations that let you work with resources inside IoT Central applications. Data plane operations let you automate tasks that can also be completed using the IoT Central UI. Currently, there are [generally available](/rest/api/iotcentral/2022-07-31dataplane/api-tokens) and [preview](/rest/api/iotcentral/2022-06-30-previewdataplane/api-tokens) versions of the data plane API.
- *Control plane* operations that let you work with the Azure resources associated with IoT Central applications. Control plane operations let you automate tasks that can also be completed in the Azure portal. ## Data plane operations
iot-edge Tutorial C Module https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-edge/tutorial-c-module.md
# Mandatory fields. See more on aka.ms/skyeye/meta. Title: Tutorial develop C module for Linux - Azure IoT Edge | Microsoft Docs
+ Title: Tutorial - develop C module for Linux - Azure IoT Edge | Microsoft Docs
description: This tutorial shows you how to create an IoT Edge module with C code and deploy it to a Linux device running IoT Edge
Use the following table to understand your options for developing and deploying
| - | | - | | **Linux AMD64** | ![Use VS Code for C modules on Linux AMD64](./medi64](./media/tutorial-c-module/green-check.png) | | **Linux ARM32** | ![Use VS Code for C modules on Linux ARM32](./media/tutorial-c-module/green-check.png) | ![Use VS for C modules on Linux ARM32](./media/tutorial-c-module/green-check.png) |
+| **Linux ARM64** | ![Use VS Code for C modules on Linux ARM64](./media/tutorial-c-module/green-check.png) | ![Use VS for C modules on Linux ARM64](./media/tutorial-c-module/green-check.png) |
Before beginning this tutorial, you should have gone through the previous tutorial to set up your development environment for Linux container development: [Develop IoT Edge modules using Linux containers](tutorial-develop-for-linux.md). By completing that tutorial, you should have the following prerequisites in place:
Before beginning this tutorial, you should have gone through the previous tutori
* [Visual Studio Code](https://code.visualstudio.com/) configured with the [Azure IoT Tools](https://marketplace.visualstudio.com/items?itemName=vsciot-vscode.azure-iot-tools). * [Docker CE](https://docs.docker.com/install/) configured to run Linux containers.
-To develop an IoT Edge module in C, install the following additional prerequisites on your development machine:
+To develop an IoT Edge module in C, install the following prerequisites on your development machine:
* [C/C++ extension](https://marketplace.visualstudio.com/items?itemName=ms-vscode.cpptools) for Visual Studio Code.
-Installing the Azure IoT C SDK is not required for this tutorial, but can provide helpful functionality like intellisense and reading program definitions. For installation information, see [Azure IoT C SDKs and Libraries](https://github.com/Azure/azure-iot-sdk-c).
+Installing the Azure IoT C SDK isn't required for this tutorial, but can provide helpful functionality like intellisense and reading program definitions. For installation information, see [Azure IoT C SDKs and Libraries](https://github.com/Azure/azure-iot-sdk-c).
## Create a module project
Currently, Visual Studio Code can develop C modules for Linux AMD64 and Linux AR
### Update the module with custom code
-The default module code receives messages on an input queue and passes them along through an output queue. Let's add some additional code so that the module processes the messages at the edge before forwarding them to IoT Hub. Update the module so that it analyzes the temperature data in each message, and only sends the message to IoT Hub if the temperature exceeds a certain threshold.
+The default module code receives messages on an input queue and passes them along through an output queue. Let's add more code so the module processes messages at the edge before forwarding them to IoT Hub. Update the module so that it analyzes the temperature data in each message, and only sends the message to IoT Hub if the temperature exceeds a certain threshold.
1. The data from the sensor in this scenario comes in JSON format. To filter messages in JSON format, import a JSON library for C. This tutorial uses Parson.
Make sure that your IoT Edge device is up and running.
2. Right-click the name of your IoT Edge device, then select **Create Deployment for Single Device**.
-3. Select the **deployment.amd64.json** file in the **config** folder and then click **Select Edge Deployment Manifest**. Do not use the deployment.template.json file.
+3. Select the **deployment.amd64.json** file in the **config** folder and then click **Select Edge Deployment Manifest**. Don't use the deployment.template.json file, as that file is only a template.
4. Under your device, expand **Modules** to see a list of deployed and running modules. Click the refresh button. You should see the new **CModule** running along with the **SimulatedTemperatureSensor** module and the **$edgeAgent** and **$edgeHub**.
We used the CModule module twin in the deployment manifest to set the temperatur
## Clean up resources
-If you plan to continue to the next recommended article, you can keep the resources and configurations that you created and reuse them. You can also keep using the same IoT Edge device as a test device.
+If you continue to the next recommended article, you can keep your resources and configurations and reuse them. You can also keep using the same IoT Edge device as a test device.
Otherwise, you can delete the local configurations and the Azure resources that you used in this article to avoid charges.
iot-edge Tutorial Java Module https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-edge/tutorial-java-module.md
Use the following table to understand your options for developing and deploying
| - | | | | **Linux AMD64** | ![Use VS Code for Java modules on Linux AMD64](./media/tutorial-c-module/green-check.png) | | | **Linux ARM32** | ![Use VS Code for Java modules on Linux ARM32](./media/tutorial-c-module/green-check.png) | |
+| **Linux ARM64** | ![Use VS Code for Java modules on Linux ARM64](./media/tutorial-c-module/green-check.png) | |
Before beginning this tutorial, you should have gone through the previous tutorial to set up your development environment for Linux container development: [Develop IoT Edge modules for Linux devices](tutorial-develop-for-linux.md). By completing either of those tutorials, you should have the following prerequisites in place:
iot-edge Tutorial Python Module https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-edge/tutorial-python-module.md
Use the following table to understand your options for developing and deploying
| - | | | | **Linux AMD64** | ![Use VS Code for Python modules on Linux AMD64](./media/tutorial-c-module/green-check.png) | | | **Linux ARM32** | ![Use VS Code for Python modules on Linux ARM32](./media/tutorial-c-module/green-check.png) | |
+| **Linux ARM64** | ![Use VS Code for Python modules on Linux ARM64](./media/tutorial-c-module/green-check.png) | |
Before beginning this tutorial, you should have gone through the previous tutorial to set up your development environment for Linux container development: [Develop IoT Edge modules using Linux containers](tutorial-develop-for-linux.md). By completing that tutorial, you should have the following prerequisites in place:
iot-hub-device-update Device Update Azure Real Time Operating System https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-hub-device-update/device-update-azure-real-time-operating-system.md
description: Get started with Device Update for Azure RTOS.
Last updated 3/18/2021-+
-# Tutorial: Device Update for Azure IoT Hub using Azure RTOS
+# Device Update for Azure IoT Hub using Azure RTOS
-This tutorial shows you how to create the Device Update for Azure IoT Hub agent in Azure RTOS NetX Duo. It also provides simple APIs for developers to integrate the Device Update capability in their application. Explore [samples](https://github.com/azure-rtos/samples/tree/PublicPreview/ADU) of key semiconductors evaluation boards that include the get-started guides to learn how to configure, build, and deploy over-the-air updates to the devices.
-
-In this tutorial, you'll learn how to:
-> [!div class="checklist"]
-> * Get started.
-> * Tag your device.
-> * Create a device group.
-> * Deploy an image update.
-> * Monitor the update deployment.
+This article shows you how to create the Device Update for Azure IoT Hub agent in Azure RTOS NetX Duo. It also provides simple APIs for developers to integrate the Device Update capability in their application. Explore [samples](https://github.com/azure-rtos/samples/tree/PublicPreview/ADU) of key semiconductors evaluation boards that include the get-started guides to learn how to configure, build, and deploy over-the-air updates to the devices.
If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) before you begin.
Learn more about [Azure RTOS](/azure/rtos/).
1. On the left pane, under **IoT Devices**, find your IoT device and go to the device twin. 1. In the device twin, delete any existing Device Update tag values by setting them to null. 1. Add a new Device Update tag value to the root JSON object, as shown:
-
+ ```JSON "tags": { "ADUGroup": "<CustomTagValue>"
Learn more about [Azure RTOS](/azure/rtos/).
You've now completed a successful end-to-end image update by using Device Update for IoT Hub on an Azure RTOS embedded device.
-## Clean up resources
-
-When no longer needed, clean up your device update account, instance, IoT hub, and IoT device.
- ## Next steps To learn more about Azure RTOS and how it works with IoT Hub, see the [Azure RTOS webpage](https://azure.com/rtos).
load-balancer Cross Region Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/load-balancer/cross-region-overview.md
This region doesn't affect how the traffic will be routed. If a home region goes
* North Europe * East Asia * US Gov Virginia
+* UK West
+* Uk South
> [!NOTE] > You can only deploy your cross-region load balancer or Public IP in Global tier in one of the regions above.
load-balancer Ipv6 Dual Stack Standard Internal Load Balancer Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/load-balancer/ipv6-dual-stack-standard-internal-load-balancer-powershell.md
-# Deploy an IPv6 dual stack application using Standard Internal Load Balancer in Azure - PowerShell (Preview)
+# Deploy an IPv6 dual stack application using Standard Internal Load Balancer in Azure - PowerShell
This article shows you how to deploy a dual stack (IPv4 + IPv6) application in Azure that includes a dual stack virtual network and subnet, a Standard Internal Load Balancer with dual (IPv4 + IPv6) front-end configurations, VMs with NICs that have a dual IP configuration, network security group, and public IPs.
logic-apps Add Artifacts Integration Service Environment Ise https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/add-artifacts-integration-service-environment-ise.md
ms.suite: integration Previously updated : 02/28/2021 Last updated : 08/20/2022 # Add resources to your integration service environment (ISE) in Azure Logic Apps
logic-apps Block Connections Connectors https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/block-connections-connectors.md
ms.suite: integration Previously updated : 05/18/2022 Last updated : 08/22/2022 # Block connector usage in Azure Logic Apps + If your organization doesn't permit connecting to restricted or unapproved resources using their [managed connectors](../connectors/managed.md) in Azure Logic Apps, you can block the capability to create and use those connections in logic app workflows. With [Azure Policy](../governance/policy/overview.md), you can define and enforce [policies](../governance/policy/overview.md#policy-definition) that prevent creating or using connections for connectors that you want to block. For example, for security reasons, you might want to block connections to specific social media platforms or other services and systems. This article shows how to set up a policy that blocks specific connections by using the Azure portal, but you can create policy definitions in other ways. For example, you can use the Azure REST API, Azure PowerShell, Azure CLI, and Azure Resource Manager templates. For more information, see [Tutorial: Create and manage policies to enforce compliance](../governance/policy/tutorials/create-and-manage.md).
logic-apps Concepts Schedule Automated Recurring Tasks Workflows https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/concepts-schedule-automated-recurring-tasks-workflows.md
Last updated 08/20/2022
# Schedules for recurring triggers in Azure Logic Apps workflows + Azure Logic Apps helps you create and run automated recurring workflows on a schedule. By creating a logic app workflow that starts with a built-in Recurrence trigger or Sliding Window trigger, which are Schedule-type triggers, you can run tasks immediately, at a later time, or on a recurring interval. You can call services inside and outside Azure, such as HTTP or HTTPS endpoints, post messages to Azure services such as Azure Storage and Azure Service Bus, or get files uploaded to a file share. With the Recurrence trigger, you can also set up complex schedules and advanced recurrences for running tasks. To learn more about the built-in Schedule triggers and actions, see [Schedule triggers](#schedule-triggers) and [Schedule actions](#schedule-actions). > [!NOTE]
logic-apps Connect Virtual Network Vnet Isolated Environment Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/connect-virtual-network-vnet-isolated-environment-overview.md
ms.suite: integration Previously updated : 05/16/2021 Last updated : 08/20/2022 # Access to Azure virtual networks from Azure Logic Apps using an integration service environment (ISE) + Sometimes, your logic app workflows need access to protected resources, such as virtual machines (VMs) and other systems or services, that are inside or connected to an Azure virtual network. To directly access these resources from workflows that usually run in multi-tenant Azure Logic Apps, you can create and run your logic apps in an *integration service environment* (ISE) instead. An ISE is actually an instance of Azure Logic Apps that runs separately on dedicated resources, apart from the global multi-tenant Azure environment, and doesn't [store, process, or replicate data outside the region where you deploy the ISE](https://azure.microsoft.com/global-infrastructure/data-residency#select-geography). For example, some Azure virtual networks use private endpoints ([Azure Private Link](../private-link/private-link-overview.md)) for providing access to Azure PaaS services, such as Azure Storage, Azure Cosmos DB, or Azure SQL Database, partner services, or customer services that are hosted on Azure. If your logic app workflows require access to virtual networks that use private endpoints, you have these options:
logic-apps Connect Virtual Network Vnet Isolated Environment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/connect-virtual-network-vnet-isolated-environment.md
Last updated 08/20/2022
# Connect to Azure virtual networks from Azure Logic Apps using an integration service environment (ISE) + For scenarios where Consumption logic app resources and integration accounts need access to an [Azure virtual network](../virtual-network/virtual-networks-overview.md), create an [*integration service environment* (ISE)](connect-virtual-network-vnet-isolated-environment-overview.md). An ISE is an environment that uses dedicated storage and other resources that are kept separate from the "global" multi-tenant Azure Logic Apps. This separation also reduces any impact that other Azure tenants might have on your apps' performance. An ISE also provides you with your own static IP addresses. These IP addresses are separate from the static IP addresses that are shared by the logic apps in the public, multi-tenant service. When you create an ISE, Azure *injects* that ISE into your Azure virtual network, which then deploys Azure Logic Apps into your virtual network. When you create a logic app or integration account, select your ISE as their location. Your logic app or integration account can then directly access resources, such as virtual machines (VMs), servers, systems, and services, in your virtual network.
logic-apps Connect Virtual Network Vnet Set Up Single Ip Address https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/connect-virtual-network-vnet-set-up-single-ip-address.md
ms.suite: integration Previously updated : 05/06/2020 Last updated : 08/20/2022 # Set up a single IP address for one or more integration service environments in Azure Logic Apps + When you work with Azure Logic Apps, you can set up an [*integration service environment* (ISE)](../logic-apps/connect-virtual-network-vnet-isolated-environment-overview.md) for hosting logic apps that need access to resources in an [Azure virtual network](../virtual-network/virtual-networks-overview.md). When you have multiple ISE instances that need access to other endpoints that have IP restrictions, deploy an [Azure Firewall](../firewall/overview.md) or a [network virtual appliance](../virtual-network/virtual-networks-overview.md#filter-network-traffic) into your virtual network and route outbound traffic through that firewall or network virtual appliance. You can then have all the ISE instances in your virtual network use a single, public, static, and predictable IP address to communicate with the destination systems that you want. That way, you don't have to set up additional firewall openings at your destination systems for each ISE. This topic shows how to route outbound traffic through an Azure Firewall, but you can apply similar concepts to a network virtual appliance such as a third-party firewall from the Azure Marketplace. While this topic focuses on setup for multiple ISE instances, you can also use this approach for a single ISE when your scenario requires limiting the number of IP addresses that need access. Consider whether the additional costs for the firewall or virtual network appliance make sense for your scenario. Learn more about [Azure Firewall pricing](https://azure.microsoft.com/pricing/details/azure-firewall/).
logic-apps Create Integration Service Environment Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/create-integration-service-environment-rest-api.md
ms.suite: integration Previously updated : 02/03/2021 Last updated : 08/20/2022 # Create an integration service environment (ISE) by using the Logic Apps REST API + For scenarios where your logic apps and integration accounts need access to an [Azure virtual network](../virtual-network/virtual-networks-overview.md), you can create an [*integration service environment* (ISE)](../logic-apps/connect-virtual-network-vnet-isolated-environment-overview.md) by using the Logic Apps REST API. To learn more about ISEs, see [Access to Azure Virtual Network resources from Azure Logic Apps](connect-virtual-network-vnet-isolated-environment-overview.md). This article shows you how to create an ISE by using the Logic Apps REST API in general. Optionally, you can also enable a [system-assigned or user-assigned managed identity](../active-directory/managed-identities-azure-resources/overview.md#managed-identity-types) on your ISE, but only by using the Logic Apps REST API at this time. This identity lets your ISE authenticate access to secured resources, such as virtual machines and other systems or services, that are in or connected to an Azure virtual network. That way, you don't have to sign in with your credentials.
logic-apps Create Managed Service Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/create-managed-service-identity.md
ms.suite: integration Previously updated : 07/30/2022 Last updated : 08/22/2022 # Authenticate access to Azure resources with managed identities in Azure Logic Apps + In logic app workflows, some triggers and actions support using a [managed identity](../active-directory/managed-identities-azure-resources/overview.md) to authenticate access to resources protected by Azure Active Directory (Azure AD). This identity was previously known as a *Managed Service Identity (MSI)*. When you enable your logic app resource to use a managed identity for authentication, you don't have to provide credentials, secrets, or Azure AD tokens. Azure manages this identity and helps keep authentication information secure because you don't have to manage this sensitive information. Azure Logic Apps supports the [*system-assigned* managed identity](../active-directory/managed-identities-azure-resources/overview.md) and the [*user-assigned* managed identity](../active-directory/managed-identities-azure-resources/overview.md), but the following differences exist between these identity types:
logic-apps Create Parameters Workflows https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/create-parameters-workflows.md
Last updated 08/20/2022
# Create cross-environment parameters for workflow inputs in Azure Logic Apps + In Azure Logic Apps, you can abstract values that might change in workflows across development, test, and production environments by defining *parameters*. When you use parameters rather than environment-specific variables, you can initially focus more on designing your workflows, and insert your environment-specific variables later. This article introduces how to create, use, and edit parameters for multi-tenant Consumption logic app workflows and for single-tenant Standard logic app workflows. You'll also learn how to manage environment variables.
logic-apps Create Single Tenant Workflows Azure Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/create-single-tenant-workflows-azure-portal.md
# Create an integration workflow with single-tenant Azure Logic Apps (Standard) in the Azure portal + This article shows how to create an example automated integration workflow that runs in the *single-tenant* Azure Logic Apps environment by using the **Logic App (Standard)** resource type and the Azure portal. This resource type can host multiple [stateful and stateless workflows](single-tenant-overview-compare.md#stateful-stateless). Also, workflows in the same logic app and tenant run in the same process as the redesigned Azure Logic Apps runtime, so they share the same resources and provide better performance. For more information about the single-tenant Azure Logic Apps offering, review [Single-tenant versus multi-tenant and integration service environment](single-tenant-overview-compare.md). While this example workflow is cloud-based and has only two steps, you can create workflows from hundreds of operations that can connect a wide range of apps, data, services, and systems across cloud, on premises, and hybrid environments. The example workflow starts with the built-in Request trigger and follows with an Office 365 Outlook action. The trigger creates a callable endpoint for the workflow and waits for an inbound HTTPS request from any caller. When the trigger receives a request and fires, the next action runs by sending email to the specified email address along with selected outputs from the trigger.
logic-apps Create Single Tenant Workflows Visual Studio Code https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/create-single-tenant-workflows-visual-studio-code.md
# Create an integration workflow with single-tenant Azure Logic Apps (Standard) in Visual Studio Code + This article shows how to create an example automated integration workflow that runs in the *single-tenant* Azure Logic Apps environment by using Visual Studio Code with the **Azure Logic Apps (Standard)** extension. The logic app that you create with this extension is based on the **Logic App (Standard)** resource type, which provides the following capabilities: * You can locally run and test logic app workflows in the Visual Studio Code development environment.
logic-apps Custom Connector Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/custom-connector-overview.md
ms.suite: integration Previously updated : 06/10/2022 Last updated : 08/22/2022 # As a developer, I want learn about the capability to create custom connectors with operations that I can use in my Azure Logic Apps workflows. # Custom connectors in Azure Logic Apps + Without writing any code, you can quickly create automated integration workflows when you use the prebuilt connector operations in Azure Logic Apps. A connector helps your workflows connect and access data, events, and actions across other apps, services, systems, protocols, and platforms. Each connector offers operations as triggers, actions, or both that you can add to your workflows. By using these operations, you expand the capabilities for your cloud apps and on-premises apps to work with new and existing data. Connectors in Azure Logic Apps are either *built in* or *managed*. A *built-in* connector runs natively on the Azure Logic Apps runtime, which means they're hosted in the same process as the runtime and provide higher throughput, low latency, and local connectivity. A *managed connector* is a proxy or a wrapper around an API, such as Office 365 or Salesforce, that helps the underlying service talk to Azure Logic Apps. Managed connectors are powered by the connector infrastructure in Azure and are deployed, hosted, run, and managed by Microsoft. You can choose from [hundreds of managed connectors](/connectors/connector-reference/connector-reference-logicapps-connectors) to use with your workflows in Azure Logic Apps.
logic-apps Customer Managed Keys Integration Service Environment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/customer-managed-keys-integration-service-environment.md
ms.suite: integration Previously updated : 01/20/2021 Last updated : 08/20/2022 # Set up customer-managed keys to encrypt data at rest for integration service environments (ISEs) in Azure Logic Apps
logic-apps Logic Apps Author Definitions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/logic-apps-author-definitions.md
ms.suite: integration Previously updated : 01/01/2018 Last updated : 08/21/2022 # Create, edit, or extend JSON for logic app workflow definitions in Azure Logic Apps
Last updated 01/01/2018
When you create enterprise integration solutions with automated workflows in [Azure Logic Apps](../logic-apps/logic-apps-overview.md),
-the underlying logic app definitions use simple
+the underlying workflow definitions use simple
and declarative JavaScript Object Notation (JSON) along with the [Workflow Definition Language (WDL) schema](../logic-apps/logic-apps-workflow-definition-language.md) for their description and validation. These formats
-make logic app definitions easier to read and
+make workflow definitions easier to read and
understand without knowing much about code.
-When you want to automate creating and deploying logic apps,
-you can include logic app definitions as
+When you want to automate creating and deploying logic app resources,
+you can include workflow definitions as
[Azure resources](../azure-resource-manager/management/overview.md) inside [Azure Resource Manager templates](../azure-resource-manager/templates/overview.md). To create, manage, and deploy logic apps, you can then use
To create, manage, and deploy logic apps, you can then use
[Azure CLI](../azure-resource-manager/templates/deploy-cli.md), or the [Azure Logic Apps REST APIs](/rest/api/logic/).
-To work with logic app definitions in JSON,
+To work with workflow definitions in JSON,
open the Code View editor when working in the Azure portal or in Visual Studio, or copy the definition into any editor that you want.
-If you're new to logic apps, review
-[how to create your first logic app](../logic-apps/quickstart-create-first-logic-app-workflow.md).
+If you're new to Azure Logic Apps, review
+[how to create your first logic app workflow](../logic-apps/quickstart-create-first-logic-app-workflow.md).
> [!NOTE] > Some Azure Logic Apps capabilities, such as defining
-> parameters and multiple triggers in logic app definitions,
-> are available only in JSON, not the Logic Apps Designer.
+> parameters and multiple triggers in workflow definitions,
+> are available only in JSON, not the workflow designer.
> So for these tasks, you must work in Code View or another editor. ## Edit JSON - Azure portal
and then from the results, select your logic app.
select **Logic App Code View**. The Code View editor opens and shows
- your logic app definition in JSON format.
+ your workflow definition in JSON format.
## Edit JSON - Visual Studio
-Before you can work on your logic app definition
+Before you can work on your workflow definition
in Visual Studio, make sure that you've [installed the required tools](../logic-apps/quickstart-create-logic-apps-with-visual-studio.md#prerequisites). To create a logic app with Visual Studio, review
or as Azure Resource Manager projects from Visual Studio.
or [Azure Resource Group](../azure-resource-manager/management/overview.md) project, that contains your logic app.
-2. Find and open your logic app's definition,
+2. Find and open your workflow definition,
which by default, appears in an [Resource Manager template](../azure-resource-manager/templates/overview.md), named **LogicApp.json**.
You can use and customize this template for
deployment to different environments. 3. Open the shortcut menu for your
-logic app definition and template.
+workflow definition and template.
Select **Open With Logic App Designer**. ![Open logic app in a Visual Studio solution](./media/logic-apps-author-definitions/open-logic-app-designer.png)
Select **Open With Logic App Designer**.
> [!TIP] > If you don't have this command in Visual Studio 2019, check that you have the latest updates for Visual Studio.
-4. At the bottom of the designer, choose **Code View**.
+4. At the bottom of the workflow designer, choose **Code View**.
The Code View editor opens and shows
- your logic app definition in JSON format.
+ your workflow definition in JSON format.
5. To return to designer view, at the bottom of the Code View editor,
Follow these general steps to *parameterize*, or define and use parameters for,
## Process strings with functions
-Logic Apps has various functions for working with strings.
+Azure Logic Apps has various functions for working with strings.
For example, suppose you want to pass a company name from an order to another system. However, you're not sure about proper handling for character encoding. You could perform base64 encoding on this string, but to avoid escapes in the URL, you can replace several characters instead. Also, you only need a substring for
-the company name because the first five characters are not used.
+the company name because the first five characters aren't used.
``` json {
logic-apps Logic Apps Batch Process Send Receive Messages https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/logic-apps-batch-process-send-receive-messages.md
Title: Batch process messages as a group
description: Send and receive messages in groups between your workflows by using batch processing in Azure Logic Apps. ms.suite: integration-- Previously updated : 08/20/2022 Last updated : 08/21/2022 # Send, receive, and batch process messages in Azure Logic Apps
logic-apps Logic Apps Create Api App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/logic-apps-create-api-app.md
Title: Create web APIs & REST APIs for Azure Logic Apps
description: Create web APIs & REST APIs to call your APIs, services, or systems for system integrations in Azure Logic Apps ms.suite: integration-+ Previously updated : 05/26/2017 Last updated : 08/21/2022 # Create custom APIs you can call from Azure Logic Apps
Last updated 05/26/2017
Although Azure Logic Apps offers [hundreds of connectors](../connectors/apis-list.md) that you can use in logic app workflows, you might want to call APIs, systems, and services that aren't available as connectors.
-You can create your own APIs that provide actions and triggers to use in logic apps.
+You can create your own APIs that provide actions and triggers to use in workflows.
Here are other reasons why you might want to create your own APIs
-that you can call from logic app workflows:
+that you can call from workflows:
* Extend your current system integration and data integration workflows. * Help customers use your service to manage professional or personal tasks.
that you can call from logic app workflows:
Basically, connectors are web APIs that use REST for pluggable interfaces, [Swagger metadata format](https://swagger.io/specification/) for documentation, and JSON as their data exchange format. Because connectors are REST APIs
-that communicate through HTTP endpoints, you can use any language,
-like .NET, Java, Python, or Node.js, for building connectors.
+that communicate through HTTP endpoints, you can use any language to build connectors,
+such as .NET, Java, Python, or Node.js.
You can also host your APIs on [Azure App Service](../app-service/overview.md), a platform-as-a-service (PaaS) offering that provides one of the best, easiest, and most scalable ways for API hosting.
-For custom APIs to work with logic apps, your API can provide
+For custom APIs to work with logic app workflow, your API can provide
[*actions*](./logic-apps-overview.md#logic-app-concepts)
-that perform specific tasks in logic app workflows. Your API can also act as a
+that perform specific tasks in workflows. Your API can also act as a
[*trigger*](./logic-apps-overview.md#logic-app-concepts)
-that starts a logic app workflow when new data or an event meets a specified condition.
+that starts a workflow run when new data or an event meets a specified condition.
This topic describes common patterns that you can follow for building actions and triggers in your API, based on the behavior that you want your API to provide.
easy API hosting.
> consider deploying your APIs as API apps, > which can make your job easier when you build, host, and consume APIs > in the cloud and on premises. You don't have to change any code in your
-> APIs -- just deploy your code to an API app. For example, learn how to
+> APIs--just deploy your code to an API app. For example, learn how to
> build API apps created with these languages: > > * [ASP.NET](../app-service/quickstart-dotnetcore.md).
like custom APIs but also have these attributes:
* Registered as Logic Apps Connector resources in Azure. * Appear with icons alongside Microsoft-managed connectors in the Logic Apps Designer.
-* Available only to the connectors' authors and logic app users who have the same
+* Available only to the connectors' authors and logic app resource users who have the same
Azure Active Directory tenant and Azure subscription in the region where the logic apps are deployed.
You can also nominate registered connectors for Microsoft certification.
This process verifies that registered connectors meet the criteria for public use and makes those connectors available for users in Power Automate and Microsoft Power Apps.
-For more information about custom connectors, see
+For more information, review the following documentation:
* [Custom connectors overview](../logic-apps/custom-connector-overview.md) * [Create custom connectors from Web APIs](/connectors/custom-connectors/create-web-api-connector)
For logic apps to perform tasks, your custom API should provide
[*actions*](./logic-apps-overview.md#logic-app-concepts). Each operation in your API maps to an action. A basic action is a controller that accepts HTTP requests and returns HTTP responses.
-So for example, a logic app sends an HTTP request to your web app or API app.
-Your app then returns an HTTP response, along with content that the logic app can process.
+So for example, a workflow sends an HTTP request to your web app or API app.
+Your app then returns an HTTP response, along with content that the workflow can process.
For a standard action, you can write an HTTP request method in your API and describe that method in a Swagger file. You can then call your API directly
By default, responses must be returned within the [request timeout limit](./logi
![Standard action pattern](./media/logic-apps-create-api-app/standard-action.png) <a name="pattern-overview"></a>
-To make a logic app wait while your API finishes longer-running tasks,
+To make a workflow wait while your API finishes longer-running tasks,
your API can follow the [asynchronous polling pattern](#async-pattern) or the [asynchronous webhook pattern](#webhook-actions) described in this topic. For an analogy that helps you visualize these patterns' different behaviors,
To have your API perform tasks that could run longer than the
[request timeout limit](./logic-apps-limits-and-config.md), you can use the asynchronous polling pattern. This pattern has your API do work in a separate thread,
-but keep an active connection to the Logic Apps engine.
-That way, the logic app does not time out or continue with
+but keep an active connection to the Azure Logic Apps engine.
+That way, the workflow doesn't time out or continue with
the next step in the workflow before your API finishes working. Here's the general pattern: 1. Make sure that the engine knows that your API accepted the request and started working.
-2. When the engine makes subsequent requests for job status, let the engine know when your API finishes the task.
-3. Return relevant data to the engine so that the logic app workflow can continue.
+1. When the engine makes subsequent requests for job status, let the engine know when your API finishes the task.
+1. Return relevant data to the engine so that the workflow can continue.
<a name="bakery-polling-action"></a> Now apply the previous bakery analogy to the polling pattern,
This back-and-forth process continues until you call,
and the bakery tells you that your order is ready and delivers your cake. So let's map this polling pattern back. The bakery represents your custom API,
-while you, the cake customer, represent the Logic Apps engine.
+while you, the cake customer, represent the Azure Logic Apps engine.
When the engine calls your API with a request, your API confirms the request and responds with the time interval when the engine can check job status. The engine continues checking job status until your API responds
described from the API's perspective:
1. When your API gets an HTTP request to start work, immediately return an HTTP `202 ACCEPTED` response with the `location` header described later in this step.
-This response lets the Logic Apps engine know that your API got the request,
+This response lets the Azure Logic Apps engine know that your API got the request,
accepted the request payload (data input), and is now processing. The `202 ACCEPTED` response should include these headers: * *Required*: A `location` header that specifies the absolute path
- to a URL where the Logic Apps engine can check your API's job status
+ to a URL where the Azure Logic Apps engine can check your API's job status
* *Optional*: A `retry-after` header that specifies the number of seconds that the engine should wait before checking the `location` URL for job status.
accepted the request payload (data input), and is now processing.
By default, the engine checks every 20 seconds. To specify a different interval, include the `retry-after` header and the number of seconds until the next poll.
-2. After the specified time passes, the Logic Apps engine polls
+2. After the specified time passes, the Azure Logic Apps engine polls
the `location` URL to check job status. Your API should perform these checks and return these responses:
checks and return these responses:
but with the same headers as the original response. When your API follows this pattern, you don't have to do anything in the
-logic app workflow definition to continue checking job status.
+workflow definition to continue checking job status.
When the engine gets an HTTP `202 ACCEPTED` response and a
-valid `location` header, the engine respects the asynchronous pattern
+valid `location` header, the engine respects the asynchronous pattern,
and checks the `location` header until your API returns a non-202 response. > [!TIP]
and checks the `location` header until your API returns a non-202 response.
As an alternative, you can use the webhook pattern for long-running tasks and asynchronous processing.
-This pattern has the logic app pause and wait for a "callback"
+This pattern pauses the workflow and waits for a "callback"
from your API to finish processing before continuing workflow. This callback is an HTTP POST that sends a message to a URL when an event happens.
you give them your phone number so they can call you when the cake is done.
This time, the bakery tells you when your order is ready and delivers your cake. When we map this webhook pattern back, the bakery represents your custom API,
-while you, the cake customer, represent the Logic Apps engine.
+while you, the cake customer, represent the Azure Logic Apps engine.
The engine calls your API with a request and includes a "callback" URL. When the job is done, your API uses the URL to notify the engine and return data to your logic app, which then continues workflow.
and return data to your logic app, which then continues workflow.
For this pattern, set up two endpoints on your controller: `subscribe` and `unsubscribe` * `subscribe` endpoint: When execution reaches your API's action in the workflow,
-the Logic Apps engine calls the `subscribe` endpoint. This step causes the
-logic app to create a callback URL that your API stores and then wait for the
+the Azure Logic Apps engine calls the `subscribe` endpoint. This step causes the
+workflow to create a callback URL that your API stores and then wait for the
callback from your API when work is complete. Your API then calls back with an HTTP POST to the URL and passes any returned content and headers as input to the logic app.
-* `unsubscribe` endpoint: If the logic app run is canceled, the Logic Apps engine calls the `unsubscribe` endpoint. Your API can then unregister the callback URL and stop any processes as necessary.
+* `unsubscribe` endpoint: If the workflow run is canceled, the Azure Logic Apps engine calls the `unsubscribe` endpoint. Your API can then unregister the callback URL and stop any processes as necessary.
![Webhook action pattern](./media/logic-apps-create-api-app/custom-api-webhook-action-pattern.png)
-Currently, the Logic App Designer doesn't support discovering webhook endpoints through Swagger. So for this pattern, you have to add a [**Webhook** action](../connectors/connectors-native-webhook.md) and specify the URL, headers, and body for your request. See also [Workflow actions and triggers](logic-apps-workflow-actions-triggers.md#apiconnection-webhook-action). For an example webhook pattern, review this [webhook trigger sample in GitHub](https://github.com/logicappsio/LogicAppTriggersExample/blob/master/LogicAppTriggers/Controllers/WebhookTriggerController.cs).
+Currently, the workflow designer doesn't support discovering webhook endpoints through Swagger. So for this pattern, you have to add a [**Webhook** action](../connectors/connectors-native-webhook.md) and specify the URL, headers, and body for your request. See also [Workflow actions and triggers](logic-apps-workflow-actions-triggers.md#apiconnection-webhook-action). For an example webhook pattern, review this [webhook trigger sample in GitHub](https://github.com/logicappsio/LogicAppTriggersExample/blob/master/LogicAppTriggers/Controllers/WebhookTriggerController.cs).
Here are some other tips and notes: * To pass in the callback URL, you can use the `@listCallbackUrl()` workflow function in any of the previous fields as necessary.
-* If you own both the logic app and the subscribed service, you don't have to call the `unsubscribe` endpoint after the callback URL is called. Otherwise, the Logic Apps runtime needs to call the `unsubscribe` endpoint to signal that no more calls are expected and to allow for resource clean up on the server side.
+* If you own both the logic app resource and the subscribed service, you don't have to call the `unsubscribe` endpoint after the callback URL is called. Otherwise, the Azure Logic Apps runtime needs to call the `unsubscribe` endpoint to signal that no more calls are expected and to allow resource cleanup on the server side.
<a name="triggers"></a> ## Trigger patterns Your custom API can act as a [*trigger*](./logic-apps-overview.md#logic-app-concepts)
-that starts a logic app when new data or an event meets a specified condition.
+that starts a workflow run when new data or an event meets a specified condition.
This trigger can either check regularly, or wait and listen, for new data or events at your service endpoint. If new data or an event meets the specified condition,
Also, learn more about [usage metering for triggers](logic-apps-pricing.md).
### Check for new data or events regularly with the polling trigger pattern A *polling trigger* acts much like the [polling action](#async-pattern)
-previously described in this topic. The Logic Apps engine periodically
+previously described in this topic. The Azure Logic Apps engine periodically
calls and checks the trigger endpoint for new data or events. If the engine finds new data or an event that meets your specified condition,
-the trigger fires. Then, the engine creates a logic app instance that processes the data as input.
+the trigger fires. Then, the engine creates a workflow instance that processes the data as input.
![Polling trigger pattern](./media/logic-apps-create-api-app/custom-api-polling-trigger-pattern.png) > [!NOTE]
-> Each polling request counts as an action execution, even when no logic app instance is created.
+> Each polling request counts as an action execution, even when no workflow instance is created.
> To prevent processing the same data multiple times, > your trigger should clean up data that was already read and passed to the logic app.
Here are specific steps for a polling trigger, described from the API's perspect
| Found new data or event? | API response | | - | |
-| Found | Return an HTTP `200 OK` status with the response payload (input for next step). <br/>This response creates a logic app instance and starts the workflow. |
-| Not found | Return an HTTP `202 ACCEPTED` status with a `location` header and a `retry-after` header. <br/>For triggers, the `location` header should also contain a `triggerState` query parameter, which is usually a "timestamp." Your API can use this identifier to track the last time that the logic app was triggered. |
+| Found | Return an HTTP `200 OK` status with the response payload (input for next step). <br/>This response creates a workflow instance and starts the workflow. |
+| Not found | Return an HTTP `202 ACCEPTED` status with a `location` header and a `retry-after` header. <br/>For triggers, the `location` header should also contain a `triggerState` query parameter, which is usually a "timestamp." Your API can use this identifier to track the last time that the workflow was triggered. |
||| For example, to periodically check your service for new files,
you might build a polling trigger that has these behaviors:
A webhook trigger is a *push trigger* that waits and listens for new data or events at your service endpoint. If new data or an event meets the specified condition,
-the trigger fires and creates a logic app instance, which then processes the data as input.
+the trigger fires and creates a workflow instance, which then processes the data as input.
Webhook triggers act much like the [webhook actions](#webhook-actions) previously described in this topic, and are set up with `subscribe` and `unsubscribe` endpoints. * `subscribe` endpoint: When you add and save a webhook trigger in your logic app,
-the Logic Apps engine calls the `subscribe` endpoint. This step causes
-the logic app to create a callback URL that your API stores.
+the Azure Logic Apps engine calls the `subscribe` endpoint. This step causes
+the workflow to create a callback URL that your API stores.
When there's new data or an event that meets the specified condition, your API calls back with an HTTP POST to the URL. The content payload and headers pass as input to the logic app.
-* `unsubscribe` endpoint: If the webhook trigger or entire logic app is deleted, the Logic Apps engine calls the `unsubscribe` endpoint.
+* `unsubscribe` endpoint: If the webhook trigger or entire logic app resource is deleted, the Azure Logic Apps engine calls the `unsubscribe` endpoint.
Your API can then unregister the callback URL and stop any processes as necessary. ![Webhook trigger pattern](./media/logic-apps-create-api-app/custom-api-webhook-trigger-pattern.png)
-Currently, the Logic App Designer doesn't support discovering webhook endpoints through Swagger. So for this pattern, you have to add a [**Webhook** trigger](../connectors/connectors-native-webhook.md) and specify the URL, headers, and body for your request. See also [HTTPWebhook trigger](logic-apps-workflow-actions-triggers.md#httpwebhook-trigger). For an example webhook pattern, review this [webhook trigger controller sample in GitHub](https://github.com/logicappsio/LogicAppTriggersExample/blob/master/LogicAppTriggers/Controllers/WebhookTriggerController.cs).
+Currently, the workflow designer doesn't support discovering webhook endpoints through Swagger. So for this pattern, you have to add a [**Webhook** trigger](../connectors/connectors-native-webhook.md) and specify the URL, headers, and body for your request. See also [HTTPWebhook trigger](logic-apps-workflow-actions-triggers.md#httpwebhook-trigger). For an example webhook pattern, review this [webhook trigger controller sample in GitHub](https://github.com/logicappsio/LogicAppTriggersExample/blob/master/LogicAppTriggers/Controllers/WebhookTriggerController.cs).
Here are some other tips and notes:
Here are some other tips and notes:
* To prevent processing the same data multiple times, your trigger should clean up data that was already read and passed to the logic app.
-* If you own both the logic app and the subscribed service, you don't have to call the `unsubscribe` endpoint after the callback URL is called. Otherwise, the Logic Apps runtime needs to call the `unsubscribe` endpoint to signal that no more calls are expected and to allow for resource clean up on the server side.
+* If you own both the logic app resource and the subscribed service, you don't have to call the `unsubscribe` endpoint after the callback URL is called. Otherwise, the Logic Apps runtime needs to call the `unsubscribe` endpoint to signal that no more calls are expected and to allow resource cleanup on the server side.
## Improve security for calls to your APIs from logic apps
Learn [how to deploy and call custom APIs from logic apps](../logic-apps/logic-a
## Publish custom APIs to Azure
-To make your custom APIs available for other Logic Apps users in Azure,
-you must add security and register them as Logic App connectors.
+To make your custom APIs available for other Azure Logic Apps users,
+you must add security and register them as Azure Logic Apps connectors.
For more information, see [Custom connectors overview](../logic-apps/custom-connector-overview.md). To make your custom APIs available to all users in Logic Apps, Power Automate, and Microsoft Power Apps, you must add security,
-register your APIs as Logic App connectors, and nominate your connectors for the
+register your APIs as Azure Logic Apps connectors, and nominate your connectors for the
[Microsoft Azure Certified program](https://azure.microsoft.com/marketplace/programs/certified/logic-apps/). ## Get support
register your APIs as Logic App connectors, and nominate your connectors for the
* For questions, visit the [Microsoft Q&A question page for Azure Logic Apps](/answers/topics/azure-logic-apps.html).
-* To help improve Logic Apps, vote on or submit ideas at the
- [Logic Apps user feedback site](https://aka.ms/logicapps-wish).
- ## Next steps * [Handle errors and exceptions](../logic-apps/logic-apps-exception-handling.md)
logic-apps Logic Apps Diagnosing Failures https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/logic-apps-diagnosing-failures.md
ms.suite: integration Previously updated : 05/24/2022 Last updated : 08/20/2022 # Troubleshoot and diagnose workflow failures in Azure Logic Apps
logic-apps Logic Apps Gateway Install https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/logic-apps-gateway-install.md
ms.suite: integration Previously updated : 03/16/2021 Last updated : 08/20/2022 #Customer intent: As a software developer, I want to install and set up the on-premises data gateway so that I can create logic app workflows that can access data in on-premises systems.
logic-apps Logic Apps Using Sap Connector https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/logic-apps-using-sap-connector.md
Previously updated : 08/16/2022 Last updated : 08/22/2022 tags: connectors
If you're receiving this error message and experience intermittent failures call
1. Check SAP settings in your on-premises data gateway service configuration file, `Microsoft.PowerBI.EnterpriseGateway.exe.config`.
- The retry count setting looks like `WebhookRetryMaximumCount="2"`. The retry interval setting looks like `WebhookRetryDefaultDelay="00:00:00.10"` and the timespan format is `HH:mm:ss.ff`.
+ 1. Under the `configuration` root node, add a `configSections` element, if none exists.
+ 1. Under the `configSections` node, add a `section` element with the following attributes, if none exist: `name="SapAdapterSection" type="Microsoft.Adapters.SAP.Common.SapAdapterSection, Microsoft.Adapters.SAP.Common"`
+
+ > [!IMPORTANT]
+ > Don't change the attributes in existing `section` elements, if such elements already exist.
+
+ Your `configSections` element looks like the following version, if no other section or section group is declared in the gateway service configuration:
+
+ ```xml
+ <configSections>
+ <section name="SapAdapterSection" type="Microsoft.Adapters.SAP.Common.SapAdapterSection, Microsoft.Adapters.SAP.Common"/>
+ </configSections>
+ ```
+
+ 1. Under the `configuration` root node, add an `SapAdapterSection` element, if none exists.
+ 1. Under the `SapAdapterSection` node, add a `Broker` element with the following attributes, if none exist: `WebhookRetryDefaultDelay="00:00:00.10" WebhookRetryMaximumCount="2"`
+
+ > [!IMPORTANT]
+ > Change the attributes for the `Broker` element, even if the element already exists.
+
+ The `SapAdapterSection` element looks like the following version, if no other element or attribute is declared in the SAP adapter configuration:
+
+ ```xml
+ <SapAdapterSection>
+ <Broker WebhookRetryDefaultDelay="00:00:00.10" WebhookRetryMaximumCount="2" />
+ </SapAdapterSection>
+ ```
+
+ The retry count setting looks like `WebhookRetryMaximumCount="2"`. The retry interval setting looks like `WebhookRetryDefaultDelay="00:00:00.10"` where the timespan format is `HH:mm:ss.ff`.
+
+ > [!NOTE]
+ > For more information about the configuration file,
+ > review [Configuration file schema for .NET Framework](/dotnet/framework/configure-apps/file-schema/).
1. Save your changes. Restart your on-premises data gateway.
logic-apps Support Non Unicode Character Encoding https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/support-non-unicode-character-encoding.md
Title: Convert non-Unicode encoded text for compatibility description: Handle non-Unicode characters in Azure Logic Apps by converting text payloads to UTF-8 with base64 encoding and Azure Functions. Previously updated : 10/05/2021+ - Last updated : 08/20/2022
-# Support non-Unicode character encoding in Logic Apps
+
+# Support non-Unicode character encoding in Azure Logic Apps
When you work with text payloads, Azure Logic Apps infers the text is encoded in a Unicode format, such as UTF-8. You might have problems receiving, sending, or processing characters with different encodings in your workflow. For example, you might get corrupted characters in flat files when working with legacy systems that don't support Unicode.
If you set the `Content-Type` header to `application/octet-stream`, you also mig
## Base64 encode content
-Before you [base64 encode](workflow-definition-language-functions-reference.md#base64) content, make sure you've [converted the text to UTF-8](#convert-payload-encoding). If you base64 decode the content to a string before converting the text to UTF-8, characters might return corrupted.
+Before you [base64 encode](workflow-definition-language-functions-reference.md#base64) content to a string, make sure that you [converted the text to UTF-8](#convert-payload-encoding). Otherwise, characters might return corrupted.
Next, convert any .NET-supported encoding to another .NET-supported encoding. Review the [Azure Functions code example](#azure-functions-version) or the [.NET code example](#net-version):
Using these same concepts, you can also [send a non-Unicode payload from your wo
## Sample payload conversions
-In this example, the base64-encoded sample input string is a personal name, *H&eacute;lo&iuml;se*, that contains accented characters.
+In this example, the base64-encoded sample input string is a personal name that contains accented characters: *H&eacute;lo&iuml;se*
Example input:
machine-learning Concept Data Encryption https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/concept-data-encryption.md
To use the key when deploying a model to Azure Container Instance, create a new
For more information on creating and using a deployment configuration, see the following articles: * [AciWebservice.deploy_configuration()](/python/api/azureml-core/azureml.core.webservice.aci.aciwebservice#deploy-configuration-cpu-cores-none--memory-gb-none--tags-none--properties-none--description-none--location-none--auth-enabled-none--ssl-enabled-none--enable-app-insights-none--ssl-cert-pem-file-none--ssl-key-pem-file-none--ssl-cname-none--dns-name-label-none--primary-key-none--secondary-key-none--collect-model-data-none--cmk-vault-base-url-none--cmk-key-name-none--cmk-key-version-none-) reference
-* [Where and how to deploy](/azure/machine-learning/how-to-deploy-managed-online-endpoints)
+* [Where and how to deploy](how-to-deploy-managed-online-endpoints.md)
For more information on using a customer-managed key with ACI, see [Encrypt data with a customer-managed key](../container-instances/container-instances-encrypt-data.md#encrypt-data-with-a-customer-managed-key).
machine-learning Concept Sourcing Human Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/concept-sourcing-human-data.md
# What is "human data" and why is it important to source responsibly? + Human data is data collected directly from, or about, people. Human data may include personal data such as names, age, images, or voice clips and sensitive data such as genetic data, biometric data, gender identity, religious beliefs, or political affiliations. Collecting this data can be important to building AI systems that work for all users. But certain practices should be avoided, especially ones that can cause physical and psychological harm to data contributors.
machine-learning How To Deploy Mlflow Models https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-deploy-mlflow-models.md
For no-code-deployment, Azure Machine Learning
There are three workflows for deploying MLflow models to Azure Machine Learning: - [Deploy using the MLflow plugin](#deploy-using-the-mlflow-plugin)-- [Deploy using Azure ML CLI (v2)](#deploy-using-azure-ml-cli-v2)
+- [Deploy using Azure ML CLI/SDK (v2)](#deploy-using-azure-ml-clisdk-v2)
- [Deploy using Azure Machine Learning studio](#deploy-using-azure-machine-learning-studio) Each workflow has different capabilities, particularly around which type of compute they can target. The following table shows them:
The MLflow plugin [azureml-mlflow](https://pypi.org/project/azureml-mlflow/) can
) ```
-## Deploy using Azure ML CLI (v2)
+## Deploy using Azure ML CLI/SDK (v2)
-You can use Azure ML CLI v2 to deploy models trained and logged with MLflow to [managed endpoints (Online/batch)](concept-endpoints.md). When you deploy your MLflow model using the Azure ML CLI v2, it's a no-code-deployment so you don't have to provide a scoring script or an environment, but you can if needed.
+You can use Azure ML CLI/SDK v2 to deploy models trained and logged with MLflow to [managed endpoints (Online/batch)](concept-endpoints.md). Deployment of MLflow models support no-code-deployment, so you don't have to provide a scoring script or an environment, but you can if needed.
### Prerequisites
You can use Azure ML CLI v2 to deploy models trained and logged with MLflow to [
* You must have a MLflow model. If your model is not in MLflow format and you want to use this feature, you can [convert your custom ML model to MLflow format](how-to-convert-custom-model-to-mlflow.md). -
-In this code snippet used in this article, the `ENDPOINT_NAME` environment variable contains the name of the endpoint to create and use. To set this, use the following command from the CLI. Replace `<YOUR_ENDPOINT_NAME>` with the name of your endpoint:
-- ### Steps - This example shows how you can deploy an MLflow model to an online endpoint using CLI (v2). > [!IMPORTANT] > For MLflow no-code-deployment, **[testing via local endpoints](how-to-deploy-managed-online-endpoints.md#deploy-and-debug-locally-by-using-local-endpoints)** is currently not supported.
-1. Create a YAML configuration file for your endpoint. The following example configures the name and authentication mode of the endpoint:
+1. Connect to Azure Machine Learning workspace
- __create-endpoint.yaml__
+ # [Azure ML CLI (v2)](#tab/cli)
+
+ ```bash
+ az account set --subscription <subscription>
+ az configure --defaults workspace=<workspace> group=<resource-group> location=<location>
+ ```
+
+ # [Azure ML SDK for Python (v2)](#tab/sdk)
+
+ The workspace is the top-level resource for Azure Machine Learning, providing a centralized place to work with all the artifacts you create when you use Azure Machine Learning. In this section, we'll connect to the workspace in which you'll perform deployment tasks.
+
+ 1. Import the required libraries:
+
+ ```python
+ from azure.ai.ml import MLClient
+ from azure.ai.ml.entities import ManagedOnlineEndpoint, ManagedOnlineDeployment, Model
+ from azure.ai.ml.constants import AssetType
+ from azure.identity import DefaultAzureCredential
+ ```
+
+ 2. Configure workspace details and get a handle to the workspace:
+
+ ```python
+ subscription_id = "<subscription>"
+ resource_group = "<resource-group>"
+ workspace = "<workspace>"
+
+ ml_client = MLClient(DefaultAzureCredential(), subscription_id, resource_group, workspace)
+ ```
+
- :::code language="yaml" source="~/azureml-examples-main/cli/endpoints/online/mlflow/create-endpoint.yaml":::
+1. The following example configures the name and authentication mode of the endpoint:
+
+ # [Azure ML CLI (v2)](#tab/cli)
+
+ Create a YAML configuration file for your endpoint:
+
+ __create-endpoint.yaml__
-1. To create a new endpoint using the YAML configuration, use the following command:
+ :::code language="yaml" source="~/azureml-examples-main/cli/endpoints/online/mlflow/create-endpoint.yaml":::
+
+ # [Azure ML SDK for Python (v2)](#tab/sdk)
+
+ Create an endpoint using the SDK:
+
+ ```python
+ endpoint = ManagedOnlineEndpoint(
+ name="my-endpoint",
+ description="this is a sample local endpoint",
+ auth_mode="key"
+ )
+ ```
- :::code language="azurecli" source="~/azureml-examples-main/cli/deploy-managed-online-endpoint-mlflow.sh" ID="create_endpoint":::
+1. Execute the endpoint creation. This operation will create the endpoint in the Azure Machine Learning workspace:
+
+ # [Azure ML CLI (v2)](#tab/cli)
+
+ To create a new endpoint using the YAML configuration, use the following command:
-1. Create a YAML configuration file for the deployment.
+ :::code language="azurecli" source="~/azureml-examples-main/cli/deploy-managed-online-endpoint-mlflow.sh" ID="create_endpoint":::
- # [From a training job](#tab/fromjob)
+ # [Azure ML SDK for Python (v2)](#tab/sdk)
+
+ To create a new endpoint using the endpoint configuration just created, use the following command:
+
+ ```python
+ ml_client.online_endpoints.begin_create_or_update(endpoint)
+ ```
+
+1. Before going further, we need to register the model we want to deploy. Deployment of unregistered models is not supported in Azure Machine Learning.
- The following example configures a deployment `sklearn-diabetes` to the endpoint created in the previous step. The model is registered from a job previously run:
+ # [Azure ML CLI (v2)](#tab/cli)
- a. Get the job name of the training job. In this example we are assuming the job you want is the last one submitted to the platform.
+ We first need to register the model we want to deploy. Deployment of unregistered models is not supported in Azure Machine Learning.
+ #### From a training job
+
+ In this example, the model is registered from a job previously run. Assuming that your model was registered with an instruction similar like this:
+
+ ```python
+ mlflow.sklearn.log_model(scikit_model, "model")
+ ```
+
+ To register the model from a previous run we would need the job name/run ID in question. For simplicity, let's assume that we are looking to register the model trained in the last run submitted to the workspace:
+ ```bash JOB_NAME=$(az ml job list --query "[0].name" | tr -d '"') ```+
+ Then, let's register the model in the registry.
+
+ ```bash
+ az ml model create --name "mir-sample-sklearn-mlflow-model" \
+ --type "mlflow_model" \
+ --path "azureml://jobs/$JOB_NAME/outputs/artifacts/model"
+ ```
+
+ #### From a local model
- b. Register the model in the registry.
+ If your model is located in the local file system or compute, then you can register it as follows:
```bash az ml model create --name "mir-sample-sklearn-mlflow-model" \ --type "mlflow_model" \
- --path "azureml://jobs/$JOB_NAME/outputs/artifacts/model"
+ --path "sklearn-diabetes/model"
```
+
+ # [Azure ML SDK for Python (v2)](#tab/sdk)
- c. Create the deployment `YAML` file:
+ We first need to register the model we want to deploy. Deployment of unregistered models is not supported in Azure Machine Learning.
- __sklearn-deployment.yaml__
+ #### From a training job
+
+ In this example, the model is registered from a job previously run. Assuming that your model was registered with an instruction similar like this:
+
+ ```python
+ mlflow.sklearn.log_model(scikit_model, "model)
+ ```
+
+ To register the model from a previous run we would need the job name/run ID in question. For simplicity, let's assume that we are looking to register the model trained in the last run submitted to the workspace:
+
+ ```python
+ job_name = ml_client.jobs.list()[0].name
+ ```
+
+ Then, let's register the model in the registry.
+
+ ```python
+ model = Model(name="mir-sample-sklearn-mlflow-model",
+ path=f"azureml://jobs/{job_name}/outputs/artifacts/model",
+ type=AssetType.MLFLOW_MODEL)
+ ml_client.models.create_or_update(model)
+ ```
+
+ #### From a local model
+ If your model is located in the local file system or compute, then you can register it as follows:
+
+ ```
+ model = Model(name="mir-sample-sklearn-mlflow-model",
+ path="sklearn-diabetes/model",
+ type=AssetType.MLFLOW_MODEL)
+ ml_client.models.create_or_update(model)
+ ```
+
+1. Once the endpoint is created, we need to create a deployment on it. Remember that endpoints can contain one or multiple deployments and traffic can be configured for each of them. In this example, we are going to create only one deployment to serve all the traffic, named `sklearn-deployment`.
+
+ # [Azure ML CLI (v2)](#tab/cli)
+
+ Create the deployment `YAML` file:
+
+ __sklearn-deployment.yaml__
+ ```yaml $schema: https://azuremlschemas.azureedge.net/latest/managedOnlineDeployment.schema.json name: sklearn-deployment
This example shows how you can deploy an MLflow model to an online endpoint usin
instance_type: Standard_DS2_v2 instance_count: 1 ```+
+ # [Azure ML SDK for Python (v2)](#tab/sdk)
+
+ ```python
+ blue_deployment = ManagedOnlineDeployment(
+ name="sklearn-deployment",
+ endpoint_name="my-endpoint",
+ model=model,
+ instance_type="Standard_F2s_v2",
+ instance_count=1,
+ )
+ ```
+
+
+1. Create the deployment and assign all the traffic to it.
- > [!IMPORTANT]
- > For MLflow no-code-deployment (NCD) to work, setting **`type`** to **`mlflow_model`** is required, `type: mlflow_modelΓÇï`. For more information, see [CLI (v2) model YAML schema](reference-yaml-model.md).
+ # [Azure ML CLI (v2)](#tab/cli)
- # [From a local model](#tab/fromlocal)
+ :::code language="azurecli" source="~/azureml-examples-main/cli/deploy-managed-online-endpoint-mlflow.sh" ID="create_sklearn_deployment":::
- The following example configures a deployment `sklearn-diabetes` to the endpoint created in the previous step using the local MLflow model:
-
- __sklearn-deployment.yaml__
-
- :::code language="yaml" source="~/azureml-examples-main/cli/endpoints/online/mlflow/sklearn-deployment.yaml":::
-
- > [!IMPORTANT]
- > For MLflow no-code-deployment (NCD) to work, setting **`type`** to **`mlflow_model`** is required, `type: mlflow_modelΓÇï`. For more information, see [CLI (v2) model YAML schema](reference-yaml-model.md).
-
-1. To create the deployment using the YAML configuration, use the following command:
-
- :::code language="azurecli" source="~/azureml-examples-main/cli/deploy-managed-online-endpoint-mlflow.sh" ID="create_sklearn_deployment":::
+ # [Azure ML SDK for Python (v2)](#tab/sdk)
+
+ ```python
+ ml_client.begin_create_or_update(blue_deployment)
+ endpoint.traffic = {"sklearn-deployment": 100}
+ ml_client.begin_create_or_update(endpoint)
+ ```
+
+1. Once the deployment is completed, the service is ready to receive requests. If you are not sure about how to submit requests to the service, see [Creating requests](#creating-requests).
## Deploy using Azure Machine Learning studio
You can use [Azure Machine Learning studio](https://ml.azure.com) to deploy mode
:::image type="content" source="./media/how-to-manage-models/register-model-as-asset.png" alt-text="Screenshot of the UI to register a model." lightbox="./media/how-to-manage-models/register-model-as-asset.png":::
-2. From [studio](https://ml.azure.com), select your workspace and then use either the __endpoints__ or __models__ page to create the endpoint deployment:
-
- # [Endpoints page](#tab/endpoint)
-
- 1. From the __Endpoints__ page, Select **+Create**.
-
- :::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/create-from-endpoints.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/create-from-endpoints.png" alt-text="Screenshot showing create option on the Endpoints UI page.":::
-
- 1. Provide a name and authentication type for the endpoint, and then select __Next__.
- 1. When selecting a model, select the MLflow model registered previously. Select __Next__ to continue.
+2. From [studio](https://ml.azure.com), select your workspace and then use either the __endpoints__ page to create the endpoint deployment:
- 1. When you select a model registered in MLflow format, in the Environment step of the wizard, you don't need a scoring script or an environment.
+ a. From the __Endpoints__ page, Select **+Create**.
- :::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/ncd-wizard.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/ncd-wizard.png" alt-text="Screenshot showing no code and environment needed for MLflow models.":::
+ :::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/create-from-endpoints.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/create-from-endpoints.png" alt-text="Screenshot showing create option on the Endpoints UI page.":::
- 1. Complete the wizard to deploy the model to the endpoint.
+ b. Provide a name and authentication type for the endpoint, and then select __Next__.
- :::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/review-screen-ncd.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/review-screen-ncd.png" alt-text="Screenshot showing NCD review screen.":::
+ c. When selecting a model, select the MLflow model registered previously. Select __Next__ to continue.
- # [Models page](#tab/models)
+ d. When you select a model registered in MLflow format, in the Environment step of the wizard, you don't need a scoring script or an environment.
- 1. Select the MLflow model, and then select __Deploy__. When prompted, select __Deploy to real-time endpoint__.
+ :::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/ncd-wizard.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/ncd-wizard.png" alt-text="Screenshot showing no code and environment needed for MLflow models.":::
- :::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/deploy-from-models-ui.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/deploy-from-models-ui.png" alt-text="Screenshot showing how to deploy model from Models UI.":::
+ e. Complete the wizard to deploy the model to the endpoint.
- 1. Complete the wizard to deploy the model to the endpoint.
+ :::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/review-screen-ncd.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/review-screen-ncd.png" alt-text="Screenshot showing NCD review screen.":::
## Considerations when deploying to real time inference
machine-learning How To Log Mlflow Models https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-log-mlflow-models.md
mlflow.pyfunc.log_model("classifier",
# [Using artifacts](#tab/artifacts)
-Wrapping your model may be simple, but sometimes your model needs multiple pieces to be loaded or it can't just be serialized simply as a Pickle file. In those cases, the `PythonModel` supports indicating an arbitrary list of **artifacts**. Each artifact will be packaged along with your model.
+Wrapping your model may be simple, but sometimes your model is composed by multiple pieces that need to be loaded or it can't just be serialized as a Pickle file. In those cases, the `PythonModel` supports indicating an arbitrary list of **artifacts**. Each artifact will be packaged along with your model.
Use this method when: > [!div class="checklist"]
mlflow.pyfunc.log_model("classifier",
# [Using a model loader](#tab/loader)
-Sometimes your model logic is complex and there are several source code files being used to make your model work. This would be the case when you have a Python library for your model for instance. In this scenario, you want to package the library all along with your model so it can move as a single piece.
+Sometimes your model logic is complex and there are several source files that your model loads on inference time. This would be the case when you have a Python library for your model for instance. In this scenario, you want to package the library all along with your model so it can move as a single piece.
Use this method when: > [!div class="checklist"] > * Your model can't be serialized in Pickle format or there is a better format available for that.
-> * Your model can be stored in a folder where all the requiered artifacts are placed.
-> * Your model's logic is complex and it requires multiple source files. Potentially, there is a library that supports your model.
+> * Your model artifacts can be stored in a folder where all the requiered artifacts are placed.
+> * Your model source code is complex and it requires multiple Python files. Potentially, there is a library that supports your model.
> * You want to customize the way the model is loaded and how the `predict` function works. MLflow supports this kind of models too by allowing you to specify any arbitrary source code to package along with the model as long as it has a *loader module*. Loader modules can be specified in the `log_model()` instruction using the argument `loader_module` which indicates the Python namespace where the loader is implemented. The argument `code_path` is also required, where you indicate the source files where the `loader_module` is defined. You are required to implement in this namespace a function called `_load_pyfunc(data_path: str)` that received the path of the artifacts and returns an object with a method predict (at least).
model.save_model(model_path)
mlflow.pyfunc.log_model("classifier", data_path=model_path,
- code_path=['loader_module.py'],
+ code_path=['src'],
loader_module='loader_module' signature=signature) ```
mlflow.pyfunc.log_model("classifier",
> * The model was saved using the save method of the framework used (it's not saved as a pickle). > * A new parameter, `data_path`, was added pointing to the folder where the model's artifacts are located. This can be a folder or a file. Whatever is on that folder or file, it will be packaged with the model. > * A new parameter, `code_path`, was added pointing to the location where the source code is placed. This can be a path or a single file. Whatever is on that folder or file, it will be packaged with the model.
+> * `loader_module` is the Python module where the function `_load_pyfunc` is defined.
-The corresponding `loader_module.py` implementation would be:
+The folder `src` contains a file called `loader_module.py` (which is the loader module):
-__loader_module.py__
+__src/loader_module.py__
```python class MyModel():
machine-learning How To Monitor Online Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-monitor-online-endpoints.md
You can also create custom alerts to notify you of important status updates to y
1. Select **Add action groups** > **Create action groups** to specify what should happen when your alert is triggered. 1. Choose **Create alert rule** to finish creating your alert.
-1.
+ ## Logs There are three logs that can be enabled for online endpoints:
machine-learning How To Setup Customer Managed Keys https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-setup-customer-managed-keys.md
To use the key when deploying a model to Azure Container Instance, create a new
For more information on creating and using a deployment configuration, see the following articles: * [AciWebservice.deploy_configuration()](/python/api/azureml-core/azureml.core.webservice.aci.aciwebservice#deploy-configuration-cpu-cores-none--memory-gb-none--tags-none--properties-none--description-none--location-none--auth-enabled-none--ssl-enabled-none--enable-app-insights-none--ssl-cert-pem-file-none--ssl-key-pem-file-none--ssl-cname-none--dns-name-label-none--primary-key-none--secondary-key-none--collect-model-data-none--cmk-vault-base-url-none--cmk-key-name-none--cmk-key-version-none-) reference
-* [Where and how to deploy](v1/how-to-deploy-and-where.md)
+* [Where and how to deploy](how-to-deploy-managed-online-endpoints.md)
* [Deploy a model to Azure Container Instances](v1/how-to-deploy-azure-container-instance.md) For more information on using a customer-managed key with ACI, see [Encrypt data with a customer-managed key](../container-instances/container-instances-encrypt-data.md#encrypt-data-with-a-customer-managed-key).
machine-learning Concept Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/v1/concept-data.md
Title: Secure data access in the cloud
+ Title: Secure data access in the cloud v1
-description: Learn how to securely connect to your data storage on Azure with Azure Machine Learning datastores and datasets.
+description: Learn how to securely connect to your data storage on Azure with Azure Machine Learning datastores and datasets v1
--++ Last updated 10/21/2021 #Customer intent: As an experienced Python developer, I need to securely access my data in my Azure storage solutions and use it to accomplish my machine learning tasks.
-# Secure data access in Azure Machine Learning
+# Data in Azure Machine Learning v1
+ > [!div class="op_single_selector" title1="Select the version of Azure Machine Learning developer platform you are using:"] > * [v1](concept-data.md)
See the [Create a dataset monitor](how-to-monitor-datasets.md) article, to learn
## Next steps + Create a dataset in Azure Machine Learning studio or with the Python SDK [using these steps.](how-to-create-register-datasets.md)
-+ Try out dataset training examples with our [sample notebooks](https://github.com/Azure/MachineLearningNotebooks/tree/master/how-to-use-azureml/work-with-data/).
++ Try out dataset training examples with our [sample notebooks](https://github.com/Azure/MachineLearningNotebooks/tree/master/how-to-use-azureml/work-with-data/).
machine-learning Concept Network Data Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/v1/concept-network-data-access.md
+
+ Title: Network data access in studio
+
+description: Learn how data access works with Azure Machine Learning studio when your workspace or storage is in a virtual network.
+++++++ Last updated : 11/19/2021+++
+# Network data access with Azure Machine Learning studio
+
+Data access is complex and it's important to recognize that there are many pieces to it. For example, accessing data from Azure Machine Learning studio is different than using the SDK. When using the SDK on your local development environment, you're directly accessing data in the cloud. When using studio, you aren't always directly accessing the data store from your client. Studio relies on the workspace to access data on your behalf.
+
+> [!IMPORTANT]
+> The information in this article is intended for Azure administrators who are creating the infrastructure required for an Azure Machine Learning solution.
+
+> [!TIP]
+> Studio only supports reading data from the following datastore types in a VNet:
+>
+> * Azure Storage Account (blob & file)
+> * Azure Data Lake Storage Gen1
+> * Azure Data Lake Storage Gen2
+> * Azure SQL Database
+
+## Data access
+
+In general, data access from studio involves the following checks:
+
+1. Who is accessing?
+ - There are multiple different types of authentication depending on the storage type. For example, account key, token, service principal, managed identity, and user identity.
+ - If authentication is made using a user identity, then it's important to know *which* user is trying to access storage. Learn more about [identity-based data access](how-to-identity-based-data-access.md).
+2. Do they have permission?
+ - Are the credentials correct? If so, does the service principal, managed identity, etc., have the necessary permissions on the storage? Permissions are granted using Azure role-based access controls (Azure RBAC).
+ - [Reader](../../role-based-access-control/built-in-roles.md#reader) of the storage account reads metadata of the storage.
+ - [Storage Blob Data Reader](../../role-based-access-control/built-in-roles.md#storage-blob-data-reader) reads data within a blob container.
+ - [Contributor](../../role-based-access-control/built-in-roles.md#contributor) allows write access to a storage account.
+ - More roles may be required depending on the type of storage.
+3. Where is access from?
+ - User: Is the client IP address in the VNet/subnet range?
+ - Workspace: Is the workspace public or does it have a private endpoint in a VNet/subnet?
+ - Storage: Does the storage allow public access, or does it restrict access through a service endpoint or a private endpoint?
+4. What operation is being performed?
+ - Create, read, update, and delete (CRUD) operations on a data store/dataset are handled by Azure Machine Learning.
+ - Data Access calls (such as preview or schema) go to the underlying storage and need extra permissions.
+5. Where is this operation being run; compute resources in your Azure subscription or resources hosted in a Microsoft subscription?
+ - All calls to dataset and datastore services (except the "Generate Profile" option) use resources hosted in a __Microsoft subscription__ to run the operations.
+ - Jobs, including the "Generate Profile" option for datasets, run on a compute resource in __your subscription__, and access the data from there. So the compute identity needs permission to the storage rather than the identity of the user submitting the job.
+
+The following diagram shows the general flow of a data access call. In this example, a user is trying to make a data access call through a machine learning workspace, without using any compute resource.
++
+### Scenarios and identities
+
+The following table lists what identities should be used for specific scenarios:
+
+| Scenario | Use workspace</br>Managed Service Identity (MSI) | Identity to use |
+|--|--|--|
+| Access from UI | Yes | Workspace MSI |
+| Access from UI | No | User's Identity |
+| Access from Job | Yes/No | Compute MSI |
+| Access from Notebook | Yes/No | User's identity |
+
+> [!TIP]
+> If you need to access data from outside Azure Machine Learning, such as using Azure Storage Explorer, _user_ identity is probably what is used. Consult the documentation for the tool or service you are using for specific information. For more information on how Azure Machine Learning works with data, see [Identity-based data access to storage services on Azure](how-to-identity-based-data-access.md).
+
+## Azure Storage Account
+
+When using an Azure Storage Account from Azure Machine Learning studio, you must add the managed identity of the workspace to the following Azure RBAC roles for the storage account:
+
+* [Blob Data Reader](../../role-based-access-control/built-in-roles.md#storage-blob-data-reader)
+* If the storage account uses a private endpoint to connect to the VNet, you must grant the managed identity the [Reader](../../role-based-access-control/built-in-roles.md#reader) role for the storage account private endpoint.
+
+For more information, see [Use Azure Machine Learning studio in an Azure Virtual Network](../how-to-enable-studio-virtual-network.md).
+
+See the following sections for information on limitations when using Azure Storage Account with your workspace in a VNet.
+
+### Secure communication with Azure Storage Account
+
+To secure communication between Azure Machine Learning and Azure Storage Accounts, configure storage to [Grant access to trusted Azure services](../../storage/common/storage-network-security.md#grant-access-to-trusted-azure-services).
+
+### Azure Storage firewall
+
+When an Azure Storage account is behind a virtual network, the storage firewall can normally be used to allow your client to directly connect over the internet. However, when using studio it isn't your client that connects to the storage account; it's the Azure Machine Learning service that makes the request. The IP address of the service isn't documented and changes frequently. __Enabling the storage firewall will not allow studio to access the storage account in a VNet configuration__.
+
+### Azure Storage endpoint type
+
+When the workspace uses a private endpoint and the storage account is also in the VNet, there are extra validation requirements when using studio:
+
+* If the storage account uses a __service endpoint__, the workspace private endpoint and storage service endpoint must be in the same subnet of the VNet.
+* If the storage account uses a __private endpoint__, the workspace private endpoint and storage service endpoint must be in the same VNet. In this case, they can be in different subnets.
+
+## Azure Data Lake Storage Gen1
+
+When using Azure Data Lake Storage Gen1 as a datastore, you can only use POSIX-style access control lists. You can assign the workspace's managed identity access to resources just like any other security principal. For more information, see [Access control in Azure Data Lake Storage Gen1](../../data-lake-store/data-lake-store-access-control.md).
+
+## Azure Data Lake Storage Gen2
+
+When using Azure Data Lake Storage Gen2 as a datastore, you can use both Azure RBAC and POSIX-style access control lists (ACLs) to control data access inside of a virtual network.
+
+**To use Azure RBAC**, follow the steps in the [Datastore: Azure Storage Account](../how-to-enable-studio-virtual-network.md#datastore-azure-storage-account) section of the 'Use Azure Machine Learning studio in an Azure Virtual Network' article. Data Lake Storage Gen2 is based on Azure Storage, so the same steps apply when using Azure RBAC.
+
+**To use ACLs**, the managed identity of the workspace can be assigned access just like any other security principal. For more information, see [Access control lists on files and directories](../../storage/blobs/data-lake-storage-access-control.md#access-control-lists-on-files-and-directories).
+
+## Azure SQL Database
+
+To access data stored in an Azure SQL Database with a managed identity, you must create a SQL contained user that maps to the managed identity. For more information on creating a user from an external provider, see [Create contained users mapped to Azure AD identities](/azure/azure-sql/database/authentication-aad-configure#create-contained-users-mapped-to-azure-ad-identities).
+
+After you create a SQL contained user, grant permissions to it by using the [GRANT T-SQL command](/sql/t-sql/statements/grant-object-permissions-transact-sql).
+
+### Secure communication with Azure SQL Database
+
+To secure communication between Azure Machine Learning and Azure SQL Database, there are two options:
+
+> [!IMPORTANT]
+> Both options allow the possibility of services outside your subscription connecting to your SQL Database. Make sure that your SQL login and user permissions limit access to authorized users only.
+
+* __Allow Azure services and resources to access the Azure SQL Database server__. Enabling this setting _allows all connections from Azure_, including __connections from the subscriptions of other customers__, to your database server.
+
+ For information on enabling this setting, see [IP firewall rules - Azure SQL Database and Synapse Analytics](/azure/azure-sql/database/firewall-configure).
+
+* __Allow the IP address range of the Azure Machine Learning service in Firewalls and virtual networks__ for the Azure SQL Database. Allowing the IP addresses through the firewall limits __connections to the Azure Machine Learning service for a region__.
+
+ > [!WARNING]
+ > The IP ranges for the Azure Machine Learning service may change over time. There is no built-in way to automatically update the firewall rules when the IPs change.
+
+ To get a list of the IP addresses for Azure Machine Learning, download the [Azure IP Ranges and Service Tags](https://www.microsoft.com/download/details.aspx?id=56519) and search the file for `AzureMachineLearning.<region>`, where `<region>` is the Azure region that contains your Azure Machine Learning workspace.
+
+ To add the IP addresses to your Azure SQL Database, see [IP firewall rules - Azure SQL Database and Synapse Analytics](/azure/azure-sql/database/firewall-configure).
+
+## Next steps
+
+For information on enabling studio in a network, see [Use Azure Machine Learning studio in an Azure Virtual Network](../how-to-enable-studio-virtual-network.md).
marketplace Commercial Marketplace Lead Management Instructions Https https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/marketplace/partner-center-portal/commercial-marketplace-lead-management-instructions-https.md
description: Learn how to use Power Automate and an HTTPS endpoint to manage lea
-- Previously updated : 05/21/2021++ Last updated : 08/19/2022 # Use an HTTPS endpoint to manage commercial marketplace leads
marketplace Private Offers Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/marketplace/private-offers-api.md
Previously updated : 06/07/2022 Last updated : 07/21/2022
-# Create and manage private offers via API (preview)
-
-> [!NOTE]
-> This API is in preview. If you have any questions about the preview program, contact [privateofferspreview@microsoft.com](mailto:privateofferspreview@microsoft.com).
+# Create and manage private offers via API
Private offers allow publishers and customers to transact one or more products in Azure Marketplace by creating time-bound pricing with customized terms. The private offers API enables ISVs to programmatically create and manage private offers for customers and resellers. This API is useful if your account manages many private offers and you want to automate and optimize their management workflows. This API uses Azure Active Directory (Azure AD) to authenticate the calls from your app or service.
To use this API, you may need to reference several different types of IDs associ
A private offer is based on an existing product in your Partner Center account. To see a list of products associated with your Partner Center account, use this API call:
-`GET https://graph.microsoft.com/rp/product-ingestion/product/`
+`GET https://graph.microsoft.com/rp/product-ingestion/product?$version=2022-07-01`
The response appears in the following sample format:
The response appears in the following sample format:
{ "value": [ {
- "$schema": "https://product-ingestion.azureedge.net/schema/product/2022-03-01-preview2",
+ "$schema": "https://schema.mp.microsoft.com/schema/product/2022-07-01",
"id": "string", "identity": { "externalId": "string"
The response appears in the following sample format:
For products that contain more than one plan, you may want to create a private offer based on one specific plan. If so, you'll need that plan's ID. Obtain a list of the plans (such as variants or SKUs) for the product using the following API call:
-`GET https://graph.microsoft.com/rp/product-ingestion/plan/?product=<product-id>`
+`GET https://graph.microsoft.com/rp/product-ingestion/plan?product=<product-id>&$version=2022-07-01`
The response appears in the following sample format:
The response appears in the following sample format:
{ "value": [ {
- "$schema": "https://product-ingestion.azureedge.net/schema/plan/2022-03-01-preview2",
+ "$schema": "https://schema.mp.microsoft.com/schema/plan/2022-07-01",
"product": "string", "id": "string", "identity": {
The response appears in the following sample format:
To see a list of all private offers associated with your seller account, use the following API call:
-`GET https://graph.microsoft.com/rp/product-ingestion/private-offer/query?`
+`GET https://graph.microsoft.com/rp/product-ingestion/private-offer/query?$version=2022-07-01`
## How to use the API
Use this method to create a private offer for a customer.
### Request
-`POST https://graph.microsoft.com/rp/product-ingestion/configure`
+`POST https://graph.microsoft.com/rp/product-ingestion/configure?$version=2022-07-01`
#### Request Header
Optional: clientID
#### Request parameters
-There are no parameters for this method.
+$version - required. This is the version of the schema that is being used in the request
#### Request body
Provide the details of the private offer using the ISV to Customer private offer
```json {
- "$schema": "https://product-ingestion.azureedge.net/schema/configure/2022-03-01-preview2",
+ "$schema": "https://schema.mp.microsoft.com/schema/configure/2022-07-01",
"resources": [ {
- "$schema": "https://product-ingestion.azureedge.net/schema/private-offer/2022-03-01-preview2",
+ "$schema": "https://schema.mp.microsoft.com/schema/private-offer/2022-07-01",
"name": "privateOffercustomer1705", "state": "live", "privateOfferType": "customerPromotion",
If you're using absolute pricing instead of percentage-based discounting, you ca
Use this method to obtain the pricing resource for your existing public plan, edit the prices, and then use the edited resource for your private offer.
-`GET https://graph.microsoft.com/rp/product-ingestion/price-and-availability-private-offer-plan/{productId}?plan={planId}`
+`GET https://graph.microsoft.com/rp/product-ingestion/price-and-availability-private-offer-plan/{productId}?plan={planId}&$version=2022-07-01`
Sample absolute pricing resource: ```json {
- "$schema": "https://product-ingestion.azureedge.net/schema/price-and-availability-private-offer-plan/2022-03-01-preview2",
+ "$schema": "https://schema.mp.microsoft.com/schema/price-and-availability-private-offer-plan/2022-07-01",
"resourceName": "newSimpleAbsolutePricing", "product": "product/7ba807c8-386a-4efe-80f1-b97bf8a554f8", "plan": "plan/987654",
The response will contain the jobId you can use later to poll the status:
```json {
- "$schema": "https://product-ingestion.azureedge.net/schema/configure-status/2022-03-01-preview2",
+ "$schema": "https://schema.mp.microsoft.com/schema/configure-status/2022-07-01",
"jobId": "c32dd7e8-8619-462d-a96b-0ac1974bace5", "jobStatus": "notStarted", "jobResult": "pending",
Use this method to create a new private offer for a customer.
### Request
-`POST https://graph.microsoft.com/rp/product-ingestion/configure`
+`POST https://graph.microsoft.com/rp/product-ingestion/configure?$version=2022-07-01`
#### Request header
Use this method to create a new private offer for a customer.
#### Request parameters
-There are no parameters for this method.
+$version - required. This is the version of the schema that is being used in the request
#### Request body
Provide the details of the private offer using the **ISV to reseller margin priv
```json {
- "$schema": "https://product-ingestion.azureedge.net/schema/configure/2022-03-01-preview2",
+ "$schema": "https://schema.mp.microsoft.com/schema/configure/2022-07-01",
"resources": [ {
- "$schema": "https://product-ingestion.azureedge.net/schema/private-offer/2022-03-01-preview2",
+ "$schema": "https://schema.mp.microsoft.com/schema/private-offer/2022-07-01",
"privateOfferType": "cspPromotion", "name": "privateOffercsp1034", "state": "live",
The response will contain the jobId you can use later to poll the status.
```json {
- "$schema": "https://product-ingestion.azureedge.net/schema/configure-status/2022-03-01-preview2",
+ "$schema": "https://schema.mp.microsoft.com/schema/configure-status/2022-07-01",
"jobId": "c32dd7e8-8619-462d-a96b-0ac1974bace5", "jobStatus": "notStarted", "jobResult": "pending",
Use this method to delete an existing private offer while it's still in draft st
### Request
-`POST https://graph.microsoft.com/rp/product-ingestion/configure`
+`POST https://graph.microsoft.com/rp/product-ingestion/configure?$version=2022-07-01`
#### Request header
Use this method to delete an existing private offer while it's still in draft st
#### Request parameters
-There are no parameters for this method.
+$version - required. This is the version of the schema that is being used in the request
#### Request body ```json {
- "$schema": "https://product-ingestion.azureedge.net/schema/configure/2022-03-01-preview2"
+ "$schema": "https://schema.mp.microsoft.com/schema/configure/2022-07-01"
"resources": [ {
- "$schema": "https://product-ingestion.azureedge.net/schema/private-offer/2022-03-01-preview2",
+ "$schema": "https://schema.mp.microsoft.com/schema/private-offer/2022-07-01",
"id": "private-offer/456e-a345-c457-1234", "name": "privateOffercustomer1705", "state": "deleted"
The response will contain the jobId you can use later to poll the status.
```json {
- "$schema": "https://product-ingestion.azureedge.net/schema/configure-status/2022-03-01-preview2",
+ "$schema": "https://schema.mp.microsoft.com/schema/configure-status/2022-07-01",
"jobId": "c32dd7e8-8619-462d-a96b-0ac1974bace5", "jobStatus": "notStarted", "jobResult": "pending",
You must use the private offer ID to specify which private offer you want to wit
### Request
-`POST https://graph.microsoft.com/rp/product-ingestion/configure`
+`POST https://graph.microsoft.com/rp/product-ingestion/configure?$version=2022-07-01`
#### Request header
You must use the private offer ID to specify which private offer you want to wit
#### Request parameters
-There are no parameters for this method.
+$version - required. This is the version of the schema that is being used in the request
#### Request body ```json {
- "$schema": "https://product-ingestion.azureedge.net/schema/configure/2022-03-01-preview2"
+ "$schema": "https://schema.mp.microsoft.com/schema/configure/2022-07-01"
"resources": [ {
- "$schema": "https://product-ingestion.azureedge.net/schema/private-offer/2022-03-01-preview2",
+ "$schema": "https://schema.mp.microsoft.com/schema/private-offer/2022-07-01",
"id": "private-offer/456e-a345-c457-1234", "name": "privateOffercustomer1705", "state": "withdrawn"
The response will contain the jobId you can later use to poll the status.
```json {
- "$schema": "https://product-ingestion.azureedge.net/schema/configure-status/2022-03-01-preview2",
+ "$schema": "https://schema.mp.microsoft.com/schema/configure-status/2022-07-01",
"jobId": "c32dd7e8-8619-462d-a96b-0ac1974bace5", "jobStatus": "notStarted", "jobResult": "pending",
Use this method to upgrade an existing customer private offer. You must provide
### Request
-`POST https://graph.microsoft.com/rp/product-ingestion/configure`
+`POST https://graph.microsoft.com/rp/product-ingestion/configure?$version=2022-07-01`
#### Request header
Use this method to upgrade an existing customer private offer. You must provide
#### Request parameters
-There are no parameters for this method.
+$version - required. This is the version of the schema that is being used in the request
#### Request body
You can use the same schemas as the two methods to create a new private offer de
```json {
- "$schema": "https://product-ingestion.azureedge.net/schema/configure/2022-03-01-preview2",
+ "$schema": "https://schema.mp.microsoft.com/schema/configure/2022-07-01",
"resources": [ {
- "$schema": "https://product-ingestion.azureedge.net/schema/private-offer/2022-03-01-preview2",
+ "$schema": "https://schema.mp.microsoft.com/schema/private-offer/2022-07-01",
"name": "publicApiCustAPIUpgrade1", "state": "live", "privateOfferType": "customerPromotion",
The response will contain the jobId you can use later to poll the status.
```json {
- "$schema": "https://product-ingestion.azureedge.net/schema/configure-status/2022-03-01-preview2",
+ "$schema": "https://schema.mp.microsoft.com/schema/configure-status/2022-07-01",
"jobId": "c32dd7e8-8619-462d-a96b-0ac1974bace5", "jobStatus": "notStarted", "jobResult": "pending",
Use this method to query the status of an existing job. You can poll the status
### Request
-`GET https://graph.microsoft.com/rp/product-ingestion/configure/<jobId>/status`
+`GET https://graph.microsoft.com/rp/product-ingestion/configure/<jobId>/status?$version=2022-07-01`
#### Request header
Use this method to query the status of an existing job. You can poll the status
jobId ΓÇô required. This is the ID of the job you want to query the status of. It's available in the response data generated during a previous request to either create, delete, withdraw, or upgrade a private offer.
+$version - required. This is the version of the schema that is being used in the request
+ #### Request body Don't provide a request body for this method.
Sample outputs:
```json {
- "$schema": "https://product-ingestion.azureedge.net/schema/configure-status/2022-03-01-preview2",
+ "$schema": "https://schema.mp.microsoft.com/schema/configure-status/2022-07-01",
"jobId": "c32dd7e8-8619-462d-a96b-0ac1974bace5", "jobStatus": "running", "jobResult": "pending",
Sample outputs:
```json {
- "$schema": " https://product-ingestion.azureedge.net/schema/configure-status/2022-03-01-preview2",
+ "$schema": " https://schema.mp.microsoft.com/schema/configure-status/2022-07-01",
"jobId": "b3f49dff-381f-480d-a10e-17f4ce49b65f", "jobStatus": "completed", "jobResult": "succeeded",
Sample outputs:
```json {
- "$schema": " https://product-ingestion.azureedge.net/schema/configure-status/2022-03-01-preview2",
+ "$schema": " https://schema.mp.microsoft.com/schema/configure-status/2022-07-01",
"jobId": "c32dd7e8-8619-462d-a96b-0ac1974bace5", "jobStatus": "completed", "jobResult": "failed",
There are two methods to do this depending whether you have the resourceURI or t
### Request
-`GET https://graph.microsoft.com/rp/product-ingestion/private-offer/<id>`
+`GET https://graph.microsoft.com/rp/product-ingestion/private-offer/<id>?$version=2022-07-01`
or
-`GET https://graph.microsoft.com/rp/product-ingestion/configure/<jobId>`
+`GET https://graph.microsoft.com/rp/product-ingestion/configure/<jobId>?$version=2022-07-01`
#### Request header
ID - required. This is the ID of the private offer you want the full details of.
jobId - required. This is the ID of the job you want the full details of. This ID is available in the response data generated during a previous request to either create, delete, withdraw, or upgrade a private offer.
+$version - required. This is the version of the schema that is being used in the request
+ #### Request body Don't provide a request body for this method.
You'll receive the full details of the private offer.
```json {
- "$schema": "https://product-ingestion.azureedge.net/schema/configure/2022-03-01-preview2",
+ "$schema": "https://schema.mp.microsoft.com/schema/configure/2022-07-01",
"resources": [ { "id": "private-offer/07380dd9-bcbb-cccbb-bbccbc",
migrate How To Delete Project https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/migrate/how-to-delete-project.md
Title: Delete an Azure Migrate project description: In this article, learn how you can delete an Azure Migrate project by using the Azure portal.--++ ms. Last updated 10/22/2019
migrate Hyper V Migration Architecture https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/migrate/hyper-v-migration-architecture.md
Title: How does Hyper-V migration work in Azure Migrate? description: Learn about Hyper-V migration with Azure Migrate --++ ms. Last updated 11/19/2019
migrate Migrate Services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/migrate/migrate-services-overview.md
Title: About Azure Migrate description: Learn about the Azure Migrate service.--++ ms. Last updated 04/15/2020
migrate Migrate Support Matrix Hyper V Migration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/migrate/migrate-support-matrix-hyper-v-migration.md
Title: Support for Hyper-V migration in Azure Migrate description: Learn about support for Hyper-V migration with Azure Migrate.--++ ms. Last updated 04/15/2020
migrate Migrate Support Matrix https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/migrate/migrate-support-matrix.md
Title: Azure Migrate support matrix description: Provides a summary of support settings and limitations for the Azure Migrate service.--++ ms. Last updated 07/23/2020
migrate Migrate V1 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/migrate/migrate-v1.md
Title: Work with the previous version of Azure Migrate description: Describes how to work with the previous version of Azure Migrate.--++ ms. Last updated 9/23/2021
migrate Prepare Isv Movere https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/migrate/prepare-isv-movere.md
Title: Prepare Azure Migrate to work with an ISV tool/Movere description: This article describes how to prepare Azure Migrate to work with an ISV tool or Movere, and then how to start using the tool. --++ ms. Last updated 06/10/2020
migrate Resources Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/migrate/resources-faq.md
Title: Azure Migrate FAQ description: Get answers to common questions about the Azure Migrate service.--++ ms. Last updated 04/15/2020
migrate Troubleshoot General https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/migrate/troubleshoot-general.md
Title: Troubleshoot Azure Migrate issues | Microsoft Docs description: Provides an overview of known issues in the Azure Migrate service, as well as troubleshooting tips for common errors.--++ ms.
migrate Troubleshoot Project https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/migrate/troubleshoot-project.md
Title: Troubleshoot Azure Migrate projects description: Helps you to troubleshoot issues with creating and managing Azure Migrate projects.--++ ms. Last updated 01/01/2020
mysql Concepts Backup Restore https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/flexible-server/concepts-backup-restore.md
Previously updated : 05/24/2022 Last updated : 07/26/2022 # Backup and restore in Azure Database for MySQL Flexible Server
Azure Database for MySQL Flexible Server, automatically creates server backups a
## Backup overview
+Flexible Server supports two types of backups to provide an enhanced flexibility towards maintaining backups of the business-critical data.
+
+### Automated Backup
+ Flexible Server takes snapshot backups of the data files and stores them in a local redundant storage. The server also performs transaction logs backup and also stores them in local redundant storage. These backups allow you to restore a server to any point-in-time within your configured backup retention period. The default backup retention period is seven days. You can optionally configure the database backup from 1 to 35 days. All backups are encrypted using AES 256-bit encryption for the data stored at rest.
+### On-Demand Backup
+
+Flexible server also allows you to trigger on-demand backups of the production workload, in addition to the automated backups taken by the Azure Database for MySQL Flexible service and store it in alignment with serverΓÇÖs backup retention policy. You can use these backups as a fastest restore point to perform a point-in-time restore to reduce restore times by up to 90%. The default backup retention period is seven days. You can optionally configure the database backup from 1 to 35 days. You can trigger a total of 50 on-demand backups from the portal. All backups are encrypted using AES 256-bit encryption for the data stored at rest.
+ These backup files cannot be exported. The backups can only be used for restore operations in Flexible server. You can also use [mysqldump](../concepts-migrate-dump-restore.md#dump-and-restore-using-mysqldump-utility) from a MySQL client to copy a database. ## Backup frequency
The primary means of controlling the backup storage cost is by setting the appro
## View Available Full Backups
-The Backup and Restore blade in the Azure portal lists the automated full backups taken daily once. One can use this blade to view the completion timestamps for all available full backups within the serverΓÇÖs retention period and to perform restore operations using these full backups. The list of available backups includes all full-automated backups within the retention period, a timestamp showing the successful completion, a timestamp indicating how long a backup will be retained, and a restore action.
+The Backup and Restore blade in the Azure portal provides a complete list of the full backups available to you at any given point in time. This includes automated backups as well as the On-Demand backups. One can use this blade to view the completion timestamps for all available full backups within the serverΓÇÖs retention period and to perform restore operations using these full backups. The list of available backups includes all full backups within the retention period, a timestamp showing the successful completion, a timestamp indicating how long a backup will be retained, and a restore action.
## Restore
After a restore from either **latest restore point** or **custom restore point**
### Backup related questions - **How do I backup my server?**
-By default, Azure Database for MySQL enables automated backups of your entire server (encompassing all databases created) with a default 7-day retention period. The only way to manually take a backup is by using community tools such as mysqldump as documented [here](../concepts-migrate-dump-restore.md#dump-and-restore-using-mysqldump-utility) or mydumper as documented [here](../concepts-migrate-mydumper-myloader.md#create-a-backup-using-mydumper). If you wish to backup Azure Database for MySQL to a Blob storage, refer to our tech community blog [Backup Azure Database for MySQL to a Blob Storage](https://techcommunity.microsoft.com/t5/azure-database-for-mysql/backup-azure-database-for-mysql-to-a-blob-storage/ba-p/803830).
+By default, Azure Database for MySQL enables automated backups of your entire server (encompassing all databases created) with a default 7-day retention period. You can also trigger a manual backup using On-Demand backup feature. The other way to manually take a backup is by using community tools such as mysqldump as documented [here](../concepts-migrate-dump-restore.md#dump-and-restore-using-mysqldump-utility) or mydumper as documented [here](../concepts-migrate-mydumper-myloader.md#create-a-backup-using-mydumper). If you wish to backup Azure Database for MySQL to a Blob storage, refer to our tech community blog [Backup Azure Database for MySQL to a Blob Storage](https://techcommunity.microsoft.com/t5/azure-database-for-mysql/backup-azure-database-for-mysql-to-a-blob-storage/ba-p/803830).
- **Can I configure automatic backups to be retained for long term?** No, currently we only support a maximum of 35 days of automated backup retention. You can take manual backups and use that for long-term retention requirement.
mysql How To Restore Server Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/flexible-server/how-to-restore-server-portal.md
Previously updated : 04/01/2021 Last updated : 07/26/2022 # Point-in-time restore of a Azure Database for MySQL Flexible Server using Azure portal
Follow these steps to restore your flexible server using an existing full backup
2. Click **Backup and Restore** from the left panel.
-3. View Available Backups page will be shown with the option to restore from all full automated backups taken for the server within the retention period.
+3. View Available Backups page will be shown with the option to restore from available full automated backups and on-demand backups taken for the server within the retention period.
4. Select the desired full backup from the list by clicking on corresponding **Restore** action.
mysql How To Trigger On Demand Backup https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/flexible-server/how-to-trigger-on-demand-backup.md
+
+ Title: Trigger On-Demand Backup for Azure Database for MySQL Flexible Server with Azure portal.
+description: This article describes how to trigger On-Demand backup from Azure portal
++++++ Last updated : 07/26/2022++
+# Trigger On-Demand Backup of an Azure Database for MySQL Flexible Server using Azure portal
++
+This article provides step-by-step procedure to trigger On-Demand backup from the portal.
+
+## Prerequisites
+
+To complete this how-to guide, you need:
+
+- You must have an Azure Database for MySQL Flexible Server.
+
+## Trigger On-Demand Backup
+
+Follow these steps to trigger back up on demand:
+
+1. In the [Azure portal](https://portal.azure.com/), choose your flexible server that you want to take backup of.
+
+2. Select **Backup** and Restore from the left panel.
+
+3. From the Backup and Restore page, Select **Backup Now**.
+
+4. Take backup page is shown. Provide a custom name for the backup in the Backup name field.
+
+ :::image type="content" source="./media/how-to-trigger-on-demand-backup/trigger-ondemand-backup.png" alt-text="Screenshot showing how to trigger On-demand backup.":::
+
+5. Select **Trigger**
+
+6. A notification shows that a backup has been initiated.
+
+7. Once completed, the on demand backup is seen listed along with the automated backups in the View Available Backups page.
+
+## Restore from an On-Demand full backup
+
+Learn more about [Restore a server](how-to-restore-server-portal.md)
+
+## Next steps
+
+Learn more about [business continuity](concepts-business-continuity.md)
mysql Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/flexible-server/whats-new.md
+ Previously updated : 05/24/2022 Last updated : 08/16/2022 # What's new in Azure Database for MySQL - Flexible Server? [!INCLUDE[applies-to-mysql-flexible-server](../includes/applies-to-mysql-flexible-server.md)]
-[Azure Database for MySQL - Flexible Server](./overview.md) is a deployment mode that's designed to provide more granular control and flexibility over database management functions and configuration settings than does the Single Server deployment mode. The service currently supports community version of MySQL 5.7 and 8.0.
+[Azure Database for MySQL - Flexible Server](./overview.md) is a deployment mode designed to provide more granular control and flexibility over database management functions and configuration settings than the Single Server deployment mode. The service currently supports the community version of MySQL 5.7 and 8.0.
This article summarizes new releases and features in Azure Database for MySQL - Flexible Server beginning in January 2021. Listings appear in reverse chronological order, with the most recent updates first.+ ## August 2022
-**Server logs for Azure Database for MySQL - Flexible Server**
+- **Server logs for Azure Database for MySQL - Flexible Server**
- Server Logs will help customers to emit the server logs to server storage space in file format, which you can later download. Slow query logs are supported with server logs, which can help customers in performance troubleshooting and query tuning. Customers have ability to store logs up to a week or up-to 7 GB of logs size. You can configure or download them from [Azure portal](./how-to-server-logs-portal.md) or [Azure CLI](./how-to-server-logs-cli.md).[Learn more](./concepts-monitoring.md#server-logs)
+ Server Logs will help customers to emit the server logs to server storage space in file format, which you can later download. Slow query logs are supported with server logs, which can help customers in performance troubleshooting and query tuning. Customers can store logs for up to a week or up to 7 GB. You can configure or download them from [Azure portal](./how-to-server-logs-portal.md) or [Azure CLI](./how-to-server-logs-cli.md).[Learn more](./concepts-monitoring.md#server-logs)
+- **On-Demand Backup for Azure Database for MySQL - Flexible Server**
-## June 2022
+ The on-Demand backup feature allows customers to trigger On-Demand backups of their production workload, in addition to the automated backups taken by Azure Database for MySQL Flexible service, and store it in alignment with the serverΓÇÖs backup retention policy. These backups can be used as the fastest restore point to perform a point-in-time restore for faster and more predictable restore times. [**Learn more**](how-to-trigger-on-demand-backup.md#trigger-on-demand-backup)
**Known Issues**
-On few servers where audit or slow logs are enabled, you may no longer see logs being uploaded to data sinks configured under diagnostics settings. Please verify whether your logs have the latest updated timestamp for the events, based on the [data sink](./tutorial-query-performance-insights.md#set-up-diagnostics) you've configured. If your server is affected by this issue, please open a [support ticket](https://azure.microsoft.com/support/create-ticket/) so that we can apply a quick fix on the server to resolve the issue. Alternatively, you can wait for our next deployment cycle, during which we'll apply a permanent fix in all regions.
+Server parameter aad_auth_only was exposed in this month's deployment. Enabling server parameter aad_auth_only will block all non Azure Active Directory MySQL connections to your Azure Database for MySQL Flexible server. When you try to connect to the server, you will receive error "ERROR 9107 (HY000): Only Azure Active Directory accounts are allowed to connect to server".
+We are currently working on additional configurations required for AAD authentication to be fully functional, and the feature will be available in the upcoming deployments. Do not enable the aad_auth_only parameter until then.
+++
+## June 2022
+
+- **Known Issues**
+
+On a few servers where audit or slow logs are enabled, you may no longer see logs uploaded to data sinks configured under diagnostics settings. Verify whether your logs have the latest updated timestamp for the events based on the [data sink](./tutorial-query-performance-insights.md#set-up-diagnostics) you've configured. If your server is affected by this issue, open a [support ticket](https://azure.microsoft.com/support/create-ticket/) so that we can apply a quick fix on the server to resolve the issue.
## May 2022 - **Announcing Azure Database for MySQL - Flexible Server for business-critical workloads**
- Azure Database for MySQL ΓÇô Flexible Server Business Critical service tier is now generally available. Business Critical service tier is ideal for Tier 1 production workloads that require low latency, high concurrency, fast failover, and high scalability, such as gaming, e-commerce, and Internet-scale applications, to learn more about [Business Critical service Tier](https://techcommunity.microsoft.com/t5/azure-database-for-mysql-blog/announcing-azure-database-for-mysql-flexible-server-for-business/ba-p/3361718).
+ Azure Database for MySQL ΓÇô Flexible Server Business Critical service tier is generally available. The Business Critical service tier is ideal for Tier 1 production workloads that require low latency, high concurrency, fast failover, and high scalability, such as gaming, e-commerce, and Internet-scale applications, to learn more about [Business Critical service Tier](https://techcommunity.microsoft.com/t5/azure-database-for-mysql-blog/announcing-azure-database-for-mysql-flexible-server-for-business/ba-p/3361718).
- **Announcing the addition of new Burstable compute instances for Azure Database for MySQL - Flexible Server** We're announcing the addition of new Burstable compute instances to support customersΓÇÖ auto-scaling compute requirements from 1 vCore up to 20 vCores. learn more about [Compute Option for Azure Database for MySQL - Flexible Server](./concepts-compute-storage.md). - **Known issues**
- - The Reserved instances (RI) feature in Azure Database for MySQL ΓÇô Flexible server isn't working properly for the Business Critical service tier, after its rebranding from the Memory Optimized service tier. Specifically, instance reservation has stopped working, and we're currently working to fix the issue.
- - Private DNS integration details aren't displayed on few Azure Database for MySQL Database flexible servers, which have HA option enabled. This issue doesn't have any impact on availability of the server or name resolution. We're working on a permanent fix to resolve the issue and it will be available in the next deployment. Meanwhile, if you want to view the Private DNS Zone details, you can either search under [Private DNS zones](../../dns/private-dns-getstarted-portal.md) in the Azure portal or you can perform a [manual failover](concepts-high-availability.md#planned-forced-failover) of the HA enabled flexible server and refresh the Azure portal.
+ - The Reserved instances (RI) feature in Azure Database for MySQL ΓÇô Flexible server isn't working properly for the Business Critical service tier after its rebranding from the Memory Optimized service tier. Specifically, instance reservation has stopped working, and we're currently working to fix the issue.
+ - Private DNS integration details aren't displayed on a few Azure Database for MySQL Database flexible servers, which have HA option enabled. This issue doesn't have any impact on availability of the server or name resolution. We're working on a permanent fix to resolve the issue, and it will be available in the next deployment. Meanwhile, if you want to view the Private DNS Zone details, you can either search under [Private DNS zones](../../dns/private-dns-getstarted-portal.md) in the Azure portal or you can perform a [manual failover](concepts-high-availability.md#planned-forced-failover) of the HA enabled flexible server and refresh the Azure portal.
## April 2022
On few servers where audit or slow logs are enabled, you may no longer see logs
Azure Database for MySQL - Flexible Server 8.0 now is running on minor version 8.0.28*, to learn more about changes coming in this minor version [visit Changes in MySQL 8.0.28 (2022-01-18, General Availability)](https://dev.mysql.com/doc/relnotes/mysql/8.0/en/news-8-0-28.html) - **Minor version upgrade for Azure Database for MySQL - Flexible server to 5.7.37**
- Azure Database for MySQL - Flexible Server 5.7 now is running on minor version 5.7.37*, to learn more about changes coming in this minor version [visit Changes in MySQL 5.7.37 (2022-01-18, General Availability](https://dev.mysql.com/doc/relnotes/mysql/5.7/en/news-5-7-37.html)
+ Azure Database for MySQL - Flexible Server 5.7 is now running on minor version 5.7.37*, to learn more about changes coming in this minor version [visit Changes in MySQL 5.7.37 (2022-01-18, General Availability](https://dev.mysql.com/doc/relnotes/mysql/5.7/en/news-5-7-37.html)
-* Please note that some regions are still running older minor versions of the Azure Database for MySQL and will be patched by end of April 2022.
+ > [!Note]
+ > Please note that some regions are still running older minor versions of the Azure Database for MySQL and will be patched by end of April 2022.
- **Deprecation of TLSv1 or TLSv1.1 protocols with Azure Database for MySQL - Flexible Server (8.0.28)**
- Starting version 8.0.28, MySQL community edition supports TLS protocol TLSv1.2 or TLSv1.3 only. Azure Database for MySQL ΓÇô Flexible Server will also stop supporting TLSv1 and TLSv1.1 protocols, to align with modern security standards. You'll no longer be able to configure TLSv1 or TLSv1.1 from the server parameter pane for newly created resources as well as for resources created previously. The default will be TLSv1.2. Resources created before the upgrade will still support communication through TLS protocol TLSv1 or TLSv1.1 through 1 May 2022.
+ Starting version 8.0.28, the MySQL community edition supports TLS protocol TLSv1.2 or TLSv1.3 only. Azure Database for MySQL ΓÇô Flexible Server will also stop supporting TLSv1 and TLSv1.1 protocols to align with modern security standards. You'll no longer be able to configure TLSv1 or TLSv1.1 from the server parameter pane for newly created resources and previously created resources. The default will be TLSv1.2. Resources created before the upgrade will still support communication through TLS protocol TLSv1 or TLSv1.1 through May 1, 2022.
## March 2022 This release of Azure Database for MySQL - Flexible Server includes the following updates. - **Migrate from locally redundant backup storage to geo-redundant backup storage for existing flexible server**
- Azure Database for MySQL - Flexible Server now provides the added flexibility to migrate to geo-redundant backup storage from locally redundant backup storage post server-create to provide higher data resiliency. Enabling geo-redundancy via the server's Compute + Storage blade empowers customers to recover their existing flexible servers from a geographic disaster or regional failure when they canΓÇÖt access the server in the primary region. With this feature enabled for their existing servers, customers can perform geo-restore and deploy a new server to the geo-paired Azure region leveraging the original serverΓÇÖs latest available geo-redundant backup. [Learn more](concepts-backup-restore.md)
+ Azure Database for MySQL - Flexible Server now provides the added flexibility to migrate to geo-redundant backup storage from locally redundant backup storage post server-create to provide higher data resiliency. Enabling geo-redundancy via the server's Compute + Storage blade empowers customers to recover their existing flexible servers from a geographic disaster or regional failure when they canΓÇÖt access the server in the primary region. With this feature enabled for their existing servers, customers can perform geo-restore and deploy a new server to the geo-paired Azure region using the original serverΓÇÖs latest available geo-redundant backup. [Learn more](concepts-backup-restore.md)
- **Simulate disaster recovery drills for your stopped servers**
- Azure Database for MySQL - Flexible Server now provides the ability to perform geo-restore on stopped servers helping users simulate disaster recovery drills for their workloads to estimate impact and recovery time.This will help users plan better to meet their disaster recovery and business continuity objectives by leveraging geo-redundancy feature offered by Azure Database for MySQL Flexible Server. [Learn more](how-to-restore-server-cli.md)
+ Azure Database for MySQL - Flexible Server now provides the ability to perform geo-restore on stopped servers helping users simulate disaster recovery drills for their workloads to estimate impact and recovery time.This will help users plan better to meet their disaster recovery and business continuity objectives by using geo-redundancy feature offered by Azure Database for MySQL Flexible Server. [Learn more](how-to-restore-server-cli.md)
## January 2022
This release of Azure Database for MySQL - Flexible Server includes the followin
- China North 2 - **Reserving 36 IOPs for Azure Database for MySQL Flexible Server which has HA enabled**
-
- We're adding 36 IOPs and reserving it for supporting standby failover operation on the servers, which has High Availability enabled. This IOPs is in addition to the configured IOPs on your servers, so additional charges per month would apply based on your Azure region. The extra IOPS help us ensure our commitment to providing smooth failover experience from primary to standby replica.The extra charge can be estimated by navigating to the [Azure Database for MySQL ΓÇô Flexible Server pricing page](https://azure.microsoft.com/pricing/details/mysql/flexible-server), choosing the Azure region for your server, and multiplying IOPs/month cost by 36 IOPs. For example: if your server is hosted in East US, the extra IO costs, which you can expect is $0.05*36 = $1.8 USD per month.
+
+ We're adding 36 IOPs and reserving it for supporting standby failover operation on the servers, which has High Availability enabled. This IOPs is in addition to the configured IOPs on your servers, so more charges per month would apply based on your Azure region. The extra IOPS help us ensure our commitment to providing smooth failover experience from primary to standby replica.The extra charge can be estimated by navigating to the [Azure Database for MySQL ΓÇô Flexible Server pricing page](https://azure.microsoft.com/pricing/details/mysql/flexible-server), choosing the Azure region for your server, and multiplying IOPs/month cost by 36 IOPs. For example: if your server is hosted in East US, the extra IO costs, which you can expect are $0.05*36 = $1.8 USD per month.
- **Bug fixes**
- Restart workflow struck issue with servers with HA and Geo-redundant backup option enabled is fixed.
+ Restart workflow struck issue with servers with HA, and Geo-redundant backup option enabled is fixed.
- **Known issues**
- - When you're using ARM templates for provisioning or configuration changes for HA enabled servers, if a single deployment is made to enable/disable HA and along with other server properties like backup redundancy, storage etc. then deployment would fail. You can mitigate it by submitting the deployment request separately for to enable\disable and configuration changes. You wouldnΓÇÖt have issue with Portal or Azure CLI as these are request already separated.
+ - When you're using ARM templates for provisioning or configuration changes for HA enabled servers, if a single deployment is made to enable/disable HA and along with other server properties like backup redundancy, storage etc., then deployment would fail. You can mitigate it by submitting the deployment request separately to enable\disable and configuration changes. You wouldnΓÇÖt have an issue with Portal or Azure CLI, as these requests are already separated.
- - When you're viewing automated backups for a HA enabled server in Backup and Restore blade, if at some point in time a forced or automatic failover is performed, you may lose viewing rights to the server's backups on the Backup and Restore blade. Despite the invisibility of information regarding backups on the portal, the flexible server is successfully taking daily automated backups for the server in the backend and the server can be restored to any point in time within the retention period.
+ - When you're viewing automated backups for a HA enabled server in Backup and Restore blade, if at some point in time a forced or automatic failover is performed, you may lose viewing rights to the server's backups on the Backup and Restore blade. Despite the invisibility of information regarding backups on the portal, the flexible server is successfully taking daily automated backups for the server in the backend. The server can be restored to any point within the retention period.
## November 2021
This release of Azure Database for MySQL - Flexible Server includes the followin
- **Fastest restore points**
- With the fastest restore point option, you can restore a Flexible Server instance in the fastest time possible on a given day within the serverΓÇÖs retention period. This restore operation will simply restore the full snapshot backup without requiring restore or recovery of logs. With fastest restore point, customers will see three options while performing point in time restores from Azure portal viz latest restore point, custom restore point and fastest restore point. [Learn more](concepts-backup-restore.md#point-in-time-restore)
+ With the fastest restore point option, you can restore a Flexible Server instance in the fastest time possible on a given day within the serverΓÇÖs retention period. This restore operation restores the full snapshot backup without requiring restore or recovery of logs. With fastest restore point, customers will see three options while performing point in time restores from Azure portal viz latest restore point, custom restore point and fastest restore point. [Learn more](concepts-backup-restore.md#point-in-time-restore)
- **FAQ blade in Azure portal**
This release of Azure Database for MySQL - Flexible Server includes the followin
- **Restore a deleted Flexible server**
- The service now allows you to recover a deleted MySQL flexible server resource within five days from the time of server deletion. For a detailed guide on how to restore a deleted server, [refer documented steps](../flexible-server/how-to-restore-dropped-server.md). To protect server resources post deployment from accidental deletion or unexpected changes, we recommend administrators to leverage [management locks](../../azure-resource-manager/management/lock-resources.md).
+ The service now allows you to recover a deleted MySQL flexible server resource within five days from the time of server deletion. For a detailed guide on how to restore a deleted server, [refer documented steps](../flexible-server/how-to-restore-dropped-server.md). To protect server resources post deployment from accidental deletion or unexpected changes, we recommend administrators to use [management locks](../../azure-resource-manager/management/lock-resources.md).
- **Known issues**
This release of Azure Database for MySQL - Flexible Server includes the followin
- **Geo-redundant backup restore to geo-paired region for DR scenarios**
- The service now provides the added flexibility to choose geo-redundant backup storage to provide higher data resiliency. Enabling geo-redundancy empowers customers to recover from a geographic disaster or regional failure when they canΓÇÖt access the server in the primary region. With this feature enabled, customers can perform geo-restore and deploy a new server to the geo-paired geographic region leveraging the original serverΓÇÖs latest available geo-redundant backup. [Learn more](../flexible-server/concepts-backup-restore.md).
+ The service now provides the added flexibility to choose geo-redundant backup storage to provide higher data resiliency. Enabling geo-redundancy empowers customers to recover from a geographic disaster or regional failure when they canΓÇÖt access the server in the primary region. With this feature enabled, customers can perform geo-restore and deploy a new server to the geo-paired geographic region using the original serverΓÇÖs latest available geo-redundant backup. [Learn more](../flexible-server/concepts-backup-restore.md).
-- **Availability Zones Selection when creating Read replicas**
+- **Availability Zones Selection when creating Read replicas**
When creating Read replica, you have an option to select the Availability Zones location of your choice. An Availability Zone is a high availability offering that protects your applications and data from datacenter failures. Availability Zones are unique physical locations within an Azure region. [Learn more](../flexible-server/concepts-read-replicas.md). - **Read replicas in Azure Database for MySQL - Flexible servers will no longer be available on Burstable SKUs**
- You wonΓÇÖt be able to create new or maintain existing read replicas on the Burstable tier server. In the interest of providing a good query and development experience for Burstable SKU tiers, the support for creating and maintaining read replica for servers in the Burstable pricing tier will be discontinued.
+ You wonΓÇÖt be able to create new or maintain existing read replicas on the Burstable tier server. In the interest of providing a good query and development experience for Burstable SKU tiers, the support for creating and maintaining read replica for servers in the Burstable pricing tier will be discontinued.
If you have an existing Azure Database for MySQL - Flexible Server with read replica enabled, youΓÇÖll have to scale up your server to either General Purpose or Business Critical pricing tiers or delete the read replica within 60 days. After the 60-day period, while you can continue to use the primary server for your read-write operations, replication to read replica servers will be stopped. For newly created servers, read replica option will be available only for the General Purpose and Business Critical pricing tiers.
+- **Monitoring Azure Database for MySQL - Flexible Server with Azure Monitor Workbooks**
Azure Database for MySQL - Flexible Server is now integrated with Azure Monitor Workbooks. Workbooks provide a flexible canvas for data analysis and the creation of rich visual reports within the Azure portal. With this integration, the server has link to workbooks and few sample templates, which help to monitor the service at scale. These templates can be edited, customized to customer requirements and pinned to dashboard to create a focused and organized view of Azure resources. [Query Performance Insights](./tutorial-query-performance-insights.md), [Auditing](./tutorial-configure-audit.md), and Instance Overview templates are currently available. [Learn more](./concepts-workbooks.md).
This release of Azure Database for MySQL - Flexible Server includes the followin
- **Known Issues** - When a primary Azure region is down, one canΓÇÖt create geo-redundant servers in its geo-paired region as storage canΓÇÖt be provisioned in the primary Azure region. One must wait for the primary region to be up to provision geo-redundant servers in the geo-paired region. - ## September 2021 This release of Azure Database for MySQL - Flexible Server includes the following updates.
This release of Azure Database for MySQL - Flexible Server includes the followin
The public preview of Azure Database for MySQL - Flexible Server is now available in the following Azure regions:
- - UK West
- - Canada East
- - Japan West
+ - UK West
+ - Canada East
+ - Japan West
- **Bug fixes**
- Same-zone HA creation is fixed in the following regions:
+ Same-zone HA creation is fixed in the following regions:
- - Central India
- - East Asia
- - Korea Central
- - South Africa North
- - Switzerland North
+ - Central India
+ - East Asia
+ - Korea Central
+ - South Africa North
+ - Switzerland North
## August 2021
This release of Azure Database for MySQL - Flexible Server includes the followin
- **Availability in four additional Azure regions**
- The public preview of Azure Database for MySQL - Flexible Server is now available in the following Azure regions:
+ The public preview of Azure Database for MySQL - Flexible Server is now available in the following Azure regions:
- - Australia Southeast
- - South Africa North
- - East Asia (Hong Kong)
- - Central India
+ - Australia Southeast
+ - South Africa North
+ - East Asia (Hong Kong)
+ - Central India
[Learn more](overview.md#azure-regions). - **Known issues**
- - Right after Zone-Redundant high availability server failover, clients fail to connect to the server if using SSL with ssl_mode VERIFY_IDENTITY. This issue can be mitigated by using ssl_mode as VERIFY_CA.
- - Unable to create Same-Zone High availability server in the following regions: Central India, East Asia, Korea Central, South Africa North, Switzerland North.
- - In a rare scenario and after HA failover, the primary server will be in read_only mode. Resolve the issue by updating ΓÇ£read_onlyΓÇ¥ value from the server parameters blade to OFF.
- - After successfully scaling Compute in the Compute+Storage blade, IOPS are reset to the SKU default. Customers can work around the issue by rescaling IOPs in the Compute+Storage blade to desired value (previously set) post the compute deployment and consequent IOPS reset.
+ - Right after Zone-Redundant high availability server failover, clients fail to connect to the server if using SSL with ssl_mode VERIFY_IDENTITY. This issue can be mitigated by using ssl_mode as VERIFY_CA.
+ - Unable to create Same-Zone High availability server in the following regions: Central India, East Asia, Korea Central, South Africa North, Switzerland North.
+ - In a rare scenario and after HA failover, the primary server will be in read_only mode. Resolve the issue by updating ΓÇ£read_onlyΓÇ¥ value from the server parameters blade to OFF.
+ - After successfully scaling Compute in the Compute+Storage blade, IOPS are reset to the SKU default. Customers can work around the issue by rescaling IOPs in the Compute+Storage blade to desired value (previously set) post the compute deployment and consequent IOPS reset.
## July 2021
This release of Azure Database for MySQL - Flexible Server includes the followin
- **Zone redundant HA forced failover fixes**
- This release includes fixes for known issues related to forced failover to ensure that server parameters and additional IOPS changes are persisted across failovers.
+ This release includes fixes for known issues related to forced failover to ensure that server parameters and more IOPS changes are persisted across failovers.
- **Known issues**
- - Trying to perform a compute scale up or scale down operation on an existing server with less than 20 GB of storage provisioned won't complete successfully. Resolve the issue by scaling up the provisioned storage to 20 GB and retrying the compute scaling operation.
+ - Trying to perform a compute scale up or scale down operation on an existing server with less than 20 GB of storage provisioned won't complete successfully. Resolve the issue by scaling up the provisioned storage to 20 GB and retrying the compute scaling operation.
## May 2021
This release of Azure Database for MySQL - Flexible Server includes the followin
- **Zone redundant HA available in UK South and Japan East region**
- Azure Database for MySQL - Flexible Server now offers zone-redundant high availability in two additional regions: UK South and Japan East. [Learn more](overview.md#azure-regions).
+ Azure Database for MySQL - Flexible Server now offers zone-redundant high availability in two more regions: UK South and Japan East. [Learn more](overview.md#azure-regions).
- **Known issues**
- - Additional IOPs changes donΓÇÖt take effect in zone redundant HA enabled servers. Customers can work around the issue by disabling HA, scaling IOPs, and the re-enabling zone redundant HA.
- - After force failover, the standby availability zone is inaccurately reflected in the portal. (No workaround)
- - Server parameter changes don't take effect in zone redundant HA enabled server after forced failover. (No workaround)
+ - Additional IOPs changes donΓÇÖt take effect in zone redundant HA enabled servers. Customers can work around the issue by disabling HA, scaling IOPs, and the re-enabling zone redundant HA.
+ - After force failover, the standby availability zone is inaccurately reflected in the portal. (No workaround)
+ - Server parameter changes don't take effect in zone redundant HA enabled server after forced failover. (No workaround)
## April 2021
This release of Azure Database for MySQL - Flexible Server includes the followin
- **Ability to force failover to standby server with zone redundant high availability released**
- Customers can now manually force a failover to test functionality with their application scenarios, which can help them to prepare in case of any outages. [Learn more](https://techcommunity.microsoft.com/t5/azure-database-for-mysql/forced-failover-for-azure-database-for-mysql-flexible-server/ba-p/2280671).
+ Customers can now manually force a failover to test functionality with their application scenarios, which can help them to prepare for any outages. [Learn more](https://techcommunity.microsoft.com/t5/azure-database-for-mysql/forced-failover-for-azure-database-for-mysql-flexible-server/ba-p/2280671).
- **PowerShell module for Flexible Server released**
This release of Azure Database for MySQL - Flexible Server includes the followin
- **Known issues**
- - SSL\TLS 1.2 is enforced and canΓÇÖt be disabled. (No workarounds)
- - There are intermittent provisioning failures for servers provisioned in a VNet. The workaround is to retry the server provisioning until it succeeds.
+ - SSL\TLS 1.2 is enforced and canΓÇÖt be disabled. (No workarounds)
+ - There are intermittent provisioning failures for servers provisioned in a VNet. The workaround is to retry the server provisioning until it succeeds.
## February 2021
This release of Azure Database for MySQL - Flexible Server includes the followin
- **Additional IOPS feature released**
- Azure Database for MySQL - Flexible Server supports provisioning additional [IOPS](concepts-compute-storage.md#iops) independent of the storage provisioned. Customers can use this feature to increase or decrease the number of IOPS anytime based on their workload requirements.
+ Azure Database for MySQL - Flexible Server supports provisioning more [IOPS](concepts-compute-storage.md#iops) independent of the storage provisioned. Customers can use this feature to increase or decrease the number of IOPS anytime based on their workload requirements.
- **Known issues**
If you have questions about or suggestions for working with Azure Database for M
- Learn more about [Azure Database for MySQL pricing](https://azure.microsoft.com/pricing/details/mysql/server/). - Browse the [public documentation](index.yml) for Azure Database for MySQL ΓÇô Flexible Server.-- Review details on [troubleshooting common migration errors](../howto-troubleshoot-common-errors.md).
+- Review details on [troubleshooting common migration errors](../howto-troubleshoot-common-errors.md).
openshift Howto Encrypt Data Disks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/openshift/howto-encrypt-data-disks.md
Title: Encrypt persistent volume claims with a customer-managed key (CMK) on Azu
description: Bring your own key (BYOK) / Customer-managed key (CMK) deploy instructions for Azure Red Hat OpenShift Last updated 02/24/2021--++ keywords: encryption, byok, aro, cmk, openshift, red hat # Customer intent: My security policies dictate that data at rest must be encrypted. Beyond this, the key used to encrypt data must be able to be changed at-will by a person authorized to do so.
postgresql Connect Python https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/postgresql/flexible-server/connect-python.md
This article assumes that you're familiar with developing using Python, but you'
- If you created your flexible server with *Public access (allowed IP addresses)*, you can add your local IP address to the list of firewall rules on your server. Refer to [Create and manage Azure Database for PostgreSQL - Flexible Server firewall rules using the Azure CLI](./how-to-manage-firewall-cli.md). ## Install the Python libraries for PostgreSQL
-The [psycopg2](https://pypi.python.org/pypi/psycopg2/) module enables connecting to and querying a PostgreSQL database, and is available as a Linux, macOS, or Windows [wheel](https://pythonwheels.com/) package. Install the binary version of the module, including all the dependencies. For more information about `psycopg2` installation and requirements, see [Installation](http://initd.org/psycopg/docs/install.html).
+The [psycopg2](https://pypi.python.org/pypi/psycopg2/) module enables connecting to and querying a PostgreSQL database, and is available as a Linux, macOS, or Windows [wheel](https://pythonwheels.com/) package. Install the binary version of the module, including all the dependencies.
To install `psycopg2`, open a terminal or command prompt and run the command `pip install psycopg2`.
For each code example in this article:
1. To run the file, change to your project folder in a command-line interface, and type `python` followed by the filename, for example `python postgres-insert.py`. ## Create a table and insert data
-The following code example connects to your Azure Database for PostgreSQL - Flexible Server database using the [psycopg2.connect](http://initd.org/psycopg/docs/connection.html) function, and loads data with a SQL **INSERT** statement. The [cursor.execute](http://initd.org/psycopg/docs/cursor.html#execute) function executes the SQL query against the database.
+The following code example connects to your Azure Database for PostgreSQL - Flexible Server database using the psycopg2.connect function, and loads data with a SQL **INSERT** statement. The cursor.execute function executes the SQL query against the database.
```Python import psycopg2
When the code runs successfully, it produces the following output:
![Command-line output](media/connect-python/2-example-python-output.png) ## Read data
-The following code example connects to your Azure Database for PostgreSQL - Flexible Server database and uses [cursor.execute](http://initd.org/psycopg/docs/cursor.html#execute) with the SQL **SELECT** statement to read data. This function accepts a query and returns a result set to iterate over by using [cursor.fetchall()](http://initd.org/psycopg/docs/cursor.html#cursor.fetchall).
+The following code example connects to your Azure Database for PostgreSQL - Flexible Server database and uses cursor.execute with the SQL **SELECT** statement to read data. This function accepts a query and returns a result set to iterate over by using cursor.fetchall()
```Python import psycopg2
conn.close()
``` ## Update data
-The following code example connects to your Azure Database for PostgreSQL - Flexible Server database and uses [cursor.execute](http://initd.org/psycopg/docs/cursor.html#execute) with the SQL **UPDATE** statement to update data.
+The following code example connects to your Azure Database for PostgreSQL - Flexible Server database and uses cursor.execute with the SQL **UPDATE** statement to update data.
```Python import psycopg2
conn.close()
``` ## Delete data
-The following code example connects to your Azure Database for PostgreSQL - Flexible Server database and uses [cursor.execute](http://initd.org/psycopg/docs/cursor.html#execute) with the SQL **DELETE** statement to delete an inventory item that you previously inserted.
+The following code example connects to your Azure Database for PostgreSQL - Flexible Server database and uses cursor.execute with the SQL **DELETE** statement to delete an inventory item that you previously inserted.
```Python import psycopg2
postgresql Concepts Connection Libraries https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/postgresql/single-server/concepts-connection-libraries.md
Most language client libraries used to connect to PostgreSQL server are external
| **Language** | **Client interface** | **Additional information** | **Download** | |--|-|-|--|
-| Python | [psycopg](http://initd.org/psycopg/) | DB API 2.0-compliant | [Download](http://initd.org/psycopg/download/) |
+| Python | [psycopg](https://www.psycopg.org/) | DB API 2.0-compliant | [Download](https://sourceforge.net/projects/adodbapi/) |
| PHP | [php-pgsql](https://secure.php.net/manual/en/book.pgsql.php) | Database extension | [Install](https://secure.php.net/manual/en/pgsql.installation.php) | | Node.js | [Pg npm package](https://www.npmjs.com/package/pg) | Pure JavaScript non-blocking client | [Install](https://www.npmjs.com/package/pg) | | Java | [JDBC](https://jdbc.postgresql.org/) | Type 4 JDBC driver | [Download](https://jdbc.postgresql.org/download.html)  |
purview Concept Policies Data Owner https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-policies-data-owner.md
+
+ Title: Microsoft Purview data owner policies concepts
+description: Understand Microsoft Purview data owner policies
+++++ Last updated : 03/20/2022++
+# Concepts for Microsoft Purview data owner policies
++
+This article discusses concepts related to managing access to data sources in your data estate from within the Microsoft Purview governance portal.
+
+> [!Note]
+> This capability is different from access control for Microsoft Purview itself, which is described in [Access control in Microsoft Purview](catalog-permissions.md).
+
+## Overview
+
+Access policies in Microsoft Purview enable you to manage access to different data systems across your entire data estate. For example:
+
+A user needs read access to an Azure Storage account that has been registered in Microsoft Purview. You can grant this access directly in Microsoft Purview by creating a data access policy through the **Policy management** app in the Microsoft Purview governance portal.
+
+Data access policies can be enforced through Purview on data systems that have been registered for policy.
+
+## Microsoft Purview policy concepts
+
+### Microsoft Purview policy
+
+A **policy** is a named collection of policy statements. When a policy is published to one or more data systems under PurviewΓÇÖs governance, it's then enforced by them. A policy definition includes a policy name, description, and a list of one or more policy statements.
+
+### Policy statement
+
+A **policy statement** is a human readable instruction that dictates how the data source should handle a specific data operation. The policy statement comprises **Effect**, **Action, Data Resource** and **Subject**.
+
+#### Action
+
+An **action** is the operation being permitted or denied as part of this policy. For example: Read or Modify. These high-level logical actions map to one (or more) data actions in the data system where they are enforced.
+
+#### Effect
+
+The **effect** indicates what should be resultant effect of this policy. Currently, the only supported value is **Allow**.
+
+#### Data resource
+
+The **data resource** is the fully qualified data asset path to which a policy statement is applicable. It conforms to the following format:
+
+*/subscription/\<subscription-id>/resourcegroups/\<resource-group-name>/providers/\<provider-name>/\<data-asset-path>*
+
+Azure Storage data-asset-path format:
+
+*Microsoft.Storage/storageAccounts/\<account-name>/blobservice/default/containers/\<container-name>*
+
+Azure SQL DB data-asset-path format:
+
+*Microsoft.Sql/servers/\<server-name>*
+
+#### Subject
+
+The end-user identity from Azure Active Directory for whom this policy statement is applicable. This identity can be a service principal, an individual user, a group, or a managed service identity (MSI).
+
+### Example
+
+Deny Read on Data Asset:
+*/subscription/finance/resourcegroups/prod/providers/Microsoft.Storage/storageAccounts/finDataLake/blobservice/default/containers/FinData to group Finance-analyst*
+
+In the above policy statement, the effect is *Deny*, the action is *Read*, the data resource is Azure Storage container *FinData*, and the subject is Azure Active Directory group *Finance-analyst*. If any user that belongs to this group attempts to read data from the storage container *FinData*, the request will be denied.
+
+### Hierarchical enforcement of policies
+
+The data resource specified in a policy statement is hierarchical by default. This means that the policy statement applies to the data object itself and to **all** the child objects contained by the data object. For example, a policy statement on Azure storage container applies to all the blobs contained within it.
+
+### Policy combining algorithm
+
+Microsoft Purview can have different policy statements that refer to the same data asset. When evaluating a decision for data access, Microsoft Purview combines all the applicable policies and provides a consolidated decision. The combining strategy picks the most restrictive policy.
+For example, letΓÇÖs assume two different policies on an Azure Storage container *FinData* as follows,
+
+Policy 1 - *Allow Read on Data Asset /subscription/…./containers/FinData
+To group Finance-analyst*
+
+Policy 2 - *Deny Read on Data Asset /subscription/…./containers/FinData
+To group Finance-contractors*
+
+Then letΓÇÖs assume that user ΓÇÿuser1ΓÇÖ, who is part of two groups:
+*Finance-analyst* and *Finance-contractors*, executes a call to blob read API. Since both policies will be applicable, Microsoft Purview will choose the most restrictive one, which is *Deny* of *Read*. Thus, the access request will be denied.
+
+> [!Note]
+> Currently, the only supported effect is **Allow**.
+
+## Policy publishing
+
+A newly created policy exists in the draft mode state, only visible in Microsoft Purview. The act of publishing initiates enforcement of a policy in the specified data systems. It's an asynchronous action that can take between 5 minutes and 2 hours to be effective, depending on the enforcement code in the underlying data sources. For more information, consult the tutorials related to each data source
+
+A policy published to a data source could contain references to an asset belonging to a different data source. Such references will be ignored since the asset in question does not exist in the data source where the policy is applied.
+
+## Next steps
+Check the tutorials on how to create policies in Microsoft Purview that work on specific data systems such as Azure Storage:
+
+* [Access provisioning by data owner to Azure Storage datasets](how-to-policies-data-owner-storage.md)
+* [Enable Microsoft Purview data owner policies on all data sources in a subscription or a resource group](./how-to-policies-data-owner-resource-group.md)
purview Deployment Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/deployment-best-practices.md
More hardening steps can be taken:
* Fine-tune scope scan to improve scan performance * [Use REST APIs](tutorial-atlas-2-2-apis.md) to export critical metadata and properties for backup and recovery * [Use workflow](how-to-workflow-business-terms-approval.md) to automate ticketing and eventing to avoid human errors
-* [Use policies](concept-data-owner-policies.md) to manage access to data assets through the Microsoft Purview governance portal.
+* [Use policies](concept-policies-data-owner.md) to manage access to data assets through the Microsoft Purview governance portal.
## Lifecycle considerations
purview How To Deploy Profisee Purview Integration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-deploy-profisee-purview-integration.md
# Microsoft Purview - Profisee MDM Integration
-Master data management (MDM) is a key pillar of any unified data governance solution. Microsoft Purview supports master data management with our partner [Profisee](https://profisee.com/profisee-advantage/). This tutorial compiles reference and integration deployment materials in one place; firstly to put Microsoft Purview Unified Data Governance and MDM in the context of an Azure data estate; and more importantly, to get you started on your MDM journey with Microsoft Purview through our integration with Profisee.
+Master data management (MDM) is a key pillar of any unified data governance solution. Microsoft Purview supports master data management with our partner [Profisee](https://profisee.com/solutions/microsoft-enterprise/azure/). This tutorial compiles reference and integration deployment materials in one place; firstly to put Microsoft Purview Unified Data Governance and MDM in the context of an Azure data estate; and more importantly, to get you started on your MDM journey with Microsoft Purview through our integration with Profisee.
## Why Data Governance and Master Data Management (MDM) are essential to the modern Data Estate?
Microsoft Purview and Profisee MDM are often discussed as being a ΓÇÿBetter Toge
- Inherently multi-domain ΓÇô Profisee offers a multi-domain approach to MDM where there are no limitations to the number of specificity of master data domains. This design aligns well with customers looking to modernize their data estate who may start with a limited number of domains, but ultimately will benefit from maximizing domain coverage (matched to their data governance coverage) across their whole data estate. - Engineered for Azure ΓÇô Profisee has been engineered to be cloud-native with options for both SaaS and managed IaaS/CaaS deployments on Azure (see next section)
+ >[!TIP]
+ > **Leverage ProfiseeΓÇÖs [MDS Migration Utility](https://profisee.com/solutions/microsoft-enterprise/master-data-services/) to upgrade from Microsoft MDS (Master Data Services) in a single click**
+ ## Profisee MDM: Deployment Flexibility ΓÇô Turnkey SaaS Experience or IaaS/CaaS Flexibility Profisee MDM has been engineered for a cloud-native experience and may be deployed on Azure in two ways ΓÇô SaaS and Azure IaaS/CaaS/Kubernetes Cluster.
Complete deployment flexibility and control, using the most efficient and low-ma
- **Complete Flexibility & Autonomy** - Available in Azure, AWS, Google Cloud or on-premises. - **Fast to Deploy, Easy to Maintain** - 100% containerized configuration streamlines patches and upgrades.
-More Details on [Profisee MDM Benefits On Modern Cloud Architecture](https://profisee.com/our-technology/modern-cloud-architecture/), [Profisee Advantage Videos](https://profisee.com/profisee-advantage/) and why it fits best with [Microsoft Azure](https://azure.microsoft.com/) cloud deployments!
+More Details on [Profisee MDM Benefits On Modern Cloud Architecture](https://profisee.com/our-technology/modern-cloud-architecture/), [Profisee MDM on Azure](https://profisee.com/solutions/microsoft-enterprise/azure/), and why it fits best with [Microsoft Azure](https://azure.microsoft.com/) deployments!
## Microsoft Purview - Profisee Reference Architecture
The reference architecture shows how both Microsoft Purview and Profisee MDM wor
3. Enrich master data model with governance details ΓÇô Governance Data Stewards can enrich master data entity definitions with data dictionary and glossary information as well as ownership and sensitive data classifications, etc. in Microsoft Purview 4. Apply enriched governance data for data stewardship ΓÇô any definitions and metadata available in Microsoft Purview are visible in real-time in Profisee as guidance for the MDM Data Stewards 5. Load source data from business applications ΓÇô Azure Data Factory extracts data from source systems with 100+ pre-built connectors and/or REST gateway
- Transactional and unstructured data is loaded to downstream analytics solution ΓÇô All ΓÇÿrawΓÇÖ source data can be loaded to analytics database such as Synapse (Synapse is generally the preferred analytic database but others such as Snowflake are also common). Analysis on this raw information without proper master (ΓÇÿgoldenΓÇÖ) data will be subject to inaccuracy as data overlaps, mismatches and conflicts won't yet have been resolved.
+6. Transactional and unstructured data is loaded to downstream analytics solution ΓÇô All ΓÇÿrawΓÇÖ source data can be loaded to analytics database such as Synapse (Synapse is generally the preferred analytic database but others such as Snowflake are also common). Analysis on this raw information without proper master (ΓÇÿgoldenΓÇÖ) data will be subject to inaccuracy as data overlaps, mismatches and conflicts won't yet have been resolved.
7. Master data from source systems is loaded to Profisee MDM application ΓÇô Multiple streams of ΓÇÿmasterΓÇÖ data is loaded to Profisee MDM. Master data is the data that defines a domain entity such as customer, product, asset, location, vendor, patient, household, menu item, ingredient, and so on. This data is typically present in multiple systems and resolving differing definitions and matching and merging this data across systems is critical to the ability to use any cross-system data in a meaningful way. 8. Master data is standardized, matched, merged, enriched and validated according to governance rules ΓÇô Although data quality and governance rules may be defined in other systems (such as Microsoft Purview), Profisee MDM is where they're enforced. Source records are matched and merged both within and across source systems to create the most complete and correct record possible. Data quality rules check each record for compliance with business and technical requirements. 9. Extra data stewardship to review and confirm matches, data quality, and data validation issues, as required ΓÇô Any record failing validation or matching with only a low probability score is subject to remediation. To remediate failed validations, a workflow process assigns records requiring review to Data Stewards who are experts in their business data domain. Once records have been verified or corrected, they're ready to use as a ΓÇÿgolden recordΓÇÖ master.
purview How To Enable Data Use Management https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-enable-data-use-management.md
Currently, a data owner can enable DUM on a data resource for these types of access policies:
-* [Data owner access policies](concept-data-owner-policies.md) - access policies authored via Microsoft Purview data policy experience.
+* [Data owner access policies](concept-policies-data-owner.md) - access policies authored via Microsoft Purview data policy experience.
* [Self-service access policies](concept-self-service-data-access-policy.md) - access policies automatically generated by Microsoft Purview after a [self-service access request](how-to-request-access.md) is approved. To be able to create any data policy on a resource, DUM must first be enabled on that resource. This article will explain how to enable DUM on your resources in Microsoft Purview.
To disable Data Use Management for a source, resource group, or subscription, a
## Next steps -- [Create data owner policies for your resources](how-to-data-owner-policy-authoring-generic.md)-- [Enable Microsoft Purview data owner policies on all data sources in a subscription or a resource group](./how-to-data-owner-policies-resource-group.md)-- [Enable Microsoft Purview data owner policies on an Azure Storage account](./how-to-data-owner-policies-storage.md)
+- [Create data owner policies for your resources](how-to-policies-data-owner-authoring-generic.md)
+- [Enable Microsoft Purview data owner policies on all data sources in a subscription or a resource group](./how-to-policies-data-owner-resource-group.md)
+- [Enable Microsoft Purview data owner policies on an Azure Storage account](./how-to-policies-data-owner-storage.md)
purview How To Link Azure Data Factory https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-link-azure-data-factory.md
Multiple Azure Data Factories can connect to a single Microsoft Purview to push
>**Collection admins** role on the root collection. > > Also, it requires the users to be the data factory's "Owner" or "Contributor".
+>
+> Your data factory needs to have system assigned managed identity enabled.
Follow the steps below to connect an existing data factory to your Microsoft Purview account. You can also [connect Data Factory to Microsoft Purview account from ADF](../data-factory/connect-data-factory-to-azure-purview.md).
purview How To Policies Data Owner Arc Sql Server https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-policies-data-owner-arc-sql-server.md
+
+ Title: Provision access by data owner for SQL Server on Azure Arc-enabled servers (preview)
+description: Step-by-step guide on how data owners can configure access to Arc-enabled SQL servers through Microsoft Purview access policies.
+++++ Last updated : 08/12/2022++
+# Provision access by data owner for SQL Server on Azure Arc-enabled servers (preview)
++
+[Access policies](concept-policies-data-owner.md) allow you to manage access from Microsoft Purview to data sources that have been registered for *Data Use Management*.
+
+This how-to guide describes how a data owner can delegate authoring policies in Microsoft Purview to enable access to SQL Server on Azure Arc-enabled servers. The following actions are currently enabled: *SQL Performance Monitoring*, *SQL Security Auditing* and *Read*. These 3 actions are only supported for policies at server level. *Modify* is not supported at this point.
+
+## Prerequisites
+- SQL server version 2022 CTP 2.0 or later running on Windows. [Follow this link](https://www.microsoft.com/sql-server/sql-server-2022)
+- Complete process to onboard that SQL server with Azure Arc and enable Azure AD Authentication. [Follow this guide to learn how](https://aka.ms/sql-on-arc-AADauth).
+
+**Enforcement of policies for this data source is available only in the following regions for Microsoft Purview**
+- East US
+- East US 2
+- South Central US
+- West US 3
+- Canada Central
+- West Europe
+- North Europe
+- UK South
+- France Central
+- UAE North
+- Central India
+- Korea Central
+- Japan East
+- Australia East
+
+## Security considerations
+- The Server admin can turn off the Microsoft Purview policy enforcement.
+- Arc Admin/Server admin permissions empower the Arc admin or Server admin with the ability to change the ARM path of the given server. Given that mappings in Microsoft Purview use ARM paths, this can lead to wrong policy enforcements.
+- SQL Admin (DBA) can gain the power of Server admin and can tamper with the cached policies from Microsoft Purview.
+- The recommended configuration is to create a separate App Registration per SQL server instance. This prevents SQL server2 from reading the policies meant for SQL server1, in case a rogue admin in SQL server2 tampers with the ARM path.
+
+## Configuration
+
+> [!Important]
+> You can assign the data source side permission (i.e., *IAM Owner*) **only** by entering Azure portal through this [special link](https://portal.azure.com/?feature.canmodifystamps=true&Microsoft_Azure_HybridData_Platform=sqlrbacmain#blade/Microsoft_Azure_HybridCompute/AzureArcCenterBlade/sqlServers). Alternatively, you can configure this permission at the parent resource group level so that it gets inherited by the "SQL Server - Azure Arc" data source.
+
+### SQL Server on Azure Arc-enabled server configuration
+This section describes the steps to configure the SQL Server on Azure Arc to use Microsoft Purview.
+
+1. Sign in to Azure portal with a [special link](https://portal.azure.com/?feature.canmodifystamps=true&Microsoft_Azure_HybridData_Platform=sqlrbacmain#blade/Microsoft_Azure_HybridCompute/AzureArcCenterBlade/sqlServers) that contains feature flags to list SQL Servers on Azure Arc
+
+1. Navigate to a SQL Server you want to configure
+
+1. Navigate to **Azure Active Directory** feature on the left pane
+
+1. Verify that Azure Active Directory Authentication is configured and scroll down.
+![Screenshot shows how to configure Microsoft Purview endpoint in Azure AD section.](./media/how-to-policies-data-owner-sql/setup-sql-on-arc-for-purview.png)
+
+1. Set **External Policy Based Authorization** to enabled
+
+1. Enter **Microsoft Purview Endpoint** in the format *https://\<purview-account-name\>.purview.azure.com*. You can see the names of Microsoft Purview accounts in your tenant through [this link](https://portal.azure.com/#blade/HubsExtension/BrowseResource/resourceType/Microsoft.Purview%2FAccounts). Optionally, you can confirm the endpoint by navigating to the Microsoft Purview account, then to the Properties section on the left menu and scrolling down until you see "Scan endpoint". The full endpoint path will be the one listed without the "/Scan" at the end.
+
+1. Make a note of the **App registration ID**, as you will need it when you register and enable this data source for *Data use Management* in Microsoft Purview.
+
+1. Select the **Save** button to save the configuration.
+
+### Register data sources in Microsoft Purview
+Register each data source with Microsoft Purview to later define access policies.
+
+1. Sign in to Microsoft Purview Studio.
+
+1. Navigate to the **Data map** feature on the left pane, select **Sources**, then select **Register**. Type "Azure Arc" in the search box and select **SQL Server on Azure Arc**. Then select **Continue**
+![Screenshot shows how to select a source for registration.](./media/how-to-policies-data-owner-sql/select-arc-sql-server-for-registration.png)
+
+1. Enter a **Name** for this registration. It is best practice to make the name of the registration the same as the server name in the next step.
+
+1. select an **Azure subscription**, **Server name** and **Server endpoint**.
+
+1. **Select a collection** to put this registration in.
+
+1. Enable Data Use Management. Data Use Management needs certain permissions and can affect the security of your data, as it delegates to certain Microsoft Purview roles to manage access to the data sources. **Go through the secure practices related to Data Use Management in this guide**: [How to enable Data Use Management]
+(./how-to-enable-data-use-management.md)
+
+1. Upon enabling Data Use Management, Microsoft Purview will automatically capture the **Application ID** of the App Registration related to this Arc-enabled SQL server. Come back to this screen and hit the refresh button on the side of it to refresh, in case the association between the Arc-enabled SQL server and the App Registration changes in the future.
+
+1. Select **Register** or **Apply** at the bottom
+
+Once your data source has the **Data Use Management** toggle *Enabled*, it will look like this picture.
+![Screenshot shows how to register a data source for policy.](./media/how-to-policies-data-owner-sql/register-data-source-for-policy-arc-sql.png)
+
+> [!Note]
+> - If you want to create a policy on a resource group or subscription and have it enforced in Arc-enabled SQL servers, you will need to also register those servers independently for *Data use management* to provide their App ID. See this document on how to create policies at resource group or subscription level: [Enable Microsoft Purview data owner policies on all data sources in a subscription or a resource group](./how-to-policies-data-owner-resource-group.md).
+
+## Create and publish a data owner policy
+
+Execute the steps in the **Create a new policy** and **Publish a policy** sections of the [data-owner policy authoring tutorial](./how-to-policies-data-owner-authoring-generic.md#create-a-new-policy). The result will be a data owner policy similar to one of the examples shown in the images.
+
+**Example #1: SQL Performance Monitor policy**. This policy assigns the Azure AD principal 'Christie Cline' to the *SQL Performance monitoring* action, in the scope of Arc-enabled SQL server *DESKTOP-xxx*. This policy has also been published to that server. Note: Policies related to this action are not supported below server level.
+
+![Screenshot shows a sample data owner policy giving SQL Performance Monitor access to an Azure SQL Database.](./media/how-to-policies-data-owner-sql/data-owner-policy-example-arc-sql-server-performance-monitor.png)
+
+**Example #2: SQL Security Auditor policy**. Similar to example 1, but choose the *SQL Security auditing* action (instead of *SQL Performance monitoring*), when authoring the policy. Note: Policies related to this action are not supported below server level.
+
+**Example #3: Read policy**. This policy assigns the Azure AD principal 'sg-Finance' to the *SQL Data reader* action, in the scope of SQL server *DESKTOP-xxx*. This policy has also been published to that server. Note: Policies related to this action are not supported below server level.
+
+![Screenshot shows a sample data owner policy giving Data Reader access to an Azure SQL Database.](./media/how-to-policies-data-owner-sql/data-owner-policy-example-arc-sql-server-data-reader.png)
+
+> [!Note]
+> - Given that scan is not currently available for this data source, data reader policies can only be created at server level. Use the **Data sources** box instead of the Asset box when authoring the **data resources** part of the policy.
+> - There is a know issue with SQL Server Management Studio that prevents right-clicking on a table and choosing option ΓÇ£Select Top 1000 rowsΓÇ¥.
++
+>[!Important]
+> - Publish is a background operation. It can take up to **5 minutes** for the changes to be reflected in this data source.
+> - Changing a policy does not require a new publish operation. The changes will be picked up with the next pull.
+
+### Test the policy
+
+The Azure AD Accounts referenced in the access policies should now be able to connect to any database in the server to which the policies are published.
+
+#### Force policy download
+It is possible to force an immediate download of the latest published policies to the current SQL database by running the following command. The minimal permission required to run it is membership in ##MS_ServerStateManager##-server role.
+
+```sql
+-- Force immediate download of latest published policies
+exec sp_external_policy_refresh reload
+```
+
+#### Analyze downloaded policy state from SQL
+The following DMVs can be used to analyze which policies have been downloaded and are currently assigned to Azure AD accounts. The minimal permission required to run them is VIEW DATABASE SECURITY STATE - or assigned Action Group *SQL Security Auditor*.
+
+```sql
+
+-- Lists generally supported actions
+SELECT * FROM sys.dm_server_external_policy_actions
+
+-- Lists the roles that are part of a policy published to this server
+SELECT * FROM sys.dm_server_external_policy_roles
+
+-- Lists the links between the roles and actions, could be used to join the two
+SELECT * FROM sys.dm_server_external_policy_role_actions
+
+-- Lists all Azure AD principals that were given connect permissions
+SELECT * FROM sys.dm_server_external_policy_principals
+
+-- Lists Azure AD principals assigned to a given role on a given resource scope
+SELECT * FROM sys.dm_server_external_policy_role_members
+
+-- Lists Azure AD principals, joined with roles, joined with their data actions
+SELECT * FROM sys.dm_server_external_policy_principal_assigned_actions
+```
+++
+## Additional information
+
+### Policy action mapping
+
+This section contains a reference of how actions in Microsoft Purview data policies map to specific actions in SQL Server on Azure Arc-enabled servers.
+
+| **Microsoft Purview policy action** | **Data source specific actions** |
+|-|--|
+| | |
+| *Read* |Microsoft.Sql/sqlservers/Connect |
+||Microsoft.Sql/sqlservers/databases/Connect |
+||Microsoft.Sql/Sqlservers/Databases/Schemas/Tables/Rows|
+||Microsoft.Sql/Sqlservers/Databases/Schemas/Views/Rows |
+|||
+| *SQL Performance Monitor* |Microsoft.Sql/sqlservers/Connect |
+||Microsoft.Sql/sqlservers/databases/Connect |
+||Microsoft.Sql/sqlservers/SystemViewsAndFunctions/ServerMetadata/rows/select |
+||Microsoft.Sql/sqlservers/databases/SystemViewsAndFunctions/DatabaseMetadata/rows/select |
+||Microsoft.Sql/sqlservers/SystemViewsAndFunctions/ServerState/rows/select |
+||Microsoft.Sql/sqlservers/databases/SystemViewsAndFunctions/DatabaseState/rows/select |
+|||
+| *SQL Security Auditor* |Microsoft.Sql/sqlservers/Connect |
+||Microsoft.Sql/sqlservers/databases/Connect |
+||Microsoft.Sql/sqlservers/SystemViewsAndFunctions/ServerSecurityState/rows/select |
+||Microsoft.Sql/sqlservers/databases/SystemViewsAndFunctions/DatabaseSecurityState/rows/select |
+||Microsoft.Sql/sqlservers/SystemViewsAndFunctions/ServerSecurityMetadata/rows/select |
+||Microsoft.Sql/sqlservers/databases/SystemViewsAndFunctions/DatabaseSecurityMetadata/rows/select |
+|||
+
+## Next steps
+Check blog, demo and related how-to guides
+* [Demo of access policy for Azure Storage](https://learn-video.azurefd.net/vod/player?id=caa25ad3-7927-4dcc-88dd-6b74bcae98a2)
+* [Concepts for Microsoft Purview data owner policies](./concept-policies-data-owner.md)
+* Blog: [Private preview: controlling access to Azure SQL at scale with policies in Purview](https://techcommunity.microsoft.com/t5/azure-sql-blog/private-preview-controlling-access-to-azure-sql-at-scale-with/ba-p/2945491)
+* [Enable Microsoft Purview data owner policies on all data sources in a subscription or a resource group](./how-to-policies-data-owner-resource-group.md)
+* [Enable Microsoft Purview data owner policies on an Azure SQL DB](./how-to-policies-data-owner-azure-sql-db.md)
purview How To Policies Data Owner Authoring Generic https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-policies-data-owner-authoring-generic.md
+
+ Title: Authoring and publishing data owner access policies (preview)
+description: Step-by-step guide on how a data owner can author and publish access policies in Microsoft Purview
++++++ Last updated : 05/27/2022++
+# Authoring and publishing data owner access policies (Preview)
++
+Access policies allow a data owner to delegate in Microsoft Purview access management to a data source. These policies can be authored directly in the Microsoft Purview governance portal, and after publishing, they get enforced by the data source. This tutorial describes how to create, update, and publish access policies in the Microsoft Purview governance portal.
+
+## Prerequisites
+
+## Configuration
+
+### Data source configuration
+
+Before authoring data policies in the Microsoft Purview governance portal, you'll need to configure the data sources so that they can enforce those policies.
+
+1. Follow any policy-specific prerequisites for your source. Check the [Microsoft Purview supported data sources table](microsoft-purview-connector-overview.md) and select the link in the **Access Policy** column for sources where access policies are available. Follow any steps listed in the Access policy or Prerequisites sections.
+1. Register the data source in Microsoft Purview. Follow the **Prerequisites** and **Register** sections of the [source pages](microsoft-purview-connector-overview.md) for your resources.
+1. [Enable the Data Use Management toggle on the data source](how-to-enable-data-use-management.md#enable-data-use-management). Additional permissions for this step are described in the linked document.
+
+## Create a new policy
+
+This section describes the steps to create a new policy in Microsoft Purview.
+Ensure you have the *Policy Author* permission as described [here](#permissions-for-policy-authoring-and-publishing)
+
+1. Sign in to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
+
+1. Navigate to the **Data policy** feature using the left side panel. Then select **Data policies**.
+
+1. Select the **New Policy** button in the policy page.
+
+ :::image type="content" source="./media/access-policies-common/policy-onboard-guide-1.png" alt-text="Data owner can access the Policy functionality in Microsoft Purview when it wants to create policies.":::
+
+1. The new policy page will appear. Enter the policy **Name** and **Description**.
+
+1. To add policy statements to the new policy, select the **New policy statement** button. This will bring up the policy statement builder.
+
+ :::image type="content" source="./media/access-policies-common/create-new-policy.png" alt-text="Data owner can create a new policy statement.":::
+
+1. Select the **Effect** button and choose *Allow* from the drop-down list.
+
+1. Select the **Action** button and choose *Read* or *Modify* from the drop-down list.
+
+1. Select the **Data Resources** button to bring up the window to enter Data resource information, which will open to the right.
+
+1. Under the **Data Resources** Panel do **one of two things** depending on the granularity of the policy:
+ - To create a broad policy statement that covers an entire data source, resource group, or subscription that was previously registered, use the **Data sources** box and select its **Type**.
+ - To create a fine-grained policy, use the **Assets** box instead. Enter the **Data Source Type** and the **Name** of a previously registered and scanned data source. See example in the image.
+
+ :::image type="content" source="./media/access-policies-common/select-data-source-type.png" alt-text="Data owner can select a Data Resource when editing a policy statement.":::
+
+1. Select the **Continue** button and transverse the hierarchy to select and underlying data-object (for example: folder, file, etc.). Select **Recursive** to apply the policy from that point in the hierarchy down to any child data-objects. Then select the **Add** button. This will take you back to the policy editor.
+
+ :::image type="content" source="./media/access-policies-common/select-asset.png" alt-text="Data owner can select the asset when creating or editing a policy statement.":::
+
+1. Select the **Subjects** button and enter the subject identity as a principal, group, or MSI. Then select the **OK** button. This will take you back to the policy editor
+
+ :::image type="content" source="./media/access-policies-common/select-subject.png" alt-text="Data owner can select the subject when creating or editing a policy statement.":::
+
+1. Repeat the steps #5 to #11 to enter any more policy statements.
+
+1. Select the **Save** button to save the policy.
+
+Now that you have created your policy, you will need to publish it for it to become active.
+
+## Publish a policy
+A newly created policy is in the **draft** state. The process of publishing associates the new policy with one or more data sources under governance. This is called "binding" a policy to a data source.
+
+Ensure you have the *Data Source Admin* permission as described [here](#permissions-for-policy-authoring-and-publishing)
+
+The steps to publish a policy are as follows:
+
+1. Sign in to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
+
+1. Navigate to the **Data policy** feature using the left side panel. Then select **Data policies**.
+
+ :::image type="content" source="./media/access-policies-common/policy-onboard-guide-2.png" alt-text="Data owner can access the Policy functionality in Microsoft Purview when it wants to update a policy by selecting 'Data policies'.":::
+
+1. The Policy portal will present the list of existing policies in Microsoft Purview. Locate the policy that needs to be published. Select the **Publish** button on the right top corner of the page.
+
+ :::image type="content" source="./media/access-policies-common/publish-policy.png" alt-text="Data owner can publish a policy.":::
+
+1. A list of data sources is displayed. You can enter a name to filter the list. Then, select each data source where this policy is to be published and then select the **Publish** button.
+
+ :::image type="content" source="./media/access-policies-common/select-data-sources-publish-policy.png" alt-text="Data owner can select the data source where the policy will be published.":::
+
+>[!Note]
+> After making changes to a policy, there is no need to publish it again for it to take effect if the data source(s) continues to be the same.
+
+## Update or delete a policy
+
+Steps to update or delete a policy in Microsoft Purview are as follows.
+Ensure you have the *Policy Author* permission as described [here](#permissions-for-policy-authoring-and-publishing)
+
+1. Sign in to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
+
+1. Navigate to the **Data policy** feature using the left side panel. Then select **Data policies**.
+
+ :::image type="content" source="./media/access-policies-common/policy-onboard-guide-2.png" alt-text="Data owner can access the Policy functionality in Microsoft Purview when it wants to update a policy.":::
+
+1. The Policy portal will present the list of existing policies in Microsoft Purview. Select the policy that needs to be updated.
+
+1. The policy details page will appear, including Edit and Delete options. Select the **Edit** button, which brings up the policy statement builder. Now, any parts of the statements in this policy can be updated. To delete the policy, use the **Delete** button.
+
+ :::image type="content" source="./media/access-policies-common/edit-policy.png" alt-text="Data owner can edit or delete a policy statement.":::
+
+## Next steps
+
+For specific guides on creating policies, you can follow these tutorials:
+
+- [Enable Microsoft Purview data owner policies on all data sources in a subscription or a resource group](./how-to-policies-data-owner-resource-group.md)
+- [Enable Microsoft Purview data owner policies on an Azure Storage account](./how-to-policies-data-owner-storage.md)
purview How To Policies Data Owner Azure Sql Db https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-policies-data-owner-azure-sql-db.md
+
+ Title: Provision access by data owner for Azure SQL DB (preview)
+description: Step-by-step guide on how data owners can configure access for Azure SQL DB through Microsoft Purview access policies.
+++++ Last updated : 08/12/2022++
+# Provision access by data owner for Azure SQL DB (preview)
++
+[Access policies](concept-policies-data-owner.md) allow you to manage access from Microsoft Purview to data sources that have been registered for *Data Use Management*.
+
+This how-to guide describes how a data owner can delegate authoring policies in Microsoft Purview to enable access to Azure SQL DB. The following actions are currently enabled: *SQL Performance Monitoring*, *SQL Security Auditing* and *Read*. The first two actions are supported only at server level. *Modify* is not supported at this point.
+
+## Prerequisites
+
+## Microsoft Purview Configuration
+
+### Register the data sources in Microsoft Purview
+The Azure SQL DB resources need to be registered first with Microsoft Purview to later define access policies. You can follow these guides:
+
+[Register and scan Azure SQL DB](./register-scan-azure-sql-database.md)
+
+After you've registered your resources, you'll need to enable Data Use Management. Data Use Management can affect the security of your data, as it delegates to certain Microsoft Purview roles to manage access to the data sources. **Go through the secure practices related to Data Use Management in this guide**:
+
+[How to enable Data Use Management](./how-to-enable-data-use-management.md)
+
+Once your data source has the **Data Use Management** toggle *Enabled*, it will look like this picture. This will enable the access policies to be used with the given SQL server and all its contained databases.
+![Screenshot shows how to register a data source for policy.](./media/how-to-policies-data-owner-sql/register-data-source-for-policy-azure-sql-db.png)
++
+## Create and publish a data owner policy
+
+Execute the steps in the **Create a new policy** and **Publish a policy** sections of the [data-owner policy authoring tutorial](./how-to-policies-data-owner-authoring-generic.md#create-a-new-policy). The result will be a data owner policy similar to one of the examples shown in the images.
+
+**Example #1: SQL Performance Monitor policy**. This policy assigns the Azure AD principal 'Mateo Gomez' to the *SQL Performance monitoring* action, in the scope of SQL server *relecloud-sql-srv2*. This policy has also been published to that server. Note: Policies related to this action are not supported below server level.
+
+![Screenshot shows a sample data owner policy giving SQL Performance Monitor access to an Azure SQL Database.](./media/how-to-policies-data-owner-sql/data-owner-policy-example-azure-sql-db-performance-monitor.png)
+
+**Example #2: SQL Security Auditor policy**. Similar to example 1, but choose the *SQL Security auditing* action (instead of *SQL Performance monitoring*), when authoring the policy. Note: Policies related to this action are not supported below server level.
+
+**Example #3: Read policy**. This policy assigns the Azure AD principal 'Robert Murphy' to the *SQL Data reader* action, in the scope of SQL server *relecloud-sql-srv2*. This policy has also been published to that server. Note: Policies related to this action are supported below server level (e.g., database, table)
+
+![Screenshot shows a sample data owner policy giving Data Reader access to an Azure SQL Database.](./media/how-to-policies-data-owner-sql/data-owner-policy-example-azure-sql-db-data-reader.png)
++
+>[!Important]
+> - Publish is a background operation. It can take up to **5 minutes** for the changes to be reflected in this data source.
+> - Changing a policy does not require a new publish operation. The changes will be picked up with the next pull.
+
+### Test the policy
+
+The Azure AD Accounts referenced in the access policies should now be able to connect to any database in the server to which the policies are published.
+
+#### Force policy download
+It is possible to force an immediate download of the latest published policies to the current SQL database by running the following command. The minimal permission required to run it is membership in ##MS_ServerStateManager##-server role.
+
+```sql
+-- Force immediate download of latest published policies
+exec sp_external_policy_refresh reload
+```
+
+#### Analyze downloaded policy state from SQL
+The following DMVs can be used to analyze which policies have been downloaded and are currently assigned to Azure AD accounts. The minimal permission required to run them is VIEW DATABASE SECURITY STATE - or assigned Action Group *SQL Security Auditor*.
+
+```sql
+
+-- Lists generally supported actions
+SELECT * FROM sys.dm_server_external_policy_actions
+
+-- Lists the roles that are part of a policy published to this server
+SELECT * FROM sys.dm_server_external_policy_roles
+
+-- Lists the links between the roles and actions, could be used to join the two
+SELECT * FROM sys.dm_server_external_policy_role_actions
+
+-- Lists all Azure AD principals that were given connect permissions
+SELECT * FROM sys.dm_server_external_policy_principals
+
+-- Lists Azure AD principals assigned to a given role on a given resource scope
+SELECT * FROM sys.dm_server_external_policy_role_members
+
+-- Lists Azure AD principals, joined with roles, joined with their data actions
+SELECT * FROM sys.dm_server_external_policy_principal_assigned_actions
+```
+
+## Additional information
+
+### Policy action mapping
+
+This section contains a reference of how actions in Microsoft Purview data policies map to specific actions in Azure SQL DB.
+
+| **Microsoft Purview policy action** | **Data source specific actions** |
+|-|--|
+| | |
+| *Read* |Microsoft.Sql/sqlservers/Connect |
+||Microsoft.Sql/sqlservers/databases/Connect |
+||Microsoft.Sql/Sqlservers/Databases/Schemas/Tables/Rows|
+||Microsoft.Sql/Sqlservers/Databases/Schemas/Views/Rows |
+|||
+| *SQL Performance Monitor* |Microsoft.Sql/sqlservers/Connect |
+||Microsoft.Sql/sqlservers/databases/Connect |
+||Microsoft.Sql/sqlservers/SystemViewsAndFunctions/ServerMetadata/rows/select |
+||Microsoft.Sql/sqlservers/databases/SystemViewsAndFunctions/DatabaseMetadata/rows/select |
+||Microsoft.Sql/sqlservers/SystemViewsAndFunctions/ServerState/rows/select |
+||Microsoft.Sql/sqlservers/databases/SystemViewsAndFunctions/DatabaseState/rows/select |
+|||
+| *SQL Security Auditor* |Microsoft.Sql/sqlservers/Connect |
+||Microsoft.Sql/sqlservers/databases/Connect |
+||Microsoft.Sql/sqlservers/SystemViewsAndFunctions/ServerSecurityState/rows/select |
+||Microsoft.Sql/sqlservers/databases/SystemViewsAndFunctions/DatabaseSecurityState/rows/select |
+||Microsoft.Sql/sqlservers/SystemViewsAndFunctions/ServerSecurityMetadata/rows/select |
+||Microsoft.Sql/sqlservers/databases/SystemViewsAndFunctions/DatabaseSecurityMetadata/rows/select |
+|||
+
+## Next steps
+Check blog, demo and related how-to guides
+* [Demo of access policy for Azure Storage](https://learn-video.azurefd.net/vod/player?id=caa25ad3-7927-4dcc-88dd-6b74bcae98a2)
+* [Concepts for Microsoft Purview data owner policies](./concept-policies-data-owner.md)
+* Blog: [Microsoft Purview Data Policy for SQL DevOps access provisioning now in public preview](https://techcommunity.microsoft.com/t5/microsoft-purview-blog/microsoft-purview-data-policy-for-sql-devops-access-provisioning/ba-p/3403174)
+* Blog: [Controlling access to Azure SQL at scale with policies in Purview](https://techcommunity.microsoft.com/t5/azure-sql-blog/private-preview-controlling-access-to-azure-sql-at-scale-with/ba-p/2945491)
+* [Enable Microsoft Purview data owner policies on all data sources in a subscription or a resource group](./how-to-policies-data-owner-resource-group.md)
+* [Enable Microsoft Purview data owner policies on an Arc-enabled SQL Server](./how-to-policies-data-owner-arc-sql-server.md)
purview How To Policies Data Owner Resource Group https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-policies-data-owner-resource-group.md
+
+ Title: Resource group and subscription access provisioning by data owner (preview)
+description: Step-by-step guide showing how a data owner can create access policies to resource groups or subscriptions.
+++++ Last updated : 05/27/2022+++
+# Resource group and subscription access provisioning by data owner (Preview)
+
+[Access policies](concept-policies-data-owner.md) allow you to manage access from Microsoft Purview to data sources that have been registered for *Data Use Management*.
+
+You can also [register an entire resource group or subscription](register-scan-azure-multiple-sources.md), and create a single policy that will manage access to **all** data sources in that resource group or subscription. That single policy will cover all existing data sources and any data sources that are created afterwards. This article describes how this is done.
+
+## Prerequisites
+
+**Only these data sources are enabled for access policies on resource group or subscription**. Follow the **Prerequisites** section that is specific to the data source(s) in these guides:
+* [Data owner policies on an Azure Storage account](./how-to-policies-data-owner-storage.md#prerequisites)
+* [Data owner policies on an Azure SQL Database](./how-to-policies-data-owner-azure-sql-db.md#prerequisites)*
+* [Data owner policies on an Arc-enabled SQL Server](./how-to-policies-data-owner-arc-sql-server.md#prerequisites)*
+
+(*) Only the *SQL Performance monitoring* and *Security auditing* actions are fully supported for SQL-type data sources. The *Read* action needs a workaround described later in this guide. The *Modify* action is not currently supported for SQL-type data sources.
+
+## Configuration
+
+### Register the subscription or resource group for Data Use Management
+The subscription or resource group needs to be registered with Microsoft Purview to later define access policies.
+
+To register your subscription or resource group, follow the **Prerequisites** and **Register** sections of this guide:
+
+- [Register multiple sources in Microsoft Purview](register-scan-azure-multiple-sources.md#prerequisites)
+
+After you've registered your resources, you'll need to enable Data Use Management. Data Use Management needs certain permissions and can affect the security of your data, as it delegates to certain Microsoft Purview roles to manage access to the data sources. **Go through the secure practices related to Data Use Management in this guide**: [How to enable Data Use Management](./how-to-enable-data-use-management.md)
+
+In the end, your resource will have the **Data Use Management** toggle **Enabled**, as shown in the picture:
+
+![Screenshot shows how to register a resource group or subscription for policy by toggling the enable tab in the resource editor.](./media/how-to-policies-data-owner-resource-group/register-resource-group-for-policy.png)
+
+>[!Important]
+> - If you want to create a policy on a resource group or subscription and have it enforced in Arc-enabled SQL servers, you will need to also register those servers independently for *Data use management* to provide their App ID.
+
+## Create and publish a data owner policy
+Execute the steps in the **Create a new policy** and **Publish a policy** sections of the [data-owner policy authoring tutorial](./how-to-policies-data-owner-authoring-generic.md#create-a-new-policy). The result will be a data owner policy similar to the example shown in the image: a policy that provides security group *sg-Finance* *modify* access to resource group *finance-rg*. Use the Data source box in the Policy user experience.
+
+![Screenshot shows a sample data owner policy giving access to a resource group.](./media/how-to-policies-data-owner-resource-group/data-owner-policy-example-resource-group.png)
+
+>[!Important]
+> - Publish is a background operation. For example, Azure Storage accounts can take up to **2 hours** to reflect the changes.
+> - Changing a policy does not require a new publish operation. The changes will be picked up with the next pull.
+
+>[!Warning]
+> **Known Issues**
+> - No implicit connect permission is provided to SQL type data sources (e.g.: Azure SQL DB, SQL server on Azure Arc-enabled servers) when creating a policy with *Read* action on a resource group or subscription. To support this scenario, provide the connect permission to the Azure AD principals locally, i.e. directly in the SQL-type data sources.
+
+## Additional information
+- Creating a policy at subscription or resource group level will enable the Subjects to access Azure Storage system containers, for example, *$logs*. If this is undesired, first scan the data source and then create finer-grained policies for each (that is, at container or sub-container level).
+
+### Limits
+The limit for Microsoft Purview policies that can be enforced by Storage accounts is 100 MB per subscription, which roughly equates to 5000 policies.
+
+## Next steps
+Check blog, demo and related tutorials:
+
+* [Concepts for Microsoft Purview data owner policies](./concept-policies-data-owner.md)
+* [Blog: resource group-level governance can significantly reduce effort](https://techcommunity.microsoft.com/t5/azure-purview-blog/data-policy-features-resource-group-level-governance-can/ba-p/3096314)
+* [Video: Demo of data owner access policies for Azure Storage](https://learn-video.azurefd.net/vod/player?id=caa25ad3-7927-4dcc-88dd-6b74bcae98a2)
purview How To Policies Data Owner Storage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-policies-data-owner-storage.md
+
+ Title: Access provisioning by data owner to Azure Storage datasets (preview)
+description: Step-by-step guide showing how data owners can create access policies to datasets in Azure Storage
++++++ Last updated : 8/11/2022++
+# Access provisioning by data owner to Azure Storage datasets (Preview)
++
+[Access policies](concept-policies-data-owner.md) allow you to manage access from Microsoft Purview to data sources that have been registered for *Data Use Management*.
+
+This article describes how a data owner can delegate in Microsoft Purview management of access to Azure Storage datasets. Currently, these two Azure Storage sources are supported:
+
+- Blob storage
+- Azure Data Lake Storage (ADLS) Gen2
+
+## Prerequisites
++
+## Configuration
+
+### Register the data sources in Microsoft Purview for Data Use Management
+The Azure Storage resources need to be registered first with Microsoft Purview to later define access policies.
+
+To register your resources, follow the **Prerequisites** and **Register** sections of these guides:
+
+- [Register and scan Azure Storage Blob - Microsoft Purview](register-scan-azure-blob-storage-source.md#prerequisites)
+
+- [Register and scan Azure Data Lake Storage (ADLS) Gen2 - Microsoft Purview](register-scan-adls-gen2.md#prerequisites)
+
+After you've registered your resources, you'll need to enable Data Use Management. Data Use Management needs certain permissions and can affect the security of your data, as it delegates to certain Microsoft Purview roles to manage access to the data sources. **Go through the secure practices related to Data Use Management in this guide**: [How to enable Data Use Management](./how-to-enable-data-use-management.md)
+
+Once your data source has the **Data Use Management** toggle **Enabled**, it will look like this picture:
++
+## Create and publish a data owner policy
+Execute the steps in the **Create a new policy** and **Publish a policy** sections of the [data-owner policy authoring tutorial](./how-to-policies-data-owner-authoring-generic.md#create-a-new-policy). The result will be a data owner policy similar to the example shown in the image: a policy that provides group *Contoso Team* *read* access to Storage account *marketinglake1*:
++
+>[!Important]
+> - Publish is a background operation. Azure Storage accounts can take up to **2 hours** to reflect the changes.
+
+## Data Consumption
+- Data consumer can access the requested dataset using tools such as PowerBI or Azure Synapse Analytics workspace.
+- Sub-container access: Policy statements set below container level on a Storage account are supported. However, users will not be able to browse to the data asset using Azure Portal's Storage Browser or Microsoft Azure Storage Explorer tool if access is granted only at file or folder level of the Azure Storage account. This is because these apps attempt to crawl down the hierarchy starting at container level, and the request fails because no access has been granted at that level. Instead, the App that requests the data must execute a direct access by providing a fully qualified name to the data object. The following documents show examples of how to perform a direct access. See also the blogs in the *Next steps* section of this how-to-guide.
+ - [*abfs* for ADLS Gen2](../hdinsight/hdinsight-hadoop-use-data-lake-storage-gen2.md#access-files-from-the-cluster)
+ - [*az storage blob download* for Blob Storage](../storage/blobs/storage-quickstart-blobs-cli.md#download-a-blob)
+
+## Additional information
+- Creating a policy at Storage account level will enable the Subjects to access system containers, for example *$logs*. If this is undesired, first scan the data source(s) and then create finer-grained policies for each (that is, at container or subcontainer level).
+- The root blob in a container will be accessible to the Azure AD principals in a Microsoft Purview *allow*-type RBAC policy if the scope of such policy is either subscription, resource group, Storage account or container in Storage account.
+- The root container in a Storage account will be accessible to the Azure AD principals in a Microsoft Purview *allow*-type RBAC policy if the scope of such policy is either subscription, resource group, or Storage account.
+
+### Limits
+- The limit for Microsoft Purview policies that can be enforced by Storage accounts is 100 MB per subscription, which roughly equates to 5000 policies.
+
+### Known issues
+
+**Known issues** related to Policy creation
+- Do not create policy statements based on Microsoft Purview resource sets. Even if displayed in Microsoft Purview policy authoring UI, they are not yet enforced. Learn more about [resource sets](concept-resource-sets.md).
+
+### Policy action mapping
+
+This section contains a reference of how actions in Microsoft Purview data policies map to specific actions in Azure Storage.
+
+| **Microsoft Purview policy action** | **Data source specific actions** |
+||--|
+|||
+| *Read* |Microsoft.Storage/storageAccounts/blobServices/containers/read |
+| |Microsoft.Storage/storageAccounts/blobServices/containers/blobs/read |
+|||
+| *Modify* |Microsoft.Storage/storageAccounts/blobServices/containers/blobs/read |
+| |Microsoft.Storage/storageAccounts/blobServices/containers/blobs/write |
+| |Microsoft.Storage/storageAccounts/blobServices/containers/blobs/add/action |
+| |Microsoft.Storage/storageAccounts/blobServices/containers/blobs/move/action |
+| |Microsoft.Storage/storageAccounts/blobServices/containers/blobs/delete |
+| |Microsoft.Storage/storageAccounts/blobServices/containers/read |
+| |Microsoft.Storage/storageAccounts/blobServices/containers/write |
+| |Microsoft.Storage/storageAccounts/blobServices/containers/delete |
+|||
++
+## Next steps
+Check blog, demo and related tutorials:
+
+* [Demo of access policy for Azure Storage](https://learn-video.azurefd.net/vod/player?id=caa25ad3-7927-4dcc-88dd-6b74bcae98a2)
+* [Concepts for Microsoft Purview data owner policies](./concept-policies-data-owner.md)
+* [Enable Microsoft Purview data owner policies on all data sources in a subscription or a resource group](./how-to-policies-data-owner-resource-group.md)
+* [Blog: What's New in Microsoft Purview at Microsoft Ignite 2021](https://techcommunity.microsoft.com/t5/azure-purview/what-s-new-in-azure-purview-at-microsoft-ignite-2021/ba-p/2915954)
+* [Blog: Accessing data when folder level permission is granted](https://techcommunity.microsoft.com/t5/azure-purview-blog/data-policy-features-accessing-data-when-folder-level-permission/ba-p/3109583)
+* [Blog: Accessing data when file level permission is granted](https://techcommunity.microsoft.com/t5/azure-purview-blog/data-policy-features-accessing-data-when-file-level-permission/ba-p/3102166)
purview How To Workflow Manage Requests Approvals https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-workflow-manage-requests-approvals.md
Previously updated : 03/09/2022 Last updated : 08/22/2022
Select the request to take action.
1. Select the correct status, add any comments, and select **Confirm**.
-### Re-assign requests
+### Reassign requests
-You can re-assign requests both approvals and tasks which are assigned to you to a different user.
+You can reassign requests both approvals and tasks that are assigned to you to a different user.
-1. To re-assign, select request or task you are assigned and click on **Reassign** in the following window.
+1. To reassign, select the request or task you're assigned and select **Reassign** in the following window.
- :::image type="content" source="./media/how-to-workflow-manage-requests-approval/reassign-button.png" alt-text="Screenshot showing the task selected and the Respond page is open, with details, a status, and a place for comments and re-assign button.":::
+ :::image type="content" source="./media/how-to-workflow-manage-requests-approval/reassign-button.png" alt-text="Screenshot showing the task selected and the Respond page is open, with details, a status, and a place for comments and reassign button.":::
-1. You will be not presented with a list of all the users who are assigned to the request. Click on **Assignee** where your user name or the group you are part of will be displayed and change it from your user name to the new user name. Click **Save** to complete the re-assignment.
+1. You'll be presented with a list of all the users who are assigned to the request. Select **Assignee** where your user name or the group you're part of is displayed and change it from your user name to the new user name. Select **Save** to complete the reassignment.
- :::image type="content" source="./media/how-to-workflow-manage-requests-approval/reassign-user.png" alt-text="Screenshot showing the request selected and the re-assign user.":::
+ :::image type="content" source="./media/how-to-workflow-manage-requests-approval/reassign-user.png" alt-text="Screenshot showing the request selected and the reassign user.":::
> [!NOTE] > You can only re-assign your user id or group you are part of to another user or group. The other assignees will be greyed out and will not available for re-assignment.
Select the request to see the status and the outcomes for each approver/task own
### Cancel workflows
-You can cancel a submitted request and it's underlying workflow by clicking on **Cancel request and it's underlying workflow run**.
+You can cancel a submitted request and its underlying workflow by selecting **Cancel request and its underlying workflow run**.
:::image type="content" source="./media/how-to-workflow-manage-requests-approval/cancel-workflow-request.png" alt-text="Screenshot with the requests and approvals page shown on the 'My pending requests' tab, with cancel request."::: > [!NOTE]
- > You can only cancel workflows which are in progress. When you cancel a request from requests and approvals section, it will cancel underlying workflow run.
+ > You can only cancel workflows that are in progress. When you cancel a request from the **requests and approvals** section, it will cancel the underlying workflow run.
## History
purview How To Workflow Manage Runs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-workflow-manage-runs.md
This article outlines how to manage workflows that are already running.
:::image type="content" source="./media/how-to-workflow-manage-runs/select-stages.png" alt-text="Screenshot of the workflow runs page, with the workflow details page overlaid. Some workflow run actions in the request timeline have been expanded to show more information and sub steps.":::
-1. You can cancel a running workflow by clicking on **Cancel workflow run**.
+1. You can cancel a running workflow by selecting **Cancel workflow run**.
- :::image type="content" source="./media/how-to-workflow-manage-runs/cancel-workflows.png" alt-text="Screenshot of the workflow runs page, with the workflow details page overlaid and cancel button to cancel the workflow run.":::
+ :::image type="content" source="./media/how-to-workflow-manage-runs/cancel-workflows-inline.png" alt-text="Screenshot of the workflow runs page, with the workflow details page overlaid and cancel button to cancel the workflow run." lightbox="./media/how-to-workflow-manage-runs/cancel-workflows.png":::
> [!NOTE]
- > You can only cancel workflows which are in progress.
+ > You can only cancel workflows that are in progress.
## Next steps
purview Microsoft Purview Connector Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/microsoft-purview-connector-overview.md
The table below shows the supported capabilities for each data source. Select th
|| [SAP HANA](register-scan-sap-hana.md) | [Yes](register-scan-sap-hana.md#register) | No | No | No | No | || [Snowflake](register-scan-snowflake.md) | [Yes](register-scan-snowflake.md#register) | No | [Yes](register-scan-snowflake.md#lineage) | No | No | || [SQL Server](register-scan-on-premises-sql-server.md)| [Yes](register-scan-on-premises-sql-server.md#register) |[Yes](register-scan-on-premises-sql-server.md#scan) | No* | No| No |
-|| SQL Server on Azure-Arc| No |No | No |[Yes (Preview)](how-to-data-owner-policies-arc-sql-server.md) | No |
+|| SQL Server on Azure-Arc| No |No | No |[Yes (Preview)](how-to-policies-data-owner-arc-sql-server.md) | No |
|| [Teradata](register-scan-teradata-source.md)| [Yes](register-scan-teradata-source.md#register)| [Yes](register-scan-teradata-source.md#scan)| [Yes*](register-scan-teradata-source.md#lineage) | No| No | |File|[Amazon S3](register-scan-amazon-s3.md)|[Yes](register-scan-amazon-s3.md)| [Yes](register-scan-amazon-s3.md)| Limited* | No| No | ||[HDFS](register-scan-hdfs.md)|[Yes](register-scan-hdfs.md)| [Yes](register-scan-hdfs.md)| No | No| No |
purview Quickstart Data Share Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/quickstart-data-share-dotnet.md
+
+ Title: 'Quickstart: Share data using the .NET SDK'
+description: This article will guide you through sharing and receiving data using Microsoft Purview Data Sharing account through the .NET SDK.
++++
+ms.devlang: csharp
+ Last updated : 08/17/2022++
+# Quickstart: Share and receive data with the Microsoft Purview Data Sharing .NET SDK
++
+In this quickstart, you'll use the .NET SDK provides to share data and receive shares from Azure Data Lake Storage (ADLS Gen2) or Blob storage accounts. The article includes code snippets that will allow you to share and receive data using Microsoft Purview Data Sharing.
+
+For an overview of how data sharing works, watch this short [demo](https://aka.ms/purview-data-share/overview-demo).
++
+### Visual Studio
+
+The walkthrough in this article uses Visual Studio 2019. The procedures for Visual Studio 2013, 2015, or 2017 may differ slightly.
+
+### Azure .NET SDK
+
+Download and install [Azure .NET SDK](https://azure.microsoft.com/downloads/) on your machine.
+
+## Create an application in Azure Active Directory
+
+1. In [Create an Azure Active Directory application](../active-directory/develop/howto-create-service-principal-portal.md#register-an-application-with-azure-ad-and-create-a-service-principal), create an application that represents the .NET application you're creating in this tutorial. For the sign-on URL, you can provide a dummy URL as shown in the article (`https://contoso.org/exampleapp`).
+1. In [Get values for signing in](../active-directory/develop/howto-create-service-principal-portal.md#get-tenant-and-app-id-values-for-signing-in), get the **application ID**,**tenant ID**, and **object ID**, and note down these values that you use later in this tutorial.
+1. In [Certificates and secrets](../active-directory/develop/howto-create-service-principal-portal.md#authentication-two-options), get the **authentication key**, and note down this value that you use later in this tutorial.
+1. [assign the application to these roles:](../active-directory/develop/howto-create-service-principal-portal.md#assign-a-role-to-the-application)
+
+ | | Azure Storage Account Roles | Microsoft Purview Collection Roles |
+ |: |: |: |
+ | **Data Provider** |Owner OR Blob Storage Data Owner|Data Share Contributor|
+ | **Data Consumer** |Contributor OR Owner OR Storage Blob Data Contributor OR Blob Storage Data Owner|Data Share Contributor|
+
+## Create a Visual Studio project
+
+Next, create a C# .NET console application in Visual Studio:
+
+1. Launch **Visual Studio**.
+2. In the Start window, select **Create a new project** > **Console App (.NET Framework)**. .NET version 4.5.2 or above is required.
+3. In **Project name**, enter **PurviewDataSharingQuickStart**.
+4. Select **Create** to create the project.
+
+## Install NuGet packages
+
+1. Select **Tools** > **NuGet Package Manager** > **Package Manager Console**.
+2. In the **Package Manager Console** pane, run the following commands to install packages. For more information, see the [Microsoft.Azure.Analytics.Purview.Share NuGet package](https://www.nuget.org/packages/Azure.Analytics.Purview.Share/1.0.3-beta.20).
+
+ ```powershell
+ Install-Package Microsoft.Azure.Purview.Share.ManagementClient
+ Install-Package Microsoft.Azure.Management.ResourceManager -IncludePrerelease
+ Install-Package Microsoft.IdentityModel.Clients.ActiveDirectory
+ Install-Package Azure.Analytics.Purview.Share
+ Install-Package Azure.Analytics.Purview.Account
+ ```
+
+## Create a sent share
+
+The below code will create a data share that you can send to internal or external users.
+To use it, be sure to fill out these variables:
+
+- **endpoint** - "https://\<my-account-name>.purview.azure.com/share". Replace **\<my-account-name>** with the name of your Microsoft Purview instance
+- **sentShareName** - a name for your new data share
+- **description** - an optional description for your data share
+- **collectionName** - the name of the collection where your share will be housed.
+
+```C# Snippet:Azure_Analytics_Purview_Share_Samples_01_Namespaces
+using Azure.Core;
+using Azure.Identity;
+
+var credential = new DefaultAzureCredential();
+var endPoint = "https://<my-account-name>.purview.azure.com/share";
+var sentShareClient = new SentSharesClient(endPoint, credential);
+var collectionName = "<name of your collection>";
+
+// Get collection internal reference name
+// Collections that are not the root collection have a friendlyName (the name you see in the Microsoft Purview Data Map) and an internal reference name used to access the collection programmatically.
+var collectionsResponse = await accountClient.GetCollectionsAsync();
+
+var collectionsResponseDocument = JsonDocument.Parse(collectionsResponse.Content);
+var collections = collectionsResponseDocument.RootElement.GetProperty("value");
+
+foreach (var collection in collections.EnumerateArray())
+{
+ if (String.Equals(collectionName, collection.GetProperty("friendlyName").ToString(), StringComparison.OrdinalIgnoreCase))
+ {
+ collectionName = collection.GetProperty("name").ToString();
+ }
+}
+
+// Create sent share
+var sentShareName = "sample-Share";
+var inPlaceSentShareDto = new
+{
+ shareKind = "InPlace",
+ properties = new
+ {
+ description = "demo share",
+ collection = new
+ {
+ referenceName = collectionName,
+ type = "CollectionReference"
+ }
+ }
+};
+var sentShare = await sentShareClient.CreateOrUpdateAsync(sentShareName, RequestContent.Create(inPlaceSentShareDto));
+```
+
+## Add an asset to a sent share
+
+The below code will add a data asset to a share you're sending.
+To use it, be sure to fill out these variables:
+
+- **endpoint** - "https://\<my-account-name>.purview.azure.com/share". Replace **\<my-account-name>** with the name of your Microsoft Purview instance
+- **sentShareName** - the name of the data share where you'll add the asset
+- **assetName** - a name for the asset you'll be adding
+- **senderStorageResourceId** - the [resource ID for the storage account](../storage/common/storage-account-get-info.md#get-the-resource-id-for-a-storage-account) where the data asset is stored
+- **senderStorageContainer** - the name of container where your asset is housed in your storage account
+- **senderPathToShare** - the folder and file path for the asset you'll be sharing
+- **pathNameForReceiver** - a (path compliant) name that for the share that will be visible to the data receiver
+
+```C# Snippet:Azure_Analytics_Purview_Share_Samples_02_Namespaces
+using Azure.Core;
+using Azure.Identity;
+
+var credential = new DefaultAzureCredential();
+var endPoint = "https://<my-account-name>.purview.azure.com/share";
+// Add asset to sent share
+var sentShareName = "sample-Share";
+var assetName = "fabrikam-blob-asset";
+var assetNameForReceiver = "receiver-visible-asset-name";
+var senderStorageResourceId = "<SENDER_STORAGE_ACCOUNT_RESOURCE_ID>";
+var senderStorageContainer = "fabrikamcontainer";
+var senderPathToShare = "folder/sample.txt";
+var pathNameForReceiver = "from-fabrikam";
+var assetData = new
+{
+ // For Adls Gen2 asset use "AdlsGen2Account"
+ kind = "blobAccount",
+ properties = new
+ {
+ storageAccountResourceId = senderStorageResourceId,
+ receiverAssetName = assetNameForReceiver,
+ paths = new[]
+ {
+ new
+ {
+ containerName = senderStorageContainer,
+ senderPath = senderPathToShare,
+ receiverPath = pathNameForReceiver
+ }
+ }
+ }
+};
+var assetsClient = new AssetsClient(endPoint, credential);
+await assetsClient.CreateAsync(WaitUntil.Started, sentShareName, assetName, RequestContent.Create(assetData));
+```
+
+## Send invitation
+
+The below code will send an invitation to a data share.
+To use it, be sure to fill out these variables:
+
+- **endpoint** - "https://\<my-account-name>.purview.azure.com/share". Replace **\<my-account-name>** with the name of your Microsoft Purview instance
+- **sentShareName** - the name of the data share you're sending
+- **assetName** - a name for your invitation
+- **targetEmail** or **targetActiveDirectoryId and targetObjectId** - the email address **or** objectId and tenantId for the user or service principal that will receive the data share. TenantId is optional.
+
+```C# Snippet:Azure_Analytics_Purview_Share_Samples_03_Namespaces
+using Azure.Core;
+using Azure.Identity;
+
+var credential = new DefaultAzureCredential();
+var endPoint = "https://<my-account-name>.purview.azure.com/share";
+// Send invitation
+var sentShareName = "sample-Share";
+var invitationName = "invitation-to-fabrikam";
+var invitationData = new
+{
+ invitationKind = "User",
+ properties = new
+ {
+ targetEmail = "user@domain.com"
+ }
+};
+// Instead of sending invitation to Azure login email of the user, you can send invitation to object ID of a service principal and tenant ID.
+// Tenant ID is optional. To use this method, comment out the previous declaration, and uncomment the next one.
+//var invitationData = new
+//{
+// invitationKind = "Application",
+// properties = new
+// {
+// targetActiveDirectoryId = "<targetActiveDirectoryId>",
+// targetObjectId = "<targetObjectId>"
+// }
+//};
+var sentShareInvitationsClient = new SentShareInvitationsClient(endPoint, credential);
+await sentShareInvitationsClient.CreateOrUpdateAsync(sentShareName, invitationName, RequestContent.Create(invitationData));
+```
+
+## View sent share invitations
+
+The below code will allow you to view your sent invitations.
+To use it, be sure to fill out these variables:
+
+- **endpoint** - "https://\<my-account-name>.purview.azure.com/share". Replace **\<my-account-name>** with the name of your Microsoft Purview instance
+- **sentShareName** - the name share that an invitation was sent for
+
+```C# Snippet:Azure_Analytics_Purview_Share_Samples_04_Namespaces
+using System.Linq;
+using System.Text.Json;
+using Azure.Identity;
+
+var credential = new DefaultAzureCredential();
+var endPoint = "https://<my-account-name>.purview.azure.com/share";
+var sentShareName = "sample-Share";
+// View sent share invitations. (Pending/Rejected)
+var sentShareInvitationsClient = new SentShareInvitationsClient(endPoint, credential);
+var sentShareInvitations = sentShareInvitationsClient.GetSentShareInvitations(sentShareName);
+var responseInvitation = sentShareInvitations.FirstOrDefault();
+if (responseInvitation == null)
+{
+ //No invitations
+ return;
+}
+var responseInvitationDocument = JsonDocument.Parse(responseInvitation);
+var targetEmail = responseInvitationDocument.RootElement.GetProperty("properties").GetProperty("targetEmail");
+```
+
+## View received invitations
+
+The below code will allow you to view your received invitations.
+To use it, be sure to fill out these variables:
+
+- **endpoint** - "https://\<my-account-name>.purview.azure.com/share". Replace **\<my-account-name>** with the name of your Microsoft Purview instance
+
+```C# Snippet:Azure_Analytics_Purview_Share_Samples_05_Namespaces
+using Azure.Identity;
+
+var credential = new DefaultAzureCredential();
+var endPoint = "https://<my-account-name>.purview.azure.com/share";
+// View received invitations
+var receivedInvitationsClient = new ReceivedInvitationsClient(endPoint, credential);
+var receivedInvitations = receivedInvitationsClient.GetReceivedInvitations();
+```
+
+## Create a received share
+
+The below code will allow you to receive a data share.
+To use it, be sure to fill out these variables:
+
+- **endpoint** - "https://\<my-account-name>.purview.azure.com/share". Replace **\<my-account-name>** with the name of your Microsoft Purview instance
+- **receivedShareName** - a name for the share that is being received
+- **sentShareLocation** - the region where the share is housed. It should be the same region as the sent share and will be one of [Microsoft Purview's available regions](https://azure.microsoft.com/global-infrastructure/services/?products=purview&regions=all).
+- **collectionName** - the name of the collection where your share will be housed.
+
+```C# Snippet:Azure_Analytics_Purview_Share_Samples_06_Namespaces
+using System.Linq;
+using System.Text.Json;
+using Azure.Core;
+using Azure.Identity;
+
+var credential = new DefaultAzureCredential();
+var endPoint = "https://<my-account-name>.purview.azure.com/share";
+var collectionName = "<name of your collection>"
+
+// Create received share
+var receivedInvitationsClient = new ReceivedInvitationsClient(endPoint, credential);
+var receivedInvitations = receivedInvitationsClient.GetReceivedInvitations();
+var receivedShareName = "fabrikam-received-share";
+var receivedInvitation = receivedInvitations.LastOrDefault();
+
+// Get collection internal reference name
+// Collections that are not the root collection have a friendlyName (the name you see in the Microsoft Purview Data Map) and an internal reference name used to access the collection programmatically.
+var collectionsResponse = await accountClient.GetCollectionsAsync();
+
+var collectionsResponseDocument = JsonDocument.Parse(collectionsResponse.Content);
+var collections = collectionsResponseDocument.RootElement.GetProperty("value");
+
+foreach (var collection in collections.EnumerateArray())
+{
+ if (collection.Equals(collection.GetProperty("friendlyName").ToString(), StringComparison.OrdinalIgnoreCase))
+ {
+ collectionName = collection.GetProperty("name").ToString();
+ }
+}
++
+if (receivedInvitation == null)
+{
+ //No received invitations
+ return;
+}
+var receivedInvitationDocument = JsonDocument.Parse(receivedInvitation).RootElement;
+var receivedInvitationId = receivedInvitationDocument.GetProperty("name");
+var receivedShareData = new
+{
+ shareKind = "InPlace",
+ properties = new
+ {
+ invitationId = receivedInvitationId,
+ sentShareLocation = "eastus",
+ collection = new
+ {
+ referenceName = collectionName,
+ type = "CollectionReference"
+ }
+ }
+};
+var receivedShareClient = new ReceivedSharesClient(endPoint, credential);
+var receivedShare = await receivedShareClient.CreateAsync(receivedShareName, RequestContent.Create(receivedShareData));
+```
+
+## View accepted shares
+
+The below code will allow you to view your accepted shares.
+To use it, be sure to fill out these variables:
+
+- **endpoint** - "https://\<my-account-name>.purview.azure.com/share". Replace **\<my-account-name>** with the name of your Microsoft Purview instance
+- **sentShareName** - the name of the share you would like to view
+
+```C# Snippet:Azure_Analytics_Purview_Share_Samples_07_Namespaces
+using System.Linq;
+using System.Text.Json;
+using Azure.Identity;
+
+var credential = new DefaultAzureCredential();
+var endPoint = "https://<my-account-name>.purview.azure.com/share";
+var sentShareName = "sample-Share";
+// View accepted shares
+var acceptedSentSharesClient = new AcceptedSentSharesClient(endPoint, credential);
+var acceptedSentShares = acceptedSentSharesClient.GetAcceptedSentShares(sentShareName);
+var acceptedSentShare = acceptedSentShares.FirstOrDefault();
+if (acceptedSentShare == null)
+{
+ //No accepted sent shares
+ return;
+}
+var receiverEmail = JsonDocument.Parse(acceptedSentShare).RootElement.GetProperty("properties").GetProperty("receiverEmail").GetString();
+```
+
+## Get received assets
+
+The below code will allow you to see the assets received from a share.
+To use it, be sure to fill out these variables:
+
+- **endpoint** - "https://\<my-account-name>.purview.azure.com/share". Replace **\<my-account-name>** with the name of your Microsoft Purview instance
+- **receivedShareName** - the name of the share you would like to view the assets from
+- **assetMappingName** - consumer provided input that is used as identifier for the created asset mapping
+- **receiverContainerName** - the name of the container where the assets were housed
+- **receiverFolderName** - the name of the folder where the assets were housed
+- **receiverMountPath** - an optional input path parameter for destination mapping location
+- **receiverStorageResourceId** - the [resource ID for the storage account](../storage/common/storage-account-get-info.md#get-the-resource-id-for-a-storage-account) where the data asset is stored
+
+```C# Snippet:Azure_Analytics_Purview_Share_Samples_08_Namespaces
+using System;
+using System.Linq;
+using System.Text.Json;
+using Azure.Core;
+using Azure.Identity;
+
+var credential = new DefaultAzureCredential();
+var endPoint = "https://<my-account-name>.purview.azure.com/share";
+// Get received assets
+var receivedShareName = "fabrikam-received-share";
+var receivedAssetsClient = new ReceivedAssetsClient(endPoint, credential);
+var receivedAssets = receivedAssetsClient.GetReceivedAssets(receivedShareName);
+var receivedAssetName = JsonDocument.Parse(receivedAssets.First()).RootElement.GetProperty("name").GetString();
+string assetMappingName = "receiver-asset-mapping";
+string receiverContainerName = "receivedcontainer";
+string receiverFolderName = "receivedfolder";
+string receiverMountPath = "receivedmountpath";
+string receiverStorageResourceId = "<RECEIVER_STORAGE_ACCOUNT_RESOURCE_ID>";
+var assetMappingData = new
+{
+ // For Adls Gen2 asset use "AdlsGen2Account"
+ kind = "BlobAccount",
+ properties = new
+ {
+ assetId = Guid.Parse(receivedAssetName),
+ storageAccountResourceId = receiverStorageResourceId,
+ containerName = receiverContainerName,
+ folder = receiverFolderName,
+ mountPath = receiverMountPath
+ }
+};
+var assetMappingsClient = new AssetMappingsClient(endPoint, credential);
+var assetMapping = await assetMappingsClient.CreateAsync(WaitUntil.Completed, receivedShareName, assetMappingName, RequestContent.Create(assetMappingData));
+```
+
+## Clean up resources
+
+To clean up the resources created for the quick start, follow the steps below:
+
+1. Within [Microsoft Purview governance portal](https://web.purview.azure.com/), [delete the sent share](how-to-share-data.md#delete-a-sent-share).
+1. Also [delete your received share](how-to-receive-share.md#delete-received-share).
+1. Once the shares are successfully deleted, delete the target container and folder Microsoft Purview created in your target storage account when you received shared data.
+
+## Next steps
+
+* [FAQ for data sharing](how-to-data-share-faq.md)
+* [How to share data](how-to-share-data.md)
+* [How to receive share](how-to-receive-share.md)
+* [REST API reference](/rest/api/purview/)
purview Quickstart Data Share https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/quickstart-data-share.md
This article provides a quick guide on how to share data and receive shares from
For an overview of how data sharing works, watch this short [demo](https://aka.ms/purview-data-share/overview-demo).
-## Prerequisites
-
-### Microsoft Purview prerequisites
-
-* [A Microsoft Purview account](create-catalog-portal.md). You can also use two Microsoft Purview accounts, one for data provider and one for data consumer to test both workflows.
-* Your recipient's Azure sign-in email address that you can use to send the invitation to. The recipient's email alias won't work.
-
-### Azure Storage account prerequisites
-
-* Your Azure subscription must be registered for the **AllowDataSharing** preview feature. Follow the below steps using Azure portal or PowerShell.
-
- # [Portal](#tab/azure-portal)
- 1. In Azure portal, select your Azure subscription that you'll use to create the source and target storage account.
- 1. From the left menu, select **Preview features** under *Settings*.
- 1. Select **AllowDataSharing** and *Register*.
- 1. Refresh the *Preview features* screen to verify the *State* is **Registered**. It could take 15 minutes to 1 hour for registration to complete.
-
- For more information, see [Register preview feature](../azure-resource-manager/management/preview-features.md?tabs=azure-portal#register-preview-feature).
-
- # [PowerShell](#tab/powershell)
- ```azurepowershell
- Set-AzContext -SubscriptionId [Your Azure subscription ID]
- ```
- ```azurepowershell
- Register-AzProviderFeature -FeatureName "AllowDataSharing" -ProviderNamespace "Microsoft.Storage"ΓÇï
- ```
- ```azurepowershell
- Get-AzProviderFeature -FeatureName "AllowDataSharing" -ProviderNamespace "Microsoft.Storage"
- ```
- The *RegistrationState* should be **Registered**. It could take 15 minutes to 1 hour for registration to complete. For more information, see [Register preview feature](../azure-resource-manager/management/preview-features.md?tabs=azure-portal#register-preview-feature).
-
-* Source and target storage accounts **created after** the registration step is completed. **Both storage accounts must be in the same Azure region as each other**. Both storage accounts need to be ADLS Gen2 or Blob Storage accounts. Your storage accounts can be in a different Azure region from your Microsoft Purview account.
-
- > [!NOTE]
- > The following are supported storage account configurations:
- >
- > - Azure regions: Canada Central, Canada East, UK South, UK West, Australia East, Japan East, Korea South, and South Africa North
- > - Performance: Standard
- > - Redundancy options: LRS, GRS, RA-GRS
-
-* If the storage accounts are in an Azure subscription different than Microsoft Purview account, [register the Microsoft.Purview resource provider](../azure-resource-manager/management/resource-providers-and-types.md) in the Azure subscription where the storage accounts are located.
-* Latest version of the storage SDK, PowerShell, CLI and Azure Storage Explorer. Storage REST API version must be February 2020 or later.
-* The storage accounts need to be registered in the collections where you'll send or receive the share. If you're using one Microsoft Purview account, this can be two different collections, or the same collection. For instructions to register, see the [ADLS Gen2](register-scan-adls-gen2.md) or [Blob storage](register-scan-azure-blob-storage-source.md) data source pages.
-
-### Required roles
-Below are required roles for sharing data and receiving shares.
-
-| | Azure Storage Account Roles | Microsoft Purview Collection Roles |
-|: |: |: |
-| **Data Provider** |Owner OR Storage Blob Data Owner|Data Share Contributor|
-| **Data Consumer** |Contributor OR Owner OR Storage Blob Data Contributor OR Storage Blob Data Owner|Data Share Contributor|
-
-Note: If you created the Microsoft Purview account, you're automatically assigned all the roles to the root collection. Refer to [Microsoft Purview permissions](catalog-permissions.md) to learn more about the Microsoft Purview collection and roles.
## Create a share
purview Register Scan Adls Gen2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-adls-gen2.md
It's important to register the data source in Microsoft Purview prior to setting
:::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-select-data-source.png" alt-text="Screenshot that allows selection of the data source":::
-1. Provide a suitable **Name** for the data source, select the relevant **Azure subscription**, existing **Data Lake Store account name** and the **collection** and select **Apply**. Leave the **Data Use Management** toggle on the **disabled** position until you have a chance to carefully go over this [document](./how-to-access-policies-storage.md).
+1. Provide a suitable **Name** for the data source, select the relevant **Azure subscription**, existing **Data Lake Store account name** and the **collection** and select **Apply**. Leave the **Data Use Management** toggle on the **disabled** position until you have a chance to carefully go over this [document](./how-to-policies-data-owner-storage.md).
:::image type="content" source="media/register-scan-adls-gen2/register-adls-gen2-data-source-details.png" alt-text="Screenshot that shows the details to be entered in order to register the data source":::
Source storage account can support up to 20 targets, and target storage account
## Access policy To create an access policy for Azure Data Lake Storage Gen 2, follow these guides:
-* [Single storage account](./how-to-data-owner-policies-storage.md) - This guide will allow you to enable access policies on a single Azure Storage account in your subscription.
-* [All sources in a subscription or resource group](./how-to-data-owner-policies-resource-group.md) - This guide will allow you to enable access policies on all enabled and available sources in a resource group, or across an Azure subscription.
+* [Single storage account](./how-to-policies-data-owner-storage.md) - This guide will allow you to enable access policies on a single Azure Storage account in your subscription.
+* [All sources in a subscription or resource group](./how-to-policies-data-owner-resource-group.md) - This guide will allow you to enable access policies on all enabled and available sources in a resource group, or across an Azure subscription.
## Next steps
purview Register Scan Azure Blob Storage Source https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-blob-storage-source.md
It is important to register the data source in Microsoft Purview prior to settin
:::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-select-data-source.png" alt-text="Screenshot that allows selection of the data source":::
-1. Provide a suitable **Name** for the data source, select the relevant **Azure subscription**, existing **Azure Blob Storage account name** and the **collection** and select **Apply**. Leave the **Data Use Management** toggle on the **disabled** position until you have a chance to carefully go over this [document](./how-to-access-policies-storage.md).
+1. Provide a suitable **Name** for the data source, select the relevant **Azure subscription**, existing **Azure Blob Storage account name** and the **collection** and select **Apply**. Leave the **Data Use Management** toggle on the **disabled** position until you have a chance to carefully go over this [document](./how-to-policies-data-owner-storage.md).
:::image type="content" source="media/register-scan-azure-blob-storage-source/register-blob-data-source-details.png" alt-text="Screenshot that shows the details to be entered in order to register the data source":::
Source storage account can support up to 20 targets, and target storage account
## Access policy To create an access policy for Azure Blob Storage, follow these guides:
-* [Single storage account](./how-to-data-owner-policies-storage.md) - This guide will allow you to enable access policies on a single Azure Storage account in your subscription.
-* [All sources in a subscription or resource group](./how-to-data-owner-policies-resource-group.md) - This guide will allow you to enable access policies on all enabled and available sources in a resource group, or across an Azure subscription.
+* [Single storage account](./how-to-policies-data-owner-storage.md) - This guide will allow you to enable access policies on a single Azure Storage account in your subscription.
+* [All sources in a subscription or resource group](./how-to-policies-data-owner-resource-group.md) - This guide will allow you to enable access policies on all enabled and available sources in a resource group, or across an Azure subscription.
## Next steps
purview Register Scan Azure Multiple Sources https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-multiple-sources.md
This article outlines how to register multiple Azure sources and how to authenti
|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Access Policy**|**Lineage**|**Data Sharing**| |||||||||
-| [Yes](#register) | [Yes](#scan) | [Yes](#scan) | [Yes](#scan)| [Yes](#scan)| [Yes](how-to-data-owner-policies-resource-group.md) | [Source Dependant](catalog-lineage-user-guide.md)| No |
+| [Yes](#register) | [Yes](#scan) | [Yes](#scan) | [Yes](#scan)| [Yes](#scan)| [Yes](how-to-policies-data-owner-resource-group.md) | [Source Dependant](catalog-lineage-user-guide.md)| No |
## Prerequisites
purview Register Scan Azure Sql Database https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/register-scan-azure-sql-database.md
Scans can be managed or run again on completion
## Access policy To create an access policy for Azure SQL Database, follow these guides:
-* [Single SQL account](./how-to-data-owner-policies-azure-sql-db.md) - This guide will allow you to enable access policies on a single Azure SQL Database account in your subscription.
-* [All data sources in a subscription or resource group](./how-to-data-owner-policies-resource-group.md) - This guide will allow you to enable access policies on all enabled and available sources in a resource group, or across an Azure subscription.
+* [Single SQL account](./how-to-policies-data-owner-azure-sql-db.md) - This guide will allow you to enable access policies on a single Azure SQL Database account in your subscription.
+* [All data sources in a subscription or resource group](./how-to-policies-data-owner-resource-group.md) - This guide will allow you to enable access policies on all enabled and available sources in a resource group, or across an Azure subscription.
## Lineage (Preview) <a id="lineagepreview"></a>
purview Tutorial Data Owner Policies Storage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-data-owner-policies-storage.md
Last updated 04/08/2022
[!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
-[Policies](concept-data-owner-policies.md) in Microsoft Purview allow you to enable access to data sources that have been registered to a collection. This tutorial describes how a data owner can use Microsoft Purview to enable access to datasets in Azure Storage through Microsoft Purview.
+[Policies](concept-policies-data-owner.md) in Microsoft Purview allow you to enable access to data sources that have been registered to a collection. This tutorial describes how a data owner can use Microsoft Purview to enable access to datasets in Azure Storage through Microsoft Purview.
In this tutorial, you learn how to: > [!div class="checklist"]
Check our demo and related tutorials:
> [!div class="nextstepaction"] > [Demo of access policy for Azure Storage](https://learn-video.azurefd.net/vod/player?id=caa25ad3-7927-4dcc-88dd-6b74bcae98a2)
-> [Concepts for Microsoft Purview data owner policies](./concept-data-owner-policies.md)
-> [Enable Microsoft Purview data owner policies on all data sources in a subscription or a resource group](./how-to-data-owner-policies-resource-group.md)
+> [Concepts for Microsoft Purview data owner policies](./concept-policies-data-owner.md)
+> [Enable Microsoft Purview data owner policies on all data sources in a subscription or a resource group](./how-to-policies-data-owner-resource-group.md)
remote-rendering Override Materials https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/remote-rendering/how-tos/conversion/override-materials.md
The full JSON schema for materials files is given here. Except for `unlit` and `
"useVertexColor": { "type" : "boolean" }, "isDoubleSided": { "type" : "boolean" }, "ignoreTextureMaps": { "$ref" : "#/definitions/listOfMaps" },
- "transparencyWriteDepth": {"type" : "boolean" }
+ "transparencyWritesDepth": {"type" : "boolean" }
}, "required": ["name"], "additionalProperties" : false
security Steps Secure Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/security/fundamentals/steps-secure-identity.md
Apps using their own legacy methods to authenticate with Azure AD and access com
We recommend the following actions:
-1. Discover legacy authentication in your organization with Azure AD Sign-In logs and Log Analytic workbooks.
+1. Discover legacy authentication in your organization with Azure AD Sign-In logs and Log Analytic workbooks.
1. Setup SharePoint Online and Exchange Online to use modern authentication.
-1. If you have Azure AD Premium licenses, use Conditional Access policies to block legacy authentication. For Azure AD free tier, use Azure AD Security Defaults.
-1. Block legacy authentication if you use AD FS.
-1. Block Legacy Authentication with Exchange Server 2019.
-1. Disable legacy authentication in Exchange Online.
+1. If you have Azure AD Premium licenses, use Conditional Access policies to block legacy authentication. For Azure AD free tier, use Azure AD Security Defaults.
+1. Block legacy authentication if you use AD FS.
+1. Block Legacy Authentication with Exchange Server 2019.
+1. Disable legacy authentication in Exchange Online.
-For more information, see the article [Blocking legacy authentication protocols in Azure AD](../../active-directory/fundamentals/concept-fundamentals-block-legacy-authentication.md).
+For more information, see the article [Blocking legacy authentication protocols in Azure AD](../../active-directory/conditional-access/block-legacy-authentication.md).
### Block invalid authentication entry points
service-fabric Service Fabric Versions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-fabric/service-fabric-versions.md
The tables in this article outline the Service Fabric and platform versions that
## Windows
+### Current versions
| Service Fabric runtime |Can upgrade directly from|Can downgrade to*|Compatible SDK or NuGet package version|Supported .NET runtimes** |OS Version |End of support | | | | | | | | | | 9.0 CU2<br>9.0.1048.9590 | 8.0 CU3<br>8.0.536.9590 | 8.0 | Less than or equal to version 6.0 | .NET 6.0 (GA), >= .NET Core 3.1, <br>All >= .NET Framework 4.5 | [See supported OS version](#supported-windows-versions-and-support-end-date) | Current version | | 9.0 CU1<br>9.0.1028.9590 | 8.0 CU3<br>8.0.536.9590 | 8.0 | Less than or equal to version 6.0 | .NET 6.0 (GA), >= .NET Core 3.1, <br>All >= .NET Framework 4.5 | [See supported OS version](#supported-windows-versions-and-support-end-date) | Current version | | 9.0 RTO<br>9.0.1017.9590 | 8.0 CU3<br>8.0.536.9590 | 8.0 | Less than or equal to version 6.0 | .NET 6.0 (GA), >= .NET Core 3.1, <br>All >= .NET Framework 4.5 | [See supported OS version](#supported-windows-versions-and-support-end-date) | Current version |++
+| Service Fabric runtime |Can upgrade directly from|Can downgrade to*|Compatible SDK or NuGet package version|Supported .NET runtimes** |OS Version |End of support |
+| | | | | | | |
| 8.2 CU4<br>8.2.1659.9590 | 8.0 CU3<br>8.0.536.9590 | 8.0 | Less than or equal to version 5.2 | .NET 5.0, >= .NET Core 3.1, <br>All >= .NET Framework 4.5 | [See supported OS version](#supported-windows-versions-and-support-end-date) | November 1, 2022 | | 8.2 CU3<br>8.2.1620.9590 | 8.0 CU3<br>8.0.536.9590 | 8.0 | Less than or equal to version 5.2 | .NET 5.0, >= .NET Core 3.1, <br>All >= .NET Framework 4.5 | [See supported OS version](#supported-windows-versions-and-support-end-date) | November 1, 2022 | | 8.2 CU2.1<br>8.2.1571.9590 | 8.0 CU3<br>8.0.536.9590 | 8.0 | Less than or equal to version 5.2 | .NET 5.0, >= .NET Core 3.1, <br>All >= .NET Framework 4.5 | [See supported OS version](#supported-windows-versions-and-support-end-date) | November 1, 2022 |
Support for Service Fabric on a specific OS ends when support for the OS version
## Linux
+### Current versions
+| Service Fabric runtime | Can upgrade directly from |Can downgrade to*|Compatible SDK or NuGet package version | Supported .NET runtimes** | OS version | End of support |
+| | | | | | | |
+| 9.0 CU2.1<br>9.0.1086.1 | 8.0 CU3<br>8.0.527.1 | 8.2 CU 5.1<br>8.2.1483.1 | Less than or equal to version 6.0 | >= .NET Core 2.1 | [See supported OS version](#supported-linux-versions-and-support-end-date) | Current version |
+| 8.2 CU5.1<br>8.2.1483.1 | 8.0 CU3<br>8.0.527.1 | N/A | Less than or equal to version 5.2 | >= .NET Core 2.1 | [See supported OS version](#supported-linux-versions-and-support-end-date) | December 1, 2022 |
+ | Service Fabric runtime | Can upgrade directly from |Can downgrade to*|Compatible SDK or NuGet package version | Supported .NET runtimes** | OS version | End of support | | | | | | | | |
-| 9.0 CU2.1<br>9.0.1086.1 | 8.0 CU3<br>8.0.527.1 | 8.0 | Less than or equal to version 6.0 | >= .NET Core 2.1 | [See supported OS version](#supported-linux-versions-and-support-end-date) | Current version |
| 9.0 CU2<br>9.0.1056.1 | 8.0 CU3<br>8.0.527.1 | 8.0 | Less than or equal to version 6.0 | >= .NET Core 2.1 | [See supported OS version](#supported-linux-versions-and-support-end-date) | August 19, 2022 | | 9.0 CU1<br>9.0.1035.1 | 8.0 CU3<br>8.0.527.1 | 8.0 | Less than or equal to version 6.0 | >= .NET Core 2.1 | [See supported OS version](#supported-linux-versions-and-support-end-date) | August 19, 2022 | | 9.0 RTO<br>9.0.1018.1 | 8.0 CU3<br>8.0.527.1 | 8.0 | Less than or equal to version 6.0 | >= .NET Core 2.1 | [See supported OS version](#supported-linux-versions-and-support-end-date) | August 19, 2022 |
-| 8.2 CU5.1<br>8.2.1483.1 | 8.0 CU3<br>8.0.527.1 | 8.0 | Less than or equal to version 5.2 | >= .NET Core 2.1 | [See supported OS version](#supported-linux-versions-and-support-end-date) | December 1, 2022 |
| 8.2 CU4<br>8.2.1458.1 | 8.0 CU3<br>8.0.527.1 | 8.0 | Less than or equal to version 5.2 | >= .NET Core 2.1 | [See supported OS version](#supported-linux-versions-and-support-end-date) | August 19, 2022 | | 8.2 CU3<br>8.2.1434.1 | 8.0 CU3<br>8.0.527.1 | 8.0 | Less than or equal to version 5.2 | >= .NET Core 2.1 | [See supported OS version](#supported-linux-versions-and-support-end-date) | August 19, 2022 | | 8.2 CU2.1<br>8.2.1397.1 | 8.0 CU3<br>8.0.527.1 | 8.0 | Less than or equal to version 5.2 | >= .NET Core 2.1 | [See supported OS version](#supported-linux-versions-and-support-end-date) | August 19, 2022 |
The following table lists the version names of Service Fabric and their correspo
| Version name | Windows version number | Linux version number | | | | | | 9.0 CU2.1 | Not applicable | 9.0.1086.1 |
+| 8.2 CU5.1 | Not applicable | 8.2.1483.1 |
| 9.0 CU2 | 9.0.1048.9590 | 9.0.1056.1 | | 9.0 CU1 | 9.0.1028.9590 | 9.0.1035.1 | | 9.0 RTO | 9.0.1017.9590 | 9.0.1018.1 |
site-recovery Avs Tutorial Dr Drill Azure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/site-recovery/avs-tutorial-dr-drill-azure.md
Complete the previous tutorials:
## Verify VM properties
-Before you run a test failover, verify the VM properties, and make sure that the [VMware VM](vmware-physical-azure-support-matrix.md#replicated-machines) complies with Azure requirements.
+Before you run a test failover, verify the VM properties, and make sure that the [VMware vSphere VM](vmware-physical-azure-support-matrix.md#replicated-machines) complies with Azure requirements.
1. In **Protected Items**, click **Replicated Items** > and the VM. 2. In the **Replicated item** pane, there's a summary of VM information, health status, and the
site-recovery Avs Tutorial Failover https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/site-recovery/avs-tutorial-failover.md
Verify properties as follows:
In some scenarios, failover requires additional processing that takes around 8 to 10 minutes to complete. You might notice longer test failover times for:
-* VMware VMs running a Mobility service version older than 9.8.
-* VMware Linux VMs.
-* VMware VMs that don't have the DHCP service enabled.
-* VMware VMs that don't have the following boot drivers: storvsc, vmbus, storflt, intelide, atapi.
+* VMware vSphere VMs running a Mobility service version older than 9.8.
+* VMware vSphere Linux VMs.
+* VMware vSphere VMs that don't have the DHCP service enabled.
+* VMware vSphere VMs that don't have the following boot drivers: storvsc, vmbus, storflt, intelide, atapi.
> [!WARNING] > Don't cancel a failover in progress. Before failover is started, VM replication is stopped. If you cancel a failover in progress, failover stops, but the VM won't replicate again.
site-recovery Avs Tutorial Prepare Avs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/site-recovery/avs-tutorial-prepare-avs.md
This is the second tutorial in a series that shows you how to set up disaster re
In this article, you learn how to: > [!div class="checklist"]
-> * Prepare an account on the vCenter server or vSphere ESXi host, to automate VM discovery.
-> * Prepare an account for automatic installation of the Mobility service on VMware VMs.
-> * Review VMware server and VM requirements and support.
+> * Prepare an account on the vCenter Server to automate VM discovery.
+> * Prepare an account for automatic installation of the Mobility service on VMware vSphere VMs.
+> * Review VMware vCenter Server and VM requirements and support.
> * Prepare to connect to Azure VMs after failover. > [!NOTE]
Site Recovery needs access to Azure VMware Solution servers to:
Create the account as follows:
-1. To use a dedicated account, create a role at the vCenter level. Give the role a name such as
+1. To use a dedicated account, create a role at the vCenter Server level. Give the role a name such as
**Azure_Site_Recovery**. 2. Assign the role the permissions summarized in the table below.
-3. Create a user on the vCenter server or vSphere host. Assign the role to the user.
+3. Create a user on the vCenter Server. Assign the role to the user.
### VMware account permissions
Create the account as follows:
## Prepare an account for Mobility service installation
-The Mobility service must be installed on machines you want to replicate. Site Recovery can do a push installation of this service when you enable replication for a machine, or you can install it manually, or using installation tools.
+The Mobility service must be installed on machines you want to replicate. Azure Site Recovery can do a push installation of this service when you enable replication for a machine, or you can install it manually, or using installation tools.
- In this tutorial, we're going to install the Mobility service with the push installation.-- For this push installation, you need to prepare an account that Site Recovery can use to access the VM. You specify this account
+- For this push installation, you need to prepare an account that Azure Site Recovery can use to access the VM. You specify this account
when you set up disaster recovery in the Azure console. Prepare the account as follows:
-Prepare a domain or local account with permissions to install on the VM.
-
+- Prepare a domain or local account with permissions to install on the VM.
- **Windows VMs**: To install on Windows VMs if you're not using a domain account, disable Remote User Access control on the local machine. To do this, in the registry > **HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Policies\System**, add the DWORD entry **LocalAccountTokenFilterPolicy**, with a value of 1. - **Linux VMs**: To install on Linux VMs, prepare a root account on the source Linux server.
-## Check VMware requirements
+## Check Azure VMware Solution requirements
-Make sure VMware servers and VMs comply with requirements.
+Make sure VMware vCenter Server and VMs comply with requirements.
-1. Verify Azure VMware solution [software versions](../azure-vmware/concepts-private-clouds-clusters.md#vmware-software-versions).
-2. Verify [VMware server requirements](vmware-physical-azure-support-matrix.md#on-premises-virtualization-servers).
+1. Verify Azure VMware Solution [software versions](../azure-vmware/concepts-private-clouds-clusters.md#vmware-software-versions).
+2. Verify [VMware vCenter Server requirements](vmware-physical-azure-support-matrix.md#on-premises-virtualization-servers).
3. For Linux VMs, [check](vmware-physical-azure-support-matrix.md#linux-file-systemsguest-storage) file system and storage requirements. 4. Check [network](vmware-physical-azure-support-matrix.md#network) and [storage](vmware-physical-azure-support-matrix.md#storage) support. 5. Check what's supported for [Azure networking](vmware-physical-azure-support-matrix.md#azure-vm-network-after-failover), [storage](vmware-physical-azure-support-matrix.md#azure-storage), and [compute](vmware-physical-azure-support-matrix.md#azure-compute), after failover.
site-recovery Avs Tutorial Replication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/site-recovery/avs-tutorial-replication.md
In this tutorial, you learn how to:
> [!div class="checklist"] >
-> * Set up the source replication settings, and a Site Recovery configuration server in Azure VMware Solution private cloud
+> * Set up the source replication settings, and an Azure Site Recovery configuration server in Azure VMware Solution private cloud
> * Set up the replication target settings. > * Create a replication policy.
-> * Enable replication for a VMware VM.
+> * Enable replication for a VMware vSphere VM.
> [!NOTE] > Tutorials show you the simplest deployment path for a scenario. They use default options where possible, and don't show all possible settings and paths. For detailed instructions, review the article in the How To section of the Site Recovery Table of Contents.
Complete the previous tutorials:
2. Follow [these steps](avs-tutorial-prepare-avs.md) to prepare your Azure VMware Solution deployment for disaster recovery to Azure. 3. In this tutorial we show you how to replicate a single VM. If you're deploying multiple VMs you should use the [Deployment Planner Tool](https://aka.ms/asr-deployment-planner). [Learn more](site-recovery-deployment-planner.md) about this tool. 4. This tutorial uses a number of options you might want to do differently:
- - The tutorial uses an OVA template to create the configuration server VMware VM. If you can't do this for some reason, follow [these instructions](physical-manage-configuration-server.md) to set up the configuration server manually.
+ - The tutorial uses an OVA template to create the configuration server VMware vSphere VM. If you can't do this for some reason, follow [these instructions](physical-manage-configuration-server.md) to set up the configuration server manually.
- In this tutorial, Site Recovery automatically downloads and installs MySQL to the configuration server. If you prefer, you can set it up manually instead. [Learn more](vmware-azure-deploy-configuration-server.md#configure-settings).
In your source environment, you need a single, highly available, on-premises mac
- **Master target server**: The master target server handles replication data during failback from Azure.
-All of these components are installed together on a single Azure VMware Solution machine that's known as the *configuration server*. By default, for Azure VMware Solution disaster recovery, we set up the configuration server as a highly available VMware VM. To do this, you download a prepared Open Virtualization Application (OVA) template, and import the template into VMware to create the VM.
+All of these components are installed together on a single Azure VMware Solution machine that's known as the *configuration server*. By default, for Azure VMware Solution disaster recovery, we set up the configuration server as a highly available VMware vSphere VM. To do this, you download a prepared Open Virtualization Application (OVA) template, and import the template into VMware vCenter Server to create the VM.
- The latest version of the configuration server is available in the portal. You can also download it directly from the [Microsoft Download Center](https://aka.ms/asrconfigurationserver). - If for some reason you can't use an OVA template to set up a VM, follow [these instructions](physical-manage-configuration-server.md) to set up the configuration server manually.
All of these components are installed together on a single Azure VMware Solution
## Import the template in VMware
-1. Sign in to the VMware vCenter server or vSphere ESXi host with the VMware vSphere Client.
+1. Sign in to the VMware vCenter Server with the VMware vSphere Client.
2. On the **File** menu, select **Deploy OVF Template** to start the **Deploy OVF Template Wizard**.
- ![Screenshot of the Deploy OVF template command in the VMWare vSphere Client.](./media/vmware-azure-tutorial/vcenter-wizard.png)
+ ![Screenshot of the Deploy OVF template command in the VMware vSphere Client.](./media/vmware-azure-tutorial/vcenter-wizard.png)
3. On **Select source**, enter the location of the downloaded OVF. 4. On **Review details**, select **Next**.
If you want to add an additional NIC to the configuration server, add it before
## Register the configuration server
-After the configuration server is set up, you register it in the vault.
+After the configuration server is setup, you register it in the vault.
1. From the VMware vSphere Client console, turn on the VM. 2. The VM boots up into a Windows Server 2016 installation experience. Accept the license agreement, and enter an administrator password. 3. After the installation finishes, sign in to the VM as the administrator. 4. The first time you sign in, the Azure Site Recovery Configuration Tool starts within a few seconds.
-5. Enter a name that's used to register the configuration server with Site Recovery. Then select **Next**.
+5. Enter a name that's used to register the configuration server with Azure Site Recovery. Then select **Next**.
6. The tool checks that the VM can connect to Azure. After the connection is established, select **Sign in** to sign in to your Azure subscription. The credentials must have access to the vault in which you want to register the configuration server. Ensure that necessary [roles](vmware-azure-deploy-configuration-server.md#azure-active-directory-permission-requirements) are assigned to this user. 7. The tool performs some configuration tasks and then reboots. 8. Sign in to the machine again. In a few seconds, the Configuration Server Management Wizard starts automatically.
-### Configure settings and add the VMware server
+### Configure settings and add the VMware vCenter Server
Finish setting up and registering the configuration server. Before proceeding, ensure all [pre-requisites](vmware-azure-deploy-configuration-server.md#prerequisites) are met for successful set up of configuration server.
Finish setting up and registering the configuration server. Before proceeding, e
2. In **Select Recovery Services vault**, select your Azure subscription and the relevant resource group and vault. 3. In **Install third-party software**, accept the license agreement. Select **Download and Install** to install MySQL Server. If you placed MySQL in the path, this step can be skipped. Learn [more](vmware-azure-deploy-configuration-server.md#configure-settings) 4. In **Validate appliance configuration**, prerequisites are verified before you continue.
-5. In **Configure vCenter Server/vSphere ESXi server**, enter the FQDN or IP address of the vCenter server, or vSphere host, where the VMs you want to replicate are located. Enter the port on which the server is listening. Enter a friendly name to be used for the VMware server in the vault.
-6. Enter user credentials to be used by the configuration server to connect to the VMware server. Ensure that the user name and password are correct and is a part of the Administrators group of the virtual machine to be protected. Site Recovery uses these credentials to automatically discover VMware VMs that are available for replication. Select **Add**, and then select **Continue**.
+5. In **Configure vCenter Server/vSphere ESXi server**, enter the FQDN or IP address of the vCenter Server, where the VMs you want to replicate are located. Enter the port on which the server is listening. Enter a friendly name to be used for the VMware vCenter Server in the vault.
+6. Enter user credentials to be used by the configuration server to connect to the VMware vCenter Server. Ensure that the user name and password are correct and is a part of the Administrators group of the virtual machine to be protected. Azure Site Recovery uses these credentials to automatically discover VMware vSphere VMs that are available for replication. Select **Add**, and then select **Continue**.
7. In **Configure virtual machine credentials**, enter the user name and password that will be used to automatically install Mobility Service on VMs when replication is enabled. - For Windows machines, the account needs local administrator privileges on the machines you want to replicate. - For Linux, provide details for the root account.
Finish setting up and registering the configuration server. Before proceeding, e
9. After registration finishes, open the Azure portal and verify that the configuration server and VMware server are listed on **Recovery Services Vault** > **Manage** > **Site Recovery Infrastructure** > **Configuration Servers**.
-After the configuration server is registered, Site Recovery connects to VMware servers by using the specified settings, and discovers VMs.
+After the configuration server is registered, Site Recovery connects to VMware vCenter Server by using the specified settings, and discovers VMs.
> [!NOTE] > It can take 15 minutes or more for the account name to appear in the portal. To update
After the configuration server is registered, Site Recovery connects to VMware s
Select and verify target resources. 1. Select **Prepare infrastructure** > **Target**. Select the Azure subscription you want to use. We're using a Resource Manager model.
-2. Site Recovery checks that you have one or more virtual networks. You should have these when you set up the Azure components in the [first tutorial](tutorial-prepare-azure.md) in this tutorial series.
+2. Azure Site Recovery checks that you have one or more virtual networks. You should have these when you set up the Azure components in the [first tutorial](tutorial-prepare-azure.md) in this tutorial series.
![Screenshot of the Prepare infrastructure > Target options.](./media/vmware-azure-tutorial/storage-network.png)
Select and verify target resources.
- The policy is automatically associated with the configuration server. - A matching policy is automatically created for failback by default. For example, if the replication policy is **rep-policy**, then the failback policy is **rep-policy-failback**. This policy isn't used until you initiate a failback from Azure.
-Note: In VMware-to-Azure scenario the crash-consistent snapshot is taken at 5 min interval.
+Note: In VMware vSphere-to-Azure scenario the crash-consistent snapshot is taken at 5 min interval.
## Enable replication
Enable replication for VMs as follows:
1. Select **Replicate application** > **Source**. 2. In **Source**, select **On-premises**, and select the configuration server in **Source location**. 3. In **Machine type**, select **Virtual Machines**.
-4. In **vCenter/vSphere Hypervisor**, select the vSphere host, or vCenter server that manages the host.
+4. In **vCenter/vSphere Hypervisor**, select the vCenter Server that manages the host.
5. Select the process server (installed by default on the configuration server VM). Then select **OK**. Health status of each process server is indicated as per recommended limits and other parameters. Choose a healthy process server. A [critical](vmware-physical-azure-monitor-process-server.md#process-server-alerts) process server cannot be chosen. You can either [troubleshoot and resolve](vmware-physical-azure-troubleshoot-process-server.md) the errors **or** set up a [scale-out process server](vmware-azure-set-up-process-server-scale.md). 6. In **Target**, select the subscription and the resource group in which you want to create the failed-over VMs. We're using the Resource Manager deployment model. 7. Select the Azure network and subnet to which Azure VMs connect when they're created after failover.
site-recovery Avs Tutorial Reprotect https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/site-recovery/avs-tutorial-reprotect.md
After [failover](avs-tutorial-failover.md) of Azure VMware Solution VMs to Azure
4. If you're reprotecting VMs gathered into a replication group for multi-VM consistency, make sure they all have the same operating system (Windows or Linux) and make sure that the master target server you deploy has the same type of operating system. All VMs in a replication group must use the same master target server. 5. Open [the required ports](vmware-azure-prepare-failback.md#ports-for-reprotectionfailback) for failback. 6. Ensure that the vCenter Server is connected before failback. Otherwise, disconnecting disks and attaching them back to the virtual machine fails.
-7. If a vCenter server manages the VMs to which you'll fail back, make sure that you have the required permissions. If you perform a read-only user vCenter discovery and protect virtual machines, protection succeeds, and failover works. However, during reprotection, failover fails because the datastores can't be discovered, and aren't listed during reprotection. To resolve this problem, you can update the vCenter credentials with an [appropriate account/permissions](avs-tutorial-prepare-avs.md#prepare-an-account-for-automatic-discovery), and then retry the job.
+7. If a vCenter Server manages the VMs to which you'll fail back, make sure that you have the required permissions. If you perform a read-only user vCenter Server discovery and protect virtual machines, protection succeeds, and failover works. However, during reprotection, failover is unsuccessful because the datastores can't be discovered, and aren't listed during reprotection. To resolve this problem, you can update the vCenter Server credentials with an [appropriate account/permissions](avs-tutorial-prepare-avs.md#prepare-an-account-for-automatic-discovery), and then retry the job.
8. If you used a template to create your virtual machines, ensure that each VM has its own UUID for the disks. If the Azure VMware Solution VM UUID clashes with the UUID of the master target server because both were created from the same template, reprotection fails. Deploy from a different template. 9. If you're failing back to an alternate vCenter Server, make sure that the new vCenter Server and the master target server are discovered. Typically if they're not the datastores aren't accessible, or aren't visible in **Reprotect**. 10. Verify the following scenarios in which you can't fail back: - If you're using either the ESXi 5.5 free edition or the vSphere 6 Hypervisor free edition. Upgrade to a different version. - If you have a Windows Server 2008 R2 SP1 physical server.
- - VMware VMs can't fail back to Hyper-V.
+ - VMware vSphere VMs can't fail back to Hyper-V.
- VMs that have been migrated. - A VM that's been moved to another resource group. - A replica Azure VM that's been deleted.
spring-apps Access App Virtual Network https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/spring-apps/access-app-virtual-network.md
When **Assign Endpoint** on applications in an Azure Spring Apps service instanc
3. In the filtered result, find the **Device** connected to the service runtime **Subnet** of the service instance, and copy its **IP Address**. In this sample, the IP Address is *10.1.0.7*.
- [ ![Create DNS record](media/spring-cloud-access-app-vnet/create-dns-record.png) ](media/spring-cloud-access-app-vnet/create-dns-record.png)
+ [ ![Create DNS record](media/spring-cloud-access-app-vnet/create-dns-record.png) ](media/spring-cloud-access-app-vnet/create-dns-record.png#lightbox)
#### [CLI](#tab/azure-CLI)
spring-apps How To Circuit Breaker Metrics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/spring-apps/how-to-circuit-breaker-metrics.md
az spring app deploy -n reactive-resilience4j \
1. Select the **Application Insights** Blade from Azure Spring Apps portal, and select **Application Insights**.
- [ ![resilience4J 0](media/spring-cloud-resilience4j/resilience4J-0.png)](media/spring-cloud-resilience4j/resilience4J-0.PNG)
+ [ ![resilience4J 0](media/spring-cloud-resilience4j/resilience4J-0.png)](media/spring-cloud-resilience4j/resilience4J-0.png#lightbox)
2. Select **Metrics** from the **Application Insights** page. Select **azure.applicationinsights** from **Metrics Namespace**. Also select **resilience4j_circuitbreaker_buffered_calls** metrics with **Average**.
- [ ![resilience4J 1](media/spring-cloud-resilience4j/resilience4J-1.png)](media/spring-cloud-resilience4j/resilience4J-1.PNG)
+ [ ![resilience4J 1](media/spring-cloud-resilience4j/resilience4J-1.png)](media/spring-cloud-resilience4j/resilience4J-1.png#lightbox)
3. Select **resilience4j_circuitbreaker_calls** metrics and **Average**.
- [ ![resilience4J 2](media/spring-cloud-resilience4j/resilience4J-2.png)](media/spring-cloud-resilience4j/resilience4J-2.PNG)
+ [ ![resilience4J 2](media/spring-cloud-resilience4j/resilience4J-2.png)](media/spring-cloud-resilience4j/resilience4J-2.png#lightbox)
4. Select **resilience4j_circuitbreaker_calls** metrics and **Average**. Select **Add filter**, and then select name as **createNewAccount**.
- [ ![resilience4J 3](media/spring-cloud-resilience4j/resilience4J-3.png)](media/spring-cloud-resilience4j/resilience4J-3.PNG)
+ [ ![resilience4J 3](media/spring-cloud-resilience4j/resilience4J-3.png)](media/spring-cloud-resilience4j/resilience4J-3.png#lightbox)
5. Select **resilience4j_circuitbreaker_calls** metrics and **Average**. Then select **Apply splitting**, and select **kind**.
- [ ![resilience4J 4](media/spring-cloud-resilience4j/resilience4J-4.png)](media/spring-cloud-resilience4j/resilience4J-4.PNG)
+ [ ![resilience4J 4](media/spring-cloud-resilience4j/resilience4J-4.png)](media/spring-cloud-resilience4j/resilience4J-4.png#lightbox)
6. Select **resilience4j_circuitbreaker_calls**, `**resilience4j_circuitbreaker_buffered_calls**, and **resilience4j_circuitbreaker_slow_calls** metrics with **Average**.
- [ ![resilience4J 5](media/spring-cloud-resilience4j/resilience4j-5.png)](media/spring-cloud-resilience4j/resilience4j-5.PNG)
+ [ ![resilience4J 5](media/spring-cloud-resilience4j/resilience4j-5.png)](media/spring-cloud-resilience4j/resilience4j-5.png#lightbox)
## Next steps
spring-apps Quickstart Deploy Apps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/spring-apps/quickstart-deploy-apps.md
Use the following steps to create and deploys apps on Azure Spring Apps using th
```azurecli az spring app deploy \ --name api-gateway \
- --jar-path spring-petclinic-api-gateway/target/spring-petclinic-api-gateway-2.5.1.jar \
+ --artifact-path spring-petclinic-api-gateway/target/spring-petclinic-api-gateway-2.5.1.jar \
--jvm-options="-Xms2048m -Xmx2048m" az spring app deploy \ --name customers-service \
- --jar-path spring-petclinic-customers-service/target/spring-petclinic-customers-service-2.5.1.jar \
+ --artifact-path spring-petclinic-customers-service/target/spring-petclinic-customers-service-2.5.1.jar \
--jvm-options="-Xms2048m -Xmx2048m" ```
storage Data Lake Storage Directory File Acl Java https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/data-lake-storage-directory-file-acl-java.md
First, create a **DataLakeFileClient** instance that represents the file that yo
## List directory contents
-This example, prints the names of each file that is located in a directory named `my-directory`.
+This example prints the names of each file that is located in a directory named `my-directory`.
:::code language="java" source="~/azure-storage-snippets/blobs/howto/Java/Java-v12/src/main/java/com/datalake/manage/CRUD_DataLake.java" id="Snippet_ListFilesInDirectory":::
storsimple Storsimple 8000 Deactivate And Delete Device https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-8000-deactivate-and-delete-device.md
This article describes how to deactivate and delete a StorSimple device that is
Deactivation severs the connection between the device and the corresponding StorSimple Device Manager service. You may wish to take a StorSimple device out of service (for example, if you are replacing or upgrading your device or if you are no longer using StorSimple). If so, you need to deactivate the device before you can delete it.
-When you deactivate a device, any data that was stored locally on the device is no longer accessible. Only the data associated with the device that was stored in the cloud can be recovered.
+When you deactivate a device, any data that was stored locally on the device is no longer accessible. Only the data associated with the device that was stored in the cloud can be recovered. When you deactivate a device, any data that was stored locally on the device is no longer accessible. Only the data associated with the device that was stored in the cloud can be recovered. After you have deactivated the device, if you would like to keep it, go to [Keep my StorSimple 8000 series appliance](storsimple-8000-migration-from-8000-options.md#q-what-happens-if-i-want-to-keep-my-storsimple-8000-series-appliancebeyond-the-end-of-life-date).
> [!WARNING] > Deactivation is a PERMANENT operation and cannot be undone. A deactivated device cannot be registered with the StorSimple Device Manager service unless it is reset to factory defaults.
After the cloud appliance is deactivated, you can delete it from the list of dev
* To restore the deactivated device to factory defaults, go to [Reset the device to factory default settings](storsimple-8000-manage-device-controller.md#reset-the-device-to-factory-default-settings). * For technical assistance, [contact Microsoft Support](storsimple-8000-contact-microsoft-support.md). * To learn more about how to use the StorSimple Device Manager service, go to [Use the StorSimple Device Manager service to administer your StorSimple device](storsimple-8000-manager-service-administration.md).-
+* After you have deactivated the device, if you would like to keep it, go to [Keep my StorSimple 8000 series appliance](storsimple-8000-migration-from-8000-options.md#q-what-happens-if-i-want-to-keep-my-storsimple-8000-series-appliancebeyond-the-end-of-life-date).
storsimple Storsimple Adapter For Sharepoint https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-adapter-for-sharepoint.md
NA Previously updated : 06/06/2017 Last updated : 08/22/2022 # Install and configure the StorSimple Adapter for SharePoint++ ## Overview The StorSimple Adapter for SharePoint is a component that lets you provide Microsoft Azure StorSimple flexible storage and data protection to SharePoint Server farms. You can use the adapter to move Binary Large Object (BLOB) content from the SQL Server content databases to the Microsoft Azure StorSimple hybrid cloud storage device.
storsimple Storsimple Configure Backup Target Using Backup Exec https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-configure-backup-target-using-backup-exec.md
na Previously updated : 12/05/2016 Last updated : 08/22/2022 # StorSimple as a backup target with Backup Exec + ## Overview Azure StorSimple is a hybrid cloud storage solution from Microsoft. StorSimple addresses the complexities of exponential data growth by using an Azure storage account as an extension of the on-premises solution, and automatically tiering data across on-premises storage and cloud storage.
storsimple Storsimple Configure Backup Target Veeam https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-configure-backup-target-veeam.md
na Previously updated : 12/06/2016 Last updated : 08/22/2022 # StorSimple as a backup target with Veeam + ## Overview Azure StorSimple is a hybrid cloud storage solution from Microsoft. StorSimple addresses the complexities of exponential data growth by using an Azure Storage account as an extension of the on-premises solution and automatically tiering data across on-premises storage and cloud storage.
storsimple Storsimple Configure Backuptarget Netbackup https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-configure-backuptarget-netbackup.md
na Previously updated : 06/15/2017 Last updated : 08/22/2022 # StorSimple as a backup target with NetBackup + ## Overview Azure StorSimple is a hybrid cloud storage solution from Microsoft. StorSimple addresses the complexities of exponential data growth by using an Azure storage account as an extension of the on-premises solution, and automatically tiering data across on-premises storage and cloud storage.
storsimple Storsimple Configure Mpio On Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-configure-mpio-on-linux.md
ms.assetid: ca289eed-12b7-4e2e-9117-adf7e2034f2f Previously updated : 06/12/2019 Last updated : 08/22/2022 # Configure MPIO on a StorSimple host running CentOS++ This article explains the steps required to configure Multipathing IO (MPIO) on your Centos 6.6 host server. The host server is connected to your Microsoft Azure StorSimple device for high availability via iSCSI initiators. It describes in detail the automatic discovery of multipath devices and the specific setup only for StorSimple volumes. This procedure is applicable to all the models of StorSimple 8000 series devices.
storsimple Storsimple Disaster Recovery Using Azure Site Recovery https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-disaster-recovery-using-azure-site-recovery.md
NA Previously updated : 10/13/2017 Last updated : 08/22/2022 # Automated Disaster Recovery solution using Azure Site Recovery for file shares hosted on StorSimple + [!INCLUDE [updated-for-az](../../includes/updated-for-az.md)] ## Overview
storsimple Storsimple Monitoring Indicators https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-monitoring-indicators.md
NA Previously updated : 11/03/2017 Last updated : 08/22/2022 # Use StorSimple monitoring indicators to manage your device ## Overview Your StorSimple device includes light-emitting diodes (LEDs) and alarms that you can use to monitor the modules and overall status of the StorSimple device. The monitoring indicators can be found on the hardware components of the device's primary enclosure and the EBOD enclosure. The monitoring indicators can be either LEDs or audible alarms.
storsimple Storsimple Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-overview.md
NA Previously updated : 05/18/2022 Last updated : 08/22/2022 # StorSimple 8000 series: a hybrid cloud storage solution + ## Overview Welcome to Microsoft Azure StorSimple, an integrated storage solution that manages storage tasks between on-premises devices and Microsoft Azure cloud storage. StorSimple is an efficient, cost-effective, and easy to manage storage area network (SAN) solution that eliminates many of the issues and expenses that are associated with enterprise storage and data protection. It uses the proprietary StorSimple 8000 series device, integrates with cloud services, and provides a set of management tools for a seamless view of all enterprise storage, including cloud storage. (The StorSimple deployment information published on the Microsoft Azure website applies to StorSimple 8000 series devices only. If you're using a StorSimple 5000/7000 series device, go to [StorSimple Help](http://onlinehelp.storsimple.com/).)
storsimple Storsimple Snapshot Manager Admin https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-snapshot-manager-admin.md
NA Previously updated : 06/05/2016 Last updated : 08/22/2022 # Use StorSimple Snapshot Manager to administer your StorSimple solution + ## Overview StorSimple Snapshot Manager is a Microsoft Management Console (MMC) snap-in that simplifies data protection and backup management in a Microsoft Azure StorSimple environment. With StorSimple Snapshot Manager, you can manage Microsoft Azure StorSimple data in the data center and in the cloud as a single integrated storage solution, thus simplifying backup processes and reducing costs.
storsimple Storsimple Snapshot Manager Deployment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-snapshot-manager-deployment.md
NA Previously updated : 06/05/2017 Last updated : 08/22/2022 # Deploy the StorSimple Snapshot Manager MMC snap-in + ## Overview The StorSimple Snapshot Manager is a Microsoft Management Console (MMC) snap-in that simplifies data protection and backup management in a Microsoft Azure StorSimple environment. With StorSimple Snapshot Manager, you can manage Microsoft Azure StorSimple on-premises and cloud storage as if it were a fully integrated storage system, thus simplifying backup and restore processes and reducing costs.
storsimple Storsimple Snapshot Manager Manage Backup Catalog https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-snapshot-manager-manage-backup-catalog.md
NA Previously updated : 06/05/2017 Last updated : 08/22/2022 # Use StorSimple Snapshot Manager to manage the backup catalog + ## Overview The primary function of StorSimple Snapshot Manager is to allow you to create application-consistent backup copies of StorSimple volumes in the form of snapshots. Snapshots are then listed in an XML file called a *backup catalog*. The backup catalog organizes snapshots by volume group and then by local snapshot or cloud snapshot.
storsimple Storsimple Snapshot Manager Manage Backup Jobs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-snapshot-manager-manage-backup-jobs.md
NA Previously updated : 06/05/2017 Last updated : 08/22/2022 # Use StorSimple Snapshot Manager to view and manage backup jobs + ## Overview The **Jobs** node in the **Scope** pane shows the **Scheduled**, **Last 24 hours**, and **Running** backup tasks that you initiated interactively or by a configured policy.
storsimple Storsimple Snapshot Manager Manage Backup Policies https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-snapshot-manager-manage-backup-policies.md
NA Previously updated : 06/05/2017 Last updated : 08/22/2022 # Use StorSimple Snapshot Manager to create and manage backup policies++ ## Overview A backup policy creates a schedule for backing up volume data locally or in the cloud. When you create a backup policy, you can also specify a retention policy. (You can retain a maximum of 64 snapshots.) For more information about backup policies, see [Backup types](storsimple-what-is-snapshot-manager.md#backup-types-and-backup-policies) in [StorSimple 8000 series: a hybrid cloud solution](storsimple-overview.md).
storsimple Storsimple Snapshot Manager Manage Devices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-snapshot-manager-manage-devices.md
NA Previously updated : 06/05/2017 Last updated : 08/22/2022 # Use StorSimple Snapshot Manager to connect and manage StorSimple devices++ ## Overview You can use nodes in the StorSimple Snapshot Manager **Scope** pane to verify imported StorSimple device data and refresh connected storage devices. Additionally, when you click the **Devices** node, you can see a list of connected devices and corresponding status information in the **Results** pane.
storsimple Storsimple Snapshot Manager Manage Volume Groups https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-snapshot-manager-manage-volume-groups.md
NA Previously updated : 06/05/2017 Last updated : 08/22/2022 # Use StorSimple Snapshot Manager to create and manage volume groups++ ## Overview You can use the **Volume Groups** node on the **Scope** pane to assign volumes to volume groups, view information about a volume group, schedule backups, and edit volume groups.
storsimple Storsimple Snapshot Manager Manage Volumes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-snapshot-manager-manage-volumes.md
NA Previously updated : 04/18/2016 Last updated : 08/22/2022 # Use StorSimple Snapshot Manager to view and manage volumes++ ## Overview You can use the StorSimple Snapshot Manager **Volumes** node (on the **Scope** pane) to select volumes and view information about them. The volumes are presented as drives that correspond to the volumes mounted by the host. The **Volumes** node shows local volumes and volume types that are supported by StorSimple, including volumes discovered through the use of iSCSI and a device.
storsimple Storsimple Snapshot Manager Mmc Menu https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-snapshot-manager-mmc-menu.md
NA Previously updated : 06/05/2017 Last updated : 08/22/2022 # Use the MMC menu actions in StorSimple Snapshot Manager + ## Overview In StorSimple Snapshot Manager, you will see the following actions listed on all action menus and all variations of the **Actions** pane.
storsimple Storsimple Troubleshoot Operational Device https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-troubleshoot-operational-device.md
NA Previously updated : 11/03/2017 Last updated : 08/22/2022 # Troubleshoot an operational StorSimple device++ > [!NOTE] > The classic portal for StorSimple is deprecated. Your StorSimple Device Managers will automatically move to the new Azure portal as per the deprecation schedule. You will receive an email and a portal notification for this move. This document will also be retired soon. For any questions regarding the move, see [FAQ: Move to Azure portal](./index.yml).
storsimple Storsimple Turn Device On Or Off https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-turn-device-on-or-off.md
ms.assetid: 8e9c6e6c-965c-4a81-81bd-e1c523a14c82 Previously updated : 01/09/2018 Last updated : 08/22/2022 # Turn on or turn off your StorSimple 8000 series device + ## Overview Shutting down a Microsoft Azure StorSimple device is not required as a part of normal system operation. However, you may need to turn on a new device or a device that had to be shut down. Generally, a shutdown is required in cases in which you must replace failed hardware, physically move a unit, or take a device out of service. This tutorial describes the required procedure for turning on and shutting down your StorSimple device in different scenarios.
storsimple Storsimple Update21 Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-update21-release-notes.md
NA Previously updated : 11/03/2017 Last updated : 08/22/2022 # StorSimple 8000 Series Update 2.2 release notes + ## Overview The following release notes describe the new features and identify the critical open issues for StorSimple 8000 Series Update 2.2. They also contain a list of the StorSimple software updates included in this release. Update 2.2 can be applied to any StorSimple device running Release (GA) or Update 0.1 through Update 2.1. The device version associated with Update 2.2 is 6.3.9600.17708.
-Please review the information contained in the release notes before you deploy the update in your StorSimple solution.
+Review the information contained in the release notes before you deploy the update in your StorSimple solution.
> [!IMPORTANT] > * Update 2.2 has software only updates. It takes approximately 1.5-2 hours to install this update.
Please review the information contained in the release notes before you deploy t
The following key improvements have been made in Update 2.2. * **Automated space reclamation optimization** ΓÇô When data is deleted on thinly provisioned volumes, the unused storage blocks need to be reclaimed. This release has improved the space reclamation process from the cloud resulting in the unused space becoming available faster as compared to the previous versions.
-* **Snapshot performance enhancements** ΓÇô Update 2.2 has improved the time to process a cloud snapshot in certain scenarios where large volumes are being used and there is minimal to no data churn. A scenario that would benefit from this enhancement would be the archive volumes.
+* **Snapshot performance enhancements** ΓÇô Update 2.2 has improved the time to process a cloud snapshot in certain scenarios where large volumes are being used and there's minimal to no data churn. A scenario that would benefit from this enhancement would be the archive volumes.
* **Hardening of Support package gathering** ΓÇô There have been improvements in the way the Support package is gathered and uploaded in this release. * **Update reliability improvements** ΓÇô This release has bug fixes that result in an improved Update reliability. ## Issues fixed in Update 2.2
-The following tables provides a summary of issues that were fixed in Updates 2.2 and 2.1.
+The following tables provide a summary of issues that were fixed in Updates 2.2 and 2.1.
| No | Feature | Issue | Applies to physical device | Applies to virtual device | | | | | | |
The following table provides a summary of known issues in this release.
| No. | Feature | Issue | Comments / workaround | Applies to physical device | Applies to virtual device | | | | | | | |
-| 1 |Disk quorum |In rare instances, if the majority of disks in the EBOD enclosure of an 8600 device are disconnected resulting in no disk quorum, then the storage pool will go offline. It will stay offline even if the disks are reconnected. |You will need to reboot the device. If the issue persists, please contact Microsoft Support for next steps. |Yes |No |
+| 1 |Disk quorum |In rare instances, if most disks in the EBOD enclosure of an 8600 device are disconnected resulting in no disk quorum, then the storage pool will go offline. It will stay offline even if the disks are reconnected. |You'll need to reboot the device. If the issue persists, please contact Microsoft Support for next steps. |Yes |No |
| 2 |Incorrect controller ID |When a controller replacement is performed, controller 0 may show up as controller 1. During controller replacement, when the image is loaded from the peer node, the controller ID can show up initially as the peer controllerΓÇÖs ID. In rare instances, this behavior may also be seen after a system reboot. |No user action is required. This situation will resolve itself after the controller replacement is complete. |Yes |No |
-| 3 |Storage accounts |Using the Storage service to delete the storage account is an unsupported scenario. This will lead to a situation in which user data cannot be retrieved. | |Yes |Yes |
-| 4 |Device failover |Multiple failovers of a volume container from the same source device to different target devices is not supported. Failover from a single dead device to multiple devices will make the volume containers on the first failed over device lose data ownership. After such a failover, these volume containers will appear or behave differently when you view them in the Azure classic portal. | |Yes |No |
+| 3 |Storage accounts |Using the Storage service to delete the storage account is an unsupported scenario. This will lead to a situation in which user data can't be retrieved. | |Yes |Yes |
+| 4 |Device failover |Multiple failovers of a volume container from the same source device to different target devices isn't supported. Failover from a single dead device to multiple devices will make the volume containers on the first failed over device lose data ownership. After such a failover, these volume containers will appear or behave differently when you view them in the Azure classic portal. | |Yes |No |
| 5 |Installation |During StorSimple Adapter for SharePoint installation, you need to provide a device IP in order for the install to finish successfully. | |Yes |No | | 6 |Web proxy |If your web proxy configuration has HTTPS as the specified protocol, then your device-to-service communication will be affected and the device will go offline. Support packages will also be generated in the process, consuming significant resources on your device. |Make sure that the web proxy URL has HTTP as the specified protocol. For more information, go to [Configure web proxy for your device](./storsimple-8000-configure-web-proxy.md). |Yes |No |
-| 7 |Web proxy |If you configure and enable web proxy on a registered device, then you will need to restart the active controller on your device. | |Yes |No |
-| 8 |High cloud latency and high I/O workload |When your StorSimple device encounters a combination of very high cloud latencies (order of seconds) and high I/O workload, the device volumes go into a degraded state and the I/Os may fail with a "device not ready" error. |You will need to manually reboot the device controllers or perform a device failover to recover from this situation. |Yes |No |
+| 7 |Web proxy |If you configure and enable web proxy on a registered device, then you'll need to restart the active controller on your device. | |Yes |No |
+| 8 |High cloud latency and high I/O workload |When your StorSimple device encounters a combination of high cloud latencies (order of seconds) and high I/O workload, the device volumes go into a degraded state, and the I/Os may fail with a "device not ready" error. |You'll need to manually reboot the device controllers or perform a device failover to recover from this situation. |Yes |No |
| 9 |Azure PowerShell |When you use the StorSimple cmdlet **Get-AzureStorSimpleStorageAccountCredential &#124; Select-Object -First 1 -Wait** to select the first object so that you can create a new **VolumeContainer** object, the cmdlet returns all the objects. |Wrap the cmdlet in parentheses as follows: **(Get-Azure-StorSimpleStorageAccountCredential) &#124; Select-Object -First 1 -Wait** |Yes |Yes |
-| 10 |Migration |When multiple volume containers are passed for migration, the ETA for latest backup is accurate only for the first volume container. Additionally, parallel migration will start after the first 4 backups in the first volume container are migrated. |We recommend that you migrate one volume container at a time. |Yes |No |
-| 11 |Migration |After the restore, volumes are not added to the backup policy or the virtual disk group. |You will need to add these volumes to a backup policy in order to create backups. |Yes |Yes |
+| 10 |Migration |When multiple volume containers are passed for migration, the ETA for latest backup is accurate only for the first volume container. Additionally, parallel migration will start after the first four backups in the first volume container are migrated. |We recommend that you migrate one volume container at a time. |Yes |No |
+| 11 |Migration |After the restore, volumes aren't added to the backup policy or the virtual disk group. |You'll need to add these volumes to a backup policy in order to create backups. |Yes |Yes |
| 12 |Migration |After the migration is complete, the 5000/7000 series device must not access the migrated data containers. |We recommend that you delete the migrated data containers after the migration is complete and committed. |Yes |No |
-| 13 |Clone and DR |A StorSimple device running Update 1 cannot clone or perform disaster recovery to a device running pre-update 1 software. |You will need to update the target device to Update 1 to allow these operations |Yes |Yes |
+| 13 |Clone and DR |A StorSimple device running Update 1 can't clone or perform disaster recovery to a device running pre-update 1 software. |You'll need to update the target device to Update 1 to allow these operations |Yes |Yes |
| 14 |Migration |Configuration backup for migration may fail on a 5000-7000 series device when there are volume groups with no associated volumes. |Delete all the empty volume groups with no associated volumes and then retry the configuration backup. |Yes |No |
-| 15 |Azure PowerShell cmdlets and locally pinned volumes |You cannot create a locally pinned volume via Azure PowerShell cmdlets. (Any volume you create via Azure PowerShell will be tiered.) |Always use the StorSimple Manager service to configure locally pinned volumes. |Yes |No |
+| 15 |Azure PowerShell cmdlets and locally pinned volumes |You can't create a locally pinned volume via Azure PowerShell cmdlets. (Any volume you create via Azure PowerShell will be tiered.) |Always use the StorSimple Manager service to configure locally pinned volumes. |Yes |No |
| 16 |Space available for locally pinned volumes |If you delete a locally pinned volume, the space available for new volumes may not be updated immediately. The StorSimple Manager service updates the local space available approximately every hour. |Wait for an hour before you try to create the new volume. |Yes |No |
-| 17 |Locally pinned volumes |Your restore job exposes the temporary snapshot backup in the Backup Catalog, but only for the duration of the restore job. Additionally, it exposes a virtual disk group with prefix **tmpCollection** on the **Backup Policies** page, but only for the duration of the restore job. |This behavior can occur if your restore job has only locally pinned volumes or a mix of locally pinned and tiered volumes. If the restore job includes only tiered volumes, then this behavior will not occur. No user intervention is required. |Yes |No |
-| 18 |Locally pinned volumes |If you cancel a restore job and a controller failover occurs immediately afterwards, the restore job will show **Failed** instead of **Canceled**. If a restore job fails and a controller failover occurs immediately afterwards, the restore job will show **Canceled** instead of **Failed**. |This behavior can occur if your restore job has only locally pinned volumes or a mix of locally pinned and tiered volumes. If the restore job includes only tiered volumes, then this behavior will not occur. No user intervention is required. |Yes |No |
-| 19 |Locally pinned volumes |If you cancel a restore job or if a restore fails and then a controller failover occurs, an additional restore job appears on the **Jobs** page. |This behavior can occur if your restore job has only locally pinned volumes or a mix of locally pinned and tiered volumes. If the restore job includes only tiered volumes, then this behavior will not occur. No user intervention is required. |Yes |No |
-| 20 |Locally pinned volumes |If you try to convert a tiered volume (created and cloned with Update 1.2 or earlier) to a locally pinned volume and your device is running out of space or there is a cloud outage, then the clone(s) can be corrupted. |This problem occurs only with volumes that were created and cloned with pre-Update 2.1 software. This should be an infrequent scenario. | | |
-| 21 |Volume conversion |Do not update the ACRs attached to a volume while a volume conversion is in progress (tiered to locally pinned or vice versa). Updating the ACRs could result in data corruption. |If needed, update the ACRs prior to the volume conversion and do not make any further ACR updates while the conversion is in progress. | | |
+| 17 |Locally pinned volumes |Your restore job exposes the temporary snapshot backup in the Backup Catalog, but only during the restore job. Additionally, it exposes a virtual disk group with prefix **tmpCollection** on the **Backup Policies** page, but only during the restore job. |This behavior can occur if your restore job has only locally pinned volumes or a mix of locally pinned and tiered volumes. If the restore job includes only tiered volumes, then this behavior won't occur. No user intervention is required. |Yes |No |
+| 18 |Locally pinned volumes |If you cancel a restore job and a controller failover occurs immediately afterwards, the restore job will show **Failed** instead of **Canceled**. If a restore job fails and a controller failover occurs immediately afterwards, the restore job will show **Canceled** instead of **Failed**. |This behavior can occur if your restore job has only locally pinned volumes or a mix of locally pinned and tiered volumes. If the restore job includes only tiered volumes, then this behavior won't occur. No user intervention is required. |Yes |No |
+| 19 |Locally pinned volumes |If you cancel a restore job or if a restore fails and then a controller failover occurs, an additional restore job appears on the **Jobs** page. |This behavior can occur if your restore job has only locally pinned volumes or a mix of locally pinned and tiered volumes. If the restore job includes only tiered volumes, then this behavior won't occur. No user intervention is required. |Yes |No |
+| 20 |Locally pinned volumes |If you try to convert a tiered volume (created and cloned with Update 1.2 or earlier) to a locally pinned volume and your device is running out of space or there's a cloud outage, then the clone(s) can be corrupted. |This problem occurs only with volumes that were created and cloned with pre-Update 2.1 software. This should be an infrequent scenario. | | |
+| 21 |Volume conversion |Don't update the ACRs attached to a volume while a volume conversion is in progress (tiered to locally pinned or vice versa). Updating the ACRs could result in data corruption. |If needed, update the ACRs prior to the volume conversion and don't make any further ACR updates while the conversion is in progress. | | |
## Controller and firmware updates in Update 2.2
-This release has software-only updates. However, if you are updating from a version prior to Update 2, you will need to install driver, Storport, Spaceport, and (in some cases) disk firmware updates on your device.
+This release has software-only updates. However, if you're updating from a version prior to Update 2, you'll need to install driver, Storport, Spaceport, and (in some cases) disk firmware updates on your device.
For more information on how to install the driver, Storport, Spaceport, and disk firmware updates, see [install Update 2.2](./storsimple-8000-install-update-5.md) on your StorSimple device. ## Virtual device updates in Update 2.2
-This update cannot be applied to the virtual device. New virtual devices will need to be created.
+This update can't be applied to the virtual device. New virtual devices will need to be created.
## Next step Learn how to [install Update 2.2](./storsimple-8000-install-update-5.md) on your StorSimple device.
storsimple Storsimple Update3 Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-update3-release-notes.md
ms.assetid: 2158aa7a-4ac3-42ba-8796-610d1adb984d Previously updated : 01/09/2018 Last updated : 08/22/2022 # Update 3 release notes for your StorSimple 8000 series device + ## Overview The following release notes describe the new features and identify the critical open issues for StorSimple 8000 Series Update 3. They also contain a list of the StorSimple software updates included in this release.
storsimple Storsimple Update4 Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-update4-release-notes.md
NA Previously updated : 01/23/2018 Last updated : 08/22/2022 # StorSimple 8000 Series Update 4 release notes + ## Overview The following release notes describe the new features and identify the critical open issues for StorSimple 8000 Series Update 4. They also contain a list of the StorSimple software updates included in this release.
storsimple Storsimple Update5 Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-update5-release-notes.md
ms.assetid: Previously updated : 11/13/2017 Last updated : 08/22/2022 # StorSimple 8000 Series Update 5 release notes + ## Overview The following release notes describe the new features and identify the critical open issues for StorSimple 8000 Series Update 5. They also contain a list of the StorSimple software updates included in this release.
storsimple Storsimple Update51 Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-update51-release-notes.md
ms.assetid: Previously updated : 04/14/2021 Last updated : 08/22/2022 # StorSimple 8000 Series Update 5.1 release notes + ## Overview The following release notes describe the new features and identify the critical open issues for StorSimple 8000 Series Update 5.1. They also contain a list of the StorSimple software updates included in this release.
storsimple Storsimple Use Snapshot Manager https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-use-snapshot-manager.md
NA Previously updated : 06/05/2017 Last updated : 08/22/2022 # Use StorSimple Snapshot Manager user interface to manage backup jobs and backup catalog + ## Overview The StorSimple Snapshot Manager has an intuitive user interface that you can use to take and manage backups. This tutorial provides an introduction to the user interface, and then explains how to use each of the components. For a detailed description of the StorSimple Snapshot Manager, see [What is StorSimple Snapshot Manager?](storsimple-what-is-snapshot-manager.md)
storsimple Storsimple What Is Snapshot Manager https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storsimple/storsimple-what-is-snapshot-manager.md
NA Previously updated : 02/27/2017 Last updated : 08/22/2022 # An introduction to StorSimple Snapshot Manager + ## Overview StorSimple Snapshot Manager is a Microsoft Management Console (MMC) snap-in that simplifies data protection and backup management in a Microsoft Azure StorSimple environment. With StorSimple Snapshot Manager, you can manage Microsoft Azure StorSimple data in the data center and in the cloud as a single integrated storage solution, thus simplifying backup processes and reducing costs.
synapse-analytics Continuous Integration Delivery https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/cicd/continuous-integration-delivery.md
description: Learn how to use continuous integration and continuous delivery (CI
-ms.search.form: home
-+ Last updated 10/08/2021
Use the [Synapse workspace deployment](https://marketplace.visualstudio.com/item
The deployment task supports 3 types of operations, validate only, deploy and validate and deploy.
+ > [!NOTE]
+ > This workspace deployment extension in is not backward compatible. Please make sure that the latest version is installed and used. You can read the release note in [overview] (https://marketplace.visualstudio.com/items?itemName=AzureSynapseWorkspace.synapsecicd-deploy&ssr=false#overview) in Azure DevOps and the [latest version](https://github.com/marketplace/actions/synapse-workspace-deployment) in GitHub action.
+ **Validate** is to validate the Synapse artifacts in non-publish branch with the task and generate the workspace template and parameter template file. The validation operation only works in the YAML pipeline. The sample YAML file is as below: ```yaml
synapse-analytics Source Control https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/cicd/source-control.md
description: Learn how to configure source control in Azure Synapse Studio
-ms.search.form: datahub
Last updated 11/20/2020
synapse-analytics Shared Databases Access Control https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/sql/shared-databases-access-control.md
After executing the code script below, it will allow non-admin users to have ser
GRANT SELECT ALL USER SECURABLES to [login@contoso.com]; GO; ```
-`CONNECT ANY DATABASE` permission will allow a user to access connection to any database, but it doesn't grant any permission in any database beyond connect. When `SELECT ALL USER SECURABLES` permission is granted, a login can view data from all schema-level objects, such as external tables and views (any schema except sys and INFORMATION_SCHEMA). This permission has effect in all databases that the user can connect to. Read more about [GRANT SERVER permissions](/sql/t-sql/statements/grant-server-permissions-transact-sql?view=sql-server-ver15#remarks&preserve-view=true).
+> [!NOTE]
+> These statements should be executed on master database, as these are all server-level permissions.
After creating a login and granting permissions, users can run queries on top of the synchronized external tables. This mitigation can also be applied to Azure AD security groups.
virtual-desktop Connect Ios https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-desktop/user-documentation/connect-ios.md
# Connect to Azure Virtual Desktop with the iOS client
-> Applies to: iOS 13.0 or later. Compatible with iPhone, iPad, and iPod touch.
+> Applies to: iOS 14.0 or later. Compatible with iPhone, iPad, and iPod touch.
>[!IMPORTANT] >This content applies to Azure Virtual Desktop with Azure Resource Manager Azure Virtual Desktop objects. If you're using Azure Virtual Desktop (classic) without Azure Resource Manager objects, see [this article](../virtual-desktop-fall-2019/connect-ios-2019.md).
virtual-machine-scale-sets Virtual Machine Scale Sets Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machine-scale-sets/virtual-machine-scale-sets-troubleshoot.md
ms.reviwer: jushiman
# Troubleshooting autoscale with Virtual Machine Scale Sets
-**Problem** ΓÇô youΓÇÖve created an autoscaling infrastructure in Azure Resource Manager using virtual machine scale sets ΓÇô for example, by deploying a template like this one: https://github.com/Azure/azure-quickstart-templates/tree/master/application-workloads/python/vmss-bottle-autoscale/azuredeploy.json ΓÇô you have your scale rules defined and it works great, except no matter how much load you put on the VMs, it doesn't autoscale.
+**Problem** ΓÇô youΓÇÖve created an autoscaling infrastructure in Azure Resource Manager using virtual machine scale sets ΓÇô for example, by deploying a template like this one: https://github.com/Azure/azure-quickstart-templates/blob/master/application-workloads/python/vmss-bottle-autoscale/azuredeploy.parameters.json ΓÇô you have your scale rules defined and it works great, except no matter how much load you put on the VMs, it doesn't autoscale.
## Troubleshooting steps Some things to consider include:
Some things to consider include:
The Azure Resource Explorer is an indispensable troubleshooting tool that shows you the state of your Azure Resource Manager resources. Click on your subscription and look at the Resource Group you are troubleshooting. Under the Compute resource provider, look at the virtual machine scale set you created and check the Instance View, which shows you the state of a deployment. Also, check the instance view of VMs in the virtual machine scale set. Then, go into the Microsoft.Insights resource provider and check that the autoscale rules look right. * Is the Diagnostic extension working and emitting performance data?
- **Update:** Azure autoscale has been enhanced to use a host-based metrics pipeline, which no longer requires a diagnostics extension to be installed. The next few paragraphs no longer apply if you create an autoscaling application using the new pipeline. An example of Azure templates that have been converted to use the host pipeline is available here: https://github.com/Azure/azure-quickstart-templates/tree/master/application-workloads/python/vmss-bottle-autoscale/azuredeploy.json.
+ **Update:** Azure autoscale has been enhanced to use a host-based metrics pipeline, which no longer requires a diagnostics extension to be installed. The next few paragraphs no longer apply if you create an autoscaling application using the new pipeline. An example of Azure templates that have been converted to use the host pipeline is available here: https://github.com/Azure/azure-quickstart-templates/blob/master/application-workloads/python/vmss-bottle-autoscale/azuredeploy.parameters.json.
Using host-based metrics for autoscale is better for the following reasons:
Some things to consider include:
[audit]: ./media/virtual-machine-scale-sets-troubleshoot/image3.png [explorer]: ./media/virtual-machine-scale-sets-troubleshoot/image1.png
-[tables]: ./media/virtual-machine-scale-sets-troubleshoot/image4.png
+[tables]: ./media/virtual-machine-scale-sets-troubleshoot/image4.png
virtual-machines Dedicated Hosts How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/dedicated-hosts-how-to.md
This article guides you through how to create an Azure [dedicated host](dedicate
## Limitations - The sizes and hardware types available for dedicated hosts vary by region. Refer to the host [pricing page](https://aka.ms/ADHPricing) to learn more.-- Not all Azure VM SKUs, regions and availability zones support ultra disks, for more information about this topic, see [Azure ultra disks](disks-enable-ultra-ssd.md) . Ultra disk support for dedicated hosts is currently in preview.
+- Not all Azure VM SKUs, regions and availability zones support ultra disks, for more information about this topic, see [Azure ultra disks](disks-enable-ultra-ssd.md). Ultra disk support for dedicated hosts is currently in preview.
- The fault domain count of the virtual machine scale set can't exceed the fault domain count of the host group. ## Create a host group
virtual-machines Build Image With Packer https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/windows/build-image-with-packer.md
Create a file named *windows.json* and paste the following content. Enter your o
}] } ```
-You can also create a filed named *windows.pkr.hcl* and paste the following content with your own values as used for the above parameters table.
+You can also create a file named *windows.pkr.hcl* and paste the following content with your own values as used for the above parameters table.
```HCL source "azure-arm" "autogenerated_1" {
virtual-machines Extensions Diagnostics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/windows/extensions-diagnostics.md
Title: Azure Diagnostics Extension for Windows description: Monitor Azure Windows VMs using the Azure Diagnostics Extension-+
Last updated 04/06/2018-+ ms.devlang: azurecli
virtual-machines Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/get-started.md
ms.assetid: ad8e5c75-0cf6-4564-ae62-ea1246b4e5f2
vm-linux Previously updated : 08/09/2022 Last updated : 08/22/2022 - # Use Azure to host and run SAP workload scenarios When you use Microsoft Azure, you can reliably run your mission-critical SAP workloads and scenarios on a scalable, compliant, and enterprise-proven platform. You get the scalability, flexibility, and cost savings of Azure. With the expanded partnership between Microsoft and SAP, you can run SAP applications across development and test and production scenarios in Azure and be fully supported. From SAP NetWeaver to SAP S/4HANA, SAP BI on Linux to Windows, and SAP HANA to SQL Server, Oracle, Db2, etc, we've got you covered.
-Besides hosting SAP NetWeaver and S/4HANA scenarios with the different DBMS on Azure, you can host other SAP workload scenarios, like SAP BI on Azure.
+Besides hosting SAP NetWeaver and S/4HANA scenarios with the different DBMS on Azure, you can host other SAP workload scenarios, like SAP BI on Azure.
We just announced our new services of Azure Center for SAP solutions and Azure Monitor for SAP 2.0 entering the public previev stage. These services will give you the possibility to deploy SAP workload on Azure in a highly automated manner in an optimal architecture and configuration. And monitor your Azure infrastructure, OS, DBMS, and ABAP stack deployments on one single pane of glass.
For customers and partners who are focussed on deploying and operating their ass
Hosting SAP workload scenarios in Azure also can create requirements of identity integration and single sign-on. This situation can occur when you use Azure Active Directory (Azure AD) to connect different SAP components and SAP software-as-a-service (SaaS) or platform-as-a-service (PaaS) offers. A list of such integration and single sign-on scenarios with Azure AD and SAP entities is described and documented in the section "Azure AD SAP identity integration and single sign-on." ## Changes to the SAP workload section+ Changes to documents in the SAP on Azure workload section are listed at the [end of this article](./get-started.md#change-log). The entries in the change log are kept for around 180 days. ## You want to know+ If you have specific questions, we are going to point you to specific documents or flows in this section of the start page. You want to know: - What Azure VMs and HANA Large Instance units are supported for which SAP software releases and which operating system versions. Read the document [What SAP software is supported for Azure deployment](./sap-supported-product-on-azure.md) for answers and the process to find the information - What SAP deployment scenarios are supported with Azure VMs and HANA Large Instances. Information about the supported scenarios can be found in the documents:
- - [SAP workload on Azure virtual machine supported scenarios](./sap-planning-supported-configurations.md)
- - [Supported scenarios for HANA Large Instance](./hana-supported-scenario.md)
-- What Azure Services, Azure VM types and Azure storage services are available in the different Azure regions, check the site [Products available by region](https://azure.microsoft.com/global-infrastructure/services/)
+ - [SAP workload on Azure virtual machine supported scenarios](./sap-planning-supported-configurations.md)
+ - [Supported scenarios for HANA Large Instance](./hana-supported-scenario.md)
+- What Azure Services, Azure VM types and Azure storage services are available in the different Azure regions, check the site [Products available by region](https://azure.microsoft.com/global-infrastructure/services/)
- Are third-party HA frameworks, besides Windows and Pacemaker supported? Check bottom part of [SAP support note #1928533](https://launchpad.support.sap.com/#/notes/1928533) - What Azure storage is best for my scenario? Read [Azure Storage types for SAP workload](./planning-guide-storage.md) - Is the Red Hat kernel in Oracle Enterprise Linux supported by SAP? Read SAP [SAP support note #1565179](https://launchpad.support.sap.com/#/notes/1565179) - Why are the Azure [Da(s)v4](../../dav4-dasv4-series.md)/[Ea(s)](../../eav4-easv4-series.md) VM families not certified for SAP HANA? The Azure Das/Eas VM families are based on AMD processor-driven hardware. SAP HANA does not support AMD processors, not even in virtualized scenarios - Why am I still getting the message: 'The cpu flags for the RDTSCP instruction or the cpu flags for constant_tsc or nonstop_tsc are not set or current_clocksource and available_clocksource are not correctly configured' with SAP HANA, despite the fact that I am running the most recent Linux kernels. For the answer, check [SAP support note #2791572](https://launchpad.support.sap.com/#/notes/2791572)-- Where can I find architectures for deploying SAP Fiori on Azure? Check out the blog [SAP on Azure: Application Gateway Web Application Firewall (WAF) v2 Setup for Internet facing SAP Fiori Apps](https://blogs.sap.com/2020/12/03/sap-on-azure-application-gateway-web-application-firewall-waf-v2-setup-for-internet-facing-sap-fiori-apps/)
+- Where can I find architectures for deploying SAP Fiori on Azure? Check out the blog [SAP on Azure: Application Gateway Web Application Firewall (WAF) v2 Setup for Internet facing SAP Fiori Apps](https://blogs.sap.com/2020/12/03/sap-on-azure-application-gateway-web-application-firewall-waf-v2-setup-for-internet-facing-sap-fiori-apps/)
-
## Documentation space+ In the SAP workload documentation space, you can find the following areas: - **SAP on Azure Large Instances**: This documentation section is covering a bare-metal service that orginally was named HANA Large Instances. Different topics around this technology are covered in this section
In the SAP workload documentation space, you can find the following areas:
- **High Availability (Azure VMs)**: In this section, many of the high availability configurations around SAP workload on Azure is covered. This section includes detailed documentation around deploying Windows clustering and Pacemaker cluster configuration for the different SAP comonentns and different database systems - **Automation Framework (Azure VMs)**: Automation Framework documentation is covering an a [Terraform and Ansible based automation framework](https://github.com/Azure/sap-automation) that allows automation of Azure infrastructure and SAP software - **Azure Monitor for SAP solutions**: Microsoft developed a monitoring solutions specifically for SAP supported OS and DBMS, as well as S/4HANA and NetWeaver. This section documents the deployment and usage of the service-- **Integration with Microsoft Services** and **References** contain different links to integration between SAP and other Microsoft services. The list may not be complete. -
+- **Integration with Microsoft Services** and **References** contain different links to integration between SAP and other Microsoft services. The list may not be complete.
## Change Log -- August 09, 2022: Release of scenario [HA for SAP ASCS/ERS with NFS simple mount](./high-availability-guide-suse-nfs-simple-mount.md) on SLES 15 for SAP Applications
+- August 22, 2022: Release of cost optimization scenario [Deploy PAS and AAS with SAP NetWeaver HA cluster](high-availability-guide-rhel-with-dialog-instance.md) on RHEL.
+- August 09, 2022: Release of scenario [HA for SAP ASCS/ERS with NFS simple mount](./high-availability-guide-suse-nfs-simple-mount.md) on SLES 15 for SAP Applications.
- July 18, 2022: Clarify statement around Pacemaker support on Oracle Linux in [Azure Virtual Machines Oracle DBMS deployment for SAP workload](./dbms_guide_oracle.md) - June 29, 2022: Add recommendation and links to Pacemaker usage for Db2 versions 11.5.6 and higher in the documents [IBM Db2 Azure Virtual Machines DBMS deployment for SAP workload](./dbms_guide_ibm.md), [High availability of IBM Db2 LUW on Azure VMs on SUSE Linux Enterprise Server with Pacemaker](./dbms-guide-ha-ibm.md), and [High availability of IBM Db2 LUW on Azure VMs on Red Hat Enterprise Linux Server](./high-availability-guide-rhel-ibm-db2-luw.md) - June 08, 2022: Change in [HA for SAP NW on Azure VMs on SLES with ANF](./high-availability-guide-suse-netapp-files.md) and [HA for SAP NW on Azure VMs on RHEL with ANF](./high-availability-guide-rhel-netapp-files.md) to adjust timeouts when using NFSv4.1 (related to NFSv4.1 lease renewal) for more resilient Pacemaker configuration
In the SAP workload documentation space, you can find the following areas:
- February 08, 2022: Style changes in [SQL Server Azure Virtual Machines DBMS deployment for SAP NetWeaver](./dbms_guide_sqlserver.md) - February 07, 2022: Adding new functionality [ANF application volume groups for HANA](../../../azure-netapp-files/application-volume-group-introduction.md) in documents [NFS v4.1 volumes on Azure NetApp Files for SAP HANA](./hana-vm-operations-netapp.md) and [Azure proximity placement groups for optimal network latency with SAP applications](./sap-proximity-placement-scenarios.md) - January 30, 2022: Adding context about SQL Server proprotional fill and expectations that SQL Server data files should be the same size and should have the same free space in [SQL Server Azure Virtual Machines DBMS deployment for SAP NetWeaver](./dbms_guide_sqlserver.md)-- January 24, 2022: Change in [HA for SAP NW on SLES with NFS on Azure Files](./high-availability-guide-suse-nfs-azure-files.md), [HA for SAP NW on Azure VMs on SLES with ANF](./high-availability-guide-suse-netapp-files.md), [HA for SAP NW on Azure VMs on SLES for SAP applications](./high-availability-guide-suse.md), [HA for NFS on Azure VMs on SLES](./high-availability-guide-suse-nfs.md), [HA for SAP NNW on Azure VMs on SLES multi-SID guide](./high-availability-guide-suse-multi-sid.md), [HA for SAP NW on RHEL with NFS on Azure Files](./high-availability-guide-rhel-nfs-azure-files.md), [HA for SAP NW on Azure VMs on RHEL for SAP applications](./high-availability-guide-rhel.md) and [HA for SAP NW on Azure VMs on RHEL with ANF](./high-availability-guide-rhel-netapp-files.md) and [HA for SAP NW on Azure VMs on RHEL multi-SID guide](./high-availability-guide-rhel-multi-sid.md) to remove cidr_netmask from Pacemaker configuration to allow the resource agent to determine the value automatically -- January 12, 2022: Change in [HA for SAP NetWeaver on Azure VMs on Windows with Azure NetApp Files(SMB)](./high-availability-guide-windows-netapp-files-smb.md) to remove obsolete information for the SAP kernel that supports the scenario. -- December 08, 2021: Change in [SQL Server Azure Virtual Machines DBMS deployment for SAP NetWeaver](./dbms_guide_sqlserver.md) to clarify Azure Load Balancer settings
+- January 24, 2022: Change in [HA for SAP NW on SLES with NFS on Azure Files](./high-availability-guide-suse-nfs-azure-files.md), [HA for SAP NW on Azure VMs on SLES with ANF](./high-availability-guide-suse-netapp-files.md), [HA for SAP NW on Azure VMs on SLES for SAP applications](./high-availability-guide-suse.md), [HA for NFS on Azure VMs on SLES](./high-availability-guide-suse-nfs.md), [HA for SAP NNW on Azure VMs on SLES multi-SID guide](./high-availability-guide-suse-multi-sid.md), [HA for SAP NW on RHEL with NFS on Azure Files](./high-availability-guide-rhel-nfs-azure-files.md), [HA for SAP NW on Azure VMs on RHEL for SAP applications](./high-availability-guide-rhel.md) and [HA for SAP NW on Azure VMs on RHEL with ANF](./high-availability-guide-rhel-netapp-files.md) and [HA for SAP NW on Azure VMs on RHEL multi-SID guide](./high-availability-guide-rhel-multi-sid.md) to remove cidr_netmask from Pacemaker configuration to allow the resource agent to determine the value automatically.
+- January 12, 2022: Change in [HA for SAP NetWeaver on Azure VMs on Windows with Azure NetApp Files(SMB)](./high-availability-guide-windows-netapp-files-smb.md) to remove obsolete information for the SAP kernel that supports the scenario.
+- December 08, 2021: Change in [SQL Server Azure Virtual Machines DBMS deployment for SAP NetWeaver](./dbms_guide_sqlserver.md) to clarify Azure Load Balancer settings.
- December 08, 2021: Release of scenario [HA of SAP HANA Scale-up with Azure NetApp Files on SLES](./sap-hana-high-availability-netapp-files-suse.md) - December 07, 2021: Change in [Setting up Pacemaker on RHEL in Azure](./high-availability-guide-rhel-pacemaker.md) to clarify that the instructions are applicable for both RHEL 7 and RHEL 8 - December 07, 2021: Change in [HA for SAP NW on SLES with NFS on Azure Files](./high-availability-guide-suse-nfs-azure-files.md), [HA for SAP NW on Azure VMs on SLES with ANF](./high-availability-guide-suse-netapp-files.md) and [HA for SAP NW on Azure VMs on SLES for SAP applications](./high-availability-guide-suse.md) to adjust the instructions for configuring SWAP file.
In the SAP workload documentation space, you can find the following areas:
- December 01, 2021: Change in [SAP ASCS/SCS instance with WSFC and file share](./sap-high-availability-guide-wsfc-file-share.md), [HA for SAP NetWeaver on Azure VMs on Windows with Azure NetApp Files(SMB)](./high-availability-guide-windows-netapp-files-smb.md) and [HA for SAP NetWeaver on Azure VMs on Windows with Azure Files(SMB)](./high-availability-guide-windows-azure-files-smb.md) to update the SAP kernel version, required to support clustering SAP on Windows with file share - November 30, 2021: Added [Using Windows DFS-N to support flexible SAPMNT share creation for SMB-based file share](./high-availability-guide-windows-dfs.md) - November 22, 2021: Change in [HA for SAP NW on SLES with NFS on Azure Files](./high-availability-guide-suse-nfs-azure-files.md) and [HA for SAP NW on RHEL with NFS on Azure Files](./high-availability-guide-rhel-nfs-azure-files.md) to clarify the guidelines for J2EE SAP systems and share consolidations per storage account.-- November 16, 2021: Release of high availability guides for SAP ASCS/ERS with NFS on Azure files [HA for SAP NW on SLES with NFS on Azure Files](./high-availability-guide-suse-nfs-azure-files.md) and [HA for SAP NW on RHEL with NFS on Azure Files](./high-availability-guide-rhel-nfs-azure-files.md)
+- November 16, 2021: Release of high availability guides for SAP ASCS/ERS with NFS on Azure files [HA for SAP NW on SLES with NFS on Azure Files](./high-availability-guide-suse-nfs-azure-files.md) and [HA for SAP NW on RHEL with NFS on Azure Files](./high-availability-guide-rhel-nfs-azure-files.md)
- November 15, 2021: Introduction of new proximity placement architecture for zonal deployments in [Azure proximity placement groups for optimal network latency with SAP applications](./sap-proximity-placement-scenarios.md) - November 02, 2021: Changed [Azure Storage types for SAP workload](./planning-guide-storage.md) and [SAP ASE Azure Virtual Machines DBMS deployment for SAP workload](./dbms_guide_sapase.md) to declare SAP ASE support for NFS on Azure NetApp Files. - November 02, 2021: Changed [SAP workload configurations with Azure Availability Zones](./sap-ha-availability-zones.md) to move Singapore SouthEast to regions for active/active configurations
In the SAP workload documentation space, you can find the following areas:
- June 23, 2020: Changes to [Azure Virtual Machines planning and implementation for SAP NetWeaver](./planning-guide.md) guide and introduction of [Azure Storage types for SAP workload](./planning-guide-storage.md) guide - June 22, 2020: Add installation steps for new VM Extension for SAP to the [Deployment Guide](deployment-guide.md) - June 16, 2020: Change in [Public endpoint connectivity for VMs using Azure Standard ILB in SAP HA scenarios](./high-availability-guide-standard-load-balancer-outbound-connections.md) to add a link to SUSE Public Cloud Infrastructure 101 documentation -- June 10, 2020: Adding new HLI SKUs into [Available SKUs for HLI](./hana-available-skus.md) and [SAP HANA (Large Instances) storage architecture](./hana-storage-architecture.md)-
+- June 10, 2020: Adding new HLI SKUs into [Available SKUs for HLI](./hana-available-skus.md) and [SAP HANA (Large Instances) storage architecture](./hana-storage-architecture.md)
virtual-network-manager Concept Azure Policy Integration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network-manager/concept-azure-policy-integration.md
Policy definitions and assignment can be created through with API/PS/CLI or [Azu
## Required permissions To use Azure Policy with network groups, users need the following permissions:-- `Microsoft.ApiManagement/service/apis/operations/policy/write` is needed at the scope you're assigning.
+- `Microsoft.Authorization/policyassignments/Write` and `Microsoft.Authorization/policydefinitions/Write` are needed at the scope you're assigning.
- `Microsoft.Network/networkManagers/networkGroups/join/action` action is needed on the target network group referenced in the **Add to network group** section. This permission allows for the adding and removing of objects from the target network group. - When using set definitions to assign multiple policies at the same time, concurrent `networkGroup/join/action` permissions are needed on all definitions being assigned at the time of assignment.